Appendix J. Functional Requirements

advertisement

Appendix J. Functional Requirements

Insert responses directly into the template below. If attachments or lengthy responses are required for a particular item, indicate the location of attachment in the response box. Also, identify which item the response is referring to if it is not included in the response box. All items must include a response.

A.

Test Packages

Address the following:

1.

List all testing units included in the solution.

Vendor Response:

2.

Specify all tests available within each package.

Vendor Response:

3.

How are these question items developed?

Vendor Response:

4.

For each of the tests, how many items do students answer to receive a score?

Vendor Response:

5.

In each test’s scoring, what competencies does each score range represent?

Vendor Response:

6.

Will the Vendor work with the MnSCU system to create a custom exam? If so, detail the process and timeline. Address costs associated with this in Ошибка! Источник ссылки не найден..

Vendor Response:

7.

Can the vendor provide career pathway assessments? If so, what are they? If not, are there plans to develop these?

Vendor Response:

8.

For each test package, who creates your test items? Are the test item creators in-house employees or professionals from the field? How extensively do you use current faculty in the field to develop test items?

Vendor Response:

9.

Are there fairness review panels for all test items? If so, describe the composition of the panel and the process to identify test bias and to ensure test fairness. If not, are there plans to develop these?

Vendor Response:

10.

Content Validity: Describe how each test measures what it purports to measure.

Vendor Response:

11.

Predictive Validity: Describe how each test correctly predicts performance.

Vendor Response:

12.

How do you determine predictive validity?

Vendor Response:

13.

How long are test scores valid?

Vendor Response:

14.

After how many instances does your instrument lose validity due to retesting?

Vendor Response:

15.

Are there correlations between your test packages (e.g., such as math, reading, writing, etc.)? If so, please describe.

Vendor Response:

B.

Test Package - Mathematics

Address the following:

1.

Specify all tests available within the mathematics package.

Vendor Response:

2.

Describe the available test question and answer formats (e.g., graphs, charts, multiple choice, values/ expressions, etc.) used in each test.

Vendor Response:

3.

Describe how each test measures competencies in arithmetic, elementary algebra, intermediate algebra, college algebra, other college entry-level mathematics and statistics courses, trigonometry, and calculus I.

Vendor Response:

4.

Describe how each test measures readiness and success (50% likelihood student will earn a “C” or higher) for the course in which the student is placed for each of the courses arithmetic, elementary algebra, intermediate algebra, college algebra, other college entry-level mathematics and statistics courses, trigonometry, and calculus I.

Vendor Response:

5.

How are the question items developed?

Vendor Response:

6.

What is the test bank size of each of the tests listed above to ensure that each testing session is sufficiently unique?

Vendor Response:

7.

What is the lifespan of the question items in each of the tests listed above? How often do the question items get refreshed?

Vendor Response:

8.

Content Validity: Describe how each test measures what it purports to measure.

Vendor Response:

9.

Provide samples of test items for each test.

Vendor Response:

C.

Test Package – Reading

Address the following:

1.

Specify all tests available within the reading package.

Vendor Response:

2.

How are the question items developed?

Vendor Response:

3.

How do your tests distinguish between varying levels developmental readers below college-level?

Vendor Response:

4.

Provide samples of passages used for each test (rhetorical, literature, etc.).

Vendor Response:

5.

What subjects and topics are included in the passages?

Vendor Response:

6.

What is the lowest level of difficulty of the passages and how is that determined?

Vendor Response:

7.

Can students review the text when answering the questions or must they answer the questions without referring back to the text?

Vendor Response:

8.

What types (literal and/or inferential) of reading comprehension questions does each test include?

Vendor Response:

9.

How does each test assess vocabulary?

Vendor Response:

10.

Content Validity: Describe how each test measures what it purports to measure.

Vendor Response:

11.

In your scoring, what competencies does each score range represent?

Vendor Response:

12.

What is the test bank size of each of your reading tests to ensure that each testing session is sufficiently unique?

Vendor Response:

13.

What is the lifespan of the question items in each of your reading tests? How often do the question items get refreshed?

Vendor Response:

14.

Provide samples of test items for each test.

Vendor Response:

15.

Describe how each test correctly predicts a student earning a “C” or better in the course within which they are placed.

Vendor Response:

D.

Test Package – Writing

Address the following about all writing tests:

1.

Specify all tests available within the writing package.

Vendor Response:

2.

How are these question items developed?

Vendor Response:

3.

Describe how each of your writing tests measures writing skills.

Vendor Response:

4.

Are writing tests scored by humans or computers?

Vendor Response:

5.

Describe how each test correctly predicts a student earning a “C” or better in the course within which they are placed.

Vendor Response:

6.

What writing skills are evaluated as part of each test?

Vendor Response:

7.

What grammar proficiencies does each writing test evaluate?

Vendor Response:

8.

For each test, what competencies does each score range represent? What is the test bank size of each test to ensure that each testing session is sufficiently unique?

Vendor Response:

9.

What is the lifespan of the question items in your writing test? How often are the question items refreshed?

Vendor Response:

Essay Writing

10.

Does the test package include analysis of a writing sample? If provided, what criteria are evaluated in your rubric for scoring writing samples?

Vendor Response:

11.

Does the solution allow access to completed essays for review?

Vendor Response:

12.

Does the Vendor allow faculty to design customized essay prompts? If so, what is the process for integrating customized prompts?

Vendor Response:

13.

How does the test package check for possible plagiarism?

Vendor Response:

14.

Provide samples of writing prompts.

Vendor Response:

E.

Test Package – ESOL

Address the following about all ESOL tests:

1.

Specify all tests available within the ESOL package.

Vendor Response:

2.

How are these question items developed?

Vendor Response:

3.

How does each test in your ESOL package test the target skills being evaluated?

Vendor Response:

4.

In your scoring, what competencies does each score range represent?

Vendor Response:

5.

How large are your test banks and how often are they refreshed?

Vendor Response:

6.

Content Validity: Describe how each test measures what it purports to measure.

Vendor Response:

7.

Predictive Validity: Describe how each test correctly predicts performance.

Vendor Response:

8.

How do your ESOL test packages align to WiDA standards?

Vendor Response:

9.

Provide samples of test items for each test.

Vendor Response:

ESOL Reading Comprehension

10.

Specify all tests available within the reading package.

Vendor Response:

11.

How are these question items developed?

Vendor Response:

12.

What types of texts do you use?

Vendor Response:

13.

What types of reading comprehension questions does your test include?

Vendor Response:

14.

How does your test assess vocabulary for reading?

Vendor Response:

15.

Describe how your test measures reading comprehension.

Vendor Response:

16.

Describe how your reading test correctly predicts reading performance.

Vendor Response:

17.

In your scoring, what competencies does each score range represent?

Vendor Response:

18.

Do you allow faculty to design a customized reading test?

Vendor Response:

19.

What is the test bank size of your reading test to ensure that each testing session is sufficiently unique?

Vendor Response:

20.

What is the lifespan of the question items in your reading test? How often do the question items get refreshed?

Vendor Response:

ESOL Listening Comprehension

21.

Specify all tests available within the listening package.

Vendor Response:

22.

How are these question items developed?

Vendor Response:

23.

What types of listening texts do you use?

Vendor Response:

24.

What types of listening comprehension questions does your test include?

Vendor Response:

25.

How does your test assess vocabulary for listening?

Vendor Response:

26.

Can students take notes during the listening test?

Vendor Response:

27.

How many times can a student choose to have a passage repeated?

Vendor Response:

28.

Are the questions used in the listening test also provided in written form?

Vendor Response:

29.

Describe how your listening test measures listening skills.

Vendor Response:

30.

Describe how your test correctly predicts listening performance.

Vendor Response:

31.

In your scoring, what competencies does each score range represent?

Vendor Response:

32.

Do you allow faculty to design a customized listening test?

Vendor Response:

33.

What is the test bank size of your listening test to ensure that each testing session is sufficiently unique?

Vendor Response:

34.

What is the lifespan of the question items in your listening test? How often do the question items get refreshed?

Vendor Response:

ESOL Writing

35.

Specify all tests available within the writing package.

Vendor Response:

36.

If you use questions, how are these question items developed?

Vendor Response:

37.

What writing skills are evaluated as part of your test?

Vendor Response:

38.

Does your test include analysis of a writing sample?

Vendor Response:

39.

How does your test assess vocabulary for writing?

Vendor Response:

40.

If provided, how does your test evaluate writing samples?

Vendor Response:

41.

Describe how your writing test measures writing skills.

Vendor Response:

42.

Describe how your writing test correctly predicts writing performance.

Vendor Response:

43.

In your scoring, what competencies does each score range represent?

Vendor Response:

44.

Do you allow access to completed essays for review?

Vendor Response:

45.

Do you allow faculty to design customized essay prompts?

Vendor Response:

46.

What is the test bank size of your writing test to ensure that each testing session is sufficiently unique?

Vendor Response:

47.

What is the lifespan of the question items in your writing test? How often do the question items get refreshed?

Vendor Response:

ESOL Speaking

48.

Specify all tests available within the speaking package.

Vendor Response:

49.

If you use prompts, how are these prompts developed?

Vendor Response:

50.

What speaking skills are evaluated as part of your test?

Vendor Response:

51.

Does your test include analysis of a speaking sample?

Vendor Response:

52.

If provided, how does your test evaluate speaking samples?

Vendor Response:

53.

What types of prompts do you use?

Vendor Response:

54.

How does your test assess vocabulary for speaking?

Vendor Response:

55.

Describe how your speaking test correctly predicts speaking performance.

Vendor Response:

56.

In your scoring, what competencies does each score range represent?

Vendor Response:

57.

Do you allow access to completed speaking samples for review?

Vendor Response:

58.

Do you allow faculty to design customized speaking prompts?

Vendor Response:

59.

What is the test bank size of your speaking test to ensure that each testing session is sufficiently unique?

Vendor Response:

60.

What is the lifespan of the question items in your speaking test? How often do the question items get refreshed?

Vendor Response:

Others Assessment for ESOL

61.

What other ESOL tests does your package include aside from those listed here (reading, listening, writing, & speaking)?

Vendor Response:

62.

If you have other ESOL tests, please provide details similar to those areas above.

Vendor Response:

F.

Test Package - Computer Skills Test

Address the following:

1.

Specify all tests available within the computer skills test package.

Vendor Response:

2.

How are these question items developed?

Vendor Response:

3.

What skills are being tested within each test?

Vendor Response:

4.

How does each of the tests in your computer skills test package test the target skills being evaluated?

Vendor Response:

5.

In your scoring, what competencies does each score range represent?

Vendor Response:

6.

How often is the computer skills assessment upgraded?

Vendor Response:

7.

How large are your test banks and how often are they refreshed?

Vendor Response:

8.

Describe how your computer skills test measures computer skills.

Vendor Response:

9.

Describe how your test package correctly predicts computer skills performance.

Vendor Response:

G.

Diagnostic Capabilities

Address the following:

1.

What product(s) is available from the Vendor that provides for diagnostic capabilities in all areas mentioned previously? Describe in detail.

Vendor Response:

2.

Describe the diagnostic capabilities of the Vendor’s test and how the diagnostic features are integrated into that test.

Vendor Response:

3.

Does the solution provide recommended targeted intervention plans as a result of the diagnostic tool results? If so, describe how the diagnostic test capabilities can align diagnostic scores with targeted instruction and learning opportunities.

Vendor Response:

4.

Explain how a course placement assessment instrument aligns with a diagnostic assessment instrument; or how a diagnostic assessment can predict a course placement assessment.

Vendor Response:

5.

What is the ability of the solution to provide both a placement score and a diagnostic score as the result of one test?

Vendor Response:

6.

Describe how the diagnostic tests have the ability to branch between tests.

Vendor Response:

7.

Does the solution have the capability to weight strands on diagnostic tests? If so, describe how weights are applied. Are the weights predetermined or customizable by site?

Vendor Response:

8.

Describe how the diagnostic tests use weighted strands to branch from the diagnostic tests into the course placement tests.

Vendor Response:

H.

Alignment with K-12 Standards/Common Core

Address the following:

1.

Describe the solution’s P-20 assessment products and the relationship of these products with the

Vender’s college level course placement instrument.

Vendor Response:

2.

Describe how the solution correlates with nationally-normed college entrance exams, including the

ACT.

Vendor Response:

3.

Explain how course placement assessments and diagnostic tools are predictive of career and college readiness benchmarks on the nationally-normed college entrance exam, including the ACT.

Vendor Response:

4.

Describe how the solution aligns with Minnesota K-12 academic standards and the Common Core

State Standards.

Vendor Response:

I.

Test Design and Test Banks

Address the following:

1.

Define the solution’s ability to customize test packages, cut scores, and placement messages to ensure results that meet the specific needs of an institution.

Vendor Response:

2.

Describe how the Vendor provides a validity study (based on the customer’s individualized data for the customer). How many validity studies does the Vendor conduct per contract?

Vendor Response:

3.

Describe if and how the test bank content is culturally sensitive, minimizes bias, and appropriately addresses the diverse range of cultures unique to Minnesota’s population with regard to ethnicity, race, gender, socioeconomic status, age, gender identity, religion, nationality, intellectual and physical abilities.

Vendor Response:

4.

Describe all the ways the solution addresses the inclusion and representation of traditionally underrepresented and under-served students.

Vendor Response:

5.

Describe the solution’s ability to utilize MnSCU-developed demographic questions.

Vendor Response:

6.

Describe if and how the solution will allow for students, testing administrators, and/or faculty the ability to edit student demographics and other information.

Vendor Response:

7.

Describe how the solution provides for the ability to add or administer additional test items or test sections.

Vendor Response:

8.

Describe how the solution allows for the ability to define and set locally determined cut-scores.

Vendor Response:

9.

Is the solution able to provide tools/tutorials to help students during the test event? If yes, document and describe how the solution supports user tools (e.g., calculator, spell check, graphing tools, visually-based dictionary, dictionary, thesaurus, text pop out, measurement tools, electronic annotation, formula charts, and sketch pads being tagged to specific items/tasks during the test).

Vendor Response:

10.

Describe how stakeholders (i.e.,faculty members, staff members, administration, and/or students) are a part of the test question development process.

Vendor Response:

11.

Describe the types of analytics and predictive analytics the solution provides. Address costs associated with analytic functions in the cost proposal in Ошибка! Источник ссылки не найден..

Vendor Response:

12.

Is the construction of each test identical when administered at different institutions? Are the results reliably consistent and equivalent?

Vendor Response:

13.

What is the size of your test bank to ensure that each testing session is sufficiently unique? Include the size of the question bank for the first question of each test.

Vendor Response:

14.

Does the solution have the ability to import student data, assign test packages, and provide testing tickets for access to tests?

Vendor Response:

15.

What tools do you offer that allow testing sites to minimize the need of human resources for test proctoring?

Vendor Response:

16.

How does the solution allow the testing site to create custom exams that allow for easy addition of images and sound?

Vendor Response:

17.

What is the process/procedure if a faculty member has a concern about a test item?

Vendor Response:

J.

Adaptive Capabilities and Branching

Address the following:

1.

Describe how the test content and cognition assessment structure will be linear, sequential, branching to higher level and lower-level test, and/ or item adaptive.

Vendor Response:

2.

Describe the logic used to move up and down within tests and question items.

Vendor Response:

3.

Describe how the test can allow for customizable branching rules that automatically direct a student to a subsequent test.

Vendor Response:

4.

Describe if and how the item pool will have the flexibility to be delivered adaptively. If the adaptive functionality varies by test package/discipline, describe.

Vendor Response:

5.

Describe how the test allows for branching between tests and within the tests without completing the entire test.

Vendor Response:

K.

Multiple Measures

Address the following:

1.

Does the solution have the ability to weight other responses or information external to the assessment and incorporate it into the test score? If yes, please describe.

Vendor Response:

If no, what customization is available to create the ability to include multiple measures? Address costs to customize this in Ошибка! Источник ссылки не найден..

Vendor Response:

2.

Describe how the solution preregistration process could be used to collect additional information about students that can be used to inform a multiple measures approach of placement.

Vendor Response:

3.

Does the solution have a product that assesses non-cognitive skills? If not, what customization would permit the solution to assess non-cognitive skills? Please address costs associated with additional assessments in Ошибка! Источник ссылки не найден.. Does the solution have the capability to incorporate the results of non-cognitive assessments into the placement score?

Vendor Response:

L.

Student Testing

Address the following:

1.

Describe what student test preparations are available with this solution and how the test preparations are delivered. Describe how sample tests are included.

Vendor Response:

2.

What different technology platforms are test preparation materials compatible with?

Vendor Response:

3.

What instructions are provided to students to ensure they understand what is being asked of them

(i.e., instructions such as only one item should be checked in a multiple choice question)?

Vendor Response:

4.

Describe how the solution will allow a student to review their answers for some sections or sets of questions before moving on to the next section or completing the exam.

Vendor Response:

5.

Describe how the solution explains to the test-taker the computer adaptive nature of the assessment.

Vendor Response:

6.

Describe how the solution will allow students to have access to their answers, diagnostic information, and placement results.

Vendor Response:

7.

Describe and demonstrate how the test and item delivery will appear as a seamless experience to the student.

Vendor Response:

8.

Describe all implications associated with retakes (i.e., student costs, institutional costs, and how multiple test scores in the same area are computed and derived).

Vendor Response:

M.Testing Process and Testing Sites

Address the following:

1.

Describe how the solution allows for security measures in the proctoring of tests. Does the solution prohibit the opening of other screens, browsers, and tabs?

Vendor Response:

2.

Describe how the solution allows for the ability to identify at which testing site a student took the exam.

Vendor Response:

3.

Describe if there are test access restrictions by testing site.

Vendor Response:

4.

How many testing sessions can the Vendor support per testing site per day?

Vendor Response:

5.

List the maximum number of concurrent testing sessions platform-wide.

Vendor Response:

6.

Is there ever a time where the MnSCU system would reach the Vendor’s maximum capacity to handle concurrent tests?

Vendor Response:

7.

Describe how the solution will allow for the ability to control retakes and number of retakes.

Vendor Response:

8.

Describe the average amount of time it takes for a user to complete each test.

Vendor Response:

9.

Describe how the solution allows for students to pause a test and return at a later time.

Vendor Response:

10.

Is there a cost associated with creating a user record prior to getting a test? If yes, address in

Ошибка! Источник ссылки не найден..

Vendor Response:

11.

Can tests be administered at other locations (e.g., high schools) without college personnel needing to be present?

Vendor Response:

N.

Test Scores/Results and Reporting Function

Address the following about Test Scores and Results:

1.

Must each test be completed to receive a score?

Vendor Response:

2.

How soon after completion of tests are student scores available?

Vendor Response:

3.

How soon are online reports available for faculty, advisors, or administrators?

Vendor Response:

4.

Describe how the solution will allow the optional inclusion and incorporation of interim and human scoring, particularly for writing samples, when they exist.

Vendor Response:

5.

Describe whether site test administrators have the ability to display scores immediately.

Vendor Response:

6.

Does the solution allow individual student score reports from other MnSCU testing sites to be viewed at any MnSCU site?

Vendor Response:

7.

How often are testing records purged?

Vendor Response:

Reporting Function:

8.

Provide evidence of the Vendor’s demonstrated experience developing and delivering comprehensive and customizable score reports at the state and/or institutional level(s) for users at varied levels of access and information (e.g., counselors, faculty, students, researchers, etc.).

Include any additional information to demonstrate how this can be integrated with multiple measures for informing placement of students.

Vendor Response:

9.

Describe all levels and/or roles that have access to reports (e.g., student level, program level, advisors, faculty, institutional research, etc.).

Vendor Response:

10.

Describe how the solution will allow for the generation of monthly and annual reports at the system-wide and local levels.

Vendor Response:

11.

Describe available enrollment management analytics.

Vendor Response:

12.

List and describe all available reports.

Vendor Response:

13.

How are roles and permissions set for those who have access to data and reports?

Vendor Response:

14.

How granular are the roles or settings?

Vendor Response:

15.

Describe all options for saving/exporting reports (e.g., Excel, CSV, XML, delimited text etc.)

Vendor Response:

16.

Describe how the solution could display student data relating to multiple measures to appropriate and various stakeholders.

Vendor Response:

17.

Provide evidence of the Vendor’s demonstrated ability to design and produce customized assessment reports with minimal latency to users at varied levels of access (e.g., counselors, faculty, students, researchers, etc.) at the system-wide and local levels.

Vendor Response:

O.

Administrator Functions

Address the following:

1.

Define the solution’s ability to customize messages with images to ensure results that meet the specific needs of an institution.

Vendor Response:

2.

Describe how the solution will allow administrators to turn on and off user tools (e.g., calculator, spell check, graphing tools, visually based dictionary, dictionary, thesaurus, text pop out, measurement tools, electronic annotation, formula charts, and sketch pads).

Vendor Response:

3.

Describe how the system will allow testing administrators to monitor/control student re-testing history across the MnSCU system.

Vendor Response:

P.

User Management

Address the following:

1.

Describe how the solution offers single sign-on capability for students, faculty, and administration.

Vendor Response:

2.

Does the solution allow for MnSCU to enter a user-defined ID for sign-on?

Vendor Response:

3.

Describe how the solution provides the ability to easily add proctors and users.

Vendor Response:

4.

Describe how the solution provides the ability to add and remove testing sites.

Vendor Response:

5.

Define the solution’s standard access roles and describe how the solution manages multiple levels of access by these roles.

Vendor Response:

6.

How does the solution import student data, assign test packages, and provide tickets for test access?

Vendor Response:

7.

What tools does the solution offer to minimize the human resources needed for test proctoring?

Vendor Response:

Q.

Internet Browser Access and System Requirements

Address the following:

1.

Which browsers are supported by the solution?

Vendor Response:

2.

Describe how the solution will support new versions of browsers and within how many days after version availability.

Vendor Response:

3.

What are the system requirements for: a.

Client-based product solutions (software)?

Vendor Response: b.

Cloud or web-based product solutions?

Vendor Response:

4.

Specify all platforms which are compatible with the proposed solution (e.g., Mac, Windows, Linux, or other).

Vendor Response:

R.

Computers and Workstations

Address the following:

1.

Will the solution function with non-English keyboards? If yes, document and describe.

Vendor Response:

2.

Does the solution have the ability to easily verify whether computers used to take the test meet system requirements? If yes, document and describe.

Vendor Response:

3.

Does the solution have the ability to deliver the test software securely to individual student workstations or devices? If yes, document and describe.

Vendor Response:

4.

Can the solution provide software updates that ensure performance stability during the software update process? If yes, document and describe.

Vendor Response:

5.

How is the solution compatible with classroom management systems (i.e., InSight, SynchronEyes etc.)? If so, does the solution require an individual identifier to authenticate log in?

Vendor Response:

S.

Remote Testing and Virtual Proctoring

Address the following:

1.

Does the solution permit remote testing? If yes, describe how and what is required.

Vendor Response:

2.

What security measures do you have for remote administration of your test?

Vendor Response:

T.

Mobile Applications

Address the following:

1.

List and describe mobile applications related to this solution.

Vendor Response:

2.

Which devices do the mobile applications or functions support?

Vendor Response:

3.

Which mobile platforms do the mobile applications or functions support?

Vendor Response:

4.

What functionality is included in the mobile applications related to this solution, if not already listed?

Vendor Response:

5.

What mobile functionality is currently under development and when will it become available?

Vendor Response:

U.

Usability and Ease of Use

Address the following:

1.

How is the proposed solution’s interface intuitive?

Vendor Response:

2.

How does the solution test its ease of use?

Vendor Response:

3.

How is the proposed solution designed for higher education?

Vendor Response:

4.

Provide sample user testing reports.

Vendor Response:

5.

Describe the various forms of help and user guides that are available and for whom they are designed (students, test administrators, etc.).

Vendor Response:

6.

Describe all forms of in-test assistive technologies available, including where and how they can be used.

Vendor Response:

V.

Accessibility Standards and ADA Compliance

Address the following:

1.

Provide the Vendor’s statement of compliance to Section 508 and the American Disabilities Act

(ADA).

Vendor Response:

2.

Submit Appendices M and N, the Voluntary Product Accessibility Template (VPAT) statements for this solution.

Vendor Response:

3.

How does the proposed solution address Web accessibility issues including a statement of the current level of compliance with the W3C Web Accessibility Initiative (WAI) and Section 508, and/or future plans to achieve compliance?

Vendor Response:

4.

Do all users with or without disabilities have equally effective access, quality, timeliness, and availability to all functions and aspects of your product?

Vendor Response:

5.

Are support materials (documentation, online help, video tutorials, etc.) accessible with equally effective access, quality, timeliness, and availability-are they able to be enlarged, are they closedcaptioned, and will screen readers or screen navigation software work with these materials?

Vendor Response:

6.

What tests and tools have been applied to ensure accessibility compliancy?

Vendor Response:

7.

Does the company respond with timely updates to the product when accessibility issues arise?

Vendor Response:

8.

Will the solution support items/tasks that can accommodate students with disabilities (e.g. items/tasks including Sign Language, refreshable Braille, text-to-speech tags, text magnifying software, and speech-to-text tags)? Braille support must include contracted, un-contracted, and

Nemeth Braille. If yes, document and describe.

Vendor Response:

9.

Does the solution allow for non-mobility users using, for example, Dragon™ software to navigate?

Vendor Response:

10.

Will the solution allow test administrators to print fixed forms (to include: items, stimuli, and necessary resources) to support system accessibility requirements with appropriate security procedures? If yes, document and describe.

Vendor Response:

11.

Describe the solution’s accommodations for students with disabilities. Describe how solution addresses different types of disabilities.

Vendor Response:

12.

Describe assistive technologies that have been tested and how they can be used with the proposed solution.

Vendor Response:

13.

Does the company test assistive technologies with product updates? What are the results of all tests?

Vendor Response:

14.

Does the company respond with timely updates to the product when accessibility issues arise?

Vendor Response:

15.

Can a disabled person use the company website to learn how to navigate through the proposed solution with accessibility tools?

Vendor Response:

16.

Are mathematics symbols compatible with screen readers?

Vendor Response:

17.

Does the solution provide a quality built-in screen reader?

Vendor Response:

18.

What other testing modalities are available (e.g. Braille, paper/pencil, audio)? If available, how often do you refresh the items founds in these alternate formats?

Vendor Response:

19.

Are all tests available in paper and pencil format?

Vendor Response:

W.

Feature Requests

Address the following:

1.

What is the process for users to request feature enhancements?

Vendor Response:

2.

How are feature enhancement requests addressed?

Vendor Response:

3.

Are users notified of a timeline for proposed feature enhancements?

Vendor Response:

4.

Are the feature enhancements created for other clients available and /or transferrable to MnSCU?

Vendor Response:

5.

What is the prioritization process for feature enhancements?

Vendor Response:

X.

Training Services

Address the following:

1.

Describe in detail the Vendor’s services for training of users, including: a.

Types of training (onsite, webinar, video tutorials, etc.).

Vendor Response: b.

Implementation plan for a “train-the-trainer” model for consortia systems (i.e., trainer ramp-up).

Vendor Response: c.

The user communities available for your product.

Vendor Response:

2.

Provide information and access for all available communities.

Vendor Response:

3.

Describe availability of user guides and other user help.

Vendor Response:

Y.

Implementation (Initial and Version Releases)

Detail the Vendor’s services for the following project implementation phases. This should include services for the migration period and all future version upgrades or maintenance releases of significant magnitude to warrant project management services. For each section indicate if these services are included in the base cost. If not, detail the costs/options in Ошибка! Источник ссылки не найден.

Project Management

1.

List all Vendor personnel involved in conducting the project and/or in the implementation. Include each employee’s education and experience related to higher education, Student Information

Systems, and list where personnel will be physically located during the time they are engaged in the work.

Vendor Response:

2.

Indicate the responsibilities that Vendor personnel will have on the project as well as how long each has been in the employ of the Vendor.

Vendor Response:

3.

Indicate the number and type of resources the MnSCU system and its member institutions or campuses will be expected to provide to assist in implementation, support, and knowledge transfer.

Vendor Response:

4.

Provide résumés describing the educational and work experiences for each of the key staff that would be assigned to the project.

Vendor Response:

Estimate of implementation tasks and time

5.

Describe all of the tasks and estimated timeline involved in setting up the solution for the MnSCU system.

Vendor Response:

6.

Describe how the enterprise solution will work. Specifically address: a.

How would tests and data for MnSCU institutions be parsed by institution and by individual?

Vendor Response: b.

Describe the process for defining sites within the solution so that MnSCU can identify at which testing site a test was taken.

Vendor Response:

7.

What is the Vendor’s capacity to begin the onboarding process and how quickly can this process be initiated after a contract is signed?

Vendor Response:

By signing this statement, you certify the information provided in the Functional Requirements, Appendix J (and any attachments) is accurate and that you are authorized to sign on behalf of the Respondent.

Name of Company:________________________________________________________________

Authorized Signature:______________________________________________________________

Printed Name:________________________________________Title:________________________

Date: _________________ Telephone number:__________________________________________

Download