Partnership for Assessment of Readiness for College and Careers

Test Development

Developing the assessment

Since 2010, more than 2,000 educators, researchers, psychometricians and others have collaborated to develop PARCC’s annual state assessments and instructional tools. PARCC was designed not only to evaluate a student’s progress, but also to provide better information for teachers and parents to identify where a student needs help, or is excelling, so they can tailor instruction to meet individual student needs.

Key milestones document major developments since 2010. PARCC states have worked closely to develop the items and tasks for the annual state assessments, as well as to set costs for the first year. The assessments were field tested in spring 2014.

Two core pieces of work have been to develop College- and Career-Ready Determination policies to allow students’ direct entry into entry-level college courses without need for remediation, and Policy-Level Performance Level Descriptors, which describe what student performance looks like at various levels of proficiency in English language arts/literacy and mathematics.

Postsecondary leaders have played a critical role in all of these efforts. The work has been overseen by the PARCC governing board of state superintendents and commissioners, as well as several advisory groups.

Milestones

2010

  • PARCC states submit application and are awarded grant to develop a common set of next generation K-12 assessments in English language arts/literacy and mathematics.
  • PARCC state content teams and others begin meeting to focus on design and development of the PARCC assessment system

2011

  • The PARCC Governing Board approves refinements to the original PARCC design, creating the annual tests and optional instructional tools.
  • PARCC states formally engage colleges and universities in the work.
  • PARCC releases draft Model Content Frameworks for English literacy and Mathematics to inform item development and help support state and district implementation of the standards.

2012

  • Item development begins.
  • Teams of K-16 educators from PARCC states meet in Chicago for the first Educator Leader Cadre meeting.
  • The states release final Model Content Frameworks for English language arts/literacy and release item and task prototypes.
  • PARCC states hold the first item review committee meeting to review PARCC passages. Item review committees meet regularly to review items and passages.
  • PARCC state K-12 and higher education chiefs adopt a College- and Career-Ready Determination policy and policy-level performance level descriptors (PLDs).
  • PARCC states release final Model Content Frameworks for mathematics.
  • PARCC states approve PARCC’s retest policy and the final high school mathematics tests: Integrated Mathematics III and Algebra II.

2013

  • Small-scale research studies begins in select PARCC states.
  • Parcc Inc. nonprofit is launched to manage the development of PARCC’s next- generation assessments for the PARCC states.
  • PARCC states release final performance-level descriptors for all grades/courses in English language arts/literacy and mathematics, the accommodations manual, and cost estimates for the PACC annual tests.

2014

  • One million students in 16,000 schools field test the assessment — a “test of the test.”
  • A few thousand students in high schools with “block schedules” are the first to participate in PARCC testing.

2015

  • Five million students in 12 states (11 states plus the District of Columbia) complete a PARCC test.
  • Governing Board announces design changes that shorten the test and combine the former two test windows into one.
  • Teachers, other educators and experts from PARCC states will gather in July and August to determine the range of scores that qualify for each performance level (1, 2, 3, 4, and 5).
  • States will send individual student score reports to parents in the fall. School, district, and state results will be shared publicly.

Item Development

Item and text reviews

State experts, local educators, postsecondary faculty and community members from the PARCC states conduct rigorous reviews of every item and passage being developed for the PARCC assessment system to ensure all test items are of the highest quality, aligned to the standards, and fair for all student populations. PARCC’s process allows for unprecedented collaboration among states and their K-12 and higher education communities to provide state-led quality assurance and oversight of the test development. All PARCC item reviewers are nominated by their state education agency.

The purpose of the PARCC educator committee meetings is to receive feedback on the quality, accuracy, alignment, and appropriateness of test items that are developed annually for PARCC English language arts/literacy and mathematics assessments. These committees are composed of educators, university professors, and others selected by the PARCC states. Below is a list of the various PARCC educator committees that review test items for appropriateness to be included on the PARCC assessments.

Download now: Lifecycle of a Test Item

State Text Review Committee (In-Person & Virtual)

Participants will be able to review and edit the passages independently, through electronic display of passages, particularly multi-media passages, and then the grade level group will discuss content and bias concerns.

State Content Item Review Committee (In-Person & Virtual)

During the state content reviews, the committees will review and edit test items for adherence to the PARCC foundational documents, basic universal design principles, PARCC Accessibility Guidelines, selected metadata fields, and the PARCC Style Guide.

State Bias and Sensitivity Item Committee (In-Person & Virtual)

Educators and community members will be asked to review items and tasks to confirm the absence of bias or sensitivity issues that would interfere with a student’s ability to accomplish his or her best performance. The objective is to provide items and tasks that do not unfairly advantage or disadvantage one student or group over another. Once items are approved by the state content Item Review Team, they will be prepared for external bias and sensitivity review.

Editorial Review Committee (In-Person)

Prior to each editorial review meeting Pearson will work with the partnership manager to select up to 10 percent of the items and tasks for this review. The PARCC editorial review committee participants will do their review in Pearson’s item bank system. As with the other reviews, the committee members will view the items as the student would, and be able to vote and record their comments in the system. However, this is not a content but a copy edit review.

Data Review Committee (In-Person)

Educators will be asked to participate in the data review meeting to evaluate item-level statistics from field-tested items on the operational assessment. Participants make decisions about whether items should move forward to the operational assessments, or be revised and field tested.

Test Construction Committee (In-Person)

Educators and bias members will be asked to participate in the test construction meeting to build operational core forms to meet PARCC assessment blueprints for the performance-based and short-answer components of the summative assessment, scheduled to be administered during the school year.

Field Test

Testing the Test

Fourteen PARCC states and the District of Columbia conducted a field test in spring 2014 to allow educators and students to “test the test.” The purposes of the field test were to examine the quality of test questions and tasks; evaluate the training materials and administration procedures; evaluate the computer-based delivery platform; and to conduct research on a range of topics, including those that will inform the reporting of results from the first round of full testing. It also gave schools the ability to check their readiness to administer the full-scale test in spring 2015.

More than one million students in 16,000 schools took the tests, or about 10 percent of the students in those states. Students with disabilities and English language learners were included among field test participants in order to ensure that the test works the way it should for every student. Some students took the field tests on computers, others on paper. Students were not scored.

The PARCC Field Test: Lessons Learned report is a comprehensive review of the field test user experiences, the test delivery platform, and other aspects of the field test. It shows that the field test went well, that PARCC states are listening to feedback and making adjustments based on that feedback, and that the testing experience for students was largely positive; for example, they were successful with keyboarding, understood directions, and found the computer-based tests engaging.

Download the report: PARCC Field Test: Lessons Learned