common core content what to look for - eschool news

13
Common Core Content . . . What to Look For Preparing teachers and students for classroom instruction and summative assessment success is priority one as the transition to Common Core curriculum unfolds. Educators are grappling with what to look for when weighing Common Core assessment options. The choices are many—the solutions varied. How and where do educators begin to navigate through it all? This paper offers the top questions educators should ask before they invest in Common Core content. COMMON CORE Assessments Office: 100 Education Way, Dover NH, 03820 | Web: MeasuredProgress.org | Toll Free: 877.432.8294 ©2013 Measured Progress. All rights reserved. Measured Progress is a registered trademark and Measured Progress COMMON CORE and its logo are trademarks of Measured Progress, Inc.

Upload: others

Post on 11-Feb-2022

3 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Common Core Content  What to Look For - eSchool News

Common Core Content . . . What to Look For

Preparing teachers and students for classroom instruction and summative assessment success is priority one as the transition to Common Core curriculum unfolds. Educators are grappling with what to look for when weighing Common Core assessment options. The choices are many—the solutions varied. How and where do educators begin to navigate through it all?

This paper offers the top questions educators should ask before they invest in Common Core content.

COMMON COREAssessments

Office: 100 Education Way, Dover NH, 03820 | Web: MeasuredProgress.org | Toll Free: 877.432.8294

©2013 Measured Progress. All rights reserved. Measured Progress is a registered trademark and Measured Progress COMMON CORE and its logo are trademarks of Measured Progress, Inc.

Page 2: Common Core Content  What to Look For - eSchool News

COMMON COREAssessments

2

N THIS WHITE PAPER Measured Progress offers, as guidance for educators nationwide, a list of key recommended questions to consider when

researching Common Core assessment content options to prepare their students for the academic rigor that lies ahead.

What principles of high-quality item writing are most critical for content developers to follow? What should educators look for as key indicators of item quality?

1

Many item banks or assessments appear to offer thousands of items aligned to the Common Core. How did developers deliver them so quickly? It’s up to educators to determine when these items were written—as the “age” of an item could indicate that it might not measure the intent of the standard. Why? The Common Core State Standards (CCSS) were not finalized until 2012, so item sets developed before then likely do not address the full scope of several new standards.

The new Common Core instructional standards have an increased depth and rigor, with a goal of guiding students toward success in college and careers. Therefore, rather than simply aligning existing test items or assessments to the CCSS, item developers should be writing to each Common Core standard as a new construct that requires students to master skills that have rarely—if ever—been effectively assessed before by state or interim assessments. In a thorough review of the standards in 2012, content experts found that existing item banks and assessments could not adequately prepare students to meet the new standards and succeed on next-generation common assessments.

To truly address the standards, educators should seek new items and assessments that have been written recently and specifically for the Common Core. Beyond that very fundamental guideline, look for the following:

Context – Items written to the CCSS should address the overall intent of the standards. It is not sufficient to write to a single standard in isolation. While an item may be aligned to a single standard, that standard resides in the context of a broader environment. Other standards within the grade will impact the item, as will the standard’s place within the content progression.

Context is a very important part of developing items for the CCSS. It must flow naturally with the item and fit within realistic, reasonable use of the mathematical concepts. For example, converting the length of the Amazon River from kilometers to centimeters as a reason to use scientific notation would not be appropriate; using scientific notation to compare distances between planets would be. Where relevant context is used, both the problem and the solutions should be framed in terms of that context. The intent is not to have context for the sake of having context; rather, it is to determine how students can take information presented in a realistic scenario, process that information, and use their knowledge of mathematics to create meaningful results.

Consideration should be given not only to what is intended to be assessed with the item, but also to the logical prerequisite and subsequent concepts. Real-world mathematics is not restricted to a single concept within a single domain, nor should assessments be so restricted. It remains vitally important to test individual skills, but it is also necessary to assess how students integrate those skills in meaningful ways.

Page 3: Common Core Content  What to Look For - eSchool News

1. You may use this number line to help you answer the question.

10 12—

Three friends are reading the same book.

● Raul has read 12 of the book.

● Maya has read 16 of the book.

● Sam has read the greatest fraction of the book.

a. Write a number sentence that compares the fraction of the book that Raul hasread to the fraction of the book that Maya has read.

b. Write a fraction that could tell the part of the book that Sam has read.

Rubric

Score Description

2 for correct answer to part a, 12 > 1

6 or equivalent, and part b, 56 or any fraction greater than 1

2

1 for correct answer to one part

0 Response is incorrect or contains some correct work that is irrelevant to the skill or concept being measured.

Blank No response.

Mathematics

Figure 1: Sample items developed using principles of high-quality item writing

COMMON COREAssessments

3

Scoring Considerations – Items need to be written with scoring in mind—whether in rubrics for short-answer or constructed-response items, or distractors in selected-response items. The intent of the standards also has impact here. For example, if the intent of a standard is the use of a mathematical concept to solve problems, then the bulk of any scoring for that item should be based on the student’s demonstrated ability to use the concept to solve a problem, not on some other issue. More emphasis must be placed on making better use of students’ conceptual understanding (or misunderstanding), with less emphasis on the occurrence of rudimentary errors.

Complexity – Items should also be written with a range of difficulty and cognitive complexity, when appropriate to the assessment. If an assessment contains items only at the higher or lower levels, there is a risk of improper categorization of students’ abilities.

Consortia Experience – It can also be helpful if a content provider’s Common Core item-writing approach has been informed by involvement in, or at least familiarity with, aspects of the work of the national Common Core consortia, i.e., the Smarter Balanced Assessment Consortium and the Partnership for Assessment of Readiness for College and Careers (PARCC). This type of experience can offer insights, gained in the course of that interaction, to subsequent work building assessment tools expressly for the Common Core State Standards.

See Figure 1 below for examples of mathematics and reading items that were developed using principles such as those described above.

Page 4: Common Core Content  What to Look For - eSchool News

Reading

Lost in a Corn Mazeby Laurie Wallmark

1 DARKNESS HAS FALLEN, and the full moon casts blue-gray shadows around you. Youshine your flashlight at the ten-foot tall cornstalks towering above your head. With everystep, dried cornhusks crunch beneath your feet. You walk along paths filled with turns,loops, and dead ends. You hear laughter and voices, but no one is in sight. You’re lost in acorn maze.

2 A corn maze is a large, walk-through puzzle carved into a cornfield. Seen from above, itswinding paths may form a picture—anything from Halloween monsters to fire-breathingdragons, flying saucers to pirate ships, or sports heroes to scary witches.

3 A farmer may enlist the help of a maze designer to create these amazing images. Thedesigner begins with a sketch, drawn either by hand or on a computer. The design is thenplowed into a cornfield using hoes, tractors, or lawn mowers. Many designers use a digitaldevice called a Global Positioning System, or GPS, to guide their cutting.

4 Although today’s corn mazes may get a boost from modern technology, people have beendesigning, building, and getting lost in mazes for thousands of years. Mazes first appearedin Greek mythology; the most famous was the labyrinth at Knossos, home of the Minotaur,a half-man, half-bull monster. In the Middle Ages, gardeners built “puzzle hedges” inEuropean castle gardens to amuse the royal court. By the 19th century, mazes had becomea popular form of entertainment all over the world.

5 In 1993, producer Don Frantz and designer Adrian Fisher built the world’s first corn mazein Annville, Pennsylvania. Their dino-shaped creation, “Cornelius, The Cobasaurus,” sparkeda corn maze craze—there are now approximately one thousand corn mazes across theUnited States, and corn mazes on every continent except for frosty Antarctica.

This question has two parts. Make sure to answer both parts of the question.

3. In Selection 2, what is the author’s point of view about corn mazes?

A. They are a good use of land around the world.

B. They are the most popular form of mazes today.

C. They are more advanced than other kinds of mazes.

D. They are a good income for entertainment business owners.

Which quote from Selection 2 supports the answer above?

A. “Many designers use a digital device called a Global Positioning System, or GPS, toguide their cutting.”

B. “In the Middle Ages, gardeners built ‘puzzle hedges’ in European castle gardens toamuse the royal court.”

C. “In 1993, producer Don Frantz and designer Adrian Fisher built the world’s firstcorn maze in Annville, Pennsylvania.”

D. “there are now approximately one thousand corn mazes across the United States,and corn mazes on every continent except for frosty Antarctica.”

Distractor Analysis

A. Although the author does mention other countries where corn mazes are now built, there is no indication that he/she thinks corn mazes are agood use of land.

B. KEY: Throughout the text, the author supports the idea that corn mazes are the kind of maze that is currently popular.C. The author does connect ancient mazes with modern ones, but not to suggest that modern ones are more advanced (even though advanced

technology is used to create them).D. While the author does mention a business that has formed from corn mazes, he/she does not argue that this is a good form of income for

entertainment businesses in general.

Distractor Analysis

A. This quote seems to support the idea that corn mazes are more advanced than ancient ones, but this is not supported in the article.B. This quote seems to support the idea that corn mazes in ancient times may have been simpler than modern corn mazes, but this is not

supported by the description of the Greek maze.C. This quote seems to support the idea that corn mazes are a good form of business, but showing one business that was formed does not

actually support the point.D. KEY: This quote shows how popular corn mazes are in the U.S. and around the world.

COMMON COREAssessments

4

Page 5: Common Core Content  What to Look For - eSchool News

COMMON COREAssessments

5

Have the items been field tested?2

It’s important that items used for any accountability purpose be field tested to collect data that demonstrate how well those items gather evidence of student learning for a particular standard. Field-test data on items that make up a Common Core item pool will allow developers to ultimately create a scale that allows for the comparison of individual student and group performance across interim assessment administrations. Data from a field test can validate the effectiveness of the item at collecting evidence of student understanding and measuring item performance with diverse populations of learners. Thus, a field test does more than determine the relative difficulty of each item; it can also help educators and administrators better understand how effectively each item performs for various subgroups.

What quality-assurance process and types of reviews have the items gone through? Who were the reviewers?

3

Educators should be sure that professionally developed items pass through several cycles of careful review before they reach a student. Reviews help correct potential flaws in items and make them a better measure of a student’s true learning and progress.

What to Look For

Items that Have Undergone Content and Editorial (Internal) Reviews

�� To help achieve the highest possible Common Core item quality, content providers should apply the principles of Universal Design when developing items and assessments. Content specialists should review items to ensure they address precisely defined constructs, are accessible, non-biased, and amenable to accommodations. Reviewers should also make certain that items are fair and do not put test-takers from particular racial, ethnic, or gender groups at a disadvantage. The content specialist's item review also should ensure that items assess only knowledge or skills that are identified as part of the standard being tested, avoiding the assessment of irrelevant factors while providing maximum readability and comprehensibility.

�� An editorial review ensures items are unambiguous and free of grammatical errors, potentially insensitive content or language, and graphical miscues, thereby ensuring items display with maximum legibility.

Items that Have Undergone Teacher Review

�� Good item development should reflect the experience and expertise of many people with diverse perspectives: teachers, curriculum specialists, content design specialists, editors, graphic artists, students (during field testing), and administrators. Educator item reviews play a critical role in this collaboration, as their comments and recommendations regarding the items are based on classroom experience.

Page 6: Common Core Content  What to Look For - eSchool News

COMMON COREAssessments

6

�� English language arts and mathematics teachers and/or curriculum and assessment coordinators should be prominent participants in item reviews to ensure that items represent a diverse spectrum. Experts who understand curriculum, assessment, the Common Core State Standards, and content-specific teaching should help establish criteria for item review, after which committee members can independently review and provide feedback on the items.

�� Bias and sensitivity reviews also play a critical role in this collaboration. Experienced educators should be invited to review and comment with bias and sensitivity in mind.

GUIDING PRINCIPLES AND CRITERIA

Is the item appropriate for students at this age and grade level? Is the item instructionally relevant?

� Reading difficulty is appropriate for students at this grade level.

� Vocabulary is appropriate for students at this grade level.

� Content is of interest to students at this grade level. � Mathematical operations are appropriate for students

at this grade level. � Required reasoning skills are appropriate for students at

this grade level.

Does the item align to the applicable standard?

Is the item accurate?

� There is one and only one correct answer to each selected-response item.

� The rubric for a constructed-response item reflects the expectations established in the question itself.

Is the item free of bias? In other words, is the item free of any characteristics that might result in the differential performance of two individuals of the same ability but from different subgroups?

Is the item free of issues of sensitivity? Is the item free of any characteristic that will cause a specific group of students to have a strong emotional reaction to the item?

Do the items successfully address and enable valid measurement of understanding of Common Core concepts?

4

“The Common Core State Standards require high-level cognitive demand, such as requiring students to demonstrate deeper conceptual understanding through the application of content knowledge and skills to new situations and sustained tasks.”1

For this reason, educators should look for Common Core-aligned items and assessments written to address increasing levels of complexity according to Norman Webb’s Depth of Knowledge. Selected-response; evidence-based, selected-response; and constructed-response items typically can assess DOK levels 1-3 in reading (English language arts) and in mathematics; performance tasks can better assess DOK level 4 in all subjects. As new items and assessments evolve, educators must pay careful attention when they are evaluating them to determine whether or not item banks and tests truly assess deeper levels of knowledge.

Depth of Knowledge

Webb (1997)2 developed processes and criteria for systematically analyzing the alignment between standards and standardized assessments. This body of work offers the Depth of Knowledge model employed to analyze the cognitive expectation demanded by standards, curricular activities, and assessment tasks (Webb, 1997). The model is based on the assumption that curricular elements may all be categorized based upon the cognitive demands required to produce an acceptable response. Each group of tasks reflects a different level of cognitive expectation, or Depth of Knowledge, required to complete the task. It should be

1 Smarter Balanced Assessment Consortium General Item Specifications: Measured Progress/ETS Collaborative; c2012; p. 21; http://www.smarterbalanced.org/wordpress/wp-content/uploads/2012/05/TaskItemSpecifications/ItemSpecifications/GeneralItemSpecifications.pdf

2 Webb’s Depth of Knowledge Guide; Career and Technical Education Definitions, c2009; p. 5; http://www.aps.edu/rda/documents/resources/Webbs_DOK_Guide.pdf

Page 7: Common Core Content  What to Look For - eSchool News

COMMON COREAssessments

7

noted that the term "knowledge" as it is used here is intended to broadly encompass all forms of knowledge (i.e. procedural, declarative, etc.).

The following table is an adapted version of the model.

Table 2: Depth of Knowledge levels

DOK Level Title of Level

1 Recall and Reproduction

2 Skills and Concepts

3 Short-Term Strategic Thinking

4 Extended Thinking

Educators need to evaluate whether items in an item bank or on a test form have been written to an appropriate range of difficulty and cognitive complexity, when appropriate to the assessment. If an assessment contains items only at the higher or lower levels, there is a risk of improper categorization of students’ abilities.

Do the items have construct validity? Do they closely match the content and standards they are intended to measure?

5

Assessment professionals agree that in several different respects items must tightly align to the standards they are intended to measure. This includes considerations such as ensuring reading comprehension items are associated with “authentic,” real-life reading selections, and that item types extend beyond true-false and selected-response into more “powerful” types such as performance-based and open-ended questions that ask students to apply their knowledge. For example, while technology-enhanced items might be engaging to students, it’s important for educators to evaluate whether or not the use of technology asks the students to apply knowledge of that standard.

What to Look For:

English Language Arts Content

The Common Core English language arts standards call for the integration and assessment of reading and writing. They require students to analyze literary and informational texts, as well as compose a written response showing evidence from these sources.

The best items will be designed not only to assess students’ ability to comprehend what they read, but also to prompt them to analyze and synthesize similarities and differences between two passages and cite evidence to support their thinking. This type of assessment engages students in higher-order thinking that directly aligns to the rigor and focus of the Common Core.

Range of Text Types

The range of text types should include complete or excerpted passages from varied sources and represent a wide variety of types of literary and informational works. If possible, educators should ensure that item banks or assessments include authentic literary texts from a broad range of cultures and time periods. Similarly, informational texts should represent a wide range of types. In fact, the Common Core places greater emphasis on informational texts than do traditional standards.

Page 8: Common Core Content  What to Look For - eSchool News

COMMON COREAssessments

8

Table 3: Range of Text Types

Passage Type Examples

Literary � Stories (fictional narratives, short stories, folktales, tall tales, legends, fables, myths, fantasy, realistic fiction)

� Dramas include staged dialogue and brief, familiar scenes

� Poetry includes narrative, limerick, and free verse poems

Informational � Content (reading for information) � Excerpts from textbooks, encyclopedia articles, magazine articles, news articles

� Historical documents � Biographies, speeches, editorials � Science experiments, tables, graphics

� Technical guides

The Common Core State Standards for reading emphasize text complexity and the growth of reading comprehension. Students must be able “to comprehend texts of steadily increasing complexity as they progress through school.” Text complexity is best determined both qualitatively and quantitatively, as well as by reader and task considerations.

Qualitative Measures of Text Complexity

Expert judgment is one of the most effective ways to evaluate the complexity of literary and informational texts. Content specialists and reviewers should be able to ensure that the passages meet the requirements of the CCSS. The proof of the validity of this approach normally rests on the percentages of passages and items that eventually prove effective in field testing.

Quantitative Measures of Text Complexity

Developers can apply a number of different quantitative tools to help determine the appropriate readability level of each passage under consideration for use in a particular assessment. Since measures of text complexity must be aligned with college and career readiness, these considerations and methods can ensure that the text meets CCSS demands.

Mathematics Content

Common Core mathematics standards are grounded in a progression of learning and coherence of concepts. Because mathematics is a discipline that builds on itself, the Common Core mathematics standards combine conceptual and procedural knowledge with problem solving and reasoning. Common Core mathematics items should be designed to reflect the content standards delineated by the Common Core while integrating The Standards for Mathematical Practice3, which describes the types of expertise students should develop throughout their mathematics education.

Mathematics content should normally begin with procedural-type questions and progress to constructed-response items. The constructed-response items encourage students to conceptually integrate and apply their understanding of the key skills and concepts required by the clusters.

3 “Standards for Mathematical Practice” Common Core State Standards Initiative, 2012. Available athttp://www.corestandards.org/Math/Practice

Page 9: Common Core Content  What to Look For - eSchool News

COMMON COREAssessments

9

Does the item bank provide a variety of item types?

6

Many new item types are being offered to educators. Several new types of “technology-enhanced” items—such as ones using “hot-spot” and “drag-and-drop” techniques—harness technology to allow students to show their work on computer-based tests. While these new item types are engaging, they should successfully gather information about student understanding related to a particular Common Core standard. It’s also important to look at the mix of items and the levels of Depth of Knowledge each reaches—both separately and as a set within a test form.

The recent work of the consortia point to two relatively new types of items that show particular promise to measure student knowledge not typically assessed by traditional selected-response items. These include evidence-based selected response (EBSR) items and performance tasks—two item types that students will see in upcoming PARCC and Smarter Balanced accountability assessments. But it is the proper integration and combination of all available item types into a real-world learning environment—and how each is used to gather information about student knowledge—that makes for an effective classroom test. In other words, just because a bank of items includes each of these different types doesn’t ensure that it’s an effective set.

Educators should be concentrating on how each type gathers evidence of student understanding, and how all work together to address groups of standards—keeping in mind that technology-based items do not necessarily dive into deeper levels of complexity.

Here is a brief summary of select item types and their recommended uses:4

Selected-response – Although these test items are the best choice for assessing fact recall and comprehension, depending on how they are written they can also be used to assess higher-order thinking such as application, analysis, and evaluation.

Evidence-based selected response – This is a newer item type that will be included in upcoming tests. EBSRs include a traditional selected-response item, followed by an item that asks students to show the text evidence that led them to the answer in the first item.

Constructed-response – These can include short-answer or extended-essay responses. While constructed-response items often give educators a deeper understanding of what students know about a particular standard, they must be written to assess the standard correctly. They must include a rubric that is consistent with the expectations stated in the question and ensures that all responses will be evaluated against the same criteria, regardless of who scores the response.

Performance tasks – PARCC and Smarter Balanced assessments will also include performance tasks that differ from traditional assessments in two ways. First, these tasks will be designed to mirror and measure real-world skills, such as research, synthesis, and writing. Second, the tasks will be completed over an extended period of time.

Technology-enhanced – These items will measure students’ understanding using technologies such as drag-and-drop, cut-and-paste, highlighting, and arranging items to show relationships.

4 For a sampling of detailed analyses pertaining to item types and their recommended applications, see also: http://cte. .edu/testing/exam/test_ques2.html#performance; http://www.smarterbalanced.org/sample-items-and-performance-tasks/; and http://www.uiowa.edu/~examserv/resources_fees/Technical_Bulletins/Improving_Test_Questions.html

Page 10: Common Core Content  What to Look For - eSchool News

COMMON COREAssessments

10

Is there an available range of preconfigured item sets that can meet teachers’ needs for formative assessment assistance over the course of the entire school year?

7

Flexibility in any testing program helps educators to guide their students to show understanding of new concepts. An item bank alone, no matter how large or broad in coverage, is more effective when its items are configured into assessments that gather the information educators need to make sound and timely instructional decisions. Educators must understand the purpose of each pre-configured assessment, as well as what student responses from each can tell them about student understanding. They should seek Common Core content that includes not just an item bank, but also logical configurations of items that take into account such critical factors as:

�� Test purpose

�� Standards coverage and grouping

�� Standards pacing

�� Test length

�� Test frequency

�� Student evidence gathered

Different types of tests can each become part of a comprehensive classroom assessment program that prepares students for tests intended to gather achievement data, such as interim assessments or high-stakes summative assessments. These test “configurations” save teachers time and ensure that they are hitting the intended learning targets. At the same time, the item bank should have enough breadth and depth to allow teachers the flexibility to create their own quizzes and classroom formative assessment tools that zero in on areas of interest to a district, classroom, or group of students.

How is the item bank or assessment delivered to students?

8

Since educators develop and administer assessments for different reasons, good Common Core assessment content must be configurable into different types of tests and delivered in a variety of ways for various assessment purposes. For example, formative assessment instruments must be created quickly from an item bank and given to students frequently. Quantitative data may or may not be collected. District assessments might be pre-configured assessments or may require that educators take more time to develop them from an item bank that is “secure,” or locked down for district use only.

Likewise, student results might be more important to record for future accountability purposes. Whether quizzes or tests are delivered online or via paper and pencil, the goal is for the greatest number of students to have access to high-quality Common Core content that truly measures the new standards, regardless of what data management system their school or district uses.

Online Delivery

Item banks that may be accessed online and easily configured into tests that can also be delivered on computers or tablets have many advantages. One is the ability for the teacher to filter items in the item bank by content area, grade, standard, item type, and/or even Depth of Knowledge before including them in a test that best meets the needs of the teacher. It’s also convenient for educators to draw from pre-configured assessments in an online system that can be scheduled and administered to students using tablets or computers. Also, online testing allows for real-time data collection in many platforms, particularly for selected- response items.

Page 11: Common Core Content  What to Look For - eSchool News

COMMON COREAssessments

11

Educators might find, however, that assessment tools in their existing online testing platforms do not include newly built, rigorous Common Core content. This can pose a problem for schools and districts that are only allowed to use those platforms recommended by educational leadership. What options do classrooms, schools, and/or districts have if the online platform they are using does not include the rigorous assessment items needed to prepare their students for the Common Core?

In addition, when evaluating Common Core content, schools and districts should look for item banks and assessments that were built to IMS Global Standards to ensure that the content can be easily “ingested” into the online platform they are using. The IMS Question and Test Interoperability (QTI) specification enables the exchange of item, test and results data between authoring tools, item banks, test construction tools, learning systems, and assessment delivery systems.5

Paper-and-Pencil Delivery

In other cases, it’s more convenient for teachers to assess students using traditional paper-and-pencil quizzes and tests. Responses can either be scored by hand, with student results entered into a separate data management system, or scanned into a data management platform that includes access to a human image-scoring system or automated scoring. This gives schools that do not have large computer labs or tablets for all students the ability to expose a larger number of students to high-quality Common Core assessments. Particularly for the purpose of formative assessment, paper-and-pencil quizzes and tests that focus on clusters of like standards give educators the opportunity to gain a better understanding of student

knowledge in one class period and afford students the opportunity to become engaged in their own learning. Item banks and assessments that may be given both online and in paper-and-pencil mode often strike the best balance.

Do the selected-response items include written distractor rationales for educators and students?

9

Teachers can learn a great deal from the items their students answer incorrectly. A valuable teaching tool and an often-undervalued element of any good selected-response item is its set of distractors; i.e., incorrect answers. Distractors should be incorrect-yet-plausible responses. When used most effectively, they can reveal misconceptions students may have about a concept being tested. As part of formative assessment, distractor rationales to well-constructed, selected-response items can help teachers develop instructional strategies to correct misconceptions and help students recognize what they misunderstood.

To most easily capture this advantage, though, educators need formal, written distractor rationales that provide them with explanations about the exact misconception(s) each distractor was meant to reveal.

The next page shows two basic examples of how distractor rationales can help inform teachers’ planning for new instructional strategies

5 For more information on IMS and QTI, see http://www.imsglobal.org/question/QTI2p1brochure.pdf.

Page 12: Common Core Content  What to Look For - eSchool News

©2012 Measured Progress. All rights reserved. | Web: measuredprogress.org

B3

CCAP1005.1

Scoring Guide

CCSS Alignment CLUSTER: Integration of Knowledge and Ideas

STANDARD: Integrate information from several texts on the same topic in order to write or speak about the subject knowledgeably.

DOK: 3

5. Which best explains how the information in Selection 1 strengthens the reader’s understanding of the information in Selection 2?

A It tells why the race started.

B It explains why the race is popular.

C It introduces the mushers who later run the race.

D It describes the people who make the race possible.

Distractor Rationales

A Key: Selection 1 provides the historical context.

B The race is popular, but Selection 1 does not help the reader understand this.

C Misreading of the text

D Misreading of the text

CCSS Alignment CLUSTER: Craft and Structure

STANDARD: Analyze multiple accounts of the same event or topic, noting important similarities and diff erences in the point of view they represent.

DOK: 3

6. Based on the information in both selections, which word would best describe a successful musher?

A graceful

B brave

C imaginative

D generous

Distractor Rationales

A There is evidence to support the idea that mushers must be skillful, but “graceful” implies style rather than skill and would not be a critical characteristic for a successful musher.

B Key: Evidence in both selections supports the idea that mushers must be brave in order to succeed.

C A positive word, but not supported by the selections

D Related to the idea of rewarding the last to arrive, but not correct

Figure 2: Distractor rationales as teaching and learning tools

Mathematics Reading

©2012 Measured Progress. All rights reserved. | Web: measuredprogress.org

B2

CCAP2007.1

Scoring Guide

CCSS Alignment CluSTeR: Analyze proportional relationships and use them to solve real-world and mathematical problems.

STandaRd: Recognize and represent proportional relationships between quantities: Explain what a point (x, y) on the graph of a proportional relationship means in terms of the situation, with special attention to the points (0, 0) and (1, r) where r is the unit rate.

dOK: 2

3. Leslie is buying potatoes at a store. This graph shows the relationship between the number of pounds of potatoes she buys and the total cost.

Weight of Potatoes (in pounds)

2

1

3

5

4

1 2 3 4 50

Co

st (

in d

olla

rs)

Cost of Potatoes

Based on the graph, what is the unit cost, in dollars, of the potatoes?

A$0.75 per pound

B$1.00 per pound

C$1.33 per pound

D$1.50 per pound

Distractor Rationales

AKey

BUses the number of bags for one unit

CReciprocal of unit rate

DSelects based on first grid point intersected

Short-Answer Items

CCSS Alignment CluSTeR: Analyze proportional relationships and use them to solve real-world and mathematical problems.

STandaRd: Compute unit rates associated with ratios of fractions, including ratios of lengths, areas, and other quantities measured in like or different units.

dOK: 2

4. An airplane traveled 910 kilometer in

215 minute. What was its average speed

in kilometers per minute?

Short-Answer Rubric

SCoRe DeSCRIPTIon

1 Answer: 274 or

346 or 6.75 (km per minute)

or equivalent

0Response is incorrect or contains some correct work that is irrelevant to the skill or concept being measured.

Blank No response.

COMMON COREAssessments

12

Page 13: Common Core Content  What to Look For - eSchool News

COMMON COREAssessments

13

Is the item bank static or dynamic? 10

Some assessment providers claim to have thousands of Common Core items that provide broad coverage and meet the rigor and breadth of the Common Core, but how can educators be sure this is true? A large item count doesn’t indicate a higher quality or diversity of items or assessments, nor does it indicate that the bank will continuously be refreshed with item types and assessments likely to be included in next-generation, interim, or summative assessments. In fact, this could be a sign that the item bank is “static,” consisting of older, “back-aligned” test questions that were originally developed to align to a particular state’s standards, rather than the Common Core. Even if the content has been newly built to the Common Core, it’s important for educators to know whether or not those items and assessments will be refreshed and refined to ensure that they mirror the new Smarter Balanced and PARCC content.

As the consortia collect data from field tests and feedback from both educators and students, item types, the item mix, and perhaps even the construction of the operational tests may change. That's why it’s important for item banks and assessments to be dynamic enough to support preparation for the next-generation tests, and thus to evolve and change.

Educators should seek an organization whose item development plans indicate continued expansion of their Common Core offerings and an increase in innovative item types to fulfill the blueprints outlined by the consortia.

In Conclusion . . .

In the search for Common Core success, educators have wondered what their best course of action is; they have likely asked questions similar to those addressed here. In their simplest forms, these questions and answers will help educators navigate the new Common Core landscape.

Look for high-quality, dynamic, and varied item bank content; proven success in item creation through field tests; dependable quality assurance measures; Common Core alignment at all levels; formative instruction capabilities; and a robust assessment delivery platform.

With these considerations in mind, educators can prepare teachers and students for a seamless transition to the Common Core, and ultimately, for success in next-generation summative assessments.

About Measured Progress

A New Hampshire-based, not-for-profit organization, Measured Progress is dedicated to student learning and improving instruction in the standards-based classroom. Since 1983, it has successfully partnered with more than 30 states and hundreds of districts in support of assessment programs impacting millions of students. The company develops district- and state-level assessments and is the nation’s leading provider of alternate assessments.

The Smarter Balanced Assessment Consortium awarded Measured Progress contracts to create specifications for new and innovative items; to design the consortium’s assessment technology architecture; and to develop policies and training materials related to item writing, item review, and testing accommodations and accessibility. The company has also worked with partners to develop items for the Partnership for Assessment of Readiness for College and Careers consortium.

www.measuredprogress.org/commoncore