project 2061: student assessment 1 aligning science assessment to content standards george deboer,...

101
Project 2061: Student Assessment 1 Aligning Science Assessment to Content Standards George DeBoer, Arhonda Gogos, Cari Herrmann Abell, Kristen Lennon, An Michiels, Tom Regan, Jo Ellen Roseman, Paula Wilson Center for Curriculum Materials in Science Knowledge Sharing Institute Ann Arbor, Michigan July 10-12, 2006 This work is funded by the National Science Foundation ESI 0352473

Upload: nora-freeman

Post on 25-Dec-2015

215 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Project 2061: Student Assessment 1 Aligning Science Assessment to Content Standards George DeBoer, Arhonda Gogos, Cari Herrmann Abell, Kristen Lennon,

Project 2061: Student Assessment

1

Aligning Science

Assessment to Content Standards

George DeBoer, Arhonda Gogos, Cari Herrmann Abell, Kristen Lennon, An Michiels, Tom Regan, Jo Ellen Roseman,

Paula Wilson

Center for Curriculum Materials in Science

Knowledge Sharing Institute

Ann Arbor, Michigan

July 10-12, 2006

This work is funded by the National Science Foundation

ESI 0352473

Page 2: Project 2061: Student Assessment 1 Aligning Science Assessment to Content Standards George DeBoer, Arhonda Gogos, Cari Herrmann Abell, Kristen Lennon,

Project 2061: Student Assessment

2

Thanks to:

Abigail Burrows for organizing the pilot testing with schools.

Ed Krafsur for developing the assessment data base.

Brian Sweeney for developing illustrations for test items.

Page 3: Project 2061: Student Assessment 1 Aligning Science Assessment to Content Standards George DeBoer, Arhonda Gogos, Cari Herrmann Abell, Kristen Lennon,

Project 2061: Student Assessment

3

Strand 6: Part I

Examining the Project 2061 Criteria for Aligning Middle School Assessment Items

to Learning Goals

Page 4: Project 2061: Student Assessment 1 Aligning Science Assessment to Content Standards George DeBoer, Arhonda Gogos, Cari Herrmann Abell, Kristen Lennon,

Project 2061: Student Assessment

4

Aligning Student Assessment to Content Standards

What We Are Doing: Project Background

Creating a bank of middle and early high school science assessment items that are precisely aligned with national content standards

Providing resources to support the creation and use of assessment items aligned to content standards

Developing a data base for these resources and a user interface to access the resources

In this session, we will focus on the criteria we use for judging alignment of assessment items to content standards.

Page 5: Project 2061: Student Assessment 1 Aligning Science Assessment to Content Standards George DeBoer, Arhonda Gogos, Cari Herrmann Abell, Kristen Lennon,

Project 2061: Student Assessment

5

Resources We Will Provide

Clarifications of the content standards (elaboration, boundary setting, i.e., what’s in and what’s out). To add precision to the alignment of assessment items.

Summaries of research on student learning (misconceptions and other ideas students hold) related to the ideas in the content standards. To serve as distractors in assessment items.

Assessment maps (which include prerequisite ideas, related ideas, ideas that come later in the learning trajectory). Useful for developing test instruments on a specific topic. Also useful in item development for deciding what knowledge is reasonable to expect students to have (e.g., bedrock).

Page 6: Project 2061: Student Assessment 1 Aligning Science Assessment to Content Standards George DeBoer, Arhonda Gogos, Cari Herrmann Abell, Kristen Lennon,

Project 2061: Student Assessment

6

List of Topics

1. Atoms, Molecules and States of Matter

2. Substances, Chemical Reactions and Conservation

3. Processes that shape the Earth / Plate Tectonics

4. Weather and Climate

5. Solar System

6. Energy Transformations

7. Force and Motion

8. Forces of Nature

9. Sight and Vision

10. Mathematics: Summarizing Data

11. Mathematics: Relationships among Variables

Page 7: Project 2061: Student Assessment 1 Aligning Science Assessment to Content Standards George DeBoer, Arhonda Gogos, Cari Herrmann Abell, Kristen Lennon,

Project 2061: Student Assessment

7

List of Topics, Continued12. Basic Functions in Humans

13. Cells and Proteins

14. Evolution and Natural Selection

15. Interdependence, Diversity and Survival

16. Matter and Energy Transformations in Living Systems

17. Sexual Reproduction, Genes and Heredity

18. Cross-cutting Themes: Models

19. Nature of Science: Claims of Causal Relationships

20. Nature of Science: Inductive Reasoning

21. Nature of Science: Empirical Validation of Ideas about the World

22. Nature of Science: Uncertainty and Durability

Page 8: Project 2061: Student Assessment 1 Aligning Science Assessment to Content Standards George DeBoer, Arhonda Gogos, Cari Herrmann Abell, Kristen Lennon,

Project 2061: Student Assessment

8

Examples of:

Clarification statements

Summaries of research on student learning

Assessment maps

How each is used in the item development work.

Page 9: Project 2061: Student Assessment 1 Aligning Science Assessment to Content Standards George DeBoer, Arhonda Gogos, Cari Herrmann Abell, Kristen Lennon,

Project 2061: Student Assessment

9

Idea B: All atoms are extremely small (from BSL 4D/M1a).

Students are expected to know that atoms are much smaller than very small items with which they are familiar, such as dust, blood cells, plant cells, and microorganisms, all of which are made up of atoms. Students should know that the atoms are so small that many millions of them make up these small items with which they are familiar. They should know that this is true for all atoms. The comparison with very small objects can be used to test students’ qualitative understanding of the size of atoms in relation to these objects. Students will not, however, be expected to know the actual size of atoms.

Page 10: Project 2061: Student Assessment 1 Aligning Science Assessment to Content Standards George DeBoer, Arhonda Gogos, Cari Herrmann Abell, Kristen Lennon,

Project 2061: Student Assessment

10

Student Misconceptions Related to the Size of Atoms:

Atoms and/or molecules are similar in size to cells, dust, or bacteria (Lee et al., 1993; Nakhleh et al., 1999; Nakhleh et al., 2005).

Atoms and/or molecules can be seen with magnifying lenses or optical microscopes (Griffiths et al., 1992; Lee et al., 1993).

Page 11: Project 2061: Student Assessment 1 Aligning Science Assessment to Content Standards George DeBoer, Arhonda Gogos, Cari Herrmann Abell, Kristen Lennon,

Project 2061: Student Assessment

11

Page 12: Project 2061: Student Assessment 1 Aligning Science Assessment to Content Standards George DeBoer, Arhonda Gogos, Cari Herrmann Abell, Kristen Lennon,

Project 2061: Student Assessment

12

Steps in the Item Development Procedure

1. Select a set of benchmarks and standards to define the boundaries of a topic

2. Tease apart the benchmarks and standards into a set of key ideas

3. Create an assessment map showing how the key ideas build on each other conceptually

4. Review the research on student learning to identify ideas students may have about the ideas

5. Design items:

a. using student misconceptions as distractors

b. using the assessment analysis criteria

c. following a list of design specifications

Page 13: Project 2061: Student Assessment 1 Aligning Science Assessment to Content Standards George DeBoer, Arhonda Gogos, Cari Herrmann Abell, Kristen Lennon,

Project 2061: Student Assessment

13

Steps in the Item Development Procedure, con’t

6. Use open-ended interviewing to supplement published research on student learning

7. Use mini “item camps” to get feedback on items from staff

8. Revise items

9. Pilot test items and conduct think aloud interviews

10. Analyze pilot test data

11. Revise items

12. Conduct formal reviews of approximately 25 items using the assessment analysis criteria

13. Revise items

14. Conduct national field test of items

Page 14: Project 2061: Student Assessment 1 Aligning Science Assessment to Content Standards George DeBoer, Arhonda Gogos, Cari Herrmann Abell, Kristen Lennon,

Project 2061: Student Assessment

14

Demonstration of the Database and User Interface:

1. Items

2. Misconception List

3. Topics, key ideas, clarifications

4. Assessment Maps

5. Item Specifications

Page 15: Project 2061: Student Assessment 1 Aligning Science Assessment to Content Standards George DeBoer, Arhonda Gogos, Cari Herrmann Abell, Kristen Lennon,

Project 2061: Student Assessment

15

The Project 2061 Assessment Analysis Procedure

Page 16: Project 2061: Student Assessment 1 Aligning Science Assessment to Content Standards George DeBoer, Arhonda Gogos, Cari Herrmann Abell, Kristen Lennon,

Project 2061: Student Assessment

16

There are six parts to the analysis procedure:

1. Exploring the Learning Goal

2. Determining Content Alignment

3. Determining Whether the Task Accurately Reveals What Students do or do not Know

4. Considering the Task’s Cost Effectiveness

5. Suggesting Revisions

6. Assessment Item Rating Form (not included in this version)

Page 17: Project 2061: Student Assessment 1 Aligning Science Assessment to Content Standards George DeBoer, Arhonda Gogos, Cari Herrmann Abell, Kristen Lennon,

Project 2061: Student Assessment

17

Reviewers use the following materials: Assessment Items

The content standard that is being targeted

Clarification statements

Lists of common student misconceptions and other ideas students may have.

Results of student interviews or field test results if available

Page 18: Project 2061: Student Assessment 1 Aligning Science Assessment to Content Standards George DeBoer, Arhonda Gogos, Cari Herrmann Abell, Kristen Lennon,

Project 2061: Student Assessment

18

I. Exploration Phase

Determining the alignment of an assessment task to a learning goal requires a precise understanding of the meaning of the learning goal and what knowledge and skills are needed to successfully complete the task.

Page 19: Project 2061: Student Assessment 1 Aligning Science Assessment to Content Standards George DeBoer, Arhonda Gogos, Cari Herrmann Abell, Kristen Lennon,

Project 2061: Student Assessment

19

A. The Learning Goal

1. Reviewers carefully read the clarification statement written for the targeted learning goal (content standard or benchmark).

2. Reviewers examine the list of misconceptions related to the targeted learning goal.

Page 20: Project 2061: Student Assessment 1 Aligning Science Assessment to Content Standards George DeBoer, Arhonda Gogos, Cari Herrmann Abell, Kristen Lennon,

Project 2061: Student Assessment

20

B. The Assessment Task

1. Reviewers:

a. attempt to complete the task themselves.

b. list the knowledge and skill needed to successfully complete the task.

c. consider if there are different strategies that can be used to successfully complete the task.

d. consider which misconceptions might affect student answers.

Page 21: Project 2061: Student Assessment 1 Aligning Science Assessment to Content Standards George DeBoer, Arhonda Gogos, Cari Herrmann Abell, Kristen Lennon,

Project 2061: Student Assessment

21

II. Determining the Content Alignment between the Learning Goal and the Assessment Task

Page 22: Project 2061: Student Assessment 1 Aligning Science Assessment to Content Standards George DeBoer, Arhonda Gogos, Cari Herrmann Abell, Kristen Lennon,

Project 2061: Student Assessment

22

A. Necessity1. To be content aligned, knowledge of the ideas

described in the learning goal or the clarification statement, or knowledge that certain commonly held misconceptions are not true, must be needed to evaluate each of the answer choices.

Page 23: Project 2061: Student Assessment 1 Aligning Science Assessment to Content Standards George DeBoer, Arhonda Gogos, Cari Herrmann Abell, Kristen Lennon,

Project 2061: Student Assessment

23

Reviewers are told:

If the knowledge in the learning goal is not needed to decide if the answer choices are correct or incorrect, explain how the answer choices can be evaluated using other knowledge.

Page 24: Project 2061: Student Assessment 1 Aligning Science Assessment to Content Standards George DeBoer, Arhonda Gogos, Cari Herrmann Abell, Kristen Lennon,

Project 2061: Student Assessment

24

Applying the Necessity Criterion

Which of the following is the smallest?

A.  An atom

B.  A bacterium

C.  The width of a hair

D.  A cell in your body

Page 25: Project 2061: Student Assessment 1 Aligning Science Assessment to Content Standards George DeBoer, Arhonda Gogos, Cari Herrmann Abell, Kristen Lennon,

Project 2061: Student Assessment

25

Idea B: All atoms are extremely small (from BSL 4D/M1a).

Students are expected to know that atoms are much smaller than very small items with which they are familiar, such as dust, blood cells, plant cells, and microorganisms, all of which are made up of atoms. Students should know that the atoms are so small that many millions of them make up these small items with which they are familiar. They should know that this is true for all atoms. The comparison with very small objects can be used to test students’ qualitative understanding of the size of atoms in relation to these objects. Students will not, however, be expected to know the actual size of atoms.

Page 26: Project 2061: Student Assessment 1 Aligning Science Assessment to Content Standards George DeBoer, Arhonda Gogos, Cari Herrmann Abell, Kristen Lennon,

Project 2061: Student Assessment

26

Applying the Necessity Criterion:

The knowledge in the learning goal is needed to evaluate each answer choice.

Page 27: Project 2061: Student Assessment 1 Aligning Science Assessment to Content Standards George DeBoer, Arhonda Gogos, Cari Herrmann Abell, Kristen Lennon,

Project 2061: Student Assessment

27

An example of an item for which the targeted knowledge is not needed:

Targeted Idea: Substances may react chemically in characteristic ways with other substances to form new substances with different characteristic properties (based on NSES 5-8B:A2a).

Page 28: Project 2061: Student Assessment 1 Aligning Science Assessment to Content Standards George DeBoer, Arhonda Gogos, Cari Herrmann Abell, Kristen Lennon,

Project 2061: Student Assessment

28

Which of the following is an example of a chemical reaction?

A. A piece of metal hammered into a tree.

B. A pot of water being heated and the water evaporates.

C. A spoonful of salt dissolving in a glass of water.

D. An iron railing developing an orange, powdery surface after standing in air.

Page 29: Project 2061: Student Assessment 1 Aligning Science Assessment to Content Standards George DeBoer, Arhonda Gogos, Cari Herrmann Abell, Kristen Lennon,

Project 2061: Student Assessment

29

Applying the Necessity Criterion:

The knowledge in the learning goal is not needed.

Answer choice D, the correct answer, is a specific instance of a general principle (SIGP). The student can get the item correct by knowing that rusting is a chemical reaction without knowing the general principle that new substances are formed that have different characteristic properties.

Page 30: Project 2061: Student Assessment 1 Aligning Science Assessment to Content Standards George DeBoer, Arhonda Gogos, Cari Herrmann Abell, Kristen Lennon,

Project 2061: Student Assessment

30

B. Sufficiency

To be content aligned, knowledge of the ideas described in the learning goal or the clarification statement, or knowledge that certain commonly held misconceptions are not true, must be “all that is needed” to evaluate each of the answer choices. Students should not need any additional science knowledge.

Page 31: Project 2061: Student Assessment 1 Aligning Science Assessment to Content Standards George DeBoer, Arhonda Gogos, Cari Herrmann Abell, Kristen Lennon,

Project 2061: Student Assessment

31

Reviewers are told: If the knowledge in the learning goal is not enough to

evaluate each of the answer choices, indicate what additional knowledge is needed. (Do not include as additional knowledge those things that can be assumed as general knowledge and ability of students this age.)

An example of additional knowledge might include science or mathematics terminology that students are not expected to know.

Page 32: Project 2061: Student Assessment 1 Aligning Science Assessment to Content Standards George DeBoer, Arhonda Gogos, Cari Herrmann Abell, Kristen Lennon,

Project 2061: Student Assessment

32

Applying the Sufficiency Criterion

Which of the following is the smallest?

A.  An atom

B.  A bacterium (clarification statement says “microorganism”)

C.  The width of a hair

D.  A cell in your body

Page 33: Project 2061: Student Assessment 1 Aligning Science Assessment to Content Standards George DeBoer, Arhonda Gogos, Cari Herrmann Abell, Kristen Lennon,

Project 2061: Student Assessment

33

Applying the Sufficiency Criterion:

The sufficiency criterion is not met. Students need to know the term “bacterium,“ which is additional knowledge. Although a listed misconception includes the word “bacteria,” in pilot testing, 25% of 193 students indicated that they did not know what a bacterium was (even though most knew what bacteria were). The item should say “microorganism” or “bacteria” to match the clarification statement and/or misconception list.

Page 34: Project 2061: Student Assessment 1 Aligning Science Assessment to Content Standards George DeBoer, Arhonda Gogos, Cari Herrmann Abell, Kristen Lennon,

Project 2061: Student Assessment

34

Applying the Sufficiency Criterion

Approximately how many carbon atoms placed next to each other would it take to make a line that would cross this dot: Ÿ ?

A.  6

B.  600

C.  6000

D.  6,000,000

Note: This item assumes a 1mm dot and a diameter of 1.5Å for a carbon atom.

Page 35: Project 2061: Student Assessment 1 Aligning Science Assessment to Content Standards George DeBoer, Arhonda Gogos, Cari Herrmann Abell, Kristen Lennon,

Project 2061: Student Assessment

35

Applying the Sufficiency Criterion The sufficiency criterion is met. Students need to know that

like the other small things mentioned in the clarification statement, e.g., dust, plant cells, blood cells, and microorganisms, this small visible dot is also made of millions of atoms.

Note: This item assumes a 1mm dot and a diameter of 1.5Å for a carbon atom.

Page 36: Project 2061: Student Assessment 1 Aligning Science Assessment to Content Standards George DeBoer, Arhonda Gogos, Cari Herrmann Abell, Kristen Lennon,

Project 2061: Student Assessment

36

Idea B: All atoms are extremely small (from BSL 4D/M1a). (Not included in the workshop packet.)

Students are expected to know that atoms are much smaller than very small items with which they are familiar, such as dust, blood cells, plant cells, and microorganisms, all of which are made up of atoms. Students should know that the atoms are so small that many millions of them make up these small items with which they are familiar. They should know that this is true for all atoms. The comparison with very small objects can be used to test students’ qualitative understanding of the size of atoms in relation to these objects. Students will not, however, be expected to know the actual size of atoms [nor the order-of-magnitude relationships to other objects].

Page 37: Project 2061: Student Assessment 1 Aligning Science Assessment to Content Standards George DeBoer, Arhonda Gogos, Cari Herrmann Abell, Kristen Lennon,

Project 2061: Student Assessment

37

III. Determining Whether the Task Accurately Reveals What Students Do and Do Not Know

It’s a validity issue. Students should choose the correct answer when they know the idea and they should choose an incorrect answer when they do not know the idea.

Getting rid of factors not related to the knowledge being measured (construct irrelevant factors)

Reducing false negatives and false positives

Page 38: Project 2061: Student Assessment 1 Aligning Science Assessment to Content Standards George DeBoer, Arhonda Gogos, Cari Herrmann Abell, Kristen Lennon,

Project 2061: Student Assessment

38

A. Comprehensibility

1. It is not clear what question is being asked. Explain.

2. The task uses unfamiliar general vocabulary that is not clearly defined. List potentially unfamiliar vocabulary and explain. (Note: This is referring to general language usage, not technical scientific or mathematical terminology, which is addressed under Sufficiency.)

3. The task uses unnecessarily complex sentence structure or ambiguous punctuation that makes the task difficult to comprehend when plain language could have been used. Explain.

(Note: Rebecca Kopriva, C-SAVE, Maryland.)

Page 39: Project 2061: Student Assessment 1 Aligning Science Assessment to Content Standards George DeBoer, Arhonda Gogos, Cari Herrmann Abell, Kristen Lennon,

Project 2061: Student Assessment

39

Comprehensibility Continued:4. The task uses words and phrases that have unclear, confusing, or

ambiguous meanings. This may include commonly used words that have special meaning in the context of science. For example the word “finding” could be unfamiliar to students when referring to a scientific “finding.” Note all places where words, both general and scientific) do not have clear and straightforward meanings.

5. There is inaccurate information (including what is in the diagrams and data tables) that may be confusing to students who have a correct understanding of the science. Explain.

6. The diagrams, graphs, and data tables may not be clear or comprehensible. (For example, they may include extraneous information, inaccurate or incomplete labeling, inappropriate size or relative size of objects, etc.) Explain.

7. Other. Provide a brief explanation.

Page 40: Project 2061: Student Assessment 1 Aligning Science Assessment to Content Standards George DeBoer, Arhonda Gogos, Cari Herrmann Abell, Kristen Lennon,

Project 2061: Student Assessment

40

Comprehensibility:

An item with comprehensibility issues.

Page 41: Project 2061: Student Assessment 1 Aligning Science Assessment to Content Standards George DeBoer, Arhonda Gogos, Cari Herrmann Abell, Kristen Lennon,

Project 2061: Student Assessment

41

Most sidewalks made out of concrete have [cracks] [every few yards] as shown in the diagram below.  These are called [expansion joints] as labeled in the diagram below.  What happens to the width of the cracks during a hot day in the summer and why?

A.  The cracks get wider because the concrete shrinks.

B.  The cracks get wider because the concrete gets softer.

C.  The cracks get narrower because the concrete expands.

D.  The cracks get narrower because the ground underneath the sidewalk shrinks.

Page 42: Project 2061: Student Assessment 1 Aligning Science Assessment to Content Standards George DeBoer, Arhonda Gogos, Cari Herrmann Abell, Kristen Lennon,

Project 2061: Student Assessment

42

Most sidewalks made out of solid concrete have spaces between the sections as shown in the diagram below.  What happens to the width of the spaces during a hot day in the summer and why?

A.  The spaces get wider because the concrete shrinks.

B.  The spaces get narrower because the concrete expands.

C.  The spaces get stay the same because the concrete does not shrink or expand.

D.  Some spaces get narrower and some get wider because some concrete expands and some concrete shrinks

 

Page 43: Project 2061: Student Assessment 1 Aligning Science Assessment to Content Standards George DeBoer, Arhonda Gogos, Cari Herrmann Abell, Kristen Lennon,

Project 2061: Student Assessment

43

B. Appropriateness of Task Context

a. The context may be unfamiliar to most students. Explain.

b. The context may advantage or disadvantage one group of students because of their interest or familiarity with the context. Explain.

c. The context is complicated and not easy to understand so that students might have to spend a lot of time trying to figure out what the context means. Explain.

Page 44: Project 2061: Student Assessment 1 Aligning Science Assessment to Content Standards George DeBoer, Arhonda Gogos, Cari Herrmann Abell, Kristen Lennon,

Project 2061: Student Assessment

44

Appropriateness of Task Context, Continued

d. The information and quantities that are used are not reasonable or believable. Explain.

e. The context does not accurately represent scientific or mathematical realities or, if idealizations are involved, it is not made clear to students that it is an idealized situation. Explain.

f. Other. Explain.

Page 45: Project 2061: Student Assessment 1 Aligning Science Assessment to Content Standards George DeBoer, Arhonda Gogos, Cari Herrmann Abell, Kristen Lennon,

Project 2061: Student Assessment

45

C. Resistance to Test-Wiseness

1. Some of the distractors are not plausible. Explain.

2. One of the answer choices differs in length or contains a different amount of detail from the other answer choices. Explain.

3. One of the answer choices is qualified differently from the other answer choices, using words such as “usually” or “sometimes,” or an answer choice uses different units of measurement. Explain.

4. The use of logical opposites may lead students to eliminate answer choices. Explain.

Page 46: Project 2061: Student Assessment 1 Aligning Science Assessment to Content Standards George DeBoer, Arhonda Gogos, Cari Herrmann Abell, Kristen Lennon,

Project 2061: Student Assessment

46

Resistance to Test-Wiseness, Continued

5. One of the answer choices contains vocabulary at a different level of difficulty from the other answer choices that may make it sound more scientific. Explain.

6. The language in one of the answer choices mirrors the language in the stem. Explain.

7. There are other test-taking strategies that may be used in responding to this task. Explain

Page 47: Project 2061: Student Assessment 1 Aligning Science Assessment to Content Standards George DeBoer, Arhonda Gogos, Cari Herrmann Abell, Kristen Lennon,

Project 2061: Student Assessment

47

An item with test-wiseness issues:

This item is targeted to Idea A from Matter and Energy Transformations in Living Systems:

“Food is a source of molecules that serve as fuel and building material for all organisms.”

Is the oxygen that animals breathe a kind of food?

A. Yes, because oxygen enters the body. M-A2

B. Yes, because all animals need oxygen to survive. M-A3

C. No, because animals do not get energy from oxygen. From clarification of Idea A.

D. No, because oxygen can enter an animal’s body through its nose. M-A1, M-A2.

Page 48: Project 2061: Student Assessment 1 Aligning Science Assessment to Content Standards George DeBoer, Arhonda Gogos, Cari Herrmann Abell, Kristen Lennon,

Project 2061: Student Assessment

48

Misconceptions and other Ideas students may have: Matter and Energy Transformations: Idea A

1. Many children associate the word food with what they identify as being edible (Driver, 1984; Driver, Squires, Rushworth, & Wood-Robinson, 1994; Lee & Diong, 1999).

2. Students see food as substances (water, air, minerals, etc.) that organisms take [directly] in from their environment (Anderson, Sheldon, & Dubay, 1990; Simpson & Arnold, 1982).

3. Some students think that food is what is needed to keep animals and plants alive (Driver et al., 1994).

Page 49: Project 2061: Student Assessment 1 Aligning Science Assessment to Content Standards George DeBoer, Arhonda Gogos, Cari Herrmann Abell, Kristen Lennon,

Project 2061: Student Assessment

49

Analyzing test-wiseness issues:

Conclusion: Answer choice D (No, because oxygen can enter an animal’s body through its nose), is not a plausible explanation for why oxygen is not food. The answer choice is likely to be eliminated because of its implausibility, which is one of the factors (C1) used in assessing test-wiseness. (In pilot testing, 5 of 29 students selected this, thinking that the point of entry is what determines if something is food. Many others questioned how the nose is relevant in a question about food.)

The answer choice could be improved by changing it to say that oxygen is not food because it is not edible (M-A1) or because it does not enter through an animal’s mouth.

Page 50: Project 2061: Student Assessment 1 Aligning Science Assessment to Content Standards George DeBoer, Arhonda Gogos, Cari Herrmann Abell, Kristen Lennon,

Project 2061: Student Assessment

50

IV. Considering the Task’s Cost Effectiveness

A. Does the task require an inordinate amount of time to complete? Ask whether the time needed for students to read the question, make calculations, interpret a data table, or read a graph is warranted. Provide a brief explanation of why the task is not cost effective and how the same information might be elicited more efficiently.

Page 51: Project 2061: Student Assessment 1 Aligning Science Assessment to Content Standards George DeBoer, Arhonda Gogos, Cari Herrmann Abell, Kristen Lennon,

Project 2061: Student Assessment

51

V. Suggesting Revisions

Based on your analysis of the task, make your suggested revisions or indicate if you think the task should be eliminated from consideration.

Page 52: Project 2061: Student Assessment 1 Aligning Science Assessment to Content Standards George DeBoer, Arhonda Gogos, Cari Herrmann Abell, Kristen Lennon,

Project 2061: Student Assessment

52

Begin Content-Focused Activities

Page 53: Project 2061: Student Assessment 1 Aligning Science Assessment to Content Standards George DeBoer, Arhonda Gogos, Cari Herrmann Abell, Kristen Lennon,

Project 2061: Student Assessment

53

Aligning Science

Assessment to Content Standards

George DeBoer, Arhonda Gogos, Cari Herrmann Abell, Kristen Lennon, An Michiels, Tom Regan, Jo Ellen Roseman,

Paula Wilson

Center for Curriculum Materials in Science

Knowledge Sharing Institute

Ann Arbor, Michigan

July 10-12, 2006

This work is funded by the National Science Foundation

ESI 0352473

Page 54: Project 2061: Student Assessment 1 Aligning Science Assessment to Content Standards George DeBoer, Arhonda Gogos, Cari Herrmann Abell, Kristen Lennon,

Project 2061: Student Assessment

54

Thanks to:

Abigail Burrows for organizing the pilot testing with schools.

Ed Krafsur for developing the assessment data base.

Brian Sweeney for developing illustrations for test items.

Page 55: Project 2061: Student Assessment 1 Aligning Science Assessment to Content Standards George DeBoer, Arhonda Gogos, Cari Herrmann Abell, Kristen Lennon,

Project 2061: Student Assessment

55

Strand 6: Part II

Using Student Data to Inform the Design of Assessment Items in Middle School Science

Page 56: Project 2061: Student Assessment 1 Aligning Science Assessment to Content Standards George DeBoer, Arhonda Gogos, Cari Herrmann Abell, Kristen Lennon,

Project 2061: Student Assessment

56

Steps in the Item Development Process

1. Select a set of benchmarks and standards to define the boundaries of a topic

2. Tease apart the benchmarks and standards into a set of key ideas

3. Create an assessment map showing how the key ideas build on each other conceptually

4. Review the research on student learning to identify ideas students may have about the content

5. Design items:

a. using student misconceptions as distractors

b. following the assessment analysis criteria

c. following a list of design specifications

Page 57: Project 2061: Student Assessment 1 Aligning Science Assessment to Content Standards George DeBoer, Arhonda Gogos, Cari Herrmann Abell, Kristen Lennon,

Project 2061: Student Assessment

57

Steps in the Item Development Process, con’t

6. Use open-ended interviewing to supplement published research on student learning

7. Use mini “item camps” to get feedback on items from staff

8. Revise items

9. Pilot test items and conduct think aloud interviews

10. Analyze pilot test data

11. Revise items

12. Conduct formal reviews of approximately 25 items using the assessment analysis criteria

13. Revise items

14. Conduct national field test of items

Page 58: Project 2061: Student Assessment 1 Aligning Science Assessment to Content Standards George DeBoer, Arhonda Gogos, Cari Herrmann Abell, Kristen Lennon,

Project 2061: Student Assessment

58

Using Pilot Testing and Think Aloud Interviews

1. We use pilot testing and interviewing to probe student thinking about the targeted ideas and the test items.

2. We compare student answer choices to their explanations.

3. When answer selections and explanations don’t match, we look for problems with the item that could produce these mismatches.

Page 59: Project 2061: Student Assessment 1 Aligning Science Assessment to Content Standards George DeBoer, Arhonda Gogos, Cari Herrmann Abell, Kristen Lennon,

Project 2061: Student Assessment

59

Interviewing Snapshot for 2005 and 2006

7 schools (urban, suburban); ~200 interviews

Free and reduced lunch ranged from 2% to 78%

Some think-aloud; some open-ended

Open-ended interviews were used to inform item development. Student comments helped in the writing of distractors.

All interviews done by the item writers.

Page 60: Project 2061: Student Assessment 1 Aligning Science Assessment to Content Standards George DeBoer, Arhonda Gogos, Cari Herrmann Abell, Kristen Lennon,

Project 2061: Student Assessment

60

Think-Aloud Interview Procedure

1. Please read the question aloud, think about the answer choices, and circle the best one. Feel free to write down anything on the test paper that helps you to answer the question.

2. Could you tell me in your own words what the question is asking?

3. Why did you choose the answer you chose?

4. Were there other answer choices that you almost chose? (Why?)

Continued…

Page 61: Project 2061: Student Assessment 1 Aligning Science Assessment to Content Standards George DeBoer, Arhonda Gogos, Cari Herrmann Abell, Kristen Lennon,

Project 2061: Student Assessment

61

5. Were there any answer choices that you did not even consider? (Why?)

6. Was there an answer choice you were expecting to see but did not? What was it?

7. Were there any words or diagrams you did not really understand or situations that made the question confusing? Do you think anything would be confusing to your classmates?

8. Are you familiar with the situation that is presented in the question?

9. Where did you learn about the topic in this question? Have you seen a question like this before?

Page 62: Project 2061: Student Assessment 1 Aligning Science Assessment to Content Standards George DeBoer, Arhonda Gogos, Cari Herrmann Abell, Kristen Lennon,

Project 2061: Student Assessment

62

We inform the school administrators that:

1. The students’ responses will be used only to judge the quality of the test questions and will NOT be used as a measure of students’ knowledge or ability, instructional quality, or the quality of the school.  

2. The students are coded to protect their identity.

3. The parents are asked to sign a permission letter.

4. Some school districts require Institutional Review Board (IRB) approval.

Getting permission to conduct interviews

Page 63: Project 2061: Student Assessment 1 Aligning Science Assessment to Content Standards George DeBoer, Arhonda Gogos, Cari Herrmann Abell, Kristen Lennon,

Project 2061: Student Assessment

63

We provide incentives:

1. The revised versions of the items are made available to the teachers and administrators. 

2. We provide a report on what we learned regarding student knowledge of the targeted ideas and misconceptions students may have.

3. We offer a workshop on developing assessment items aligned to content standards to volunteering teachers and/or participating schools.

4. As a token of our appreciation, students receive a gift certificate to Borders bookstore for each interview.

Page 64: Project 2061: Student Assessment 1 Aligning Science Assessment to Content Standards George DeBoer, Arhonda Gogos, Cari Herrmann Abell, Kristen Lennon,

Project 2061: Student Assessment

64

Limitations:

1. Considerable time requirement

2. Small student sample

3. Hard to get access to students

Page 65: Project 2061: Student Assessment 1 Aligning Science Assessment to Content Standards George DeBoer, Arhonda Gogos, Cari Herrmann Abell, Kristen Lennon,

Project 2061: Student Assessment

65

Piloting snapshot:

Total of 112 classrooms across 5 content areas.

Atoms and Molecules: 726 students

Force and Motion: 610 students

Flow of Matter and Energy: 312 students

Plate Tectonics: 568 students

Control of Variables: 462 students

Page 66: Project 2061: Student Assessment 1 Aligning Science Assessment to Content Standards George DeBoer, Arhonda Gogos, Cari Herrmann Abell, Kristen Lennon,

Project 2061: Student Assessment

66

Pilot Test Schools: District-level Demographics

1. Northeast Suburban/Small Town. Middle School and High School.

40% White, 48% African American, 8% Hispanic; 25% Free and Reduced Lunch.

2. Northeast Suburban. Middle School. 95% White; 10% Free and Reduced Lunch.

3. Northeast Rural. (K-8). 98%White; 49% Free and Reduced Lunch.

4. Southern Small Town. Middle School (6-8) 70% White, 24% African American; 33% “Economically Disadvantaged.”

5. Southwest Small Town. Middle School (7-8). 95% Hispanic, 95% Free and Reduced Lunch.

Page 67: Project 2061: Student Assessment 1 Aligning Science Assessment to Content Standards George DeBoer, Arhonda Gogos, Cari Herrmann Abell, Kristen Lennon,

Project 2061: Student Assessment

67

Teacher Feedback Questionnaire

1. Does the class have a special designation (e.g., honors, AP, ELL, special needs, etc.)? Please describe.

2. Please note the approximate number of students in this class with Individualized Education Plans (IEPs).

3. Approximately how much exposure have your students had to the topics hat these assessment items test?

4. How long did it take to administer the test?

5. Was it difficult for the students to understand the instructions? Please document on any difficulties they had.

6. Please add any comments or suggestions you may have.

Page 68: Project 2061: Student Assessment 1 Aligning Science Assessment to Content Standards George DeBoer, Arhonda Gogos, Cari Herrmann Abell, Kristen Lennon,

Project 2061: Student Assessment

68

Pilot-test questions

1. Is there anything about this test question that was confusing? Explain.

2. Circle any words on the test question you don’t understand or aren’t familiar with.

3. Is answer choice A correct? Yes No Not Sure

4. Is answer choice B correct? Yes No Not Sure

5. Is answer choice C correct? Yes No Not Sure

6. Is answer choice D correct? Yes No Not Sure

For items 3-6, students are asked to explain why an answer choice is correct or not.

Page 69: Project 2061: Student Assessment 1 Aligning Science Assessment to Content Standards George DeBoer, Arhonda Gogos, Cari Herrmann Abell, Kristen Lennon,

Project 2061: Student Assessment

69

Pilot-Test Questions, Continued

7. Did you guess when you answered the test question? Yes No

8. Please suggest additional answer choices that could be used.

9. Was the picture or graph helpful? If there was no picture or graph, would you like to see one?

10. Have you studied this topic in school? Yes No Not Sure

11. Have you learned about it somewhere else? Yes No Not Sure(TV, museum visit, etc)? Where?

Page 70: Project 2061: Student Assessment 1 Aligning Science Assessment to Content Standards George DeBoer, Arhonda Gogos, Cari Herrmann Abell, Kristen Lennon,

Project 2061: Student Assessment

70

Results of Teacher Feedback

Test took 45min. to an hour to complete on average.

Students sometimes had difficulty providing an explanation for each answer choice—cognitively and motivationally. Not used to doing that.

Only a very small number of students did not take the task seriously for a variety of reasons—end of the year, not graded, etc. Most were very cooperative.

Students with learning disabilities expressed more difficulty.

The unfamiliar format was a challenge to some.

Teachers appreciated the depth of understanding that was expected.

Page 71: Project 2061: Student Assessment 1 Aligning Science Assessment to Content Standards George DeBoer, Arhonda Gogos, Cari Herrmann Abell, Kristen Lennon,

Project 2061: Student Assessment

71

Examples:

What we learn from pilot testing

Page 72: Project 2061: Student Assessment 1 Aligning Science Assessment to Content Standards George DeBoer, Arhonda Gogos, Cari Herrmann Abell, Kristen Lennon,

Project 2061: Student Assessment

72

Targeted Idea: Substances may react chemically in characteristic ways with other substances to form new substances with different characteristic properties (based on NSES 5-8B:A2a).

Which of the following is an example of a chemical reaction?

A. A piece of metal hammered into a tree.

B. A pot of water being heated and the water evaporates.

C. A spoonful of salt dissolving in a glass of water.

D. An iron railing developing an orange, powdery surface after standing in air.

Page 73: Project 2061: Student Assessment 1 Aligning Science Assessment to Content Standards George DeBoer, Arhonda Gogos, Cari Herrmann Abell, Kristen Lennon,

Project 2061: Student Assessment

73

Students who Selected Each Answer Choice

A

(metal)

B

(evaporation)

C

(dissolving)

D

(rusting)

Not

sure

Total

# 0 14 18 43 1 76

% 0 18.4 23.7 56.6 1.3 100

Page 74: Project 2061: Student Assessment 1 Aligning Science Assessment to Content Standards George DeBoer, Arhonda Gogos, Cari Herrmann Abell, Kristen Lennon,

Project 2061: Student Assessment

74

Results of piloting:

Only 5 of the 43 students who chose the correct answer D said that a new substance formed. Approximately half of the 43 students who chose D said they recognized it as an example of rusting or oxidation. Maybe these students know that rusting is a chemical reaction that produces new substances with different properties, but they may also know rusting only as a specific instance of a chemical reaction without knowing that chemical reactions involve the formation of a new substance.

None of the students chose answer choice A, suggesting that hammering a piece of metal into a tree is not a plausible answer choice. Similar results were found during interviews.

A significant number of students (42.1%) chose either B or C. This supports other research that shows that students hold the idea that phase change and/or dissolving are chemical reactions.

Page 75: Project 2061: Student Assessment 1 Aligning Science Assessment to Content Standards George DeBoer, Arhonda Gogos, Cari Herrmann Abell, Kristen Lennon,

Project 2061: Student Assessment

75

Suggested revisions:

1. Replace A with a more plausible distractor such as: “Sand being removed from sea water by filtration.”

2. Replace D with a reaction that students are not so familiar with, for example, “a white solid forming when two clear liquids are mixed together.”

Page 76: Project 2061: Student Assessment 1 Aligning Science Assessment to Content Standards George DeBoer, Arhonda Gogos, Cari Herrmann Abell, Kristen Lennon,

Project 2061: Student Assessment

76

Targeted Idea:

Organisms use molecules from food to make complex molecules that become part of their body structures.

Page 77: Project 2061: Student Assessment 1 Aligning Science Assessment to Content Standards George DeBoer, Arhonda Gogos, Cari Herrmann Abell, Kristen Lennon,

Project 2061: Student Assessment

77

When a baby chick develops inside an egg, the yolk in the egg is its only source of food. As the chick grows, the yolk becomes smaller.  Why does the yolk become smaller?

A. The yolk enters the chick, but none of the yolk becomes part of the chick.

B. The yolk is broken down into simpler substances, some of which become part of the chick.

C. The yolk is completely turned into energy for the chick.

D. The yolk gets smaller to make room for the growing chick.

Page 78: Project 2061: Student Assessment 1 Aligning Science Assessment to Content Standards George DeBoer, Arhonda Gogos, Cari Herrmann Abell, Kristen Lennon,

Project 2061: Student Assessment

78

Students who Selected Each Answer Choice

A

(not part of)

B

(simpler substances)

C

(turned into energy)

D

(makes room for chick)

Not

sure

Total

# 8 16 23 20 7 74

% 11 22 31 27 9 100

Page 79: Project 2061: Student Assessment 1 Aligning Science Assessment to Content Standards George DeBoer, Arhonda Gogos, Cari Herrmann Abell, Kristen Lennon,

Project 2061: Student Assessment

79

Results of piloting:

6 students commented that they did not understand the phrase “simpler substance” in answer choice B.

Only 8 of the 16 students who chose the correct answer B explained that yolk is broken down to provide building material that becomes incorporated into the body of the chick. The rest of the students indicated that the yolk is needed for the chick “to grow” or “to become bigger.” It is not clear that these students understand the idea that is being assessed, i.e., that food is broken down into smaller molecules that provide building material for the chick, which “become part of” the body structures of the chick.

One of the students who selected answer choice A commented that “Just like humans, pieces of food do not become part of us.” This student might have a correct molecular understanding of how food is made part of body structures but got the question wrong because of the student’s focus on the yolk as being broken down into “pieces of food.”

Page 80: Project 2061: Student Assessment 1 Aligning Science Assessment to Content Standards George DeBoer, Arhonda Gogos, Cari Herrmann Abell, Kristen Lennon,

Project 2061: Student Assessment

80

Suggested revisions:

Change answer choice A to read: “The yolk is broken down into simpler molecules but none of the atoms of these simpler molecules become part of the chick.”

Change answer choice B to read: “The yolk is broken down into simpler molecules that are used to make the body structures of the chick.”

Page 81: Project 2061: Student Assessment 1 Aligning Science Assessment to Content Standards George DeBoer, Arhonda Gogos, Cari Herrmann Abell, Kristen Lennon,

Project 2061: Student Assessment

81

The expansion of alcohol in a thermometer

AM42-4The level of colored alcohol in a thermometer rises when the

thermometer is placed in hot water.  Why does the level of alcohol rise?

                  A.  The heat molecules push the alcohol molecules upward.B.  The alcohol molecules break down into atoms which take up

more space.C.  The alcohol molecules get farther apart so the alcohol takes

up more space.D.  The water molecules are pushed into the thermometer and

are added to the alcohol molecules.

Page 82: Project 2061: Student Assessment 1 Aligning Science Assessment to Content Standards George DeBoer, Arhonda Gogos, Cari Herrmann Abell, Kristen Lennon,

Project 2061: Student Assessment

82

Student data from pilot testing

Is Answer Choice Correct?

A (heat molecules)

B (break down)

C (farther apart)

D (water pushed in)

% Correct

Yes 38 6 25 6 26%

No 24 52 32 58

Not Sure 25 28 30 20

Page 83: Project 2061: Student Assessment 1 Aligning Science Assessment to Content Standards George DeBoer, Arhonda Gogos, Cari Herrmann Abell, Kristen Lennon,

Project 2061: Student Assessment

83

Student Responses

87 students from grades 7-9 at 3 different schools

6 students not familiar with alcohol / colored alcohol (7%)

44% chose answer choice A (plausible distractor)

6 students wrote “heat rises” as their explanation for A.

12 students may have the “heat molecules” misconception.

Answer choice A is the only one that has the word “heat” in it. (Perhaps add “as it is heated” to the end of one or more answer choices.)

Page 84: Project 2061: Student Assessment 1 Aligning Science Assessment to Content Standards George DeBoer, Arhonda Gogos, Cari Herrmann Abell, Kristen Lennon,

Project 2061: Student Assessment

84

Sample student responses

Answer choice A:

No, because “heat molecules can’t push alcohol molecules because alcohol molecules are denser.”

Yes, “I remember learning about heat molecules and knew they bump other molecules upward.”

Yes, “makes sense heat rises.”

Yes, “because heat rises and it is being heated.”

Answer choice B:

No "The molecules don’t break down they stay the same"

Answer choice C:

Yes "The space between molecules expands with the increase in temperature."

Answer choice D:

No "Because there is no way that the water can get pushed into the thermometer."

No "Because how could water get through a glass, a solid glass."

Page 85: Project 2061: Student Assessment 1 Aligning Science Assessment to Content Standards George DeBoer, Arhonda Gogos, Cari Herrmann Abell, Kristen Lennon,

Project 2061: Student Assessment

85

Examples from plate tectonics of:

1. Determining appropriateness of terms used in assessment items

2. Identifying misconceptions

3. Identifying implausible ideas for distractors

Page 86: Project 2061: Student Assessment 1 Aligning Science Assessment to Content Standards George DeBoer, Arhonda Gogos, Cari Herrmann Abell, Kristen Lennon,

Project 2061: Student Assessment

86

Key Idea a: The solid crust of the earth - including both the continents and the ocean basins - consists of separate plates.

Students are expected to know that the rigid, outer layer of the earth is made of separate sections that are called plates and that the plates fit together so that the edge of one plate directly touches an adjacent plate with no gaps between them. They should know that plates are made of solid rock…. Students should know that each of the major plates encompasses very large areas of the earth’s surface (e.g., an entire continent plus adjoining ocean floor or a large part of an entire ocean basin) and that the boundaries of continents and oceans are not the same as the boundaries of plates.

Page 87: Project 2061: Student Assessment 1 Aligning Science Assessment to Content Standards George DeBoer, Arhonda Gogos, Cari Herrmann Abell, Kristen Lennon,

Project 2061: Student Assessment

87

1. Determining appropriateness of terminology in items

Two items were piloted in order to test student knowledge of the term “bedrock” (after typical instruction, i.e., not necessarily targeted to the meaning of the word bedrock) to determine if the word should be used in assessment and thus be part of a clarification statement.

The two items are identical except one uses the term “bedrock” and the other uses the descriptive phrase “solid rock.”

These items were piloted at two different middle schools in two eastern states at grades 7 and 8. Interviews of 9th graders (10 students) in a third school in a western state where bedrock is readily visible are consistent with these findings, but are not presented here.

Page 88: Project 2061: Student Assessment 1 Aligning Science Assessment to Content Standards George DeBoer, Arhonda Gogos, Cari Herrmann Abell, Kristen Lennon,

Project 2061: Student Assessment

88

Which of the following are part of earth’s plates? A. Solid rock of continents but not solid rock of ocean floors. B. Solid rock of ocean floors but not solid rock of continents. C. Solid rock of both the ocean floors and the continents.D. Solid rock of neither the ocean floors or the continents.

Number of Students = 33 (3 classes, two 7th grade and one 8th grade)

Page 89: Project 2061: Student Assessment 1 Aligning Science Assessment to Content Standards George DeBoer, Arhonda Gogos, Cari Herrmann Abell, Kristen Lennon,

Project 2061: Student Assessment

89

Student data from pilot testing (solid rock)

Is Answer Choice Correct?

A (continents only)

B (ocean floor only)

C (both) D (neither)

% Correct

Yes 5 3 19 0 57.6

No 24 26 8 29

Not Sure 4 4 4 4

Page 90: Project 2061: Student Assessment 1 Aligning Science Assessment to Content Standards George DeBoer, Arhonda Gogos, Cari Herrmann Abell, Kristen Lennon,

Project 2061: Student Assessment

90

Which of the following are part of earth’s plates? A. Bedrock of continents but not bedrock ocean floors.  B. Bedrock of ocean floors but not bedrock of continents. C. Bedrock of the ocean floors and the continents.D. Bedrock of neither ocean floors nor continents.

Number of Students = 34 (3 classes, one 7th grade and two 8th grade)

Page 91: Project 2061: Student Assessment 1 Aligning Science Assessment to Content Standards George DeBoer, Arhonda Gogos, Cari Herrmann Abell, Kristen Lennon,

Project 2061: Student Assessment

91

Student data from pilot testing (bedrock)

Is Answer Choice Correct?

A (continents only)

B (ocean floor only)

C (both) D (neither)

% Correct

Yes 1 2 17 3 50.0

No 20 17 5 19

Not Sure 13 15 12 12

Page 92: Project 2061: Student Assessment 1 Aligning Science Assessment to Content Standards George DeBoer, Arhonda Gogos, Cari Herrmann Abell, Kristen Lennon,

Project 2061: Student Assessment

92

Student answers to Bonus Question: What is bedrock?

Twenty-one of 34 students responded that they did not know.

Students who attempted to define the term said:

1. “The bed of rocks on the ocean floor “

2. “The bottom layer of a rock”

3. “Like the ocean floor “

4. “The bare rock under dirt and sand”

5. “The deep rock of the crust “

6. “Bedrock is rock that is in the ground”

7. “A type of layering of loose pebbles that have been fused together”

8. “Rocks and sediments that are on the bottom of the continent or ocean”

9. “Rocks on the bottom of the ocean”

10. “Rock Maybe”

11. “It is the rock that is on the bottom of an ocean plate”

Page 93: Project 2061: Student Assessment 1 Aligning Science Assessment to Content Standards George DeBoer, Arhonda Gogos, Cari Herrmann Abell, Kristen Lennon,

Project 2061: Student Assessment

93

Analysis:

There is a greater number of “unsure” responses when “bedrock” is used. The item using “bedrock” has 12 to 15 responses of “unsure” to each answer choice, while the item using “solid rock” has 4 “unsure” responses to each of the answer choices. Uncertainty about the meaning of the term could interfere with student thinking about the idea being tested.

Thirty-two out of the thirty-four students wrote responses indicating that they do not know what bedrock is. Despite this lack of understanding of the term, 50% of the students were able to correctly answer this item, compared to 57.6% of students answering the item using “solid rock.” Students are apparently translating “bedrock” to mean “rock” without knowing for sure what it is.

For now, we have decided not to include the term “bedrock” in the clarification of this idea (even though the word is used in a grade 3-5 benchmark) and not use it for assessment purposes.

Page 94: Project 2061: Student Assessment 1 Aligning Science Assessment to Content Standards George DeBoer, Arhonda Gogos, Cari Herrmann Abell, Kristen Lennon,

Project 2061: Student Assessment

94

2. Identifying misconceptions

In written comments, a number of students expressed misconceptions.

Which of the following are part of earth’s plates?

A. Solid rock of continents but not solid rock of ocean floors.

Plates can be seen and aren't under water.

The plates do not go down that far.

Ocean water and solid rock from the bottom is not part of a plate.

B. Solid rock of ocean floors but not solid rock of continents.

Yes, it's only made of rock from the ocean surface.

Page 95: Project 2061: Student Assessment 1 Aligning Science Assessment to Content Standards George DeBoer, Arhonda Gogos, Cari Herrmann Abell, Kristen Lennon,

Project 2061: Student Assessment

95

3. Identifying implausible distractors

Which of the following are part of earth’s plates?

D. Solid rock of neither the ocean floors nor the continents.

None of the 33 students selected this answer choice.

D.  Bedrock of neither ocean floors nor continents.

Three of the 34 students selected this answer choice.

Although students have misconceptions about either ocean floors or continents being part of plates, the idea that neither ocean floors nor continents is part of plates is not plausible.

This distractor is not informative and should be replaced.

Page 96: Project 2061: Student Assessment 1 Aligning Science Assessment to Content Standards George DeBoer, Arhonda Gogos, Cari Herrmann Abell, Kristen Lennon,

Project 2061: Student Assessment

96

An example from physics

Idea d: Friction is a force that makes it difficult for one object to slide on another object (from SFAA 4F-3h).

From the clarification statement:

“Students should know that friction is a force that acts in the opposite direction to the sliding of one surface on another surface.”

Page 97: Project 2061: Student Assessment 1 Aligning Science Assessment to Content Standards George DeBoer, Arhonda Gogos, Cari Herrmann Abell, Kristen Lennon,

Project 2061: Student Assessment

97

Alignment/SIGP

FM62-1 (Sixth Grade, n =25, Eighth Grade of different school, n=18)A box slides across the floor. The arrow labeled "Motion" represents the box's direction of motion. Which force could be the force of friction acting on the box?                                                                                            

A.  Force A (40% Sixth / 17% Eighth)B.  Force B (16% / 0%)C.  Force C (40% / 44%)D.  Force D (0% / 17%)

Page 98: Project 2061: Student Assessment 1 Aligning Science Assessment to Content Standards George DeBoer, Arhonda Gogos, Cari Herrmann Abell, Kristen Lennon,

Project 2061: Student Assessment

98

Possible Misconceptions

Forces always act in the direction of motion (Kuiper, 1994). (Answer choice A)

Friction is a force in the vertical direction, holding an object down (Horizon Research, Inc.). (Answer choice B)

Friction is an upward force; gravity is a downward force. (Answer choice D)

Page 99: Project 2061: Student Assessment 1 Aligning Science Assessment to Content Standards George DeBoer, Arhonda Gogos, Cari Herrmann Abell, Kristen Lennon,

Project 2061: Student Assessment

99

Two routes to the correct answer:1. Use targeted learning goal

“Friction opposes the sliding of two surfaces.”

2. Combine two other ideas

“A backward force slows things down.”

“Friction slows things down.” (This is a specific instance of a general principle-SIGP)

“Therefore, friction is a backward force.”

If students use 2. they have not demonstrated knowledge of the learning goal.

+

Page 100: Project 2061: Student Assessment 1 Aligning Science Assessment to Content Standards George DeBoer, Arhonda Gogos, Cari Herrmann Abell, Kristen Lennon,

Project 2061: Student Assessment

100

Student Responses

Sixth Grade: Of the 10 students choosing the correct answer…

2 indicated that they used targeted learning goal

2 indicated that they used the other route (false positive)

Eighth Grade: Of the 8 students choosing the correct answer…

4 indicated that they used the targeted learning goal

Zero indicated that they used the other route

Page 101: Project 2061: Student Assessment 1 Aligning Science Assessment to Content Standards George DeBoer, Arhonda Gogos, Cari Herrmann Abell, Kristen Lennon,

Project 2061: Student Assessment

101

Conclusions:

1. Pilot testing can be used successfully to reveal what students are thinking about the ideas we are testing.

2. Pilot testing provides access to a large number of students around the country, but what we learn is limited by the questions we ask and what students choose to write. Follow-up isn’t possible.

3. Student interviews allow for flexibility to follow up students’ comments with more probing questions, but one-on-one interviews are limited to smaller numbers of students.

4. A combination of the two methods is being used to provide insights into student thinking and the effectiveness of the assessment items that we are developing.