developing assessments of science content knowledge for teaching mark r. olson department of...

1
Developing Assessments of Science Content Knowledge for Teaching Mark R. Olson Department of Curriculum & Instruction, Neag School of Education, University of Connecticut There is broad agreement that content knowledge is important for high quality science instruction. But despite the apparent consensus, there are significant questions about how much content knowledge is necessary and what the nature of that content knowledge is. In addition, the measures used – disciplinary major, passing a subject matter exam administered by a state department of education or testing company, or the possession of an advanced degree – are poor proxies of teacher content knowledge (Wilson, Floden, Ferrini-Mundy, 2001). This poster presents emergent work from a project to both conceptualize the content knowledge for teaching secondary science and to develop assessment instruments to measure that knowledge. Using data from preservice and inservice science teachers, it is suggested that two primary explanatory structures: narrative and paradigmatic, can be used to account for how teachers might productively use content knowledge in instructional representations such as demonstrations, activities and examples. Abstract Abstract UCONN What is “content knowledge for teaching secondary science?” How might content knowledge for teaching be measured? Questions Questions Here I present work aimed at two central questions: what is content knowledge for secondary science teaching and how can it be measured? Current efforts to specify the content knowledge most useful for teachers range from having teachers experience scientific research apprenticeships to learning instructional approaches such as inquiry-based or problem-based learning. However promising such approaches might be, the relationships between such knowledge and a broader conceptualization of the science content knowledge necessary for teaching remain underspecified. Further, measuring what is learned in such ventures remains vague and elusive. Problem Problem This project takes a conjectural approach to the development of both an articulation of the content knowledge for teaching secondary science as well as measures of that knowledge. It begins with a general conception for how people reason about scientific knowledge and then describes the process for shaping this general view into a specific articulation of the content knowledge for teaching secondary science. The general view is based on Jerome Bruner’s assertion that there are two fundamental and independent modes of sense- making called narrative and paradigmatic. Narrative construals of experience take the form of story and paradigmatic contruals of experience take the form of principled, model- based reasoning. Although many have interpreted this characterization as a distinction between humanistic “narrative” and scientific “paradigmatic” endeavors--I argue that instead both narrative and paradigmatic are foundational to a rich understanding in science. In order to operationalize this general view of understanding in science into a model of the content knowledge for teaching science I have undertaken the following: Selected content topics that provide opportunities for the coordination between observable phenomena and unobservable conceptual entities. Conducted literature reviews of misconceptions research to inform the design of instructional scenarios. The instructional scenarios depict science that is typical of school instruction. The scenarios pose a phenomenon and then ask for an explanation. Two student explanations are also presented that are designed to nominate problematic features of understanding the content in the task. These instructional scenarios were initially vetted with science education faculty, graduate students and undergraduate education students in small-group discussion settings. The instructional scenarios were used as interview prompts in a study of new teachers of science and as written open-ended response tasks with undergraduate science teachers. Novel multiple-choice formats were produced. Pilot testing is currently underway. Responses to sets of tasks will be correlated to observations of teaching practice as a means of validating assessments with characterizations of the usefulness of this conception for the content knowledge for teaching science. Overview of Approach Overview of Approach Challenges Challenges Example Task Example Task The intent of this work is to develop sets of tasks, similar to the above, that together might suggest the extent to which a person may be able to coordinate between narrative and model-based construals of school science content. It is important to emphasize that this conceptualization does not privilege one construal over the other. Rather, it emphasizes that both are valuable resources for teaching students. Preliminary piloting of the above task with preservice science teachers (n=25) showed that content elements, such as “a” and “b” above, were selected more frequently and ranked more highly than content element “c.” However, there were a number of respondents who selected a, b, and c--but ranked c more highly, corresponding to N and MB respectively. The current form of the pilot tasks also asks several open- ended questions about what the respondent characterizes as strengths and weaknesses in the responses of student A & B; what the respondent might do next as a teacher of students A & B; and what would students need to do to clearly demonstrate their understanding of the task. These responses are currently used to contextualize the responses on the multiple-choice sections as well as to generate additional content elements/responses for continued task development. As these tasks/items are refined and as the content elements are more finely posed, the aim is to use these assessments in conjunction with observations of teaching practices. The question is whether or not there will be a relationship between assessment task performance and an observable instantiation of narrative and model-based Discussion Discussion task and other documentation are available at http://homepages.uconn.edu/~mro04002 task and other documentation are available at http://homepages.uconn.edu/~mro04002 Perspectives on the knowledge for teaching science content have typically been resolved to one of three major types: content knowledge, pedagogical knowledge and pedagogical content knowledge. Assessments of teacher knowledge have focused on the first two of these: “straight” content knowledge exams and exams of pedagogical knowledge. Without discounting the importance of the second, the first of these is most germane to our project. Content assessments for teachers seldom appear to be different than content assessments for students. The apparent underlying presumption is that knowledge for teaching science content is the same as knowledge learned in science classes. This is a problematic statement that is perhaps rooted in a the common-sense notion that a teacher must “know” science content in order to “teach” content. An important question is: what does it mean to know science content for teaching? A related question is: how is knowledge of content for teaching different than content knowledge learned in science class? Content knowledge measures remain undertheorized. Perhaps it is because there have not been measures or conceptualizations of content knowledge that are truly useful for teaching practice. Perhaps the dominant testing archetypes have become defacto presumptions of what content knowledge looks like? In fact, most efforts to capture the nature of knowledge in practice point to a much more nuanced view of knowledge. The question is whether the problem is conceptual, technical or both. I argue that it is both, and offer a hypothesis Content Knowledge for Teaching Science Content Knowledge for Teaching Science QuickTime™ and a None decompressor are needed to see this picture. QuickTime™ and a None decompressor are needed to see this picture. This work has been funded and generously supported by the Knowles Science Teaching Foundation Young Scholars Fellowship program. Please visit: www.kstf.org Acknowledgements Acknowledgements Multiple element selections are expected for each question stem. Question stem { Content elements Content scenario and constructed student responses nominate problematic content knowledge. Selected content elements are ranked. model- based narrati ve content knowledge for teaching science B A C This conceptualization of content knowledge recognizes that people will use the same terms: for concepts, phenomena, ideas and so forth. But these scientific content entities will fundamentally play different roles in the construction of understanding. For science teaching, knowledge in both dimensions is valued. Coordinating between narrative and model-based construals of science is hypothesized to be a powerful understanding of science for teaching. To devise items that allow for fundamentally different, yet coherent, interpretations of an assessment task is a challenge. This project is working to develop a novel approach to this challenge. PCK PK CK Is this conceptualization sufficient? What does “CK” represent--and how is it operationalized for teaching?

Upload: brent-robbins

Post on 23-Dec-2015

219 views

Category:

Documents


2 download

TRANSCRIPT

Page 1: Developing Assessments of Science Content Knowledge for Teaching Mark R. Olson Department of Curriculum & Instruction, Neag School of Education, University

Developing Assessments of Science Content Knowledge for TeachingMark R. Olson

Department of Curriculum & Instruction, Neag School of Education, University of Connecticut

There is broad agreement that content knowledge is important for high quality science instruction. But despite the apparent consensus, there are significant questions about how much content knowledge is necessary and what the nature of that content knowledge is. In addition, the measures used – disciplinary major, passing a subject matter exam administered by a state department of education or testing company, or the possession of an advanced degree – are poor proxies of teacher content knowledge (Wilson, Floden, Ferrini-Mundy, 2001).

This poster presents emergent work from a project to both conceptualize the content knowledge for teaching secondary science and to develop assessment instruments to measure that knowledge. Using data from preservice and inservice science teachers, it is suggested that two primary explanatory structures: narrative and paradigmatic, can be used to account for how teachers might productively use content knowledge in instructional representations such as demonstrations, activities and examples.

AbstractAbstract

UCONN

• What is “content knowledge for teaching secondary science?”

• How might content knowledge for teaching be measured?

QuestionsQuestions

Here I present work aimed at two central questions: what is content knowledge for secondary science teaching and how can it be measured? Current efforts to specify the content knowledge most useful for teachers range from having teachers experience scientific research apprenticeships to learning instructional approaches such as inquiry-based or problem-based learning. However promising such approaches might be, the relationships between such knowledge and a broader conceptualization of the science content knowledge necessary for teaching remain underspecified. Further, measuring what is learned in such ventures remains vague and elusive.

ProblemProblem

This project takes a conjectural approach to the development of both an articulation of the content knowledge for teaching secondary science as well as measures of that knowledge. It begins with a general conception for how people reason about scientific knowledge and then describes the process for shaping this general view into a specific articulation of the content knowledge for teaching secondary science.

The general view is based on Jerome Bruner’s assertion that there are two fundamental and independent modes of sense-making called narrative and paradigmatic. Narrative construals of experience take the form of story and paradigmatic contruals of experience take the form of principled, model-based reasoning. Although many have interpreted this characterization as a distinction between humanistic “narrative” and scientific “paradigmatic” endeavors--I argue that instead both narrative and paradigmatic are foundational to a rich understanding in science.

In order to operationalize this general view of understanding in science into a model of the content knowledge for teaching science I have undertaken the following:

• Selected content topics that provide opportunities for the coordination between observable phenomena and unobservable conceptual entities. Conducted literature reviews of misconceptions research to inform the design of instructional scenarios.

• The instructional scenarios depict science that is typical of school instruction. The scenarios pose a phenomenon and then ask for an explanation. Two student explanations are also presented that are designed to nominate problematic features of understanding the content in the task.

• These instructional scenarios were initially vetted with science education faculty, graduate students and undergraduate education students in small-group discussion settings.

• The instructional scenarios were used as interview prompts in a study of new teachers of science and as written open-ended response tasks with undergraduate science teachers.

• Novel multiple-choice formats were produced. Pilot testing is currently underway.

• Responses to sets of tasks will be correlated to observations of teaching practice as a means of validating assessments with characterizations of the usefulness of this conception for the content knowledge for teaching science.

Overview of ApproachOverview of Approach

ChallengesChallenges

Example TaskExample Task

The intent of this work is to develop sets of tasks, similar to the above, that together might suggest the extent to which a person may be able to coordinate between narrative and model-based construals of school science content.

It is important to emphasize that this conceptualization does not privilege one construal over the other. Rather, it emphasizes that both are valuable resources for teaching students.

Preliminary piloting of the above task with preservice science teachers (n=25) showed that content elements, such as “a” and “b” above, were selected more frequently and ranked more highly than content element “c.” However, there were a number of respondents who selected a, b, and c--but ranked c more highly, corresponding to N and MB respectively.

The current form of the pilot tasks also asks several open-ended questions about what the respondent characterizes as strengths and weaknesses in the responses of student A & B; what the respondent might do next as a teacher of students A & B; and what would students need to do to clearly demonstrate their understanding of the task. These responses are currently used to contextualize the responses on the multiple-choice sections as well as to generate additional content elements/responses for continued task development.

As these tasks/items are refined and as the content elements are more finely posed, the aim is to use these assessments in conjunction with observations of teaching practices. The question is whether or not there will be a relationship between assessment task performance and an observable instantiation of narrative and model-based reasoning expression in the course of teacher-led discussions, demonstrations, activities and examples.

This work continues to hold promise for articulating a richer and more productive conceptualization of the content knowledge for teaching secondary science.

DiscussionDiscussion

This task and other documentation are available at http://homepages.uconn.edu/~mro04002This task and other documentation are available at http://homepages.uconn.edu/~mro04002

Perspectives on the knowledge for teaching science content have typically been resolved to one of three major types: content knowledge, pedagogical knowledge and pedagogical content knowledge.

Assessments of teacher knowledge have focused on the first two of these: “straight” content knowledge exams and exams of pedagogical knowledge. Without discounting the importance of the second, the first of these is most germane to our project. Content assessments for teachers seldom appear to be different than content assessments for students. The apparent underlying presumption is that knowledge for teaching science content is the same as knowledge learned in science classes. This is a problematic statement that is perhaps rooted in a the common-sense notion that a teacher must “know” science content in order to “teach” content. An important question is:

what does it mean to know science content for teaching?

A related question is: how is knowledge of content for teaching different than content knowledge learned in science class?

Content knowledge measures remain undertheorized. Perhaps it is because there have not been measures or conceptualizations of content knowledge that are truly useful for teaching practice. Perhaps the dominant testing archetypes have become defacto presumptions of what content knowledge looks like? In fact, most efforts to capture the nature of knowledge in practice point to a much more nuanced view of knowledge. The question is whether the problem is conceptual, technical or both. I argue that it is both, and offer a hypothesis for what may provide traction in understanding this terrain. With a new conceptualization science content knowledge perhaps there is also a technical solution for assessing that knowledge.

Content Knowledge for Teaching ScienceContent Knowledge for Teaching Science

QuickTime™ and aNone decompressor

are needed to see this picture. QuickTime™ and aNone decompressor

are needed to see this picture.

This work has been funded and generously supported by the Knowles Science Teaching Foundation Young Scholars Fellowship program. Please visit: www.kstf.org

AcknowledgementsAcknowledgements

Multiple element selections are expected for each question stem.

Question stem

{Content elements

Content scenario and constructed student responses nominate problematic content knowledge.

Selected content elements are ranked.

model-based

narrative

content knowledge for teaching science

BA

C

This conceptualization of content knowledge recognizes that people will use the same terms: for concepts, phenomena, ideas and so forth. But these scientific content entities will fundamentally play different roles in the construction of understanding. For science teaching, knowledge in both dimensions is valued. Coordinating between narrative and model-based construals of science is hypothesized to be a powerful understanding of science for teaching.

To devise items that allow for fundamentally different, yet coherent, interpretations of an assessment task is a challenge. This project is working to develop a novel approach to this challenge.

PCK

PK

CK Is this conceptualization sufficient? What does “CK” represent--and how is it operationalized for teaching?