learning physics teaching through collaborative …

20
Canadian Journal of Action Research Volume 16, Issue 2, 2015, pages 22-41 LEARNING PHYSICS TEACHING THROUGH COLLABORATIVE DESIGN OF CONCEPTUAL MULTIPLE-CHOICE QUESTIONS Marina Milner-Bolotin University of British Columbia ABSTRACT Increasing student engagement through Electronic Response Systems (clickers) has been widely researched. Its success largely depends on the quality of multiple-choice questions used by instructors. This paper describes a pilot project that focused on the implementation of online collaborative multiple-choice question repository, PeerWise, in a Physics Methods course for secondary teachers at a large Canadian Teacher Education program. We investigated different facets of PeerWise implementation, teacher-candidates’ engagement through the system, and how PeerWise can support teacher-candidates’ collaboration on designing high quality conceptual multiple-choice questions. This study sheds light on the results of this implementation and suggests directions for future research. INTRODUCTION Multiple-choice assessment has long been criticized for the lack of validity due to student guessing, for the failure to credit partial knowledge, and most importantly, for the overreliance on the questions that evoke and assess only the least sophisticated cognitive thinking processes (Lau, Lau, Hong, & Usop, 2011; Masters et al., 2001), such as the ones belonging to the Knowledge and Comprehension cognitive levels of Bloom’s Taxonomy of Educational Objectives: Cognitive Domain (1956). However, since the introduction of Electronic Response Systems (clickers) and Peer Instruction pedagogy (Mazur, 1997b) during the late 80s early 90s, public attention to the multiple-choice assessment has

Upload: others

Post on 20-Feb-2022

8 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: LEARNING PHYSICS TEACHING THROUGH COLLABORATIVE …

Canadian Journal of Action Research

Volume 16, Issue 2, 2015, pages 22-41

LEARNING PHYSICS TEACHING THROUGH COLLABORATIVE DESIGN OF CONCEPTUAL MULTIPLE-CHOICE QUESTIONS

Marina Milner-Bolotin University of British Columbia

ABSTRACT Increasing student engagement through Electronic Response Systems (clickers) has been widely researched. Its success largely depends on the quality of multiple-choice questions used by instructors. This paper describes a pilot project that focused on the implementation of online collaborative multiple-choice question repository, PeerWise, in a Physics Methods course for secondary teachers at a large Canadian Teacher Education program. We investigated different facets of PeerWise implementation, teacher-candidates’ engagement through the system, and how PeerWise can support teacher-candidates’ collaboration on designing high quality conceptual multiple-choice questions. This study sheds light on the results of this implementation and suggests directions for future research.

INTRODUCTION Multiple-choice assessment has long been criticized for the lack of validity due to student guessing, for the failure to credit partial knowledge, and most importantly, for the overreliance on the questions that evoke and assess only the least sophisticated cognitive thinking processes (Lau, Lau, Hong, & Usop, 2011; Masters et al., 2001), such as the ones belonging to the Knowledge and Comprehension cognitive levels of Bloom’s Taxonomy of Educational Objectives: Cognitive Domain (1956). However, since the introduction of Electronic Response Systems (clickers) and Peer Instruction pedagogy (Mazur, 1997b) during the late 80s early 90s, public attention to the multiple-choice assessment has

Page 2: LEARNING PHYSICS TEACHING THROUGH COLLABORATIVE …

Learning Physics Teaching Milner-Bolotin

23

increased dramatically. Educational researchers realized that asking “Which is better, multiple-choice or open-ended assessment” was counter-productive, as the answer depended heavily on the purpose of assessment, on the employed pedagogy, and most importantly, on the construction of the multiple-choice questions. This realization and a deeper understanding of the questions’ design increased emphasis on the role of formative assessment in teaching and learning (Beatty, Gerace, Leonard, & Dufresne, 2006). Enhanced by an ever growing knowledge of how people learn (Bransford, Brown, & Cocking, 2002), it gave rise to a recent movement aiming to reconsider if not “exonerate” multiple-choice assessment (Dickinson, 2011; Little, Bjork, Bjork, & Angello, 2012). Contemporary educational researchers have realized that the weaknesses of multiple-choice assessment might not be inherent, but are based on teachers’ lack of pedagogical and content knowledge pertinent to writing powerful multiple-choice items (Sobolewski, 1996). In this context by powerful questions, we mean questions that elicit possible student misconceptions, helping students identify main concepts behind the question, and the relationships between them. This is especially relevant to Science, Technology, Engineering, and Mathematics (STEM) education where a number of multiple-choice assessments for K-12 and college-level subjects have been designed and implemented. These instruments were able to provide a timely and accurate snapshot of student conceptual understanding, thus helping teachers target specific student difficulties in order to help students build more sophisticated STEM understanding (Hestenes & Wells, 1992; Hestenes, Wells, & Swackhamer, 1992; Lasry, Guillemette, & Mazur, 2014). It is also well-known that pedagogically effective multiple-choice questions are difficult to design as they require instructor to possess deep pedagogical and content knowledge of the topic (Beatty et al., 2006; Mazur, 1997a) as well as some general knowledge about multiple-choice question writing and scoring (Lau et al., 2011). In our own research we encountered these difficulties while creating multiple-choice conceptual STEM questions for the online K-12 resource database Mathematics and Science Teaching and Learning through Technologies (Milner-Bolotin, 2014a; Milner-Bolotin, Fisher, & MacDonald, 2013). Multiple-choice test scoring methods have also improved significantly over the last decade. There are new technology-enhanced ways to score multiple-choice items, such as online tutoring systems that provide students with individually tailored feedback. This individualized feedback depends on student’s choice of a specific distractor. Since every distractor is designed around a specific misconception, the students who choose it will be asked to answer additional questions while being provided with relevant hints that will help them clarify the concept and switch to the correct answer. This process is akin to Socratic questioning, where the student is led to correct understanding through a questioning process (O'Byrne & Thompson, 2005) Lastly, modern pedagogies, such as Peer Instruction, emphasize student active engagement and interactivity (Hake, 1998; Kalman, Milner-Bolotin, & Antimirova, 2010), which can be easier achieved when student understanding is continuously monitored and the pedagogy is continuously adjusted. Figure 1 illustrates a case of such a lesson. During this process

Page 3: LEARNING PHYSICS TEACHING THROUGH COLLABORATIVE …

Learning Physics Teaching Milner-Bolotin

24

the students first use clickers to respond individually to a multiple-choice conceptual question, asking them to estimate the amount of work needed to stretch a spring a certain distance. (By conceptual here we mean the question that highlights a specific concept while requiring minimal calculations, it is an anti-thesis of a plug-and-chug question.)

Figure 1: Example of Peer Instruction cycle (6-8 minutes per question). In this example, the students are initially confused (only four chose correct answer [D]). Yet, after the group discussion, 10 out of 13 were able to answer the problem correctly (the remaining three

students refrained from the second voting).

An instructor leads a summary discussion with the class: reasons for correct, as well as for incorrect responses, are elicited from the students (1-2 minutes).

A large number of students answered incorrectly

Students work in groups of 2-3 for additional 1-2 minutes to discuss the

question

Students use clickers to resubmit individual answers

Most of the students provided correct answers, correct answer is revealed; the distractors are

considered and discussed

A clicker question is posed

Students think for 1-2 minutes without consulting peers and submit their individual

responses

A bar chart representing the response distribution is displayed to the class without

revealing the correct answer

Page 4: LEARNING PHYSICS TEACHING THROUGH COLLABORATIVE …

Learning Physics Teaching Milner-Bolotin

25

At this point, the students commit to their individual choice. The histogram of their responses is then shared with the entire class. Then the students are asked to discuss their individual responses with peers in order to justify the reasoning behind their choices, as well as understand the reasoning of their peers. This is especially productive when different members of the group chose different answers to the problem. After this discussion, the students are given an opportunity to respond to the same question individually once again. It has been well-documented that peer discussions following students’ initial commitment to their individual answers have a significant effect on student learning (Kalman et al., 2010; Lasry, 2008; Lasry, Mazur, & Watkins, 2008; Milner-Bolotin, Antimirova, & Petrov, 2010). In other words, student active engagement in the discussion of different alternatives (distractors) has a positive impact on the depth of their understanding. There exist numerous studies exploring the implementation of clickers in K-12 and at university STEM classrooms (Mayer et al., 2009; Milner-Bolotin et al., 2010). Most of them focus on the impact of clicker-enhanced pedagogy on students’ STEM learning. Multiple-choice questions used in these studies are often designed by instructors, book authors, or other content providers, and are implemented using various high-tech or low-tech interactive engagement pedagogies (Kalman et al., 2010; Lasry, 2008; Milner-Bolotin, 2004; Milner-Bolotin, Kotlicki, & Rieger, 2007). While actively engaged in Peer Instruction lessons, the teacher-candidates often find themselves at the receiving end of this assessment, not at the question designers’ side. However, while engaging in the process of designing multiple-choice STEM questions for our question database (Milner-Bolotin, 2014a), we noticed that we have significantly increased our own Pedagogical Content Knowledge (PCK) (Fisher, Butler, MacDonald, Roll, & Milner-Bolotin, 2014; Shulman, 1986). In this process, our research team had a unique opportunity to analyze questions, suggest, discuss, and often reject possible distractors and consider the pedagogical value of different versions of the same multiple-choice question. The learning experience we have gone through was very powerful. Yet, this was the process that we, educators, and not the students often experience. This begs a question, what if the students were asked not only to discuss already pre-determined distractors, but are requested to come up with their own multiple-choice questions that include a number of distractors targeting specific science concepts? What if the most common approach to the implementation of multiple-choice assessment in STEM classrooms forgoes one of the key benefits of multiple-choice assessment – the pedagogical benefit of designing effective context-specific conceptual questions that target specific science misconceptions and learners’ conceptual difficulties? This benefit of gaining experience in designing multiple-choice questions is especially valuable to prospective STEM educators. In the current paper we propose a model for using collaborative modern technologies, such as PeerWise (Denny, 2014) in STEM methods courses in order to engage physics teacher-candidates with collaborative design of conceptual multiple-choice questions. However, before discussing the implementation and

Page 5: LEARNING PHYSICS TEACHING THROUGH COLLABORATIVE …

Learning Physics Teaching Milner-Bolotin

26

possible effects of this process on the teacher-candidates, we should outline the context and design of the current study.

STUDY CONTEXT AND DESIGN The study employs an Action Research methodology that is a collection of “systematic procedures done by teachers (or other individuals in an educational setting) to gather information about, and subsequently improve, the ways their particular educational setting operates, their teaching, and their student learning” (Creswell, 2008, p. 597). Action Research should meet four criteria: (1) involve teaching and learning, and inquire into teacher’s own practice; (2) be within the teacher-researcher’s locus of control; (3) the teacher-researcher should be passionate about it; and (4) the teacher-researcher should be motivated to improve their practice (Gay, Mills, & Airasian, 2012). This study was designed to examine how teacher-candidates’ collaboration on designing multiple-choice conceptual questions via PeerWise in my Physics Methods course can facilitate the development of teacher-candidates’ PCK and promote their positive attitudes about conceptual STEM learning. We have been piloting the implementation of this pedagogy in Physics Methods courses at a large Canadian Research University for the last two years. During the first year of the study (2012-2013) we investigated the impact of modeling Peer Instruction pedagogy on teacher-candidates’ Pedagogical Content Knowledge through evaluating the quality of their participation during class, as well as the quality of the conceptual questions they submitted as part of the course assessment (Milner-Bolotin et al., 2013). During the second year of the study (2013-2014), we added the PeerWise component to the course, such as teacher-candidates were able to respond and comment on the multiple-choice conceptual questions designed by their peers in addition to uploading their own questions. The teacher-candidates who participated in the study were pursuing a BEd degree in order to be certified to teach physics at the secondary school level. The number of teacher-candidates in these courses was 12 and 10 respectively, which is a representative enrollment in a secondary science course at this Teacher Education Program. Most of the teacher-candidates have earned a BSc degree prior to being admitted to this program, while about a quarter of them have been enrolled in a concurrent BSc degree majoring in physics. Secondary methods courses constitute about 5% of the time in the BEd program (including the time, teacher-candidates spend on their school practicum) (Milner-Bolotin, 2014b). For example, Physics Methods courses meet for 3 hours weekly during a 13 week-term (39 hours in total). In addition, teacher-candidates are required to take a general Science Methods course and a large number of “context-independent” pedagogical courses. The Physics Methods course discussed here is the only course in which teacher-candidates are challenged to develop their physics PCK thus bridging their knowledge of physics to the teaching practice. Therefore, the course activities are always grounded in the Physics Education Research and practice while challenging teacher-candidates to explore novel pedagogies and relevant educational technologies. The course instructor (the author) and a Teaching Assistant worked with the teacher-candidates during the course. As part of the

Page 6: LEARNING PHYSICS TEACHING THROUGH COLLABORATIVE …

Learning Physics Teaching Milner-Bolotin

27

course assignments, teacher-candidates were asked to design five multiple-choice conceptual questions every week and to provide feedback on ten questions designed by their peers. In addition Peer Instruction pedagogy was modeled during most of the meetings, such as teacher-candidates became comfortable with it. It is important to mention that many teacher-candidates have been familiar with clickers as their instructors used them in large undergraduate science courses. However, since very few science instructors in Science Faculties have a pedagogical background or are familiar with Physics Education Research, the quality of the clicker-enhanced pedagogies and of conceptual questions used in undergraduate science courses varies significantly. Thus, it was important to model Peer Instruction and active engagement in the Physics Methods course. Lastly, teacher-candidates have never been asked to design conceptual multiple-choice questions themselves: they have always been the recipients of clicker-enhanced pedagogy. Thus, the goal of the Physics Methods course was to change it and help teacher-candidates develop the PCK needed to be able to enact clicker-enhanced pedagogy successfully in their future courses. Learning how to design and evaluate conceptual multiple-choice physics questions was a crucial step in that direction.

DESIGNING MULTIPLE-CHOICE QUESTIONS FOR PEER INSTRUCTION PEDAGOGY IN STEM

CLASSROOMS As discussed earlier, designing effective multiple-choice STEM questions is a multi-stage process (Table 1). In Stage 1, a problem designer has to identify a target concept that is aligned with the goals of the desired curriculum. For example, in the question mentioned in Figure 1, the core concept is the relationship between the area under the Force-Distance graph F(x) and the work done by this force (provided the direction of the force is along the line of the object’s motion). After the concept is identified, a problem, which often includes a real-life situation, has to be devised that requires a student to use this concept for answering it. This is the goal of Stage 2. Multiple-choice questions used in Peer Instruction pedagogy are often self-contained (they do not require a long interpretation of the question), they also tend to use multiple representations, such as a story line, a picture, a graph, or a data table. For example, the problem in Figure 1 uses three representations: a text (a story), a graph, and a schematic diagram of an experiment (Van Heuvelen & Zou, 2001). Moreover, to find the answer to the question, one does not even need to use a calculator: the area of the unshaded right triangle represents the work needed to stretch the spring 0.1 m from its equilibrium (unstretched) state. It is given in the problem that it equals 10 Joules. The area of the shaded trapezoid represents the work needed to be performed to stretch the spring an additional 0.1 m. As one can see in Figure 2, this area is three times larger than the area of the white triangle. Therefore, the additional work needed to be done equals 30 Joules. This problem can also be solved using a more traditional algebraic approach, where W denotes the work required to stretch a spring:

2 2

0 m 0.1 m

2 2

0.1 m 0.2 m 0 m 0.2 m 0 m 0.1 m

1 1(0.1 m) 10 J (0.2 m) 40 J

2 2

1 1(0.2 m) (0.1 m) 40 J 10 J 30 J

2 2

W k k

W W W k k

(1)

Page 7: LEARNING PHYSICS TEACHING THROUGH COLLABORATIVE …

Learning Physics Teaching Milner-Bolotin

28

Figure 2: One of the possible graphical explanations for the question shown in Figure 1

Reviewing the distractors shown in Figure 1, one can see that they represent possible student difficulties, following from either wrong applications of a mathematical reasoning, or the lack of conceptual understanding. Therefore these distractors have very specific pedagogical underpinnings. Figuring out potential student difficulties is the purpose of Step 3 in the question design process (Table 1). While expert STEM teachers can rely on their teaching experience for devising these distractors, it becomes a challenge for novice teachers. Yet, it opens opportunities for them to consult STEM education literature and use its findings in their practice. There are many peer reviewed STEM journals, such as The Physics Teacher, The Science Teacher, Journal of College Science Teaching, Journal of Chemical Education, American Journal of Physics, that are aimed at practitioners and have papers addressing these issues in great detail. Stages 4 and 5 of the question design process are very important, yet they are often neglected or taken for granted: devising explanations for both correct and incorrect responses, and providing ideas for extension activities such as experiments, demonstrations, simulations, that can help students see how this question fits into the bigger picture of a lesson, unit or an entire course.

Stage Action Challenge

1 Identifying a target concept Making sure that it is a major concept that presents challenges to the students and that is imperative for students to understand

2 Devising a problem (often real life situation) that requires using this concept for its solution.

To come up with a problem that will: a) be of interest to the students b) use multiple representations c) be succinct and self-contained d) be conceptual - not require extensive calculations e) can be solved on the basis of students’ prior knowledge during the allocated time

Page 8: LEARNING PHYSICS TEACHING THROUGH COLLABORATIVE …

Learning Physics Teaching Milner-Bolotin

29

3 Considering potential

student difficulties (misconceptions) and coming up with not only the right answer, but with the meaningful distractors

To envision potential stumbling blocks and student difficulties and to come up with meaningful distractors

4 Devising explanations for both correct and incorrect responses

To make sure that students not only understand why the correct answer is correct, but also what is wrong with the distractors.

5 Consider how this question fits into a bigger picture of the lesson, unit or even a course

Consider possible follow up questions or activities. Help students see the purpose of the question and its goals.

Table 1: Five stages for design of conceptual multiple-choice STEM questions

This design process is rather complex and is especially difficult for prospective teachers. In the next section we describe how modern technology, such as PeerWise online system (Denny, 2014) can help create a learning environment where teacher-candidates can collaborate on question design, support each other and provide and incorporate feedback to create pedagogically effective STEM questions.

USING PEERWISE TO SUPPORT DESIGN OF STEM MULTIPLE-CHOICE QUESTIONS IN

METHODS COURSES PeerWise is a freely available online environment that allows students to create, share, answer, edit and discuss multiple-choice questions (Denny, 2014). In PeerWise, the instructor creates a course. Then, course participants can author and upload their own questions to be shared with the class. In this process they have to provide the question stem, which can include pictures, diagrams, equations, etc. (Figure 3). Then the question author provides all possible alternatives (distractors plus a correct answer) (Figure 4). This includes an explanation of the alternatives. Then the author indicates the topics the question is addressing (Figure 5). After the question has been uploaded, all students can answer it and provide their feedback, which can include editing both the question stem and the distractors, commenting on possible alternatives or suggesting a potential sequence for the question.

Page 9: LEARNING PHYSICS TEACHING THROUGH COLLABORATIVE …

Learning Physics Teaching Milner-Bolotin

30

Figure 3: A screen shot of a conceptual multiple-choice question that includes text, equations and a figure.

Figure 4: A screen shot of the distractors for the conceptual multiple-choice question shown in Figure 3 and of the explanation. Notice, PeerWise displays the statistics about how students answer every question and how confident they are about their responses.

Page 10: LEARNING PHYSICS TEACHING THROUGH COLLABORATIVE …

Learning Physics Teaching Milner-Bolotin

31

Figure 5: Additional question-related information provided by the author and by the people who solved the question

The format of the PeerWise online repository encourages students to engage actively with designing their own questions, as well as in improving the questions designed by peers. Moreover, since PeerWise collects all the individual statistics, course participants have a continuous report on their own participation, progress, and on the quality of their questions as evaluated by their peers. Peer feedback is a key element of PeerWise and since teacher-candidates receive continuous feedback on their questions from the instructor and the Teaching-Assistant, they have an opportunity to see what constructive feedback is and also learn how to respond to it. Learning to provide constructive feedback and accept it in a positive constructive manner is one of the most valuable skills in STEM teaching, thus PeerWise becomes a non-threatening environment where this skill can be modeled, practiced and improved. This also adds a game flavour to this environment, as participants can earn points and badges while progressing through different “game” levels (Figure 6).

Figure 6: Examples of some of the statistical outputs available in PeerWise.

Page 11: LEARNING PHYSICS TEACHING THROUGH COLLABORATIVE …

Learning Physics Teaching Milner-Bolotin

32

This process allows teacher-candidates to acquire PCK crucial for successful STEM teaching. The pedagogical discussions involved in this process constitute the first step in creating a STEM teaching community of practice. Lave and Wenger (1991) referred to this gradual induction process a legitimate peripheral participation, where newcomer (teacher-candidates) begin their participation in the STEM education community by first performing low-risk tasks – modifying already existing multiple-choice questions, then by providing feedback to the questions designed by others, and eventually by designing their own questions. This process is akin to a pedagogical scaffolding of novice teachers by expert instructors and by their peers. The key to a successful functioning of this community is providing ample opportunities for teacher-candidates to practice these tasks in a low-risk environment while supporting each other along the way. This requires that Peer Instruction is modeled by the instructor gradually helping teacher-candidates experience this pedagogy both as students and as future teachers. We described this modeling process and its effects on teacher-candidates in detail in our recent paper (Milner-Bolotin et al., 2013). The main findings of that study indicate that using Peer Instruction pedagogy in a Physics Methods course increases future teachers’ interest in active learning; helps them develop Pedagogical Content knowledge; as well as help teacher educators identify gaps in the Content Knowledge of future teachers. Lastly, having teacher-candidates experience Peer Instruction both as teachers and as learners in their methods courses is a great opportunity for helping them become reflective about the effect of the teaching practices they might want to use in their own classroom on their students. Throughout this Physics Methods course, teacher-candidates were encouraged to use a Mathematics and Science Teaching and Learning through Technology resource created by our research team (Milner-Bolotin, 2014a). It includes more than 1,200 conceptual multiple-choice questions relevant to British Columbia K-12 STEM curriculum and also provides detailed discussions of the distractors including explicit pedagogical comments. While the number of questions is large, it is far from being exhaustive. Moreover, in order to use the resources designed by others, teacher-candidates have to understand the philosophy behind each one of the resources. Teacher-candidates have to become critical consumers; in order to do that, they have to learn to design questions themselves, such as they can modify existing resources to fit their pedagogical goals. As part of the course assignments, teacher-candidates were required to use PeerWise to contribute five multiple-choice conceptual questions and respond to ten questions designed by their peers weekly. This means that over the duration of the course, they created more than 50 conceptual questions and answered, commented, and edited more than a hundred questions. This is a significant time investment on their behalf. And it is important to evaluate if these efforts paid off in terms of teacher-candidates’ PCK and their attitudes about active learning and the use of technologies in STEM education. Below we discuss most important study outcomes.

RESULTS AND DISCUSSION In this section we describe some of the results of our first implementation of the PeerWise system for design and collaboration on conceptual multiple-choice physics questions. We

Page 12: LEARNING PHYSICS TEACHING THROUGH COLLABORATIVE …

Learning Physics Teaching Milner-Bolotin

33

organize the discussion into a following logical sequence. First, we summarize teacher-candidates’ engagement using the statistical data collected by the PeerWise software. Second, we complement this data with the examples of teacher-candidates’ contributions and comments. Third, we discuss our observations of the teacher-candidates’ participation and their engagement during the class meetings. And fourth, we speculate how teacher-candidates’ participation in PeerWise can help an instructor facilitate method courses. The quantitative data described in this section is collected through PeerWise software. While the course officially lasted 13 weeks, only 10 weeks were dedicated to teacher-candidates’ participation in the questions’ design process. Therefore, we expected teacher-candidates to submit at least 50 questions and submit answers to at least 100 questions. As one can see from Table 2, most of them have fulfilled these criteria. It is important to notice that some teacher-candidates have experienced considerable difficulties in designing these questions. In our view, these difficulties were caused by five factors: a) Some teacher-candidates had never experienced conceptual questions that go beyond factual recall and since the instructor kept emphasizing that the questions had to be conceptual, it made the design of questions more challenging; b) Designing a pedagogically effective conceptual question requires being aware of learners’ prior knowledge. This was especially difficult for international teacher-candidates, who did not have experience with North American secondary schools; c) Designing effective distractors was difficult: while teacher-candidates could provide correct answers, it was hard for them to think of potential student difficulties and suggest meaningful incorrect answers; d) Very often teacher-candidates reverted to the formulaic solution without being able to focus on the conceptual side of the question and provide explanations that will be meaningful to the students; e) To their surprise, teacher-candidates often revealed substantial conceptual difficulties and incomplete physics understanding. While this at first proved to be a significant obstacle, the process of question design helped address many of these difficulties.

Teacher-candidate Questions created Answers submitted Comments written

1 50 110 52 2 50 85 40 3 51 115 74 4 50 90 34 5 60 110 79 6 50 112 82 7 57 109 107 8 50 192 14 9 50 91 81

10 50 100 50

Average 51.8 111.4 61.3 Total 518 1114 613

Table 2: Summary statistics of teacher-candidates’ participation in PeerWise

Page 13: LEARNING PHYSICS TEACHING THROUGH COLLABORATIVE …

Learning Physics Teaching Milner-Bolotin

34

An example of a question designed by a teacher-candidate during the first week of the course is shown in Figure 6.

Figure 6: An example of a multiple-choice conceptual question created by a teacher-candidate during the first week of the course.

This is a standard physics question dealing with parallel resistors. However, the quality of distractors and the explanation is still low. The explanation provides an accurate algebraic solution, yet it misses the most importation point – the emphasis that the equivalent (resultant) resistance for resistors connected in parallel is always smaller than the resistance of the smallest resistor. Moreover, the correct answer is rounded and it is not clear, what is the purpose of expressing it in this form. In this case, one does not need to do any calculations to see that the only correct answer is C (since it is less than1 ). Yet, this is not emphasized by the student-teacher. Thus it is a missed teachable moment. The entire conceptual purpose of the question is missed and instead of being a powerful conceptual question, it became a traditional “plug and chug” question. Moreover, it is unclear where the distractors came from – what is the reasoning behind them, as teacher-candidate did not refer to the distractors in the explanation. This also reflects the philosophy of the teacher-candidate that in physics learning, the only thing that matters is arriving at the correct answer. A more fruitful explanation of this question could have been along the following lines: Let us denote 1 21 kΩ 1000 Ω and 1 ΩR R . Since the resistors are connected in parallel,

we can calculate the equivalent resistance applying the known formula; we can also prove that the equivalent resistance is going to be less than 1 Ω (the smallest resistance of the two):

Page 14: LEARNING PHYSICS TEACHING THROUGH COLLABORATIVE …

Learning Physics Teaching Milner-Bolotin

35

_

_ 1 2

1 2 2_ 1 1

1 2 1 2

1 2 1_ 2 2

1 2 1 2

1 1 1 1 1 1001 1000 1

1000 1 1000 1001eq parallel

eq parallel

eq parallel

eq parallel

RR R R

R R RR R R

R R R R

R R RR R R

R R R R

(2)

If the student-teacher were to think of the question conceptually, they would have also come up with more powerful distractors. This question was created earlier in the term and provided an excellent opportunity to discuss the multiple-choice conceptual question design in class. The range of the comments submitted by teacher-candidates shows their deep engagement with the questions designed by their peers. The vast majority of the comments dealt with conceptual physics understanding. Teacher-candidates often attempted to provide clarifications and improvements to the questions. Teacher-candidates often reflected on their own conceptual understanding and potential conceptual difficulties. As the course progressed, they started thinking more as teachers – attempting to discuss how the questions might be perceived by the secondary physics students. For example, they started thinking about the background knowledge the students have to have in order to answer the question, about possible alternative ways of thinking about the phenomenon, about possible extensions to the question and how it might relate to a demonstration, a laboratory activity, or another learning experience. Table 3 shows a few of these comments.

1 I answered "time" because I thought the question was asking what quantity is the same for both vertical motion and horizontal motion...I guess I answered this wasy because that's often a key piece of information to use to solve the problem.

2 I answered "velocity" because velocity has both a horizontal and vertical component, and that is how we quantify motion in the horizontal and vertical directions respectively.

3 It’s a good question so that kids remember that you take multiple values when you make the arcsine, due to the sine law. That’s what tripped me up.

4 The question needs an introduction - explaining that it can in general be solved by Bernoulli's equations, but for the simple case that the velocity of water in the barrel is taken to be zero, by conservation of energy - according to the theorem of Torricelli: http://curricula2.mit.edu/pivot/book/ph1806.html?acode=0x0200 You didn't say the barrel was sitting on the ground [I pictured pop can on a table and chose h = H :( - my bad] You didn't derive the solution: "kinematics equations" is too vague. You should derive time t from the equation for displacement in vertical direction - then show that distance in the horizontal

direction follows as ( ) 2 ( )v t h H h , as you correctly say. Do the students

know to differentiate a function to find where it is at a maximum/minimum?

Page 15: LEARNING PHYSICS TEACHING THROUGH COLLABORATIVE …

Learning Physics Teaching Milner-Bolotin

36

5 The qualitative nature of the question forces students to think about the concept

and not jump to equations right away, which is good! 6 No diagram so I have no idea what you're talking about. Not part of the

curriculum according to the BC Ministry of Education IRP (Instructional Resource Package) for physics 11 or 12.

Table 3: Comments to conceptual multiple-choice questions submitted by teacher-

candidates

As the course progressed, teacher-candidates were gradually able to produce more sophisticated questions (Figure 7). This was also clear from the peer rating of their questions. They also learned to provide more meaningful constructive feedback to their peers. This is an important skill for a teacher. At the same time, they learned to respond to the feedback provided by their peers without being upset about it. Peer feedback and peer collaboration became the backbone of the course. From our observations of teacher-candidates during their school practicum, we could see that they were able to project their views on the importance of peer collaboration to their own classrooms (Milner-Bolotin et al., 2013). Many of them chose to use Peer Instruction as one of the main pedagogies, even though the author of the paper was not their faculty advisor and had no influence on their practicum performance assessment. Moreover, a number of teacher-candidates asked the author to come and observe their teaching and provide feedback on their implementation of active engagement pedagogies, of which Peer Instruction was a big part.

Figure 7: This question was produced by a teacher-candidate during the second month of the course. It was answered by six people and generated nine comments.

Page 16: LEARNING PHYSICS TEACHING THROUGH COLLABORATIVE …

Learning Physics Teaching Milner-Bolotin

37

As every question in PeerWise is rated by peers, every teacher-candidate could see how their peers evaluated their own contributions. This was also reflected by the badges earned by teacher-candidates, as different course contributions were valued. For example, teacher candidates were earning badges for being question authors, for answering questions, providing critique, verifying somebody’s solution or explanation, helping peers, rating their questions, etc. PeerWise engagement happened mostly outside of class. Both the course instructor and the Teaching Assistant read most of the questions and commented on them. This active involvement of the instructor in the PeerWise community took considerable time and effort, as the instructor answered (and often commented) to 466 questions designed by teacher-candidates over the span of three months (which translated into 25 hours of instructor’s time over the course of 13 weeks). However, it was invaluable for helping the instructor prepare for class and make it as relevant and effective as possible. The instructor could see where teacher-candidates struggled and even if not all problems could be addressed in class, they were addressed on the one-on-one basis. It was often clear from reviewing the questions designed by teacher-candidates that they held similar misconceptions about their students. Many of these misconceptions were addressed in class through either a theoretical discussion or through designing a relevant experiment and collecting experimental evidence. Thus the instructor modeled how student conceptual difficulties can be addressed while addressing the difficulties held by teacher-candidates. For example, Logger Pro sensors and VideoAnalysis were used consistently to collect real life data and test various theoretical predictions (Antimirova & Milner-Bolotin, 2009; Milner-Bolotin, 2012; Vernier-Technology, 2014). Lastly, an important pedagogical advantage of PeerWise question database for student-teachers is that it can be accessed by teacher-candidates after the course is over. Thus, as a result of the course they have produced a collection of more than half a thousand conceptual physics questions that they can use in their own teaching. All the questions can also be downloaded at the end of the course and shared with all of the teacher-candidates.

CONCLUSION It has been long recognized that teacher-educators should practice in their methods courses the pedagogies they would like teacher-candidates to enact in their own classrooms (Grossman & McDonald, 2008). “Do as I say, and not as I do” philosophy is especially dangerous in the teacher-education context. Thus teacher-candidates should be provided with multiple opportunities to engage in ‘‘intensive, focused opportunities to experiment with aspects of practise and then learn from that experience’’ (Grossman & McDonald, 2008, p. 189-190). In order to prepare future teachers to engage their future students in active technology-enhanced STEM learning, teacher education programs need to provide teacher-candidates with opportunities to learn and practice these pedagogies in their methods courses. This paper discussed a pilot implementation of PeerWise online collaborative software in a Physics Methods course for prospective physics teachers. We

Page 17: LEARNING PHYSICS TEACHING THROUGH COLLABORATIVE …

Learning Physics Teaching Milner-Bolotin

38

have begun to uncover the potential of PeerWise in the context of STEM teacher preparation. We have shown that PeerWise provides many valuable opportunities for teacher-candidates to discuss the design of conceptual multiple-choice physics questions, to provide feedback on the questions designed by their peers, and to practise the skills crucial for STEM teaching. In the future we will investigate how PeerWise-enhanced pedagogy can support specific elements of PCK of teacher-candidates, such as the construction of their own conceptual physics understanding, their ability to identify and address student conceptual difficulties, their willingness and ability to use questioning to promote student-centered instruction in their own classrooms during the practicum. While this study was explorative in nature, it revealed a number of potential applications of educational technologies in STEM methods courses, such as using interactive technologies (clickers) to see instantaneous feedback on student understanding, as well as collaborating on creating STEM teaching resources. However, more systematic research should be done in order to understand the impact of educational technologies on promoting Pedagogical Content Knowledge of future STEM teachers

REFERENCES Antimirova, T., & Milner-Bolotin, M. (2009). A brief introduction to video analysis. Physics

in Canada, 65(April-May), 74. Beatty, I. D., Gerace, W. J., Leonard, W. J., & Dufresne, R. J. (2006). Designing effective

questions for classroom response systems teaching. American Journal of Physics, 74(1), 31-39.

Bloom, B. S. (1956). Taxonomy of educational objectives: Cognitive domain (Vol. 1). New

York: Longman. Bransford, J. D., Brown, A. L., & Cocking, R. R. (2002). How people learn: Brain, mind,

experience, and school. Washington, DC: The National Academies Press. Creswell, J. W. (2008). Educational research: Planning, conducting, and evaluating

quantitative and qualitative research (3rd ed.). Upper Saddle River: NJ: Pearson. Denny, P. (2014). PeerWise. Retrieved from peerwise.cs.auckland.ac.nz/ Dickinson, M. (2011). Writing multiple-choice questions for higher-level thinking. Learning

Solutions Magazine. http://www.learningsolutionsmag.com/articles/804/writing-multiple-choice-questions-for-higher-level-thinking. The article was published on December 5, 2011.

Page 18: LEARNING PHYSICS TEACHING THROUGH COLLABORATIVE …

Learning Physics Teaching Milner-Bolotin

39

Fisher, H., Butler, D., MacDonald, A., Roll, I., & Milner-Bolotin, M. (2014). Measuring short-

term effects of self-regulatory prompts on problem-solving abilites in introductory genetics. Paper presented at the 3rd International conference of STEM education and our planet: Making connections across contexts. Vancouver, BC. http://stem2014.ubc.ca/

Gay, L. R., Mills, G. E., & Airasian, P. (2012). Educational research: Competencies for analysis

and applications (10th ed.). Upper Saddle River: NJ: Pearson. Grossman, P., & McDonald, M. (2008). Back to the future: Directions for research in

teaching and teacher education. American Educational Research Journal, 45(1), 184-205.

Hake, R. R. (1998). Interactive-engagement versus traditional methods: A six-thousand-

student survey of mechanics test data for introductory physics courses. American Journal of Physics, 66(1), 64-74.

Hestenes, D., & Wells, M. (1992). A Mechanical Baseline Test. The Physics Teacher, 30, 159-

166. Hestenes, D., Wells, M., & Swackhamer, G. (1992). Force Concept Inventory. The Physics

Teacher, 30(3), 141-157. Kalman, C. S., Milner-Bolotin, M., & Antimirova, T. (2010). Comparison of the effectiveness

of collaborative groups and Peer Instruction in a large introductory physics course for science majors. Canadian Journal of Physics, 88(5), 325-332.

Lasry, N. (2008). Clickers or flashcards: Is there really a difference? The Physics Teacher,

46(5), 242-244. Lasry, N., Guillemette, J., & Mazur, E. (2014). Two steps forward, one step back. Nature

Physics, 10(June), 402-403. Lasry, N., Mazur, E., & Watkins, J. (2008). Peer Instruction: From Harvard to the two-year

college. American Journal of Physics, 76(11), 1066-1069. Lau, P. N. K., Lau, S. H., Hong, K. S., & Usop, H. (2011). Guessing, partial knowledge, and

misconceptions in multiple-choice tests. Educational Technology & Society, 14(4), 99-110.

Lave, J., & Wenger, E. (1991). Situated Learning: Legitimate peripheral participation.

Cambridge: Cambridge University Press.

Page 19: LEARNING PHYSICS TEACHING THROUGH COLLABORATIVE …

Learning Physics Teaching Milner-Bolotin

40

Little, J. L., Bjork, E. L., Bjork, R. A., & Angello, G. (2012). Multiple-choice tests exonerated, at

least of some charges: Fostering test-induced learning and avoiding test-induced forgetting. Psychological Science, 23(1), 337-344.

Masters, J. C., Hulsmeyer, B. S., Pike, M. E., Leichty, K., Miller, M. T., & Verst, A. L. (2001).

Assessment of multiple-choice questions in selected test banks accompanying text books used in nursing education. Journal of Nursing Education, 40(1), 25-32.

Mayer, R. E., Stulla, A., DeLeeuwa, K., Almerotha, K., Bimbera, B., Chuna, D., . . . Zhanga, H.

(2009). Clickers in college classrooms: Fostering learning with questioning methods in large lecture classes. Contemporary Educational Psychology, 34(1), 51-57.

Mazur, E. (1997a). Moving the mountain: Impediments to change. The Physics Teacher, 35,

1-4. Mazur, E. (1997b). Peer Instruction: Getting students to think in class. Paper presented at

the American Institute of Physics Conference Proceedings: Changing role of physics departments in modern universities. University of Maryland, College Park.

Milner-Bolotin, M. (2004). Tips for using a peer response system in the large introductory

physics classroom. The Physics Teacher, 42(8), 47-48. Milner-Bolotin, M. (2012). Increasing interactivity and authenticity of chemistry

instruction through data acquisition systems and other technologies. Journal of Chemical Education, 89(4), 477-481.

Milner-Bolotin, M. (2014a). Mathematics and science teaching and learning through

technologies. Retrieved from http://scienceres-edcp-educ.sites.olt.ubc.ca/ Milner-Bolotin, M. (2014b). Promoting research-based physics teacher education in

Canada: Building bridges between theory and practice. Physics in Canada, 70(2), 1-3. Milner-Bolotin, M., Antimirova, T., & Petrov, A. (2010). Clickers beyond the first year

science classroom. Journal of College Science Teaching, 40(2), 18-22. Milner-Bolotin, M., Fisher, H., & MacDonald, A. (2013). Modeling active engagement

pedagogy through classroom response systems in a physics teacher education course. LUMAT: Research and Practice in Math, Science and Technology Education, 1(5), 525-544.

Milner-Bolotin, M., Kotlicki, A., & Rieger, G. (2007). Can students learn from lecture

demonstrations: The role and place of interactive lecture experiments in large introductory science courses. Journal of College Science Teaching, 36(4), 45-49.

Page 20: LEARNING PHYSICS TEACHING THROUGH COLLABORATIVE …

Learning Physics Teaching Milner-Bolotin

41

O'Byrne, J., & Thompson, R. ( 2005). The tutorial benefits of on-line assignments:

MasteringPhysics in first year physics at the University of Sydney. Paper presented at the 2005 National UniServe Conference: Blended Learning and Improvisation. University of Sydney, Australia.

Shulman, L. S. (1986). Those who understand: Knowledge growth in teaching. Educational

Researcher, 15(2), 4-14. Sobolewski, S. J. (1996). Development of multiple-choice test items. The Physics Teacher,

34(2), 80-82. Van Heuvelen, A., & Zou, X. (2001). Multiple representations of work-energy processes.

American Journal of Physics, 69(2), 184-194. Vernier-Technology. (2014). Logger Pro (Version 3.6.1). Portland, Oregon: Vernier

Technology. Retrieved from www.vernier.com

BIOGRAPHICAL NOTE: _____________________________ Dr. Marina Milner-Bolotin has been an Assistant Professor in the Department of Curriculum and Pedagogy at the University of British Columbia since 2010. She earned her M.Sc. from Kharkov National University in Ukraine (1991) and her Ph.D. in Science and Mathematics Education at the University of TX at Austin (2001). She has been teaching secondary and post-secondary physics and mathematics, while working with science and mathematics teachers, for more than 20 years in Israel, USA, and Canada. Since coming to UBC she became an active member of Science Education group. She has published more than two dozens of papers in refereed journals. In 2014 an introductory physics textbook she co-authored was published. She is very active in provincial, national and international science education organizations. Her research focuses on exploring how technology can be used in mathematics and science education and how prospective and in-service teachers can be supported in effective technology integration. In 2010 she has been awarded Canadian Association of Physicists Excellence in Undergraduate Physics Teaching Medal and in 2014 she received a UBC Faculty of Education Killam Teaching Award. _____________________________