stephen c. ehrmann

35
Using Evidence to Improve Using Evidence to Improve Teaching and Learning Teaching and Learning (with Technology): (with Technology): Asking the Right Questions Asking the Right Questions http://tinyurl.com/cp6gs2 Stephen C. Ehrmann 1 [email protected]

Upload: damian

Post on 17-Jan-2016

25 views

Category:

Documents


0 download

DESCRIPTION

Using Evidence to Improve Teaching and Learning (with Technology): Asking the Right Questions http://tinyurl.com/cp6gs2. Stephen C. Ehrmann. 140+ institutional subscribers to TLT Group services - PowerPoint PPT Presentation

TRANSCRIPT

Page 1: Stephen C. Ehrmann

Using Evidence to Improve Using Evidence to Improve Teaching and Learning Teaching and Learning

(with Technology): (with Technology): Asking the Right QuestionsAsking the Right Questions

http://tinyurl.com/cp6gs2

Stephen C. Ehrmann

[email protected]

Page 2: Stephen C. Ehrmann

Thanks!Thanks!

• 140+ institutional subscribers to TLT Group services

• AAC&U, ACRL, EDUCAUSE, HBCU Faculty Development Network, League for Innovation, MERLOT, NISOD, POD, SCUP

• Annenberg/CPB and AAHE

• TLT Group Founding Sponsors: Blackboard, Compaq, Microsoft, SCT, WebCT

• Washington State Univ. (Flashlight Online), • Brigham Young (student course evaluation)• Bucks County CC (hybrid workshops), • Butler University (train the trainer) • Central Michigan U (5 minute workshops)• Drexel (evaluation of online programs)• IUPUI (e-portfolios); • George Washington U (faculty development

strategies)• Old Dominion U (Podcasting), hybrid pro

development)• Gannon U (seven principles)• Hamline U, NC State (TLTRs)• Johnson C. Smith (ARQ)• Mesa CC (Diversity)• MIT (adoption of innovation)• U of Nebraska (LTAs), • Oregon State (evaluation of CMS)• U of Nevada, Reno (ARQ)• U Queensland (ePortfolios)

[email protected]

Page 3: Stephen C. Ehrmann

OutlineOutline

I. Your responses to the survey

II. A “Good Problem” in education

III. Evaluating one response to that problem in order to fine-tune it

A. Methods of evaluation

B. Surveys

C. Matrix Surveys and the Scholarship of Teaching and Learning

[email protected]

Page 4: Stephen C. Ehrmann

Experience with ClickersExperience with Clickers

[email protected] 4

6%

6%

6%

8%

32%

42% 1. I’ve never used clickers before now.

2. I’ve used a clicker before now, but not as an instructor.

3. As a teacher, I’ve mainly used clickers to see what students remember, so I can adjust what I’m doing.

4. As a teacher, I’ve mainly used clickers to quiz students and grade them instantly.

5. As a teacher, I’ve mainly used clickers to deepen students’ thinking.

6. Other

Page 5: Stephen C. Ehrmann

Survey ResponsesSurvey Responses

• http://tinyurl.com/cwmrp3

– Also linked to the resource page

[email protected] 5

Page 6: Stephen C. Ehrmann

Examples of Seeking EvidenceExamples of Seeking Evidence

• “In any class, I want to know what my starting point should be in order to ensure I'm not beginning in a place that will cause some students to be lost. For instance, in an entry-level composition course, I might ask some basic questions such as what is your personal process for writing? What are some ways you overcome writer's block? etc. I initiate this questioning on day 1 to begin our semester conversation.”

[email protected] 6

Page 7: Stephen C. Ehrmann

Examples of Inquiry (2)Examples of Inquiry (2)

• “What was your "research process" for this __x__ [project]? Did it include electronic library resources?? If so, how did you find what information you needed?? Along the path for this project, did you ask for assistance (in person or long-distance (phone or e-mail or chat)? If you didn't use the Library (physical or virtual), do you have any idea what sources you did NOT get access by making that choice?”

[email protected] 7

Page 8: Stephen C. Ehrmann

Examples of MethodsExamples of Methods

• “We used student ambassadors that served as communication conduits from students to faculty. Ambassadors would solicit feedback from other students in the course. Faculty would meet with ambassadors 3-4 times a semester for feedback. I was able to make some directions for assignments more clear from this input.”

[email protected] 8

Page 9: Stephen C. Ehrmann

Methods (2)Methods (2)

• “At least once a week I simply ask the class for a show of hands to find out their level of comfort with a topic, whether or not a particular activity was helpful, their preferences for types of activities, readings, evaluations, etc. I don't always go with what they say their preferences are, but I feel it is important for me to keep in contact with their point of view.”

[email protected] 9

Page 10: Stephen C. Ehrmann

Methods (3)Methods (3)

• “This semester I polled students at the beginning of the semester (paper and pencil) about the topics and vocabulary they wanted to explore and become more fluent with and we built the class around that.”

• “I frequently make part of my lesson plan from themes of errors I see in student homework.”

• “mid-semester survey”

[email protected] 10

Page 11: Stephen C. Ehrmann

A Good ProblemA Good Problem

• As teachers, faculty are sometimes embarrassed by problems and seek to avoid them

• As researchers, faculty seek good problems

• Passed along by Prof. Randy Bass, Georgetown University

[email protected] 11

Page 12: Stephen C. Ehrmann

A Good ProblemA Good Problem

• How many graduating seniors:

– Don’t understand physics, chemistry or biology enough to understand how an oak tree becomes heavier than an acorn?

– Don’t understand circuits enough to light a light bulb with a batter and one piece of wire?

– Don’t understand physics enough to predict what will happen to their reflections when they back away from a mirror?

– To see MIT and Harvard graduating seniors wrestle with problems like these: http://tinyurl.com/cp6gs2

[email protected] 12

Page 13: Stephen C. Ehrmann

What fraction of students completing What fraction of students completing your program your program with B or better average with B or better average

would show this kind of misconception?would show this kind of misconception?

[email protected] 13

45%

47%

5%

3%

Your PredictionYour Prediction

1. Almost none (Less than 5%)

2. Some (5-20%)

3. Plenty (21-40%)

4. Epidemic (41% or more)

Page 14: Stephen C. Ehrmann

New Technology, Old TrapNew Technology, Old Trap

• Thomas Kuhn; old scientists often die without ever changing what has become common sense to them

• If we’re not helping students change their paradigms on campus, are we doing even worse online?

• My hunch: We need to change both, and take advantage of both, if students are to really master the ideas and skills of greatest importance

[email protected] 14

Page 15: Stephen C. Ehrmann

What Can Be Done: ExampleWhat Can Be Done: Example

• Study of intermediate chemistry course at University of Wisconsin

– See http://tinyurl.com/cp6gs2 for this and other resources

• Other useful strategies for helping students master ideas to apply them (even months or years later)?

[email protected] 15

Page 16: Stephen C. Ehrmann

Peer Instruction During LecturePeer Instruction During Lecture

1. Pose a question that requires thought, not memory, to answer

2. Poll students (semi) anonymously using clickers or some other polling technology

3. Students pair up to discuss how to answer the question

4. Students are polled individually again

[email protected] 16

Page 17: Stephen C. Ehrmann

Asking Students to ThinkAsking Students to Think

• “I’m going to combine these chemicals in a moment. What will the result look like?”

• “You’ve just listened to a student team presentation using online resources. Given the sources they cited, how trustworthy was their argument?”

• Questions you’ve used?

[email protected] 17

Page 18: Stephen C. Ehrmann

What Should Worry Faculty MostWhat Should Worry Faculty Most

about “Assessment about “Assessment?”?”

[email protected] 18

42%

24%

20%

0%

0%

4%

10%

ExampleExample

1. Assessment reduces everything to numbers

2. Assessment is controlled by other people, not the instructor

3. Loss of privacy in teaching

4. Threat to job, P&T chances

5. Assumes all students are supposed to learn the same thing

6. Used properly, assessment is empowering, very worrisome

7. _________

Page 19: Stephen C. Ehrmann

Research v. EvaluationResearch v. Evaluation

• Research indicates that this strategy usually helps deepen understanding, and help students learn to apply ideas to unfamiliar problems

• Evaluation is necessary to:

– See whether it’s working for you, and

– What you can do to make the technique work better in your own course

[email protected] 19

Page 20: Stephen C. Ehrmann

Activity v. TechnologyActivity v. Technology

• Why evaluating technology in order to improve learning doesn’t work

– Example: distance learning

• Other uses of clickers include taking attendance, instant graded quizzes, ‘did you hear what I just said’, ‘did you read and remember the material,’ …

• We’ll focus our discussion on evaluating polling to support peer instruction

[email protected] 20

Page 21: Stephen C. Ehrmann

Principle #1Principle #1

• If you need to understand whether a resource, technology, or facility can improve learning, and how to get more value from it, study:

– What various students actually do with that resource (if anything),

– And why they each did what they did

[email protected] 21

Page 22: Stephen C. Ehrmann

Dorbolo’s SurveyDorbolo’s Survey

• How interesting/fun was this activity?

• How useful do you think it will be for your future life?

[email protected] 22

Page 23: Stephen C. Ehrmann

*Methods: Surveys*Methods: Surveys

• Suppose you had asked one or two thinking questions, polled students (etc.)

• You want to know how well it’s working, OR you want to learn how to do it better

– What are one or two questions you’d ask your students in a survey?

[email protected] 23

Page 24: Stephen C. Ehrmann

Example Feedback Form (Draft)Example Feedback Form (Draft)

• http://tinyurl.com/dl5m2q

• What questions would you add or change?

[email protected] 24

Page 25: Stephen C. Ehrmann

““Traditional” Online SurveyTraditional” Online Survey

• Respondent pool: the people who see this survey

• Every item must make sense to (and for) every person in the respondent pool

Items 1, 2, 3, ....URL

Page 26: Stephen C. Ehrmann

Several Respondent PoolsSeveral Respondent Pools

• Each pool sees the same question group

• Different URLs for each pool, so pools can be analyzed separately or together

URL

URL

URL

Items 1, 2, 3, ....

Items 1, 2, 3, ....

Items 1, 2, 3, ....

Page 27: Stephen C. Ehrmann

Different Question Groups for Different Question Groups for Different Respondent PoolsDifferent Respondent Pools

• Each pool sees a different mix of question groups

• Different URLs for each pool so the respondent pools and question groups can be analyzed separately, or together

URL

URL

URL

Items 1, 2, 3

Item 4

Items5, 6

Items 7, 8, 9

X

X

X

X

X X

X

Page 28: Stephen C. Ehrmann

Different Response FormDifferent Response FormSame SurveySame Survey

• http://tinyurl.com/ddd4zs

[email protected] 28

Page 29: Stephen C. Ehrmann

Advantages of Matrix SurveysAdvantages of Matrix Surveys

• You don’t have to ask the same questions all the time, so questions can be much more specific, concrete

• You can pool data to get a larger “N” and see tendencies you might have missed or ignored

• You can look at trends over time

• You can work with colleagues on questions, data analysis

[email protected] 29

Page 30: Stephen C. Ehrmann

Scholarship of Teaching & LearningScholarship of Teaching & Learning

• Studying your own courses in order to evaluate and improve them

• Sharing what you learn with colleagues

[email protected] 30

Page 31: Stephen C. Ehrmann

Features of Flashlight OnlineFeatures of Flashlight Online

• Share questions with other authors (e.g., colleagues here, Drexel, Flashlight staff)

• Tailor wording (what is the polling technology called in each class?)

• Send out different response forms at different times, but still collect data in the same place

• Send out reminders to non-respondents while maintaining their anonymity

[email protected] 31

Page 32: Stephen C. Ehrmann

Learning MoreLearning More

• “Asking the Right Questions” (ARQ) materials

– Workshops 10-15 minutes long

– Online materials

– Pass it forward

[email protected] 32

Page 33: Stephen C. Ehrmann

ThanksThanks

• Questions? Comments?

• If you like this kind of thing, individual membership is free (if you’re at Rolla). Go to www.tltgroup.org for information about memberships and inst’l subscriptions

• And if you’d like to develop, or help me develop, the matrix survey on personal response systems, my email is below!

[email protected]

Page 35: Stephen C. Ehrmann

Red FlagsRed Flags

• Naïve approaches to evaluating technology use often:

– Focus on the technology itself

– Measure changes in goals (outcomes) that are the same for all users (“uniform impact”)

– Focus (only) on hoped-for gains in outcomes, compared with doing nothing

– Wait to start inquiry until things are smooth, so that there’s a good chance that the findings will be positive, and can be used to elicit rewards.

[email protected] 35