assessment of learning: more than just a test

32
Assessment of learning: More than just a test Office of Medical Education Creighton University School of Medicine

Upload: doankien

Post on 01-Jan-2017

214 views

Category:

Documents


1 download

TRANSCRIPT

Page 1: Assessment of learning: More than just a test

Assessment of learning: More than just a test

Office of Medical Education Creighton University School of Medicine

Page 2: Assessment of learning: More than just a test

Session objectives

After completing this session the learner should be able to:

• Differentiate between assessment and evaluation

• Describe common assessment types

• Create effective multiple choice questions

• Describe assessment as a process for continuous quality improvement

Page 3: Assessment of learning: More than just a test

What is assessment? What is evaluation? Is there really a difference?

Page 4: Assessment of learning: More than just a test

Definitions

The accrediting body for medical schools in the U.S. and Canada is the Liaison Committee on Medical Education (LCME). The LCME uses the following definitions:

Assess refers to assessment of medical student performance

Evaluate is reserved for faculty, resident, course and clerkship/ clerkship rotation, and program evaluation

Source: LCME Functions and Structure of a Medical School, 2011

Page 5: Assessment of learning: More than just a test

Assessment Types

•Formative

•Summative

Page 6: Assessment of learning: More than just a test

Formative Assessment

• To assess learning progress

• Often provided without a grade

• Used throughout a course or clerkship

• Intended to provide opportunities for students to ask questions and improve knowledge and skills

EXAMPLE: In-class quiz using audience response system clickers or cell phone polling

Page 7: Assessment of learning: More than just a test

Summative Assessment

• To assess learning at the end of a curricular experience (e.g., course, clerkship)

Page 8: Assessment of learning: More than just a test

Summative Assessment

Traditionally

Exams

• Multiple Choice Tests

• Essay Tests

• Oral Exams

Now

Performance Tests

• Labs

• Simulations

• OSCEs

Plus papers, portfolios, journals

Page 9: Assessment of learning: More than just a test

The Assessment Process

• Assessment is more than giving an exam

• It is a process

• Like any process, there are opportunities for continuous quality improvement

Page 10: Assessment of learning: More than just a test

Assessment Process

Assessment Blueprint

Item Development

Administration And Scoring

Item Evaluation

Page 11: Assessment of learning: More than just a test

Create a blueprint for your assessment

Identify the learning outcomes and levels of expertise before you design the assessment instrument.

Page 12: Assessment of learning: More than just a test

But first…

Ask yourself, are you assessing

Recall of memorized information?

Comprehension of material?

Application of new material?

Analysis?

Synthesis?

Evaluation?

Page 13: Assessment of learning: More than just a test

Planning and designing an exam

• Start before the course/conference begins

• Create questions that match the objectives

• Ask colleagues to review questions

• Remember to weight more important topics

• Focus on concepts, not trivia

• Focus on the application of knowledge, not recall

Page 14: Assessment of learning: More than just a test

Multiple Choice Questions

Strengths

• Efficient for large groups

• Can use to assess a range of thinking skills

• More responses reduces the chance of guessing correctly

• Grading is usually quick and straightforward

Weaknesses

• Difficult to write good questions

• It takes time to write and validate questions

Page 15: Assessment of learning: More than just a test

Anatomy of the MCQ

• Stem: contains the text

• Options: all answer choices

• Key: the correct answer

• Distractor: the incorrect answers

Page 16: Assessment of learning: More than just a test

MCQ Format used by the NBME and CUSOM

Category One: Single-best-answer formats

A (4 or more options, items or sets)

B (4 or 5 option matching, sets of 2-5 items)

R (extended matching, sets of 2-20 items)

Case, S. M. & Swanson, D. B. (2003). Constructing written test questions

for the basic and clinical sciences, 3rd edition (revised). Philadelphia: National Board of Medical Examiners. http://www.nbme.org/about/itemwriting.asp

Page 17: Assessment of learning: More than just a test

One Best Answer: 4 options

A 65-year-old man comes to the physician for a follow-up examination after the results of a bronchoscopy showed squamous cell carcinoma. When the physician tells the patient the diagnosis, the patient becomes tearful and responds, “No, you’re wrong! This must be a mistake. This can’t happen to me. Let’s do more tests.” This patient is most likely at which of the following stages of grief?

(A) Anger

(B) Bargaining

(C) Denial

(D) Depression (Case and Swanson, 2004)

Page 18: Assessment of learning: More than just a test

Rules for One-Best-Answer Items

• Each item should focus on an important concept

• Each item should assess application of knowledge, not recall of isolated fact

• The stem should pose a clear question

• All distractors should be homogeneous

• Avoid technical item flaws the cue “testwise” students or pose irrelevant difficulty

(Case and Swanson, 2004)

Page 19: Assessment of learning: More than just a test

Best Evidence for Writing MCQs

• Invest in the stem; make sure the item can be answered without looking at the options

• Include language in the stem instead of repeating it in each option

• Each item should be clear and plausible

• Avoid using different grammar (e.g., verb tense) in stem and distractor

• Avoid negatively phrased items, definitive (e.g., always), and vague (e.g., many) language

• Avoid trickery and irrelevance

Page 20: Assessment of learning: More than just a test

More MCQ Tips

• Randomly distribute the correct answer position throughout the test

• Present the options in a logical order (e.g., alpha, chronological)

• Allow 1-2 minutes per question, depending on the reading time

• Instructor should take exam first, then double or triple the time estimate

Page 21: Assessment of learning: More than just a test

One Best Answer: Extended Matching

This format is used occasionally for some exams 4 Components of a good EMQ:

1. A theme; 2. An option list; 3. A lead-in statement; 4. At least two item stems.

(Case and Swanson, 2004)

Page 22: Assessment of learning: More than just a test

Retired MCQ Formats

*No longer used by the NBME or Creighton

Category Two: True/False

• Negative A questions, e.g., “Each of the following is correct except…”

• True/False formats - select all true options • C (A/B/Both/Neither items) • X (simple true/false) • K (complex true/false where one

or more options may be true)

Page 23: Assessment of learning: More than just a test

For more MCQ info

• Consult the guide “Test Question Formats for M1 and M2 Quizzes and Exams at Creighton University School of Medicine” that is posted in this module folder

• Use the online NBME tutorial http://download.usmle.org/IWTutorial/intro.htm

• Review the NBME item writing guide available here: http://www.nbme.org/about/itemwriting.asp

• Contact the Office of Medical Education

Page 24: Assessment of learning: More than just a test

Testing Process

Test Blueprint

Item Development

Test Administration And Scoring

Item Evaluation

Page 25: Assessment of learning: More than just a test

The next step: Evaluating the test

• Examine reliability (consistency) and validity (did you assess or measure what you intended)

• Examine response patterns

• Examine item quality

Page 26: Assessment of learning: More than just a test

Item Difficulty

• Proportion of examinees selecting each response • Determine item difficulty (0.00-1.00) • Easy = 85% answer correctly • Difficult = 35% answer correctly • Item analysis will show you this statistic for each

item

Consult your Component Director or the Office of Medical Education for exam statistics

Page 27: Assessment of learning: More than just a test

Item Discrimination

• Point Biserial Correlation indicates correlation (+1 to -1) between whether a student selected a particular alternative and the student’s total score on the test

• Large, positive RPBI indicates students with higher scores answered correctly

• Low, positive (<.20) suggest ability not related to success

• Negative indicates that low-achieving students performed better than high-achieving students

Page 28: Assessment of learning: More than just a test

Finally, take a step back and reflect on the exam overall

A difficult question (p=.35) does not mean it is a good question!

If RPBI = .18 (i.e., low positive) for the same item, you should review distractor statistics or delete item entirely.

Page 29: Assessment of learning: More than just a test

Now that the exam is over

• Take time to review exam items

• Flag items that did not perform well

• Talk to the course or component director to learn why items were flawed

• If an item was too difficult, revisit your lecture materials and update for the next iteration of your lecture

Page 30: Assessment of learning: More than just a test

Close the loop

Test Blueprint

Item Development

Test Administration And Scoring

Item Evaluation

Page 31: Assessment of learning: More than just a test

References and Resources

Anderson, L.W. & Krathwohl, D.R. (Eds.). (2001). A taxonomy of learning, teaching, and assessment: A revision of Bloom’s taxonomy of educational objectives. New York: Longman. Bloom, B.S., Englehart, M.D., Furst, E. J., & Krathwohl, D.R. (1956). Taxonomy of educational objectives: Cognitive domain. New York: McKay.

Case, S. M. & Swanson, D. B. (2004). Constructing written test questions for the basic and clinical sciences, 3rd edition (revised). Philadelphia: National Board of Medical Examiners. http://www.nbme.org/about/itemwriting.asp

Clark RC. Developing technical training: a structured approach for the development of classroom and computer based instructional materials. Phoenix, AZ: Buzzards Bay Press, 1994.

Krathwohl, D., Bloom, B.S., & Masia, B. (Eds.). (1964). Taxonomy of educational objectives, handbook II: Affective domain. New York: McKay.

Page 32: Assessment of learning: More than just a test

For further reading

Clay B. Is This a Trick Question? A Short Guide to Writing Effective Test Questions. 2001. Clay Kansas Curriculum Center. http://www.k-state.edu/ksde/alp/resources/Handout-Module6.pdf. Accessed 08/14/14. Cross, K.P. & Angelo, T.A. (1993). Classroom assessment techniques. (2nd edition). San Francisco, CA: Jossey-Bass. Gronlund N. Writing Instructional Objectives for Teaching and Assessment. 7th ed. Upper Saddle River, New Jersey. Merrill Prentice Hall, 2004 Clegg, V. L. & Cashin, W. E. Improving multiple-choice tests. Idea paper No. 16, Center for Faculty Evaluation & Development. Manhattan, KS: Kansas State University. Straight, S.H. (2002). “The difference between assessment and evaluation.” Accessed 2/28/06: http://provost.binghamton.edu/assessment/assessment_evaluation.ppt http://scoring.msu.edu/writitem.html#intro Univ. of Saskaatchewan. The teaching and learning guide. Accessed 2/28/06: http://www.usask.ca/tlc/teaching_guide/index.html