strategies for course redesign evaluation

22
STRATEGIES FOR COURSE REDESIGN EVALUATION Laura M. Stapleton Human Development and Quantitative Methodology University of Maryland, College Park [email protected] 1

Upload: alisa

Post on 20-Jan-2016

64 views

Category:

Documents


2 download

DESCRIPTION

Strategies for Course Redesign Evaluation. Laura M. Stapleton Human Development and Quantitative Methodology University of Maryland, College Park [email protected]. Presentation Outline. Goal of evaluation A proposed framework for evaluation Experimental design considerations - PowerPoint PPT Presentation

TRANSCRIPT

Page 1: Strategies for Course Redesign Evaluation

STRATEGIES FOR COURSE REDESIGN EVALUATION

Laura M. StapletonHuman Development and Quantitative MethodologyUniversity of Maryland, College [email protected]

1

Page 2: Strategies for Course Redesign Evaluation

Presentation Outline

• Goal of evaluation

• A proposed framework for evaluation

• Experimental design considerations

• Examples of what (and what not) to do

• Summary and recommendations

2

Page 3: Strategies for Course Redesign Evaluation

Goal of Evaluation

• …to provide useful information for judging decision alternatives, assisting an audience to judge and improve the worth of some educational program, and assisting the improvement of policies and programs”

(Stufflebeam, 1983)

3

Page 4: Strategies for Course Redesign Evaluation

Frameworks

• Summative-judgment orientation (Scriven, 1983)

OR

• Improvement orientation (Stufflebeam, 1983)

4

“The most important purpose of program evaluation is not to prove but to improve”

Page 5: Strategies for Course Redesign Evaluation

Proposed Framework

• Stufflebeam’s CIPP framework for program evaluation• Context• Inputs• Process• Product

• Evaluation can encompass any or every one of these

aspects

5

Page 6: Strategies for Course Redesign Evaluation

Context Evaluation

• What needs are addressed, how pervasive and important are they, and to what extent are the project’s objectives reflective of assessed needs?

• What course is undergoing redesign?• Why is it targeted for redesign?• Are these reasons sufficient for the resource/time expenditure

that redesign would require?• Are there components of the traditional course that are

good/important to keep?• Does redesign represent potential benefits?

6

Page 7: Strategies for Course Redesign Evaluation

Input Evaluation

• What procedural plan was adopted to address the needs and to what extent was it a reasonable, potentially successful, and cost effective response to the assessed needs?

• Why were the specific redesign components selected?• What else might have worked just as well?• What are the costs of the chosen approach versus the

costs of others (to all stakeholders)?

7

Page 8: Strategies for Course Redesign Evaluation

Process Evaluation

• To what extent was the project plan implemented, and how, and for what reasons did it have to be modified?

• Was each part of the plan in place? • Did the components operate as expected?• Did the expected behavioral change occur?• How can implementation efforts be improved?

8

Page 9: Strategies for Course Redesign Evaluation

Product Evaluation

• What results were observed, how did the various stakeholders judge the worth and merit of the outcomes, and to what extent were the needs of the target population met?• How did outcomes compare to past/traditional delivery?• What were stakeholders opinions’ of the change?• Were the cost/benefit advantages realized?• Were there unintended consequences?

9

Page 10: Strategies for Course Redesign Evaluation

Proposed Framework

• Stufflebeam’s CIPP framework for program evaluation• Context• Inputs• Process• Product

10

Page 11: Strategies for Course Redesign Evaluation

Process Evaluation Strategies

• Observations

• Review of extant process data

• Focus groups

• Informal or formal feedback

11

Page 12: Strategies for Course Redesign Evaluation

Product Evaluation Strategies

• Qualitative review of judgments of stakeholders

• Quantitative comparison of measured outcomes

• Causal conclusions regarding quantitative outcomes depends on design (Campbell & Stanley, 1963)

12

Page 13: Strategies for Course Redesign Evaluation

Study Design

13

sample

Group A

Group B

pre-test

pre-test

Re-designed Instruction

Traditional Instruction

post-test

post-test

Pre-test post-test control group design

with this design, you have strong support for a causal statement

Page 14: Strategies for Course Redesign Evaluation

Study Design

14

sample

Group A

Group B

Re-designed Instruction

Traditional Instruction

post-test

post-test

Post-test only control group design

with this design, you have support for a causal statement, assuming students do not drop out differentially

Page 15: Strategies for Course Redesign Evaluation

Study Design

15

Group A

Group B

pre-test

pre-test

Re-designed Instruction

Traditional Instruction

post-test

post-test

Non-equivalent control group design

with this design, initial differences in the groups may explain differences (or lack of differences) in the outcomes

Page 16: Strategies for Course Redesign Evaluation

Study Design

16

Group A

Group B

Re-designed Instruction

Traditional Instruction

post-test

post-test

Static group comparison / post-test only with non-equivalent groups

with this design, initial differences in the groups may explain differences (or lack of differences) in the outcomes

Page 17: Strategies for Course Redesign Evaluation

Examples of what to do (and not do)

• UMBC Psychology Course Redesign

• Context: Low pass rates in PSYC100 course; student course evaluations were “brutal”

• Input: Delivery of PSYC100 course content altered• Material on web; self-paced “labs” and quizzes• Dyads within lecture hall with peer facilitators• Lectures were more discussion based, including video

and clicker questions• Less time in lecture, more on self-paced on-line work

17

Page 18: Strategies for Course Redesign Evaluation

Examples of what to do (and not do)

Process Evaluation (of redesign pilot)

• Lab utilization statistics• Time lab completed (relative to exam and speed)• Number of times quiz attempted

• Qualitative reaction from redesign section instructors• Focus group comments from students• Reactions from small groups in lectures

• What was working• What was not working• What changes would be helpful

18

Page 19: Strategies for Course Redesign Evaluation

Examples of what to do (and not do)

Product Evaluation (of redesign pilot)

•Post-test Only Static Group Comparison• Grade Distribution

• Redesign• Traditional (same semester)• Traditional (historical)

• Common Exam• Redesign• Traditional (same semester)

• Student Course Evaluations• Redesign• Traditional (same semester)

19

Page 20: Strategies for Course Redesign Evaluation

Summary Suggestions

• Identify a fairly independent evaluator now• Determine what type of evaluation you need to undertake

[which CIPP stage(s)?]• Identify components of each (remember unintended

consequences)• Make it happen when it needs to happen! • Be creative in considering sources of “data”• Be flexible to change your evaluation plan mid-stream• Think long-term as well as short-term

20

Page 21: Strategies for Course Redesign Evaluation

References

• Campbell, D.T. & Stanley, J. C. (1963). Experimental and Quasi-Experimental Designs for Research. Chicago, IL: Rand McNally & Company.

• Scriven, M. (1983). Evaluation ideologies. In Madaus, G. F., Scriven, M. & Stufflebeam, D. L. (Eds.) Evaluation Models, pp. 229-260. Hingham, MA: Kluwer Academic Publishers.

• Stufflebeam, D. L. (1983). The CIPP model for program evaluation. In Madaus, G. F., Scriven, M. & Stufflebeam, D. L. (Eds.) Evaluation Models, pp. 117-142. Hingham, MA: Kluwer Academic Publishers.

21

Page 22: Strategies for Course Redesign Evaluation

22

Thank you!

Contact for info: [email protected]