gregory k.w.k. chung, ucla / cresst davina c.d. klein, ucla / cresst

22
C R E S S T / U C L A Evaluating the Impact of the Interactive Multimedia Exercises (IMMEX) Program: Measuring the Impact of Problem-Solving Assessment Software Gregory K.W.K. Chung, UCLA / CRESST Davina C.D. Klein, UCLA / CRESST Tina C. Christie, UCLA / CRESST Roy S. Zimmermann, UCLA / CRESST Ronald H. Stevens, UCLA School of Medicine UCLA Graduate School of Education & Information Studies Center for the Study of Evaluation National Center for Research on Evaluation, Standards, and Student Testing Annual Meeting of the American Educational Research Association April 24, 2000

Upload: anthony-dunn

Post on 31-Dec-2015

59 views

Category:

Documents


0 download

DESCRIPTION

Evaluating the Impact of the Interactive Multimedia Exercises (IMMEX) Program: Measuring the Impact of Problem-Solving Assessment Software. Gregory K.W.K. Chung, UCLA / CRESST Davina C.D. Klein, UCLA / CRESST Tina C. Christie, UCLA / CRESST Roy S. Zimmermann, UCLA / CRESST - PowerPoint PPT Presentation

TRANSCRIPT

Page 1: Gregory K.W.K. Chung, UCLA / CRESST Davina C.D. Klein, UCLA / CRESST

C R E S S T / U C L A

Evaluating the Impact of the Interactive Multimedia Exercises (IMMEX) Program:

Measuring the Impact of Problem-Solving Assessment Software

Gregory K.W.K. Chung, UCLA / CRESSTDavina C.D. Klein, UCLA / CRESSTTina C. Christie, UCLA / CRESST

Roy S. Zimmermann, UCLA / CRESSTRonald H. Stevens, UCLA School of Medicine

UCLA Graduate School of Education & Information StudiesCenter for the Study of Evaluation

National Center for Research on Evaluation, Standards, and Student Testing

Annual Meeting of the American Educational Research AssociationApril 24, 2000

Page 2: Gregory K.W.K. Chung, UCLA / CRESST Davina C.D. Klein, UCLA / CRESST

C R E S S T / U C L A

Overview

IMMEX overview

Evaluation questions, design, findings

Focus on barriers to adoption

Implications for the future

Page 3: Gregory K.W.K. Chung, UCLA / CRESST Davina C.D. Klein, UCLA / CRESST

C R E S S T / U C L A

Implementation Context

Los Angeles Unified School District 697,000 students, 41,000 teachers, 790

schools (1998)

Average class size: 27 (1998-99)

Limited English Proficiency (LEP): 46% of students (1998-99)

2,600 classrooms have Internet access (1998-99)

Page 4: Gregory K.W.K. Chung, UCLA / CRESST Davina C.D. Klein, UCLA / CRESST

C R E S S T / U C L A

IMMEX Program Goal

Improve student learning via the routine use of IMMEX assessment technology in the classroom Explicitly link assessment technology with

classroom practice, theories of learning, and science content

Provide aggressive professional development, IMMEX, and technology support

Page 5: Gregory K.W.K. Chung, UCLA / CRESST Davina C.D. Klein, UCLA / CRESST

C R E S S T / U C L A

IMMEX ProgramProblem Solving Assessment Software Problem solving architecture:

Students presented with a problem scenario, provided with information that is relevant and irrelevant to solving problem

Problem solving demands embedded in design of information space and multiple problem sets (e.g., medical diagnosis)

Performance: # completed, % solved

Process: Pattern of information access yields evidence of use of a particular problem solving strategy (e.g., elimination, evidence vs. conjecture, cause-effect)

Page 6: Gregory K.W.K. Chung, UCLA / CRESST Davina C.D. Klein, UCLA / CRESST

C R E S S T / U C L A

IMMEX Program: Theory of Action

Better classroom teaching

Increased student

outcomes

Use of IMMEX to

assess students

Greater teacher

understanding of students

Greater teacher

facility with technology

Quality teacher training

Individual teacher

differences

Deeper teacher understanding of science content

Use of IMMEX to instruct students

Page 7: Gregory K.W.K. Chung, UCLA / CRESST Davina C.D. Klein, UCLA / CRESST

C R E S S T / U C L A

Evaluation Questions

Implementation: Is the IMMEX software being implemented as intended?

Impact: How is IMMEX impacting classrooms, teachers, and students?

Integration: How can IMMEX best be integrated into the regular infrastructure of schooling?

Page 8: Gregory K.W.K. Chung, UCLA / CRESST Davina C.D. Klein, UCLA / CRESST

C R E S S T / U C L A

Evaluation Methodology

Pre-post design Y1, Y2: Focus on teachers and

classroom impact

Y3, Y4: Focus on student impact

Examine impact over time

Page 9: Gregory K.W.K. Chung, UCLA / CRESST Davina C.D. Klein, UCLA / CRESST

C R E S S T / U C L A

Evaluation Methodology

Instruments Teacher surveys: demographics, teaching

practices, attitudes, usage, perceived impact

Teacher interviews: barriers, integration, teacher characteristics

Student surveys: demographics, perceived impact, attitudes, strategy use

Page 10: Gregory K.W.K. Chung, UCLA / CRESST Davina C.D. Klein, UCLA / CRESST

C R E S S T / U C L A

Evaluation Methodology

Data collection: Year 1: Spring 99

Year 2: Fall 99/Spring 00

Year 3, 4: Fall/Spring 01, Fall/Spring 02

Teacher sample Y1: All IMMEX-trained teachers (~240): 45

responded to survey, 9 interviewed

Y2 Fall 99: 1999 IMMEX users (38): 18 responded to survey, 8 interviewed

Page 11: Gregory K.W.K. Chung, UCLA / CRESST Davina C.D. Klein, UCLA / CRESST

C R E S S T / U C L A

Evaluation Methodology

Year 1 Year 2 Year 3 Year 4

Spr 99 Fall 99 Spr 00 Fall 01Spr 01Fall 00 Spr 01

Teacher sample Y1: Sample all teachers who were trained on

IMMEX (~240)

45 responded to survey, 9 interviewed

Y2 Fall: Sample all confirmed 1999 users (38)

18 responded to survey, 8 interviewed

Page 12: Gregory K.W.K. Chung, UCLA / CRESST Davina C.D. Klein, UCLA / CRESST

C R E S S T / U C L A

Results

Teacher surveys: High satisfaction with participation in

IMMEX program

Once a month considered high, more often few times (< 7 times) a school year

Implementation: assessing students’ problem solving, practice integrating their knowledge

Impact: use of technology, exchange of ideas with colleagues, teaching effectiveness

Page 13: Gregory K.W.K. Chung, UCLA / CRESST Davina C.D. Klein, UCLA / CRESST

C R E S S T / U C L A

Results

Teacher interviews: In general, IMMEX teachers have a very

strong commitment to teaching and student learning

Passionate about their work, committed to students and the profession, engage in a variety of activities (school and professional), open to new teaching methods

Strong belief in the pedagogical value of IMMEX

Page 14: Gregory K.W.K. Chung, UCLA / CRESST Davina C.D. Klein, UCLA / CRESST

C R E S S T / U C L A

Results

Teacher interviews: In general, IMMEX teachers are willing to

commit the time and effort required to implement IMMEX

Able to deal with complexity of implementation logistics

Highly motivated, organized, self-starters

Page 15: Gregory K.W.K. Chung, UCLA / CRESST Davina C.D. Klein, UCLA / CRESST

C R E S S T / U C L A

Results

Teacher interviews: General barriers Lack of computer skills

Lack of computers

Classroom challenges

Page 16: Gregory K.W.K. Chung, UCLA / CRESST Davina C.D. Klein, UCLA / CRESST

C R E S S T / U C L A

Results

Teacher interviews: IMMEX barriers User-interface

Lack of problem sets / Weak link to curriculum

Amount of time to implement IMMEX in classroom

Amount of time to author IMMEX problem sets

Page 17: Gregory K.W.K. Chung, UCLA / CRESST Davina C.D. Klein, UCLA / CRESST

C R E S S T / U C L A

Addressing Barriers

Problem sets

Computer related

Implementation

Authoring, Curriculum,

>100 problem sets, authoring capability, ongoing problem set development

Basic computer skills instruction, rolling labs, on-demand technical support, Web version

Full-service model

Finely-tuned development workshops, stipend, documentation, curriculum guides

Experienced, dedicated, focused staff with teaching and research experience

Barriers How Addressed

Page 18: Gregory K.W.K. Chung, UCLA / CRESST Davina C.D. Klein, UCLA / CRESST

C R E S S T / U C L A

Implications

Short-term No widespread adoption by teachers

too many barriers for too many teachers only highly motivated likely to adopt full-service model evidence of difficulty of

adoption

Learn from the “A-team” high usage teachers represent best

practices

Establish deployment infrastructure

Page 19: Gregory K.W.K. Chung, UCLA / CRESST Davina C.D. Klein, UCLA / CRESST

C R E S S T / U C L A

Implications

Long-term Problem solving instruction and assessment will

remain relevant

Computer barriers: lowered (computer access, skills)

Time-to-Implement barriers: lowered (problem set expansion, Web access, automated scoring and reporting)

Time-to-Author barriers: ???(reduction in mechanics of authoring, problem set expansion; conceptual development of problem sets remains a constant)

Page 20: Gregory K.W.K. Chung, UCLA / CRESST Davina C.D. Klein, UCLA / CRESST

C R E S S T / U C L A

Contact Information

For more information about the evaluation:

Greg Chung ([email protected])

www.cse.ucla.edu

For more information about IMMEX:

Ron Stevens ([email protected])

www.immex.ucla.edu

Page 21: Gregory K.W.K. Chung, UCLA / CRESST Davina C.D. Klein, UCLA / CRESST

C R E S S T / U C L A

IMMEX Program

First used for medical school examination in 1987

First K-12 deployment context (content development, teacher training, high school use) between 1990-92

Page 22: Gregory K.W.K. Chung, UCLA / CRESST Davina C.D. Klein, UCLA / CRESST

C R E S S T / U C L A

IMMEX Software: Search path maps