planning and sustaining evaluation of instructional technology support programs yvonne belanger,...

43
Planning and Sustaining Evaluation of Instructional Technology Support Programs Yvonne Belanger, Duke University Joan Falkenberg Getman, Cornell University Lynne O’Brien, Duke University http://cit.duke.edu http://dls.cornell.edu

Upload: esperanza-bowring

Post on 31-Mar-2015

217 views

Category:

Documents


1 download

TRANSCRIPT

Page 1: Planning and Sustaining Evaluation of Instructional Technology Support Programs Yvonne Belanger, Duke University Joan Falkenberg Getman, Cornell University

Planning and Sustaining Evaluation of Instructional

Technology Support Programs

Yvonne Belanger, Duke University

Joan Falkenberg Getman, Cornell University

Lynne O’Brien, Duke University

http://cit.duke.eduhttp://dls.cornell.edu

Page 2: Planning and Sustaining Evaluation of Instructional Technology Support Programs Yvonne Belanger, Duke University Joan Falkenberg Getman, Cornell University

Our assumption

• How can we build a culture of evaluation, so that many people contribute to evaluation?

• How can we provide a context for evaluation strategies and results?

• How can we conduct evaluation that helps with decision making?

Schools want to know if IT services, organizations & projects are effective, but have limited resources for evaluation.

Page 3: Planning and Sustaining Evaluation of Instructional Technology Support Programs Yvonne Belanger, Duke University Joan Falkenberg Getman, Cornell University

Overview

• Key issues in evaluation planning

• Early planning for evaluation at Cornell

• General approaches to evaluation at Duke

• Case studies: Duke iPod project, Duke Faculty Fellows

• Resources and templates

Page 4: Planning and Sustaining Evaluation of Instructional Technology Support Programs Yvonne Belanger, Duke University Joan Falkenberg Getman, Cornell University

Assessment v. Evaluation

• Assessment is an ongoing process aimed at understanding and improving student learning.

• Evaluation is a judgment or determination of the quality of a performance, product or use of a process against a standard. Did it work in terms of the needs being addressed or the system goal?

Article: Differentiating Assessment from Evaluation as Continuous Improvement Tools by Peter Parker, Paul D. Fleming, Steve Beyerlin, Dan Apple, Karl Krumsieg

Page 5: Planning and Sustaining Evaluation of Instructional Technology Support Programs Yvonne Belanger, Duke University Joan Falkenberg Getman, Cornell University

Why Evaluate?

“Research is aimed at truth. Evaluation is aimed at action.”

Michael Patton

Page 6: Planning and Sustaining Evaluation of Instructional Technology Support Programs Yvonne Belanger, Duke University Joan Falkenberg Getman, Cornell University

Cornell University• Private, 4 year, Research 1 • New York State land-grant

institution• Partner of the State

University of New York• 11 schools: 7 undergraduate

and 4 grad/professional• 19,600 FTE students, 1540+

faculty http://www.cornell.edu

Page 7: Planning and Sustaining Evaluation of Instructional Technology Support Programs Yvonne Belanger, Duke University Joan Falkenberg Getman, Cornell University

Cornell Computing Environment

• IT centrally and locally supported• Undergraduate education is campus based• CIT strategic plan encourages selective innovation• President’s “Call to Action”• Provost’s Distributed Learning Initiative

Page 8: Planning and Sustaining Evaluation of Instructional Technology Support Programs Yvonne Belanger, Duke University Joan Falkenberg Getman, Cornell University

Provost’s Distributed Learning Initiative

• Core technologies development • Faculty development and training• Faculty Innovation in Teaching (new)• Lynx Student Assistant Program (new)

Page 10: Planning and Sustaining Evaluation of Instructional Technology Support Programs Yvonne Belanger, Duke University Joan Falkenberg Getman, Cornell University

http://lynx.cornell.edu

Page 11: Planning and Sustaining Evaluation of Instructional Technology Support Programs Yvonne Belanger, Duke University Joan Falkenberg Getman, Cornell University

Key Issues in Planning an Evaluation

• Who are the stakeholders and what do they want to know?

• What is success?• How will you measure success?• Do you have an evaluation team or partner? • What is the most effective way to report evaluation

findings?• Are you allowing for discovery as well as confirmation?• How is the data going to guide decision-making and

improvements?

Page 12: Planning and Sustaining Evaluation of Instructional Technology Support Programs Yvonne Belanger, Duke University Joan Falkenberg Getman, Cornell University

Different Stakeholders Different Interests

• Provost• President• Vice President of

Information Technology• Faculty • Students• Deans• Dean of faculty• Dean of students

• Library• Center for learning and

Teaching• Cornell Adult University• Faculty Advisory Board

on Information Technology

• IT staff and Helpdesk staff

• Executive Budget and Finance Committee

Page 13: Planning and Sustaining Evaluation of Instructional Technology Support Programs Yvonne Belanger, Duke University Joan Falkenberg Getman, Cornell University

Defining Success

• Identify different dimensions or domains for evaluation.

• Identify indicators of success in those domains.

• Data collection method and source of data will vary with indicators of success.

Page 14: Planning and Sustaining Evaluation of Instructional Technology Support Programs Yvonne Belanger, Duke University Joan Falkenberg Getman, Cornell University

Measuring Success

Project: Student response systems (polling) in large enrollment class.Goals: Improve learning, implement inexpensive, low-

maintenance technology with specific functionality, increase student engagement, short learning curve for faculty, adoption of polling by other large enrollment classes.

Domains: 1. Instructional: strategies, learning outcomes 2. Technology: functionality, reliability…3. Student experience: attitude, use of technology…4. Faculty experience: attitude, use of technology…5. Programmatic impact6. Cost

Page 15: Planning and Sustaining Evaluation of Instructional Technology Support Programs Yvonne Belanger, Duke University Joan Falkenberg Getman, Cornell University

Balanced View of SuccessDomains: 1. Instructional: strategies, learning outcomes 2. Technology: functionality, reliability…3. Student experience: attitude, use of technology…4. Faculty experience: attitude, use of technology…5. Programmatic impact6. CostOutcomes:Students: like it in several ways and they self-report improved learningFaculty: too much time in prep, tech not meeting needs, still like the ideaIT staff: User support for faculty and facilities taxing limited staff time Finance Office: clicker replacement and new projection system beyond budget

Was the project a success?

Page 16: Planning and Sustaining Evaluation of Instructional Technology Support Programs Yvonne Belanger, Duke University Joan Falkenberg Getman, Cornell University

Evaluation Models and Standards

• Scientific inquiry and experimental modelsEmphasizes values established by research community

• Management-oriented models Emphasizes decision-making: Stufflebeam’s CIPP model

• Qualitative and Anthropological modelsEmphasizes discovery of values based on description

• Participation-oriented modelsEmphasizes values being "socially constructed" by the community

Page 17: Planning and Sustaining Evaluation of Instructional Technology Support Programs Yvonne Belanger, Duke University Joan Falkenberg Getman, Cornell University

Stufflebeam’s CIPP Model

Context, Input Process and Product evaluation• Focus: decision-making• Purpose: facilitate rational and continuing

decision-making• Evaluation activity: identify potential alternatives,

set up quality control systems

Page 18: Planning and Sustaining Evaluation of Instructional Technology Support Programs Yvonne Belanger, Duke University Joan Falkenberg Getman, Cornell University

Action Research

Action research is deliberate, solution-oriented investigation that is group or personally owned and conducted. It is characterized by spiraling cycles of problem identification, systematic data collection, reflection, analysis, data-driven action taken, and, finally, problem redefinition. The linking of the terms "action" and "research" highlights the essential features of this method: trying out ideas in practice as a means of increasing knowledge about and/or improving curriculum, teaching, and learning (Kemmis & McTaggart, 1982).

Page 19: Planning and Sustaining Evaluation of Instructional Technology Support Programs Yvonne Belanger, Duke University Joan Falkenberg Getman, Cornell University

Permissions and Partners

• Check with your institution’s research office for policies on human subject research.

• Be creative - put together an evaluation team or partnership - involve stakeholders for credibility.

Page 20: Planning and Sustaining Evaluation of Instructional Technology Support Programs Yvonne Belanger, Duke University Joan Falkenberg Getman, Cornell University

Reporting Evaluation Results

• Format your information and customize your report to stakeholders so that it meets their interests and style.– Narrative– Video interviews– PowerPoint presentation– Excel spreadsheets– Images, graphical representation of numerical data

Include unexpected outcomes

Use benchmark studies for additional context

Page 21: Planning and Sustaining Evaluation of Instructional Technology Support Programs Yvonne Belanger, Duke University Joan Falkenberg Getman, Cornell University

Focus on the Intent of Evaluation

• Evaluation uses a combination of data to present a comprehensive picture

• Return to original purpose of the evaluation and the types of decisions the data will inform.

• It is possible for a project or program to have some components that succeeded and others that did not

Page 22: Planning and Sustaining Evaluation of Instructional Technology Support Programs Yvonne Belanger, Duke University Joan Falkenberg Getman, Cornell University

Duke University

• Private, 4 year, Research I

• 9 schools: undergrad and professional

• 12,000 FTE students, 2,350 faculty

Page 23: Planning and Sustaining Evaluation of Instructional Technology Support Programs Yvonne Belanger, Duke University Joan Falkenberg Getman, Cornell University

Duke Center for Instructional Technology

• Established 1999 in response to a general needs assessment on instructional technology at Duke

• Goals: increase faculty and student use of technology, leverage resources, coordinate planning

Page 24: Planning and Sustaining Evaluation of Instructional Technology Support Programs Yvonne Belanger, Duke University Joan Falkenberg Getman, Cornell University
Page 25: Planning and Sustaining Evaluation of Instructional Technology Support Programs Yvonne Belanger, Duke University Joan Falkenberg Getman, Cornell University

Duke CIT Context

• IT is both central & school-based

• Growing interest in distance ed in professional schools

• Undergrad ed = campus based classroom teaching

• Strategic plan encourages IT experimentation

Page 26: Planning and Sustaining Evaluation of Instructional Technology Support Programs Yvonne Belanger, Duke University Joan Falkenberg Getman, Cornell University

Experiments with laptops, Blackboard, PDA’s, iPods and other technologies

Page 27: Planning and Sustaining Evaluation of Instructional Technology Support Programs Yvonne Belanger, Duke University Joan Falkenberg Getman, Cornell University

The big questions

• Is the CIT doing a good job?

• Do students learn more when they use iPods?

• What is the best way to help faculty make good use of technology?

• Is Blackboard a success?

Page 28: Planning and Sustaining Evaluation of Instructional Technology Support Programs Yvonne Belanger, Duke University Joan Falkenberg Getman, Cornell University

Answerable questions• Is CIT making positive changes in the areas

identified by the original needs assessment?• Do iPods improve course logistics and

increase student access to a rich set of course materials?

• Are faculty satisfied with the IT development programs they use?

• How widely used is Blackboard, and what new kinds of teaching does it enable?

Page 29: Planning and Sustaining Evaluation of Instructional Technology Support Programs Yvonne Belanger, Duke University Joan Falkenberg Getman, Cornell University

Tools for Structuring Evaluation

• CIPP and Logic Modeling– Context: Environment & Needs – Input: Strategies & Resources – Process: Monitoring implementation – Product: Outcomes - both quality and

significance

• Logic Modeling

Page 30: Planning and Sustaining Evaluation of Instructional Technology Support Programs Yvonne Belanger, Duke University Joan Falkenberg Getman, Cornell University

CIPP View of Institutionalized Evaluation

Stufflebeam, OPEN, 2003

Page 31: Planning and Sustaining Evaluation of Instructional Technology Support Programs Yvonne Belanger, Duke University Joan Falkenberg Getman, Cornell University

CIPP approach recommends…

• Multiple observers and informants• Mining existing information• Multiple procedures for gathering data; cross-check

qualitative and quantitative• Independent review by stakeholders and outside

groups• Feedback from Stakeholders• Be appropriately circumspect in generating and

reporting conclusions

Page 32: Planning and Sustaining Evaluation of Instructional Technology Support Programs Yvonne Belanger, Duke University Joan Falkenberg Getman, Cornell University

Faculty Fellows ProgramGoals:

• Faculty Development

• Department Development

• Intensive orientation• Occasional meetings• One-on-one consulting• Showcase presentation

Page 33: Planning and Sustaining Evaluation of Instructional Technology Support Programs Yvonne Belanger, Duke University Joan Falkenberg Getman, Cornell University

Evaluating the Fellows Program

• Stakeholder and staff input to clarify program goals

• Developing consistent reporting tools

• Distributing effort• Stakeholder review of

outcomes• Participant responsibility for

disseminating resultsEvaluation of Instructional Technology Fellows Program

Page 34: Planning and Sustaining Evaluation of Instructional Technology Support Programs Yvonne Belanger, Duke University Joan Falkenberg Getman, Cornell University

Re-envisioning the Fellows

• Full week of orientation →1-2 days + 4 additional short meetings

• Single project focus → Multiple small scale activities

• Customized individual project → theme-based offering

Page 35: Planning and Sustaining Evaluation of Instructional Technology Support Programs Yvonne Belanger, Duke University Joan Falkenberg Getman, Cornell University

Duke iPod First-Year Experiment

• Technology innovation

• Student life, campus community

• Academic impact

Project goals

Page 36: Planning and Sustaining Evaluation of Instructional Technology Support Programs Yvonne Belanger, Duke University Joan Falkenberg Getman, Cornell University

• Distributed 1,599 20 GB iPod devices to first-year students on Aug. 19, 2004

Page 37: Planning and Sustaining Evaluation of Instructional Technology Support Programs Yvonne Belanger, Duke University Joan Falkenberg Getman, Cornell University

Evaluation Challenges

• Baseline info unavailable• Iffy implementation of instructors’

course evaluation plan• How best to capture academic projects

outside of CIT purview• Quick start - experimentation;

outcomes vs. predefined goals• Proving correlation between iPod use

and improved course outcome

Page 38: Planning and Sustaining Evaluation of Instructional Technology Support Programs Yvonne Belanger, Duke University Joan Falkenberg Getman, Cornell University

Focusing the evaluation of academic iPod use

• Feasibility of using iPod to support teaching and learning

• Improving logistics of course delivery

• Enhancing student learning and interest

Page 39: Planning and Sustaining Evaluation of Instructional Technology Support Programs Yvonne Belanger, Duke University Joan Falkenberg Getman, Cornell University

Sharing preliminary information

• Crucial to have early understanding of project lessons

• Matrix of evaluation strategies• Grouping uses into similar cases• Examples:

– Summary of iPod projects and their evaluation strategies

– Early feedback on uses and lessons learned

Available at http://cit.duke.edu/evaluation

Page 40: Planning and Sustaining Evaluation of Instructional Technology Support Programs Yvonne Belanger, Duke University Joan Falkenberg Getman, Cornell University

Other Resources & Templates

• http://cit.duke.edu/evaluation– Annotated bibliography by Cornell and Duke– Sample CIT reports– CIT Logic Model example and template

http://www.innovation.cornell.edu

• How can we all share more information about our activities and learn more from one another’s successes and failures?

Page 41: Planning and Sustaining Evaluation of Instructional Technology Support Programs Yvonne Belanger, Duke University Joan Falkenberg Getman, Cornell University

Summary

• Understand what success is for your efforts• Reframe questions to be answerable• Focused rather than comprehensive evaluation• Build culture through distributed team approach• Bring context and input into evaluation• Take a formative view

Page 42: Planning and Sustaining Evaluation of Instructional Technology Support Programs Yvonne Belanger, Duke University Joan Falkenberg Getman, Cornell University

Thank You!

Lynne O’BrienDirector, Duke Center for Instructional [email protected]

Yvonne BelangerProgram Evaluator, Duke Center for Instructional Technology [email protected]

Joan GetmanAssistant Director,

Distributed Learning Services,

Cornell Informaiton Technologies

[email protected]

Page 43: Planning and Sustaining Evaluation of Instructional Technology Support Programs Yvonne Belanger, Duke University Joan Falkenberg Getman, Cornell University