a model for evaluating institutional research functions air 2000 may 20, 2000 frank doherty director...

43
A Model For Evaluating Institutional Research Functions AIR 2000 May 20, 2000 Frank Doherty Director of Institutional Research James Madison University

Upload: vanessa-waters

Post on 20-Jan-2016

214 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: A Model For Evaluating Institutional Research Functions AIR 2000 May 20, 2000 Frank Doherty Director of Institutional Research James Madison University

A Model For Evaluating Institutional Research Functions

AIR 2000May 20, 2000Frank DohertyDirector of Institutional ResearchJames Madison University

Page 2: A Model For Evaluating Institutional Research Functions AIR 2000 May 20, 2000 Frank Doherty Director of Institutional Research James Madison University

Objectives

Learn how to describe what you do.Develop a systematic plan to

evaluate the IR functions.Freebie: Learn a method to evaluate

other administrative offices on your campus.

Page 3: A Model For Evaluating Institutional Research Functions AIR 2000 May 20, 2000 Frank Doherty Director of Institutional Research James Madison University

Schedule

8:00—10:00 Introduction and program design development

10:00 Break

10:15—12:00

Program design development

12:00—1:00 Lunch

1:00—4:00 Evaluation design development

4:00-5:00 Wrap-up and evaluation

Page 4: A Model For Evaluating Institutional Research Functions AIR 2000 May 20, 2000 Frank Doherty Director of Institutional Research James Madison University

JMU OIR Evaluation

1992 SACS Visiting Team Report “Although OPA (now OIR) has occasionally

evaluated the usefulness of some of its products and services, evaluation has not been established as a routine matter. Thus, the Committee recommends that the University establish regular and ongoing evaluation mechanisms for the institutional research function.”

Page 5: A Model For Evaluating Institutional Research Functions AIR 2000 May 20, 2000 Frank Doherty Director of Institutional Research James Madison University

Evaluation Is User-Oriented

Objective is program improvement and accountability

Seek information that will improve office

User control of evaluation is very important

Page 6: A Model For Evaluating Institutional Research Functions AIR 2000 May 20, 2000 Frank Doherty Director of Institutional Research James Madison University

Elements of the Evaluation

Program designEvaluation designProgram review teamData collection and analysisReporting and recommendationsImprovement planOngoing evaluation

Page 7: A Model For Evaluating Institutional Research Functions AIR 2000 May 20, 2000 Frank Doherty Director of Institutional Research James Madison University

Program Design Philosophy

You cannot evaluate that which you cannot describe

First step in self-studyFacilitates clarification of program goals

and operation—wonderful communication device

Aids the planning processServes as an implementation guideProvides a sense of the wholeDocuments program operation

Page 8: A Model For Evaluating Institutional Research Functions AIR 2000 May 20, 2000 Frank Doherty Director of Institutional Research James Madison University

Program Design

Discrepancy Evaluation Model Systems approach

InputsProcessesOutputs

Compare performance with standard (gap analysis)

OIR Program Design Network Input-Process-Output statements

Page 9: A Model For Evaluating Institutional Research Functions AIR 2000 May 20, 2000 Frank Doherty Director of Institutional Research James Madison University

Evaluation Plan Philosophy

States intentions publiclyOrganizes complexity of evaluation

effortFacilitates and justifies evaluation

resource allocation decisionsServes as a “standard” for judging

an evaluation effort

Page 10: A Model For Evaluating Institutional Research Functions AIR 2000 May 20, 2000 Frank Doherty Director of Institutional Research James Madison University

Evaluation Design

Overall plan for the evaluation Concerns/Issues

Program-specific Common

Questions Information sources/methodology

OIR Evaluation Design

Page 11: A Model For Evaluating Institutional Research Functions AIR 2000 May 20, 2000 Frank Doherty Director of Institutional Research James Madison University

Program Review Team

Consists of 8-10 staff recommended by office

Chair not from office, but appointed by division head

Collect dataWrite report and recommendationsRecommendations discussed with division

head and supervisorAnnual objectives developed to address

recommendations

Page 12: A Model For Evaluating Institutional Research Functions AIR 2000 May 20, 2000 Frank Doherty Director of Institutional Research James Madison University

Ongoing Program Review

OIR program review is conducted every three years

Online surveyAccountability and use of results

Annual objectives based on recommendations

Page 13: A Model For Evaluating Institutional Research Functions AIR 2000 May 20, 2000 Frank Doherty Director of Institutional Research James Madison University

Components

Components Of An Effective Program Review at James Madison University

Page 14: A Model For Evaluating Institutional Research Functions AIR 2000 May 20, 2000 Frank Doherty Director of Institutional Research James Madison University

Summary

IR evaluation should be: Thorough User-oriented On-going and accountable

OIR evaluation report http://www.jmu.edu/instresrch/present/

air99/oireval.pdf

Page 15: A Model For Evaluating Institutional Research Functions AIR 2000 May 20, 2000 Frank Doherty Director of Institutional Research James Madison University

Program Design Exercise

Program design consists of two parts Network Input-Process-Output statements

Page 16: A Model For Evaluating Institutional Research Functions AIR 2000 May 20, 2000 Frank Doherty Director of Institutional Research James Madison University

Network

Numbering and levelsFunctional dependenciesLet’s create a Network

Page 17: A Model For Evaluating Institutional Research Functions AIR 2000 May 20, 2000 Frank Doherty Director of Institutional Research James Madison University

IPO Statements

InputsProcessesOutputsJMU OIR Network

Page 18: A Model For Evaluating Institutional Research Functions AIR 2000 May 20, 2000 Frank Doherty Director of Institutional Research James Madison University

Inputs

Things which set processes into motion and keep them running resources receptors staff independent groups/organizations preconditions enabling outputs from other components

Page 19: A Model For Evaluating Institutional Research Functions AIR 2000 May 20, 2000 Frank Doherty Director of Institutional Research James Madison University

Processes

Described as event-sequencesProcess descriptions of

intended interactions of people, materials and media, and current context in which they take place

Be specific indicate who is doing what to whom, how,

when, where, and for how longLinked to outputs

Page 20: A Model For Evaluating Institutional Research Functions AIR 2000 May 20, 2000 Frank Doherty Director of Institutional Research James Madison University

Outputs—Terminal

Two types: terminal objectives and enabling objectives

Terminal objectives are changes or products which result from program-controlled processes intended to be fed into the external

environment outputs for which the program holds

itself accountable—bottom line

Page 21: A Model For Evaluating Institutional Research Functions AIR 2000 May 20, 2000 Frank Doherty Director of Institutional Research James Madison University

Outputs—Enabling

Enabling objectives result from program-controlled processes used within program rather than

without “enables” the achievement of terminal

objectives can be output of one process and input

into another

Page 22: A Model For Evaluating Institutional Research Functions AIR 2000 May 20, 2000 Frank Doherty Director of Institutional Research James Madison University

IPO Development Exercise

Let’s develop an IPO for your office.IPO Exercise

Page 23: A Model For Evaluating Institutional Research Functions AIR 2000 May 20, 2000 Frank Doherty Director of Institutional Research James Madison University

Evaluation Plan

Address primary needs of area “What do you need to know?”

May want to address common institutional issues and questions Customer satisfaction Planning Use of results Etc.

Page 24: A Model For Evaluating Institutional Research Functions AIR 2000 May 20, 2000 Frank Doherty Director of Institutional Research James Madison University

Stages of Evaluation

Design evaluationInput evaluation *Process evaluation *Output evaluation *Cost-Benefit Analysis

Page 25: A Model For Evaluating Institutional Research Functions AIR 2000 May 20, 2000 Frank Doherty Director of Institutional Research James Madison University

Design Evaluation

Assessment of substantive adequacy of a program’s design

Is this likely to be a good program?Examination of the substance,

assumptions, and structure of a program prior to installation.

Page 26: A Model For Evaluating Institutional Research Functions AIR 2000 May 20, 2000 Frank Doherty Director of Institutional Research James Madison University

Input Evaluation

Appropriate for: New programs and replication efforts

Installation evaluation Inputs are present as prescribed by

program design Planned processes have been set in

motionDesign preconditions have been metStipulated preconditions are criticalFiscal monitoring

Page 27: A Model For Evaluating Institutional Research Functions AIR 2000 May 20, 2000 Frank Doherty Director of Institutional Research James Madison University

Process Evaluation

Monitors continued operation and sequential accomplishment of enabling objectives

Formative: Discrepancy reports used to modify and improve program operations

Page 28: A Model For Evaluating Institutional Research Functions AIR 2000 May 20, 2000 Frank Doherty Director of Institutional Research James Madison University

Process Evaluation

Sets the stage for summative evaluation Documents and defines “treatment”

until program process stable Clarifies relationship between program

activities and accomplishment of interim objectives

Evaluation plans should emphasize process evaluation Particularly useful during early stages of

program operation

Page 29: A Model For Evaluating Institutional Research Functions AIR 2000 May 20, 2000 Frank Doherty Director of Institutional Research James Madison University

Output Evaluation

Refers to terminal objectives only Have terminal objectives been

achieved?Investigation of causationMost useful when preceded by

formative evaluation Previous evaluation stages

contribute to program stability and improvement

Page 30: A Model For Evaluating Institutional Research Functions AIR 2000 May 20, 2000 Frank Doherty Director of Institutional Research James Madison University

Evaluation Design

Components Description

Evaluation Concern 3-7 aspects of program to be evaluated

Evaluation Questions 2 or more performance questions for each concern

Design Referent Refers to program design

Information Needed Reason for question, kind of information sought

Source of Information Where information comes from and how to be analyzed

Date Information Needed When discrepancy info needed

Page 31: A Model For Evaluating Institutional Research Functions AIR 2000 May 20, 2000 Frank Doherty Director of Institutional Research James Madison University

Selection Criteria

Critical functional importanceAreas that are problematicAreas of direct concern to external

evaluation audiences (i.e. accrediting agency)

Areas of concern to internal evaluation audiences (customer satisfaction)

Areas where information is needed soon

Page 32: A Model For Evaluating Institutional Research Functions AIR 2000 May 20, 2000 Frank Doherty Director of Institutional Research James Madison University

Evaluation Concerns Identification

Common models of organization By design component By cross-cutting function By evaluation stage

Page 33: A Model For Evaluating Institutional Research Functions AIR 2000 May 20, 2000 Frank Doherty Director of Institutional Research James Madison University

Evaluation Questions

Derived from a larger area of concernGuide to collection of performance

informationWhat kind of performance information is

necessary to answer questions posed?Determine standard for each variable

identified

Page 34: A Model For Evaluating Institutional Research Functions AIR 2000 May 20, 2000 Frank Doherty Director of Institutional Research James Madison University

Evaluation Questions

Develop for each evaluation concernShould direct systematic collection of

performance informationEvaluation question directs one to

performance information

Page 35: A Model For Evaluating Institutional Research Functions AIR 2000 May 20, 2000 Frank Doherty Director of Institutional Research James Madison University

Design Referents

Relate program design to evaluation design.

Design referent should point to a component in the program design and indicate whether the question is related to input, process, or output.

Page 36: A Model For Evaluating Institutional Research Functions AIR 2000 May 20, 2000 Frank Doherty Director of Institutional Research James Madison University

Information Needs

Provides rationale for each questionExplains what kind of information

soughtIndicates how collected information

will be used, and by whom.

Page 37: A Model For Evaluating Institutional Research Functions AIR 2000 May 20, 2000 Frank Doherty Director of Institutional Research James Madison University

Information Needs Justification

Record keepingRoutine monitoringVerification of preconditionsManagement troubleshootingFunctional criticalityAccountabilityBargain information

Page 38: A Model For Evaluating Institutional Research Functions AIR 2000 May 20, 2000 Frank Doherty Director of Institutional Research James Madison University

Information Needs Continued

Information need should tell reader

purpose of the information

(F) for formative

(S) for summative

Sometimes can be F and S

Page 39: A Model For Evaluating Institutional Research Functions AIR 2000 May 20, 2000 Frank Doherty Director of Institutional Research James Madison University

Sources of Information

Task #1: Brainstorm information possibilities for each question

Task #2: Pick and choose from possibilities

Factors to consider: Reliability and validity Cost (time and resources)

Page 40: A Model For Evaluating Institutional Research Functions AIR 2000 May 20, 2000 Frank Doherty Director of Institutional Research James Madison University

Report Dates

Establish ballpark estimate when

discrepancy reports should be

available

May differ from audience to audience

Page 41: A Model For Evaluating Institutional Research Functions AIR 2000 May 20, 2000 Frank Doherty Director of Institutional Research James Madison University

Evaluation Design Exercise

Let’s create an evaluation design Evaluation Concerns Evaluation Questions Design Referent Information Needed Source of Information Date Information Needed

Page 42: A Model For Evaluating Institutional Research Functions AIR 2000 May 20, 2000 Frank Doherty Director of Institutional Research James Madison University

Data Analysis

Questions determine methodsMultiple methods used

Statistical analysis of data Document review Surveys Interviews Focus groups

Page 43: A Model For Evaluating Institutional Research Functions AIR 2000 May 20, 2000 Frank Doherty Director of Institutional Research James Madison University

Reporting and Recommendations

Reports are organized by evaluation issue/concern

Self-Study team develops recommendations