we’re lost, but we’re making good time yogi berra

32
We’re lost, but we’re making good time Yogi Berra

Post on 21-Dec-2015

220 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: We’re lost, but we’re making good time Yogi Berra

We’re lost, but we’re making good time

Yogi Berra

Page 2: We’re lost, but we’re making good time Yogi Berra

Assessing the Institutional Research Office

A conversation at the summer, 2008, AIRPO conference

Questions rather than answers (sorry, Charlie)

Page 3: We’re lost, but we’re making good time Yogi Berra

What Gets Measured Gets Improved

Anonymous

Page 4: We’re lost, but we’re making good time Yogi Berra

We are the shoe maker’s children.

Always helping others assess, but rarely assessing ourselves

Page 5: We’re lost, but we’re making good time Yogi Berra

What is our part in the Institutional Effectiveness

play?

Page 6: We’re lost, but we’re making good time Yogi Berra

Purpose & Outcomes

• Purpose of this presentationThink (not do, alas) about assessing our units in the context of assessing non-instructional units

• Outcomes for the presentationAgreement to take a next step at cooperative effort, or not

Page 7: We’re lost, but we’re making good time Yogi Berra

Should we assess ourselves?

• If we do not, how shall we improve?• If we do not, how will we ever find

out how it feels to be assessed?• If we do, how do we assure we learn

something from the assessment?• If we do, how do we make it

productive and not just activity?

Page 8: We’re lost, but we’re making good time Yogi Berra

Why assess?

• What reasons do you give your faculty?

Page 9: We’re lost, but we’re making good time Yogi Berra

How to assess?

• The usual suspects:– Hey kids, let’s put on a survey! – Ask your customers– Ask your college as a whole– Qualitative– Quantative

Page 10: We’re lost, but we’re making good time Yogi Berra

What to assess?

• The validity/reliability of the data provided?

• The speed with which you provided the data?

• The IR office’s contribution to student learning?

• The support to the direct contributors to student learning, i.e., faculty & student life types?

Page 11: We’re lost, but we’re making good time Yogi Berra

What is our product?

Who are our customers?

Page 12: We’re lost, but we’re making good time Yogi Berra

Who are you?

• Who are you and what do you do?

• What is your mission?

• What is inside your circle?

• To what do you say ‘no’?

• What are your current goals?

• How do they fit (or not) within your mission?

• Do you accomplish your goals?

• How do you know?

Page 13: We’re lost, but we’re making good time Yogi Berra

Are you any good?

• What information do you have that will tell you if you are any good?

• Do these improvements move you to where you want to go?

• Do you improve as a result of this information?

• Where do you want to go, anyway?

Page 14: We’re lost, but we’re making good time Yogi Berra

What is our role in the education of our students?

• Do we interact with student directly?• Do we impact their learning in any

way?

Page 15: We’re lost, but we’re making good time Yogi Berra

What methods should we use?

• Quantitative?• Unobtrusive?

• Qualitative?

Page 16: We’re lost, but we’re making good time Yogi Berra

Rubric

• Does this have promise?• Can we define the elements of our

mission?• Can we define what

– Exceeds?– Meets?– Approaches?– Does not meet?

Page 17: We’re lost, but we’re making good time Yogi Berra

What information will cause you to take action?

What kind of action do you have the power to take?

Page 18: We’re lost, but we’re making good time Yogi Berra

Modes of assessment (adapted from Harris & Bell, 1986)

• Formal vs. informalFormal assessment activities or informal judgments/observations or measuring something unobtrusive

• Formative vs. summativeAlong the way or at some ending, like the end of the school year

Page 19: We’re lost, but we’re making good time Yogi Berra

Modes of assessment (cont.)

• Process or productThe report itself or the way the report got requested/assigned/completed

• Criterion-referenced or norm-referencedA pre-determined standard or a comparison with peers

Page 20: We’re lost, but we’re making good time Yogi Berra

Modes of assessment (cont.)

• Individual focused vs. group focusedThe office itself or the office in the context of the institution

• Learner-judged or teacher-judgedThe customer’s opinion or the office staff’s opinion

Page 21: We’re lost, but we’re making good time Yogi Berra

Modes of assessment (cont.)

• Internal vs. externalInside the office or outside the office/institution

• Maybe it’s not either/or but and/or?

Page 22: We’re lost, but we’re making good time Yogi Berra

Random Thoughts

• As if this wasn’t already random…• Many of our customers aren’t survey-

able• Of our customers who are, what

makes them happy?• Survey fatigue• Our sense of where we are…

Page 23: We’re lost, but we’re making good time Yogi Berra

More random thoughts…

• IR varies wildly in our schools – does that matter? Does it change fundamentally the need for assessment or the kind of assessment we do?

• What responsibility do we have for the data?

• Is our relationship with IT part of what needs to be assessed?

• What responsibility do we have for how our customers use our data?

Page 24: We’re lost, but we’re making good time Yogi Berra

More random thoughts…

• At what scale do we conduct assessment?

• Nomenclature: assessment, evaluation, institutional effectiveness, program review

• When do we do a self study?

Page 25: We’re lost, but we’re making good time Yogi Berra

Mike Middaugh’s metrics

• Average total compensation at/above median of peers within 5 years

• Total financial aid increase by 100% within 5 years

• Student satisfaction shows significant gains within 5 years

• Commit to set aside X% of resources to a goal

Page 26: We’re lost, but we’re making good time Yogi Berra

The IR analogies?

Page 27: We’re lost, but we’re making good time Yogi Berra

Metrics?

• # of external surveys submitted on time with data accurate

• # of internal data requests completed on time successfully

• # of times analyses contributed to institutional action (that contributed to increase in retention and graduation, for example

• # of times had to recall data due to error

Page 28: We’re lost, but we’re making good time Yogi Berra

Seeds of a modest proposal (thanks to Jennifer Gray and John Porter)

• Create a template • We can use or

create variants• Responsible to

AIRPO?• Role of System

Administration?

• Recruit volunteer evaluators

• Middle States model

• We all get to see each other in action

• Benefits accrue to both evaluated and evaluators

Page 29: We’re lost, but we’re making good time Yogi Berra

Logistics

• How much time should it take?• How will evaluators be compensated?• Do schools have to agree to

participate and support?• What force/strength does the

evaluation have?• What is the role of System

Administration?

Page 30: We’re lost, but we’re making good time Yogi Berra

References

• AIR professional file #80• http://www.airweb.org/page.asp?

page=73&apppage=85&id=83

Page 31: We’re lost, but we’re making good time Yogi Berra

"If you don't know where you are going, you will wind up

somewhere else."

Yogi Berra

Page 32: We’re lost, but we’re making good time Yogi Berra

Notes from session as of 06/23/08

• Possible steps toward an assessment of the IR office:• Customer satisfaction feedback

Survey after each projectSurvey at some ending point like the end of a school yearFocus groupPersonal interviewsPurpose both to get feedback and to educate customers as to good data, etc.

• Unobtrusive dataAnalyze your request database – who asks for what? Should what gets asked for be moved into some kind of routine report? Other possibilities?

• Scale is an important concept. Maybe IR does not have discrete assessment activities, but instead folds into its unit. Does the unit to which you report conduct its own assessment? If so, could the IR office fit in? If not, could the IR office lead the way?

• What is in our circle? To what do we say ‘no’? Where do requests from students fit in? • All requests should be in writing.• To whom do the data belong?• Re IR peer review:• Other professional organizations sponsor this – i.e., AACRAO. Could AIRPO/NEAIR lead the

way for IR? • Look for questions on the conference evaluation survey.