trudy banta aaglo forum melbourne may 2012

66
Trudy W. Banta Professor of Higher Education and Senior Advisor to the Chancellor for Academic Planning and Evaluation Indiana University-Purdue University Indianapolis © TWBANTA-IUPUI click for short video of Trudy click for audio of this presentation

Upload: rmit-university

Post on 11-Nov-2014

665 views

Category:

Education


0 download

DESCRIPTION

 

TRANSCRIPT

Page 2: Trudy Banta AAGLO Forum Melbourne May 2012

Discipline-Based Assessment

to Provide Convincing Evidence of

Graduate Learning Outcomes

Presented in

Australia

May 2012

by

Trudy W. BantaProfessor of Higher Education

and

Senior Advisor to the Chancellor for

Academic Planning and Evaluation

Indiana University-Purdue University Indianapolis

355 N. Lansing St., AO 140

Indianapolis, Indiana 46202-2896

tbanta@ iupui.edu

http://www.planning.iupui.edu

© TWBANTA-IUPUI

Page 3: Trudy Banta AAGLO Forum Melbourne May 2012

My History

• Educational psychology

• Program evaluation & measurement

• Performance funding in Tennessee

• 1990 USDOE effort to build a national test

• 1992 Initiated evidence-based culture at

IUPUI

© TWBANTA-IUPUI

Page 4: Trudy Banta AAGLO Forum Melbourne May 2012

© TWBANTA-IUPUI

ASSESSMENT

Is like a dancer’s mirror.

It improves one’s ability to see and

improve one’s performance.

Alexander Astin

1993

Page 5: Trudy Banta AAGLO Forum Melbourne May 2012

© TWBANTA-IUPUI

ASSESSMENT OF INDIVIDUAL

STUDENT DEVELOPMENT

•Assessment of basic skills for use in advising

•Placement

•Counseling

•Periodic review of performance with detailed

feedback

•End-of-program certification of competence

•Licensing exams

•External examiners

Page 6: Trudy Banta AAGLO Forum Melbourne May 2012

© TWBANTA-IUPUI

KEY RESULTS OF INDIVIDUAL

ASSESSMENT

•Faculty can assign grades

•Students learn their own

strengths and weaknesses

•Students become self-

assessors

Page 7: Trudy Banta AAGLO Forum Melbourne May 2012

© TWBANTA-IUPUI

A SECOND LOOK

•Across students

•Across sections

•Across courses

Page 8: Trudy Banta AAGLO Forum Melbourne May 2012

© TWBANTA-IUPUI

•Where is learning satisfactory?

•What needs to be retaught?

•Which approaches produce the most

learning for which students?

Page 9: Trudy Banta AAGLO Forum Melbourne May 2012

© TWBANTA-IUPUI

GROUP ASSESSMENT ACTIVITIES

•Classroom assignments, tests, projects

•Questionnaires for students, graduates, employers

•Interviews, focus groups

•Program completion and placement

•Awards/recognition for graduates

•Monitoring of success in graduate school

•Monitoring of success on the job

Page 10: Trudy Banta AAGLO Forum Melbourne May 2012

© TWBANTA-IUPUI

ASSESSMENT . . .

“a rich conversation

about student learning

informed by data.”

-- Ted Marchese --

AAHE

Page 11: Trudy Banta AAGLO Forum Melbourne May 2012

© TWBANTA-IUPUI

USE OF RESULTS OF GROUP

ASSESSMENT

•Program improvement

•Institutional and / or state peer

review

•Regional and / or national

accreditation

Page 12: Trudy Banta AAGLO Forum Melbourne May 2012

© TWBANTA-IUPUI

ORGANIZATIONAL LEVELS FOR ASSESSMENT

National

Regional

State

Campus

College

Discipline

Classroom

Student

Page 13: Trudy Banta AAGLO Forum Melbourne May 2012

© TWBANTA-IUPUI

GROUP ASSESSMENT REQUIRES

COLLABORATION

In setting expected program outcomes

In developing sequence of learning experiences (curriculum)

In choosing measures

In interpreting assessment findings

In making responsive improvements

Page 14: Trudy Banta AAGLO Forum Melbourne May 2012

© TWBANTA-IUPUI

BARRIERS TO COLLABORATION

IN THE ACADEMY

1. Graduate schools prepare specialists

2. Departments hire specialists

3. Much of our scholarship is

conducted alone

4. Promotion and tenure favor

individual achievements --

interdisciplinary work is harder to

evaluate

Page 15: Trudy Banta AAGLO Forum Melbourne May 2012

© TWBANTA-IUPUI

TO FOSTER COLLABORATION

•Name interdisciplinary committees

•Read and discuss current literature on

learning/assessment

•Attend conferences together

•Bring experts to campus

•Share good practices

•Work together on learning communities

Page 16: Trudy Banta AAGLO Forum Melbourne May 2012

MOST FACULTY ARE NOT TRAINED AS

TEACHERS

Faculty Development

Can Help Instructors:

•Write clear objectives (outcomes) for student learning in courses and curricula

•Connect learning outcomes to assignments in courses.

•Develop assessment tools that test higher order intellectual skills

© TWBANTA-IUPUI

Page 17: Trudy Banta AAGLO Forum Melbourne May 2012

Taxonomy of Educational Objectives

(Bloom and Others, 1956)

Cognitive domain

categories

Knowledge

Comprehension

Application

Analysis

Synthesis

Evaluation

Sample verbs for outcomes

Identifies, defines, describes

Explains, summarizes, classifies

Demonstrates, computes, solves

Differentiates, diagrams, estimates

Creates, formulates, revises

Criticizes, compares, concludes

© TWBANTA-IUPUI

Page 18: Trudy Banta AAGLO Forum Melbourne May 2012

SOME GENERIC LEARNING OBJECTIVES

•Differentiate between fact and opinion

•Gather, analyze, and interpret data

•Apply ethical principles to local,

national, global issues

•Communicate ideas in writing effectively

© TWBANTA-IUPUI

Page 19: Trudy Banta AAGLO Forum Melbourne May 2012

PROFESSIONAL PROGRAM

OBJECTIVES

Program Graduates will Demonstrate

1. Professional commitment

2. Communication skills

3. Administrative and managerial skills

4. Information technology competence

5. Research and analytic competence

Page 20: Trudy Banta AAGLO Forum Melbourne May 2012

To Ensure That Concepts Are

Taught

© TWBANTA-IUPUI

Time management

Page 21: Trudy Banta AAGLO Forum Melbourne May 2012

ALVERNO COLLEGE 8 ABILITIES

Communication

Analysis

Problem Solving

Valuing in Decision-Making

Interacting

Global Perspectives

Effective Citizenship

Aesthetic Responsiveness

Page 22: Trudy Banta AAGLO Forum Melbourne May 2012

PRINCIPLES OF UNDERGRADUATE

LEARNING (PULS)

1. Core communication and quantitative

skills

2. Critical thinking

3. Integration and application of knowledge

4. Intellectual depth, breadth, and

adaptiveness

5. Understanding society and culture

6. Values and ethics

Approved by IUPUI Faculty Council

May 1998

Page 23: Trudy Banta AAGLO Forum Melbourne May 2012

PUL #1

CORE COMMUNICATION & QUANTITATIVE

SKILLS

Demonstrated by student’s ability to:

•Express ideas and facts to others effectively in a variety

of formats, particularly written, oral, and visual formats

•Communicate effectively in a range of settings

•Identify and propose solutions for problems using

quantitative tools and reasoning

•Make effective use of information resources and

technology

Page 24: Trudy Banta AAGLO Forum Melbourne May 2012

PRINCIPLES OF UNDERGRADUATE

LEARNING

•A distinctive feature of education at IUPUI

•Permeate the entire undergraduate

curriculum

•Are enacted differently in each discipline

Page 25: Trudy Banta AAGLO Forum Melbourne May 2012

PUL HISTORY AT IUPUI

1990 – Study group of faculty and staff

1992-98 – Series of task forces

1998 – Adoption by Faculty Council

2007 – Adoption of revised version

Page 26: Trudy Banta AAGLO Forum Melbourne May 2012

© TWBANTA-IUPUI

Standardized tests

CAN

initiate conversation

Page 27: Trudy Banta AAGLO Forum Melbourne May 2012

IN USING STANDARDIZED TESTS

• Match test with curriculum

•Set expected scores on subscales

•Discuss results

•Determine what is missing

© TWBANTA-IUPUI

Page 28: Trudy Banta AAGLO Forum Melbourne May 2012

Limitationsof standardized tests of generic skills

cannot cover all a student knows

narrow coverage, need to supplement

difficult to motivate students to take

them!

What are they actually measuring?

© TWBANTA-IUPUI

Page 29: Trudy Banta AAGLO Forum Melbourne May 2012

© TWBANTA-IUPUI

VOLUNTARY SYSTEM OF ACCOUNTABILITY

Report Scores in

critical thinking, written communication,

analytic reasoning

using

•Collegiate Assessment of Academic Proficiency (CAAP)

•Measuring Academic Proficiency and Progress (MAPP)

•Collegiate Learning Assessment (CLA)

Page 30: Trudy Banta AAGLO Forum Melbourne May 2012

© TWBANTA-IUPUI

TN = MOST PRESCRIPTIVE(5.45% OF

BUDGET FOR INSTRUCTION)

1. Accredit all accreditable programs (25)

2. Test all seniors in general education (25)

3. Test seniors in 20% of majors (20)

4. Give an alumni survey (15)

5. Demonstrate use of data to improve (15)

___

100

Page 31: Trudy Banta AAGLO Forum Melbourne May 2012

© TWBANTA-IUPUI

AT THE UNIVERSITY OF TENNESSEE

CAAP

Academic Profile (now MAPP)

COMP (like CLA and withdrawn

by 1990)

College BASE

Page 32: Trudy Banta AAGLO Forum Melbourne May 2012

© TWBANTA-IUPUI

IN TN WE LEARNED

1. No test measured 30% of gen ed skills

2. Tests of generic skills measure primarily

prior learning

3. Reliability of value added = .1

4. Test scores give few clues to guide

improvement actions

Page 33: Trudy Banta AAGLO Forum Melbourne May 2012

© TWBANTA-IUPUI

AN INCONVENIENT TRUTH

.9 = the correlation between SAT

and CLA scores of institutions

thus

81% of the variance in institutions’

scores is due to prior learning

Page 34: Trudy Banta AAGLO Forum Melbourne May 2012

© TWBANTA-IUPUI

HOW MUCH OF THE VARIANCE IN SENIOR

SCORES IS DUE TO COLLEGE IMPACT?

• Student motivation to attend that institution (mission differences)

• Student mix based on

• age, gender

• socioeconomic status

• race/ethnicity

• transfer status

• college major

Page 35: Trudy Banta AAGLO Forum Melbourne May 2012

© TWBANTA-IUPUI

HOW MUCH OF THE VARIANCE IN SENIOR

SCORES IS DUE TO COLLEGE IMPACT?

(CONTINUED)

•Student motivation to do well

•Sampling error

•Measurement error

•Test anxiety

•College effects

______

19 %

Page 36: Trudy Banta AAGLO Forum Melbourne May 2012

© TWBANTA-IUPUI

STUDENT MOTIVATION

• Samples of students are being tested

• Extrinsic motivators (cash, prizes) are used

We have learned:

• Only a requirement and intrinsic motivation

will bring seniors in to do their best

Page 37: Trudy Banta AAGLO Forum Melbourne May 2012

CONCERNS ABOUT VALUE ADDED

•Student attrition

•Proportion of transfer students

•Different methods of calculating

•Unreliability

•Confounding effects of maturation

© TWBANTA-IUPUI

Page 38: Trudy Banta AAGLO Forum Melbourne May 2012

Recent University of Texas Experience

30 – 40% of seniors at flagships earn

highest CLA score (ceiling effect)

flagship campuses have lowest value

added scores

© TWBANTA-IUPUI

Page 39: Trudy Banta AAGLO Forum Melbourne May 2012

WORD FROM MEASUREMENT EXPERTS

Given the complexity of

educational settings, we may never be

satisfied that value added models can be

used to appropriately partition the causal

effects of teacher, school, and student on

measured changes in standardized test

scores.

- Henry Braun & Howard Wainer

Handbook of Statistics, Vol. 26: Psychometrics

Elsevier 2007

© TWBANTA-IUPUI

Page 40: Trudy Banta AAGLO Forum Melbourne May 2012

Employing currently available

standardized tests of generic

skills to compare the quality

of institutions is not a valid use of

those tests.

© TWBANTA-IUPUI

Page 41: Trudy Banta AAGLO Forum Melbourne May 2012

OECD’S AHELO

COMPARING HEIS X NATIONS

1. Generic skills (CLA)

2. Disciplines (Engineering and Economics)

3. Value added

4. Contextual information indicators

© TWBANTA-IUPUI

Page 42: Trudy Banta AAGLO Forum Melbourne May 2012

2012K-12 standardized test scores are used to

evaluate and compare schools

assign grades to schools

take over failing schools

evaluate, compare, and fail teachers

Yet NAEP scores have stagnated

© TWBANTA-IUPUI

Page 43: Trudy Banta AAGLO Forum Melbourne May 2012

IN FINLAND AND SINGAPORE

•No annual testing of students

•No high-stakes accountability measures for

teachers/schools

•Scholarships for best and brightest

•Starting pay like a doctor

•Must complete master’s degree

•Teachers are respected professionals

© TWBANTA-IUPUI

Page 44: Trudy Banta AAGLO Forum Melbourne May 2012

SHORT-TERM PERSPECTIVE

•Limit degrees to 120 SCH

•Penalize students who go beyond a SCH

cap

•Reward graduation in 4 years

•Consider earning potential in setting tuition

© TWBANTA-IUPUI

Page 45: Trudy Banta AAGLO Forum Melbourne May 2012

DE-PROFESSIONALIZATION –

IMMEDIATE PAYOFF

•Teacher education is first

•Industry certifications

•Partnerships to fill employers’ needs

Does apprenticeship model prepare us for

global leadership in the future?

© TWBANTA-IUPUI

Page 46: Trudy Banta AAGLO Forum Melbourne May 2012

BETTER WAYS TO DEMONSTRATE

ACCOUNTABILITY

Performance Indicators

1.Access (to promote social mobility)

2.Engaging student experience

3.Workforce development

4.Economic development

5.Civic contribution of students, faculty,

staff, graduates

© TWBANTA-IUPUI

Page 47: Trudy Banta AAGLO Forum Melbourne May 2012

IF WE MUST MEASURE LEARNING

LET’S USE:

1. Standardized tests in major fieldslicensure and certification tests

ETS Major Field Tests

2. Internship performance

3. Senior projects

4. Study abroad performance

5. Electronic portfolios

6. External examiners

© TWBANTA-IUPUI

Page 48: Trudy Banta AAGLO Forum Melbourne May 2012

START WITH MEASURES YOU

HAVE

•Assignments in courses

•Course exams

•Work performance

•Records of progress through the

curriculum

© TWBANTA-IUPUI

Page 49: Trudy Banta AAGLO Forum Melbourne May 2012

METHODS OF ASSESSMENT

Paper and pencil tests

Individual or group projects

Portfolios

Observation of practice

Observation of simulated practice

Analysis of case studies

Attitude or belief inventories

Interviews and focus groups

Surveys© TWBANTA-IUPUI

Page 50: Trudy Banta AAGLO Forum Melbourne May 2012

© TWBANTA-IUPUI

Direct Measures of LearningAssignments, exams, projects, papers

Indirect MeasuresQuestionnaires, inventories, interviews

- Did the course cover these objectives?

- How much did your knowledge increase?

- Did the teaching method(s) help you learn?

- Did the assignments help you learn?

GOOD ASSESSMENT INCLUDES BOTH

Page 51: Trudy Banta AAGLO Forum Melbourne May 2012

NILOA SURVEYPROGRAM LEVEL

APPROACHES

1. Portfolios (80% in at least 1 area)

2. Performance assessments

3. Rubrics

4. External judges

5. Student interviews

6. Employer surveys

© TBANTA-IUPUI

Page 52: Trudy Banta AAGLO Forum Melbourne May 2012

© TWBANTA-IUPUI

STUDENT ELECTRONIC PORTFOLIO

•Students take responsibility for demonstrating core skills

•Unique individual skills and achievements can be emphasized

•Multi-media opportunities extend possibilities

•Metacognitive thinking is enhanced through reflection on contents

- Sharon J. Hamilton

IUPUI

Page 53: Trudy Banta AAGLO Forum Melbourne May 2012

More use of RUBRICS

locally developed

VALUE from AAC&U

© TWBANTA-IUPUI

Page 54: Trudy Banta AAGLO Forum Melbourne May 2012

VALUE RUBRICS

•Critical thinking

•Written communication

•Oral communication

•Information literacy

•Teamwork

•Intercultural knowledge

•Ethical reasoning

© TWBANTA-IUPUI

Page 55: Trudy Banta AAGLO Forum Melbourne May 2012

ACCOUNTABILITY REPORT

•85% achieve Outstanding ratings in writing

as defined . . .

•78% are Outstanding in applying knowledge

and skills in internships

•75% are Outstanding in delivering an oral

presentation

© TWBANTA-IUPUI

Page 56: Trudy Banta AAGLO Forum Melbourne May 2012

FOR EXTERNAL CREDIBILITY

Collaborate on rubrics

Use employers as examiners

Conduct process audits

© TWBANTA-IUPUI

Page 57: Trudy Banta AAGLO Forum Melbourne May 2012

E-PORT CHALLENGES

•Reliability of rubrics

•Student motivation if used for assessment

(Barrett, 2009)

•Differences in topics for products to be

evaluated

(Sekolsky & Wentland, 2010)

© TWBANTA-IUPUI

Page 58: Trudy Banta AAGLO Forum Melbourne May 2012

OBSTACLES TO USING

PERFORMANCE-BASED MEASURES

•Defining domains and constructs

•Obtaining agreement on what to measure

and definitions

•Defining reliability and validity

•Creating good measures

- Tom Zane

WGU

© TWBANTA-IUPUI

Page 59: Trudy Banta AAGLO Forum Melbourne May 2012

WILL IT TAKE 80 YEARS . . . ?

3 Promising Alternatives

E portfolios

Rubrics

Assessment

communities

- Banta, Griffin, Flateby,

Kahn

NILOA Paper #2 (2009)

© TWBANTA-IUPUI

Page 60: Trudy Banta AAGLO Forum Melbourne May 2012

TEAGLE ASSESSMENT SCHOLARS

•study assessment data

•visit campuses

•talk with 3-4 groups of students

•talk with faculty about their campus

assessment data

- Charles Blaich

Wabash College

© TWBANTA-IUPUI

Page 61: Trudy Banta AAGLO Forum Melbourne May 2012

NATIONAL SURVEY OF STUDENT ENGAGEMENT

AT

~ HOPE COLLEGE ~

% STUDENTS STUDYING LESS THAN

10 HOURS/WEEK

Freshmen Seniors

2003 38% 39%

2010 21% 28%

© TWBANTA-IUPUI

Page 62: Trudy Banta AAGLO Forum Melbourne May 2012

HOPE COLLEGE

•Considered data over supper

•Proposed solutions

•Conducted student focus groups

•Shared all data with all faculty

•Departments dedicated a meeting to prepare

strategies to increase rigor

© TWBANTA-IUPUI

Page 63: Trudy Banta AAGLO Forum Melbourne May 2012

NATIONAL INSTITUTE FOR

LEARNING OUTCOMES ASSESSMENT

•Surveys

2009 CAOs

2011 Departments

•Occasional Papers

•Website review, standards

•Quick comments (monthly)

•Calendar of events

© TWBANTA-IUPUI

Page 64: Trudy Banta AAGLO Forum Melbourne May 2012

LUMINA

Degree Qualifications Profile

- Linked student learning outcomes

AA to MA levels

Suggests transfer based on assessment

of learning outcomes

© TWBANTA-IUPUI

Page 65: Trudy Banta AAGLO Forum Melbourne May 2012

NEW LEADERSHIP ALLIANCE

FOR STUDENT LEARNING AND ACCOUNTABILITY

- Presidents’ Alliance

- Certification Process

Set ambitious goals for learning

Gather evidence of learning

Use evidence to improve learning

Report evidence and results

© TWBANTA-IUPUI

Page 66: Trudy Banta AAGLO Forum Melbourne May 2012

© TWBANTA-IUPUI

BUILD ASSESSMENT INTO VALUED

PROCESSES

1. Assessment of learning

2. Curriculum review and revision

3. Survey research

4. Program review

5. Scholarship of Teaching & Learning

6. Evaluation of initiatives

7. Faculty development

8. Promotion & tenure

9. Rewards and recognition