idea student ratings of instruction update carrie ahern and lynette molstad selected slides...
TRANSCRIPT
IDEA Student Ratings of Instruction Update
Carrie Ahern and Lynette Molstad
Selected slides reproduced with permission of Dr. Amy Gross from The IDEA Center
www.idea.ksu.edu
04/18/232
Presentation
Process at DSU for online IDEA surveys
Review IDEA - Student Ratings of Instruction system FormsReports
Questions
04/18/233
Process for IDEA Surveys
Faculty receive e-mail for each course with a link to the FIF (new copy feature)
Faculty receive unique URL for each course- must provide this to students
Faculty receive status update on how many students completed
Questions
04/18/236
Student Learning Model
Types of learning must reflect instructor’s purpose
Effectiveness determined by student progress on objectives stressed by instructor
IDEA Student Ratings of Instruction OverviewFaculty Information FormStudent Survey - Diagnostic Form
04/18/239
Faculty Information Form
Some thoughts on selecting objectives
http://www.theideacenter.org/SelectingObjectives
Video for Faculty on completing the FIF
http://www.theideacenter.org/FIFVideo
04/18/2310
Faculty Information Form
One FIF per class being evaluated Course Information
IDEA Department Codes• Extended list:
http://www.idea.ksu.edu/StudentRatings/deptcodes.html
12 Learning Objectives Course Description Items
Optional Best answered toward end of semester
04/18/2311
FIF: Selecting Objectives
3-5 as “Essential” or “Important” Is it a significant part of the course? Do you do something specific to help
students accomplish the objective? Does the student’s progress on the objective
influence his or her grade? In general, progress ratings are negatively
related to the number of objectives chosen. Research Note 3
04/18/2312
Best Practices
Multi-section courses Curriculum committee review Prerequisite-subsequent courses Discuss meaning of objectives with
students Incorporate into course syllabus
04/18/2313
New feature- as of 2/2010
Copy FIF objectives from one course to another
Previous FIFs will be available in a drop down menu (linked by faculty e-mail address)
Student Survey
Diagnostic Form
http://theideacenter.org/sites/default/files/Student_Ratings_Diagnostic_Form.pdf
04/18/2316
Student Survey: Diagnostic Form Teaching Methods: Items 1-20 Learning Objectives: Items 21-32 Student and Course
Student Characteristics: Items 36-39, 43 Course Management/Content: Items 33-35
Global Summary: Items 40-42 Experimental Items: Items 44-47 Extra Questions: Items 48-67 Comments
04/18/2317
FalseFalse Assumptions
Effective instructors effectively employ all 20 teaching methods.
The 20 teaching methods items are used to make an overall judgment about teaching effectiveness.
Students should make significant progress on all 12 learning objectives
04/18/2318
Resources: Administering IDEA
www.idea.ksu.edu Client Resources
IDEA Resources Best practices Directions to Faculty Using Additional Questions Some Thoughts on Selecting IDEA Objectives Disciplinary Selection of Learning Objectives Guide to Administering IDEA Team Teaching
All resourceson our website.
04/18/2321
Comparison Groups (norms)
IDEA Comparisons Diagnostic Form Exclude first time institutions Exclude classes with fewer than 10 students No one institution comprises more than 5%
of the database 128 institutions 44,455 classes Updated only periodically
04/18/2322
Comparison Groups (norms)
Discipline Comparisons Updated annually (September 1)Most recent 5 years of data
• Approximately July 1-June 30
Exclusions same as IDEA Comparisons
• Also exclude classes with no objectives selected
Minimum of 400 classes
04/18/2323
Comparison Groups (norms)
Institutional ComparisonsUpdated annually (September 1)Most recent 5 years of data
• Approximately July 1-June 30
Most recent 5 years of dataIncludes Short and Diagnostic FormExclude classes with no objectives
selectedMinimum of 400 classes
04/18/2324
Norms: Converted Averages
Method of standardizing scores with different averages and standard deviations
Able to compare scores on the same scaleUse T Scores
• Average = 50• Standard Deviation = 10
They are not percentiles
04/18/2326
Adjusted Scores
Control for factors beyond instructor’s control
Regression equations
Link to video clip explaining Adjusted Scores
http://theideacenter.org/taxonomy/term/109
04/18/2327
Adjusted Scores: Diagnostic Form
Student Work Habits (#43) Student Motivation (#39) Class Size (Enrollment, FIF) Student Effort (multiple items) Course Difficulty (multiple items)
04/18/2329
The IDEA Report
Diagnostic Form ReportWhat were students’ perceptions of
the course and their learning?
What might I do to improve my teaching?
04/18/2330
Questions Addressed: Page 1 What was the response rate and how
reliable is the information contained in the report?
What overall estimates of my teaching effectiveness were made by students?
What is the effect of “adjusting” these measures to take into consideration factors I can’t control?
How do my scores compare to other comparison groups?
04/18/2332
Questions Addressed: Page 2 How much progress did students report on
the learning objectives that I identified as “Essential”?
How does this progress compare to the available comparison groups?
How much progress did students report on the “Important” objectives?
How does this progress compare to the available comparison groups?
Do conclusions change if “adjusted” rather than “raw” ratings are used?
04/18/2334
Questions Addressed: Page 3 Which of the 20 teaching methods are most
related to my learning objectives? How did students rate my use of these
important methods? What changes should I consider in my
teaching methods? Do these results suggest some general
areas where improvement efforts should focus?
04/18/2336
Improving Teaching Effectiveness IDEA Website: http://theideacenter.org/
IDEA Papers
http://www.theideacenter.org/category/helpful-resources/knowledge-base/idea-papers
04/18/2337
Questions Addressed: Page 2
How distinctive is this class with regard to the amount of reading, amount of other work (non-reading) and the difficulty of the subject matter?
How distinctive is this class with regard to student self-ratings?
04/18/2339
Questions Addressed: Page 4 What was the average rating on each of the
questions on the IDEA form? How much variation was there in these
ratings? Are the distributions of responses relatively
“normal” (bell-shaped) or is there evidence of distinctive subgroups of students?
What are the results for the additional questions I used?