ICE Evaluations Some Suggestions for Improvement

Download ICE Evaluations Some Suggestions for Improvement

Post on 20-Dec-2015




0 download

Embed Size (px)


<ul><li> Slide 1 </li> <li> ICE Evaluations Some Suggestions for Improvement </li> <li> Slide 2 </li> <li> Outline Background information and assumptions Content of evaluation forms Logistical problems with processing ICE information </li> <li> Slide 3 </li> <li> Background Information Exchange of e-mails by professors last summer Arts and Sciences Task Team currently looking at various ways of evaluating teaching My points here are mostly compatible with both </li> <li> Slide 4 </li> <li> Background Assumptions Student Evaluations will continue to be used They will be used for two purposes: Instructors own improvement of courses and teaching Assessment of teachers by administrators We should make ICE evaluations as effective as possible for both purposes </li> <li> Slide 5 </li> <li> Suggestions About Content of ICE Forms (go to evaluations file) </li> <li> Slide 6 </li> <li> Remove the One Number Overall Average At Bottom of Page It gives less information, not more It is all people will look at if its available Administrators assessing teachers Teachers planning future courses Not all the categories have to do with the instructor, so its unfair to assign these ratings to instructor FAS Task Team unanimously agreed </li> <li> Slide 7 </li> <li> Keep the text of individual questions In some formats of ICE reports, the questions are missing This encourages looking only at numbers So include the actual questions </li> <li> Slide 8 </li> <li> Why Not Also Get Rid of the Category Average Numbers? All the same reasons apply But if this is too much, then really, please, please get rid of the one number average </li> <li> Slide 9 </li> <li> Some Specific Questions on ICE Form Need Revision #20 The Material was not too difficult means that the highest rating is for material that is far too easy Combine questions 18-20 to make question The difficulty and pace of the course were appropriate </li> <li> Slide 10 </li> <li> Question #10 Demonstrated Favorable Attitude toward students Task Team recommendation: change to treated students with proper respect Reason: the old wording favors teachers who are lenient about, for example, plagiarism, arriving to class late, talking during class </li> <li> Slide 11 </li> <li> Other questions to revise #7 Was readily available for consultation outside of class Question #12 Evaluated Work Fairly </li> <li> Slide 12 </li> <li> Too Many Questions Researchers seem to agree with the common-sense idea that too many questions on an evaluation form leads students to give up Some ICE questions seem repetitive or unnecessary </li> <li> Slide 13 </li> <li> How to include fewer questions Again, combine Questions 18-20 to make question The difficulty and pace of the course were appropriate Drop Question #15 and #16 about stating and covering objectives of course, since #17 Course organization was logical and adequate covers these </li> <li> Slide 14 </li> <li> Additional Items on ICE form After the university-wide questions, a section of additional items is included Currently, each faculty (FAS, Engineering, etc.) can choose from an item bank of approved questions Instead, each department should choose any questions they want, whether from item bank or not </li> <li> Slide 15 </li> <li> Why let Departments Choose? Departments are in the best position to design questions that are appropriate for their discipline For example, why think that the same questions would be appropriate to a chemistry course, an education course, and an English literature course? Too much bureaucratic regulation is not beneficial to a university </li> <li> Slide 16 </li> <li> Logistical Problems with Processing ICE Information </li> <li> Slide 17 </li> <li> Course evaluations are often lost or assigned to wrong course Intstructors have students fill out evaluation forms, then no ICE report appears for that course Has happened at least five times in philosophy department in three years Other professors reported the same problem in last summers e-mail exchange </li> <li> Slide 18 </li> <li> The cause? If students fill in the wrong section number, or department number, or course number, then the evaluations all automatically are assigned to the wrong course (or to no course) </li> <li> Slide 19 </li> <li> The Solution Is not to assign blame (as in Well, this is the departments fault, because the graduate assistant who gave the evaluations must have told students the wrong numbers) But instead is to try to redesign system so that this mistake (which is easy to make) does not lead to corruption of data </li> <li> Slide 20 </li> <li> The Solution (part II) A simple but less effective solution: Tell all instructors to give the course information themselves to students themselves, by e.g. writing on the board (this at least makes instructors responsible) A (slightly) more difficult but more effective solution: have some kind of cover sheet for each course, which the computer will read. If the individual ICE forms disagree with information on cover sheet, automatically assign it to the correct course </li> <li> Slide 21 </li> <li> A More Widespread Problem When the evaluations for a course are mysteriously absent, sometimes evaluations from one or two (or more) students appear anyway Or, when a teacher doesnt administer evaluations, she still gets results from one or two students anyway And probably this phantom evaluation process occurs, undetected, in MOST courses </li> <li> Slide 22 </li> <li> Cause of Phantom Evaluations Its the same cause as for the missing evaluations for a whole course If one or two (or more) students write the wrong course numbers, their evaluations will be assigned to the wrong course (even if all the rest of the student forms go to the right course) This probably happens VERY OFTEN So its all the more reason to fix the problem </li> <li> Slide 23 </li> <li> How to Avoid Phantom Evaluation Problem The same way as avoiding the more large- scale assignment of evaluations to wrong courses Have some kind of cover sheet for each course, which the computer will read. If the individual ICE forms disagree with information on cover sheet, automatically assign it to the correct course </li> <li> Slide 24 </li> <li> Another Logistical Problem The ICE form includes a response rate for indicating the percentage of enrolled students who fill out an evaluation form But for at least two of the last four semesters, these figures are inaccurate </li> <li> Slide 25 </li> <li> Why is the Response Rate Often Inaccurate? The response rate is, of course, meant to be an indication of the percentage of students enrolled in the course who actually fill out the ICE form But the total number of enrolled students is not accurate The AUBsis site in fall 2003-2004 and fall 2004-2005 gave a total number of enrolled students at the BEGINNING of the term, not at the end So any students who dropped the class were still included in the enrolled students total So suppose 25 students were enrolled at the beginning of the term, but 5 dropped. And suppose 15 students filled out the ICE form. The official response rate would be 60%. But the real response rate, of students still enrolled, would be 75%. </li> <li> Slide 26 </li> <li> Solution to the response rate problem If OIRA uses the AUBsis information for this, OIRA and the registrar should coordinate the uses to which the data will be put. So the enrolled students number must reflect the number of students enrolled at the end of the term, not the beginning. </li> <li> Slide 27 </li> <li> OIRA office responses to faculty </li> <li> Slide 28 </li> <li> OIRA has not Responded to Faculty Correspondence About Problems A delicate issue Numerous examples Why it matters Solution? I admit I dont know. Maybe a full-time office manager? </li> <li> Slide 29 </li> <li> One final issue: Use of ICE Reports Literature on evaluations often mentions proper use by administrators A quick glance is worse than no information at all Items to focus on: percentage of students responding; type of course (graduate vs. undergrad, introductory vs. advanced); particular questions; distribution of answers (are one or two terrible ratings dragging average down?) NOT ONE NUMBER </li> </ul>