overcoming the teacher’s dilemma (breadth vs. depth)

of 39 /39

Author: sonya-mcpherson

Post on 31-Dec-2015

31 views

Category:

Documents


2 download

Embed Size (px)

DESCRIPTION

Overcoming the Teacher’s Dilemma (Breadth vs. Depth). The American River College Model of College-Wide Course-Level SLO Assessment ***********. Mr. John Aubert SLO Assessment Coordinator, Geography Dr. Yuj Shimizu Faculty Researcher, Psychology. Our Goals Today. Provide background - PowerPoint PPT Presentation

TRANSCRIPT

  • Overcoming the Teachers Dilemma(Breadth vs. Depth)The American River College Model of College-Wide Course-Level SLO Assessment ***********Mr. John Aubert SLO Assessment Coordinator, GeographyDr. Yuj Shimizu Faculty Researcher, Psychology

  • Provide backgroundOur collegeOur transition to new standardsShare our course-level assessment programDevelopment and implementationIntegration with institutional planningBreadth vs Depth approachDiscuss preliminary resultsOur Goals Today

  • American River College Background and ContextLocated in Sacramento, CA (one of four colleges in the Los Rios CCD)36,646 unduplicated students65 academic departments2,197 individual courses3,473 sections taught (Fall 2009)13,318 course level SLOs definedAccreditation site team arrives in

    5 days

  • The SLO/Assessment Index31The number of times that the words assess and assessment appear in the accreditation standards*The number of times that the word outcome appears in the accreditation standards*24*2006, Accrediting Commission for Community and Junior Colleges

  • Accrediting Commission for Community and Junior CollegesAccreditation Workshop Today!AUBERT

  • American River College Transition to New StandardsLate 1990s: Blooms taxonomy utilized in writing course objectives2004: SLO creation/revision embedded in curriculum processThe Outcomes vs Objectives distinction treated as a continuum

  • American River College Transition to New Standards continued 2004-2006: SLO creation/revision continues2006-2007: Focus shifts toward assessmentSpring 2007: Reassigned time allocated to develop assessment programFall 2007: SLO Assessment Coordinator appointed (20-40%)

  • Our TaskTo develop an SLO Assessment program which would:Improve teaching and learningAddress the needs of a large collegeBe consistent with our campus cultureMeet accreditation standardsTo survive the process

  • Our Guiding PrinciplesRespect faculty workloadRely on faculty expertiseBe flexibleIntegrate with existing processesPractice shared governanceMeet or exceed accreditation standards

  • What do they want from us?Pervasive, widespread dialogueAssessment which is:institutionalized and integratedsystematic and continuous (not episodic)Data analysis which leads to changeChange which improves student learningDocumentation!

  • Our solutionComprehensive assessment program adopted September 2007New standing committee formedProgram review streamlined65 departments divided into 3 cohortsCohort 1 entered 1st year (three year cycle)Substantial outreach and training !

  • The American River College Six Year Institutional Planning Cycle

  • SLO Assessment Cohorts

  • American River College Two-Part SLO Assessment for Courses >

  • Assessment Cycle Version 1.0DENIAL

    DEPRESSION

    ACCEPTANCE

    ANGER

    BARGAINING

  • The Problem: Are students learning what we expect them to?

    The Challenge: How we can measure this in a sensible way?

    SLOs and Assessment

  • Our solution: Proposed Model of AssessmentTo develop a process that would: minimize workload gather data broadly gather data systematically be specific to each courses SLOs be transparent improve teaching and learning

  • Student Self Assessment-Survey Pilot Template

  • Student Self Assessment-Survey Pilot

  • Student Self Assessment SurveySurvey piloted Spring 2007. Fast and Simple Results matched teacher predictions2007-2008 (1st Cohort): Full term, face to face courses (43% Return Rate)2008-2009(2nd Cohort): All courses (63% Return Rate)2009-2010 (3rd Cohort): Now in progress (Return Rate ???)

  • Student Self Assessment Survey Adopted Class Climate Software by Scantron to administer surveys, store survey data, and to report results (centralized approach)Our approach can be easily modified for a decentralized / departmental approach

  • Sample Student Self Assessment Survey

  • Survey ResultsSelf assessments/Student ratings can be both reliable (Ross, Rolheiser, & Hogaboam-Gray, 2002) and valid (Cohen, 1981; Fox & Dinur, 2006; Ross, 2006)

  • Criticisms of Student Self Report DataKnown biases in self, course and instructor ratings (Dunning, Heath, & Suls, in press; Frye, 2005) Self presentation concerns, overconfidence Class size Class format (Lecture vs. Seminar) Class purpose (elective, major, GE requirement) Class level (intro vs. 2nd year course) Subject matter (Behavioral Science, Humanities, Sciences) Difficulty of instructor grading Liking for instructor Hotness and Ease (Coladarci & Kornfield, 2007)

  • Interpreting Results CorrectlyLook at Relative Ratings, Not Absolute RatingsSLO 1 SLO 2

    SLO 3

    SLO 4

    SLO 5

    SLO 6

    SLO 7

  • Typical Instructor Reaction to Student Self Assessment ResultsInstructors typically agree and can make quick sense of the data:Aligns with time spent on each SLO, Realize that they do not teach that SLO anymore, The SLO is at the end and they never get to it, Realize that the SLO is the most difficult part of the course, Written in a way that students would not understand

  • Yeah butQuickBroad and systematicSensitive to outliers (produced actionable data)Thus, survey providedIndirectLacks flexibilityDoes not utilize faculty expertiseLacks depth

  • Screening DeviceAirport Security

  • Part 2: Faculty Designed Assessment (Direct Assessment)Customized assessment documented on a common template What course and why? What SLO(s)? When? How broadly? How will you assess? (describe tool or rubric) What is the criteria for successful SLO achievement? Who will be administering / scoring? Submitted to the SLO Assessment Committee

  • American River College Two-Part SLO Assessment for Courses Three-Year Cycle for DepartmentsPart IStudent Self Assessment (every course assessed)Part IIFaculty Designed Assessment (one course assessed) Action Plan develop and implement Action Plan continue implementationYEAR 1YEAR 2YEAR 3

  • Year 2: SLO Action PlanDepartments respond to both parts of assessment Part 1: Respond to the Student Self-Assessment Part 2: Respond to the Faculty Designed Assessment

  • Year 2: SLO Action PlanDetailed Instructions

    Action Plan and Instructions approved through an 8 month process

  • Where we are now2/3s have participated in the college wide SLO Assessment process1/ 3 have completed Action Plans63 Separate Actions taken/planned as a result of SLO assessmentOf the 65 Academic departments

  • Conclusion

  • Thank you!John Aubert, SLO Coordinator, [email protected] or 916-484-8637Yuj Shimizu, Faculty Researcher, [email protected] or 916-484-8149

    http://inside.arc.losrios.edu/~slo/

    Good afternoon; thanks to all of you for joining us today.My name is John Aubert.Ive been a post secondary educator for 15 years, the last 10 years at ARC. In contrast with my colleagues obvious credentials, Im here because I didnt run fast enough when asked to be my colleges SLO Assessment coordinator.In all seriousness, my skills were more related to professional development, faculty leadership, and program planning Our coordinator/researcher collaboration has been extremely friutful. Yuj

    *Provide backgroundOur collegeOur transition to new standardsShare our course-level assessment programDevelopment and implementationIntegration with institutional planningBreadth vs Depth approachDiscuss preliminary results

    *Located in Sacramento, CA (one of four colleges in the Los Rios CCD)36,646 unduplicated students65 academic departments2,197 individual courses3,473 sections taught (Fall 2009)13,318 course level SLOs definedAccreditation site team arrives in 5 days

    As of Oct 2009:1528 program SLOs84 student service SLOs37 GE SLOs7 ISLOs**As a way to remind us how the new standards have changed and what is being emphasized, Id like to share with you what I call the SLO Assessment Index.Show figures/factsNow where did I learn this and other important issues related to assessment and accreditation? click

    *Trainings, trainings, and more trainingsNow, comedy aside, I know that this slide represents the frustrations of many across the state. We certainly dont pretend to have the key to success (though you can ask us at the end of next week), but one strategy we employed ended up being quite useful We consistently downplayed and deemphasized the accreditation standards in favor of highlighting how we could make SLO assessment work to our collective benefit. Easier said than done, I knowTransition to New Standards

    Late 1990s: Blooms taxonomy utilized in writing course objectives2004: SLO creation/revision embedded in curriculum processThe Outcomes vs Objectives distinction treated as a continuum*Transition to New Standards continued

    2004-2006: SLO creation/revision continues2006-2007: Focus shifts toward assessmentSpring 2007: Reassigned time allocated to develop assessment programFall 2007: SLO Assessment Coordinator appointed (20-40%)

    *Our Task

    To develop an SLO Assessment program which would:Improve teaching and learningAddress the needs of a large collegeBe consistent with our campus cultureMeet accreditation standardsTo survive in the process

    *Our Guiding Principles

    Respect faculty workloadRely on faculty expertiseBe flexibleIntegrate with existing processesPractice shared governanceMeet or exceed accreditation standards >>>> Now, in regards to the standards (click)

    *What do they want from us?

    Pervasive, widespread dialogueAssessment which is:institutionalized and integratedsystematic and continuous (not episodic)Data analysis which leads to changeChange which improves student learning (observable through reassessment!)Documentation!

    *Our solution

    Comprehensive assessment program adopted September 2007New standing committee formedProgram review streamlined65 departments divided into 3 cohortsCohort 1 entered 1st year (three year cycle)Substantial outreach and training !Getting this plan approved involved an initial convocation presentation, a month long campus wide comment period, and approval by the academic and classified senates. It was formally adopted by the President on the recommendation of the Planning and Coordination Council.Once approved, we met with individuals, departments, divisions, chairs, deans if there was a meeting, we asked to be invited

    *What I want to do here is convey where the assessment process fits into our larger planning cycle.

    *years*program review*SLO assessment *annual Educational Master Plan, which aligns/realigns departmentmental goals and resources requests with college goals.

    ****John just showed you the Assessment cycle that we have developed. But I want to be straight with you, this isnt quite where we started. Our original assessment cycle looked more like this.

    [C] As you can see, The First step, was DENIAL: This SLO assessment cant be happening to US![C] Then came ANGER: Why ME! Why US?[C] Followed by BARGAINING: Look, instead of assessment, well attend Three district wide meetings! Ok, maybe two[C] Then by Depression: Oh noassessment is upon us, we cant fight it anymore[C] And finally, Acceptance, followed by more Denial.

    So, part of our early efforts then was to work vigorously to break this vicious cycle.

    **Well to do that, we started by formulating the problem and the challenge that we faced in addressing this problem[C] the problem was basically, are students learning what we expect them to know, or be able to do, upon completing a course. That was the problem at hand.

    [C] Now our challenge was not in finding a solution, but in finding a sensible solution. Something that made sense for our college and our campus culture.

    *Our solution, was our proposed model of assessment. Drawing from the Broader Tasks and principles that John mentioned earlierOur goal was to develop a process that would[C] Minimize workload, it would be FAST, EASY, & USER FRIENDLY, [C] A process that would gather data broadly, Giving us the ability to cast a WIDE NET,[C] Doing so in a systematic way that would produce fair and valid data[C] It would be specific to each courses SLOs, it would be customized, flexible, & thus appropriate, capitalizing on faculty expertise[C] It would be Transparent, providing a clear path from assessment to interpretation to action[C] And Most importantly, that it would have the potential to improve teaching and learning

    Well, what was our solution? (flip)

    *We started here. [C}This is an actual course outline of record. This is what it looks like. This particular one is for Math 410. All of our course outlines reside on Socrates, our in house curriculum management system.

    [C} As you can see, Section 3 of this course outline is where our course SLOs live. They begin with the prompt, Upon completion of this course, the student will be able to:demonstrate skills and techniques in solving systems of linear equations, Calculate determinants and so on down the line*[slowly] Then, we created a student self assessment survey template. A common template that could be used for any course. [Click] - Note that each statement starts with I am able to followed by [insert student learning outcome].

    *Then, we populated the template with the course SLOs from our curriculum management system, Socrates, directly from the course outline of record. [C] In this example, we filled in the template with the SLOs for Math 410 which I just showed you a minute ago.Instead of the phrase, Upon completion of the course you will be able to which appears in the course outline, the survey simply starts each statement with I am able to from the students perspective, and then completes that phrase with the specific slos for each particular course. The Students task is simply to rate their agreement or lack of agreement with each of these statements. [C]All the faculty and staff need to do, is to hand the surveys out to their students and collect them roughly 5 minutes later. Fast and simple. *This is actual pilot data, from one of our colleagues. As you can see, we found that this survey produced easily interpretable results. [C] Scores above the dotted red line are positive ratings, below are negative.[C] Here, we can easily see that these SLOs were rated positively. Alls well here.But, [C] we can also see that one SLO clearly stuck out from the others.This kind of result clearly suggested that some kind of action was needed.When the professor saw these results, they matched up with his expectations.The last SLO, I am able to find the kernal and range Right off the bat, he said, I dont teach that. Its at the end and we never get to it *To recap, We piloted the student self assessment survey during the Spring of 2007 and the results were very promising.First, it was fast and simple (The survey was easy to administer, analyze, and interpret)Second, what students said they learned matched the teachers expectations of what the teacher thought the students had learnedThese results provided important validation for our survey approach.We believed then [pause] that this survey method would eventually allow us to assess all courses in a streamlined fashion with minimal workload, yet yielding valuable and actionable data, data that would ultimately help our students.Since then, weve made some great strides in that direction.[C] in 2007-2008, we set out to assess face to face full term courses, for the 1st cohort of departments going through our initial round of our colleges formal SLO assessment process. [Pause] Being the inaugural rollout, we managed a 43% return rate, which mind you is still boatloads of data, but far below what we would have liked[C] In 2008-2009, we expanded our process to include short term classes and online offerings for our 2nd Cohort. With continued awareness and participation in the SLO assessment process, we improved our return rate to 63%[C] We are currently in progress for our 3rd Cohort, and we are hopeful that we will witness further growth in return rate.*To handle all these surveys, we adopted Class Climate, a Scantron product. Its a powerful survey tool that we adopted for its robust scanning abilities, and for its ability to administer mass amounts of surveys, store survey data, and to report results in an easy to read format, even for those with little statistical training. Now, our approach is centralized, in that the research office takes the lead on these tasks.

    That being said, our approach can be easily modified to a more decentralized, or localized approach if that is something more akin to your campus culture. *This is an actual sample of a paper version of the student self assessment survey as produced by Class Climate. At the top, it includes information about the department, course, and section number. It tells the students that these surveys are anonymous. And as with our pilot, each item starts with I am able to , SLO 1, I am able to SLO 2. Students rate themselves from strongly agree to strongly disagree. As you can see, they also have the option to check dont understand. But, I can tell you that this option is rarely used. *So, what do the results look like? Well, here is an actual sample of results from the student self assessment survey. Ive removed the course identifier, but Ive kept the key parts to show you. This PAGE of the results is called a profile line and its meant to give you an easy to see profile of how students responded. The course SLOs are listed along the left-hand side. The scale goes from Strongly agree to Strongly Disagree. Each of the red dots is the average response for all students who rated themselves on that SLO. As you can see, the ratings are fairly even and positively rated IN THIS CASE. Now, you might be thinking, Well, arent students ALWAYS going to tell you that they know everything??? And to be honest, we had those very same concerns. However, we found out from our pilot and our first cohort, that NO, they arent, they arent going to tell you that they know everything when something is strange, or its an SLO they know they dont know.

    [C] At this point, Id like to mention that there is research which supports the reliability and validity of self assessments and student ratings. However, you may still have concerns over relying on student self report data. So, I thought Id list the criticisms noted in the literature.*Important for our purposes, theres a laundry list of known biases in self, course and instructor ratings. Moreover, many of these baises are beyond the professors control. For example[C] self presentation concerns & overconfidence, Some students may not want to admit that they dont know something out of fear of feeling incompetent or stupid, while others believe that they are above average on everything in a lake Wobegon fashion, Students ratings of instruction and courses are influecned by a slew of class charactersistics such as[C] class size, the bigger the class, the worse the rating[C] class format, low interaction classes such as lectures receive lower ratings, high interaction classes such as seminars receive higher ratings[C] Class purpose, the reason for taking the class matters Electives taken presumably out of self interest gets higher ratingssomething that a student feels forced to takea GE requirement, gets lower ratings[C] Class level, Intro=lower rating, [C] Subject matter: John has it harder than I do. Behavioral sciences like psychology generally receive higher ratings, while sciences receive lower ratings.Research done on rate my professor [Ask, how many of you are familiar with that website???] Research has shown that Hotness and Ease predicted ratings of overall quality and official course and instructor evaluations. In spite of all this, can student self assessments of student learning be valid, informative, and useful? We say, absolutelylet me show you how.

    *The key is in how you look at the data. Or, what you look for.Lets look at these 2 examples, againthese are real data.Ive removed the SLOs but to remind you, each of these red dots represent the average score per SLO for a given course.The key, is to look at relative ratings, the rating of SLOs relative to each other. And Not to focus on Absolute ratings, or where the ratings fall on the scale. John and I argued quite adamantly that this be the way that results are looked at. All the biases I just showed youthe good thing for us is that they should influence the overall response systematically, in a uniform fashionNothing about those biases would predict a single SLO from deviating from the others, except that there is something about that SLO, something that all the students, the ones that like us and the ones that dont, are telling us. hey, there is something different about this one.

    *Well, what do our instructors think? They typically agree and can make quick sense of the data. [C] Correlates with instructional focus (time spent) (speech)[C] Older curriculum, or the field has changed and perhaps that equipment is no longer used (design tech)[C] At the end and never get to it (several)[C] Hardest part (psychology)[C] Written using terminology that we dont use anymore or that we dont mention in the class (English)

    The great thing is that all of these reactions lead to action. *Thus, the survey provided assessment results quickly, with minimal demand on workload[C] it was Broad and systematic, a College wide process[C] and it serves as an instrument that is sensitive to outliers, producing actionable data. All good things...[C] Yeah but[C] Students are telling us, not showing us, not demonstrating (its an indirect method)[C] the survey lacks flexibility, its a one size fits all kind of assessment. [C] In that sense then, it fails to utilize faculty expertise, something that we felt was important and a part of our campus culture[C] And Methodologically, it lacks depth. It doesnt serve as an in depth, close inspection of how students are doing

    Soto address these issues we knew we needed something more*Took our initial inspiration from maybe an unexpected place: the airport.

    The Survey is kind of like the X-ray machine or screening device at the airport, The good thing about the X-ray machine at the airport is that its fairly quick once you get there, and everyone has to go through it. Not a handful, but everyone. The survey for SLOs is intended to be like the X-ray machine in this regard.Now, I drove here from Sacramento. However, if your flew here, and you were lucky enough, like Ive been on other occasions, you also know that security also conducts a more thorough check on a subset of travelers, giving the occasional pat down, to make sure that everything is cool. Now, were not advocates for the TSA, but we decided that the idea of a quick, broad, survey of many, combined with a closer, more direct faculty designed look of a few made for a balanced approachand a very good approach for our college.To that end(flip)

    *We created the Two-Part Approach to course SLO Assessment at American River CollegeThis is a graphic that we have utilized on our own campus as a training resource.[C] Part I: Is the student self assessment survey [Walk though the graphic from top to bottom][C] Part II: on the other hand, is our closer look. The Faculty designed assessment. [Walk through from top to bottom.

    *To assist departments conduct their faculty designed assessments, we created a faculty designged assessment plan template. Really, it serves 3 functions 1) it serves as a Guide or checklist of things you need to have considered in order to carry out an assessment. Such as[Walk through]Mention some examples of assessment toolsPsychology: common test or assignment, some utilizing a pre-post (Engineering)English: Graded random sample of papers using a shared rubricArt: graded portfolios using a shared rubricMusic: judged student performancesThese plans are then submitted to the SLO Assessment Committee. Which brings me to the 2nd and 3rd functions of this plan. 2) This form gets submitted to the SLO assessment committee so that the committee can oversee that the process is being followed, rather than to evaluate the merits of a departments particular plan.

    3) In doing so, the assessment plan also provides documentation of detailed assessment processes which Im sure will come in handy for our impending accreditation visit. *So, there you have Year 1 of our SLO Assessment cycle. Part I the student self assessment, and Part II, the faculty designed assessment, which really operate as simultaneous, independent processes occurring in parallel with each other. Now, designing and collecting all that assessment data would be pointless if it didnt lead to action, actions that might improve student learning. This leads us to Year 2 and Year 3, the development and implementation of the departments Action Plan.

    *The Departments Action Plan is where they document the actions they will be taking as a result of SLO Assessment. This includes Part I, responses to the student self assessment, and part II, responses to the faculty designed assessment. Part I is just list the course, list the action. Part II is more narrative in format meant to mirror the faculty designed assessment plan. What course, were there any changes, what were the results and what changes do you plan to make.

    As with the assessment plan, this Action Plan is submitted to the SLO assessment committee which checks that the process has been followed.

    *Along with the action plan comes a set of detailed instructions. Of note in the instructions is a list of potential actions that a department might take in an attempt to improve student learning. We highlight that this list is not meant to be prescriptive or exhaustive, just to be used as a starting point for discussion.

    Now, I want to emphasize that the Action Plan and Instructions were not created overnight, or simply from the minds of a few. Instead, it underwent molding from many, many groups and resulting in a series of transformative revisions. In fact, from the 1st version, which looked very different from this, to here. Took us [C] 8 whole months of back and forth.

    *So, where are we nowOf the 65 academic departments[read slide] [C] [C]Have our assessment efforts resulted in actions??? You bet. [C] 63 separate and specific actions have been taken or planned as a result of SLO Assessment.

    [Do Not Read below]41 revisions to SLO11 revisions to course topics5 develop strategy to disseminate or highlight the importance of course outline1 develop adopt standard criteria for assignments consistent with SLOs1 update or add instructional equipment1 training adjuncts3 develop departmental strategy to aid retention of study material

    Actions: planned and already completed:

    *Actions: planned and already completed:

    *