making effective use of student feedback on innovative practice to improve educational outcomes sit...
TRANSCRIPT
Making effective use of student feedback on innovative practice to improve educational outcomes
SIT Tertiary Learning and Teaching Conference – Te Ao Hou
3 October 2014
Peter Coolbear
October 2014
Plan of presentation
• Planning interventions to improve teaching and learning and demonstrating they work
• Types of evidence
• Some examples
• Fitting it into the larger picture of student evaluation and self-assessment
• Clarity of purpose; obligations to learners; it’s a cyclical process
October 2014
Planning innovation
Identify problem / need / opportunity
Identify possible intervention
Implement
October 2014
Planning innovation
Identify problem / need / opportunity
Identify possible intervention
Implement
Does it feel good?
October 2014
Planning innovation
Identify problem / need / opportunity
Identify possible intervention
Identify what success looks like
Implement
Does it feel good?
Measure
October 2014
Planning innovation
Identify problem / need / opportunity
Identify possible intervention
Identify what success looks like
Implement
Does it feel good?
Dialogue with students
Measure
October 2014
Planning innovation
Identify problem / need / opportunity
Identify possible intervention
Identify what success looks like
ImplementMeasureDialogue with
students
Risk – processes not inclusive of all students
October 2014
Collecting evidence to inform practice improvement
Anne Alkema
Heathrose Research
October 2014
Choosing the right measures
Measures must be fit for purpose:
• What do you really want to know?
• How does your model of success model inform that decision?
• How precisely do you want to know?
• How representative do your data need to be?
October 2014
What is the nature of the evidence on which self-assessment is based?
Types of evidence
• Formal• Informal
• Benchmarking
• Data • Information
• Process• Output• Outcome
Timeliness
• Historical trends• Retrospective• Real time
ValidityAnalytical capability
October 2014
Examples
The good and not so good
What do we want to know and do we really want to know it?
How sophisticated do we want to get in our analysis?
Informing action
October 2014
Enhancing success for Māori Learners in the Health Sciences
Critical incident methdologies
Dr Elana Taipapaki Curtis
University of Auckland Equity Award 2012
October 2014
Active engagement with students about their learning.
From post-it notes to flipped classrooms to problem based learning to knowledge maps
Dedicated education units for nursing – MIT and CMDHB
October 2014
• Employer is keen to do more
• Students complain when they can’t participate
October 2014
Michael Mintrom (University of Auckland) Creating team spirit and a culture of excellence among course participants - Politics 767: Managing Research Projects : Triangulated evidence that this course supported a 1.0 shift in GPA
Caro McCaw (Otago Polytechnic) - The Path of Nexus: Māori student success in a design school context: the students tell the story about their enhanced learning environment
October 2014
Examples
And one not so good ……..
Intervention: developing an assessment on-line.
Staff really excited and put in a lot of work
Staff claimed to see improved quality of work, but no evidence of improvement of overall grades.
Evidence that some students found the on-line intervention intimidating.
October 2014
It’s not all straight-forward
What happens if the data is ambiguous? …. and it often is.
How do we document for internal and external purposes?
How do we dis-entangle what we want to know from other evidence collection within the organisation?
Moral obligations
The problem of causality
And finally …..
We’ve got a great intervention, how do you sustain it?
October 2014
The disjointed nature of evaluation: pockets of earnest endeavour
Formal programme evaluation
Engagement surveys
Institutional satisfaction
surveys
Formalteacher
evaluation processes
Programme review
Real-time discussion
with students
October 2014
This is mainly compliance
stuff
This really helps me with my teaching /
practice
This also works to help my students learn how to
learn
This is a fundamental
part of organisational development
How do you see the student evaluation processes your organisation in the context of your job as a teaching professional?
Why?
October 2014
Sarah Stein, Dorothy Spiller, Stuart Terry, Trudy Harris, Lynley Deaker and Jo Kennedy.
University of Otago, University of Waikato, Otago Polytechnic
“Closing the loop”
October 2014
Case Study – Paul
Paul has been teaching in the Sciences for 30 years. He thinks teaching is important and aims to explain core concepts to his students in a clear and accessible manner. He always tries to find fresh ways of making the material engaging for his students. He is interested in student feedback gathered through formal appraisal, but feels that often students cannot make objective judgements because their understanding of the subject is inevitably partial. He does not refer to student appraisals to inform his teaching unless he gets an unusually low score. In that instance, he may go back to the comments to try to find out what has happened. At the same time, he is not keen on ‘knee-jerk reactions’ in response to student appraisals as there is a curriculum that must be covered. Paul does not talk with colleagues or students about his appraisals.
Stein et al., 2012
October 2014
Case Study – Mere
Mere is an educator on a degree programme. She sees her role as prompting students to think about, and engage with, social justice issues. She talks about transformative learning and her hope is that the learning experiences she provides will be transformative for her students. She is an avid collector of student feedback and is committed to closing the feedback loop. She believes that students need to be listened to and shown that their views matter. She does have some concerns about the quality and usefulness of the questions on the standard formal evaluation questionnaires and tries to collect feedback throughout the course and discuss it with students. She feels that the institution is too focussed on the quality dimensions of the evaluation and that it does not promote and support the professional development benefits of the instrument strongly enough.
Stein et al, 2012
October 2014
The purpose of collecting evidence
Informing reflective practice (personal evaluative self-assessment)
Team evaluative self-assessment
Organisation evaluative self-assessment
Assisting student learning
Assisting students in learning about being effective learners
Support for promotion
Support for organisational decision-making
Public relations and marketing
October 2014
The purpose of collecting evidence
Informing reflective practice (personal evaluative self-assessment)
Team evaluative self-assessment
Organisation evaluative self-assessment
Assisting student learning
Assisting students in learning about being effective learners
Support for promotion
Support for organisational decision-making
Public relations and marketing
October 2014
Responsibilities to students
The data and information we collect is largely about students or about things that matter to students.
• What information do students get back?
• What information about actions do students get back?
• Is the information used to prompt any actions by students in support of their own learning.
October 2014
And finally, the problem of causality
Interventionput in place
Effectobserved
Interventionput in place
Effectobserved
October 2014
In the end it’s about closing the loop