aes, darwin 10 th september 2014 traversing the interplay of politics and evaluation: education...

Post on 21-Dec-2015

215 Views

Category:

Documents

0 Downloads

Preview:

Click to see full reader

TRANSCRIPT

AES, Darwin 10th September 2014

TRAVERSING THE INTERPLAY OF POLITICS AND

EVALUATION: EDUCATION REFORM IN AUSTRALIA

Assoc Prof Janet ClintonDr Amy GullicksonRuth Aston

Edmund MissonPauline Ho

Introduction

Paper 1: Collaborative Evaluation

Paper 2: Evaluation Methodology

Paper 3: Dissemination of Evaluation Findings

Paper 1: Collaborative Evaluation

Janet Clinton & Edmund MissonUniversity of Melbourne & AITSL

Evaluation context - 1

Overview of the Standards

Map progression: Graduate Proficient Highly Accomplished Lead

Evaluation context - 2

Dual purpose: Improvement

Performance and development Professional learning

Career progression Accreditation – Graduate Registration – Proficient Certification – Highly Accomplished and Lead

Support Materials

Why this evaluation? Evaluation embedded in policy implementation to inform

and guide implementation How?

Collaborative Formative Multi-method Multi-year Utilising existing infrastructures

Considerations Stakeholder communication and engagement Sharing information and findings Timing Alignment of requirements/needs Changing political context Evaluation is designed to add value

Paper 2: Evaluation Methodology

Amy Gullickson & Ruth AstonUniversity of Melbourne

Evaluation methodology

WE ARE HERE

Evaluation methodology

Mixed-Methods Informing policy development =

Evidence Based Policy Implementation

Evaluation phases

Phase 1• Build evaluation

foundation• Establish stakeholder

groups and team• Lit review & program

logic

Phase 2

• National Forum• Stakeholder interviews• Collect existing

documents

Phase 3

• National Survey #1• National online

depository

Phase 4

• Case studies• Data collection round

#2

Phase 5

• Stakeholder interviews• Final data collection

round• National Survey #2

Phase 6

• Triangulation of findings

• Revisit program logic • Draw overall

conclusion

Stage 1- Develop and refine design

Stage 2- Collect evidence

Stage 3- Finalise and make recommendations

Levels of influence ACCREDITATIO

N(GRADUATE)

REGISTRATION(PROFICIENT)

CERTIFICATION(HA AND

LEAD)

PERFORMANCE AND

DEVELOPMENT

PROFESSIONAL LEARNING

SUPPORT MATERIALS

AND RESOURCES

STUDENTS

TEACHERS AND SCHOOL LEADERS

SCHOOLS AND COMMUNITIES

SYSTEMS/SECTORS &

AUTHORITIES

STATES/TERRITORIES

NATIONAL

POLICY

National Forum Interviewees

= 82

Workshop participants

= 174

Government teacher employersCatholic teacher employersIndependent teacher employersRegulatory Authorities

Union

Principal AssociationNational

BodiesDeans of EducationAITSL Board MembersAITSL Board

Alumni

National Survey#2 (June 2015)

National Survey #1

 Respondents TotalTeacher 4141School leader 1427Teacher educator 214Pre-service teacher 219Combined 6001

Case Studies

Paper 3: Dissemination of Preliminary Findings

Edmund Misson & Pauline HoAITSL

Driving a dynamic communications strategy to evidence change and impact for the Evaluation Edmund Misson, Pauline Ho

In collaboration with Sam Hussein, AITSL Communications, Online & Social Media

Introduction

The Evaluation is a complex and dynamic process of reform implementation

- involves diverse stakeholders across varying contexts, levels, interests and needs.

Engaging the

Profession

Practising Teachers

School Leaders

Pre-service

Teachers

Teacher Educator

s

Members of key

orgs and agencies

System Leaders

AITSL’s Communications & Social Media Strategy Part of the whole of organisation Universal

Analytics Framework (UAF) tracks the promulgation and engagement of AITSL’s tools and resources

Multiple levels of analytics: CEO & Board Teams’ analytics to track engagement with tools and

resources ‘Deep dives’ to explore engagement of specific

requests ‘Personas’ to understand demographics, behaviours

and horizon scanning of AITSL’s audience

Aim of the Evaluation’s Communications & Social Media Strategy Methodological Rigour: Gain awareness and participation in the key data

collection activities of the Evaluation e.g. National Forum, National Surveys, Case Studies.

Engage the Profession: Encourage professional conversations on the Evaluation’s findings, analysed and reported in a variety of data visualisation ways.

Promulgate Findings: Share findings through various communication channels cross stakeholders to value add to the policy implementation.

Ultimately, the key goal is to increase the effectiveness of the implementation of Australian Professional Standards for Teachers in schools and organisations across jurisdictions and sectors.

Acquisition

Behaviour

Outcomes

Digital Marketing & Measurement Model by Avinash Kaushik

What are we doing to gain stakeholders’

attention to the Evaluation?

What are they doing and how are they

accessing the Evaluation’s activities?

What is the impact and outcomes of

their participation?

Objective: Engage stakeholders of the education profession to participate in the Evaluation and to drive implementation of the

StandardsMeasuring Impact

Engage

Agents

Participate

Reimaging Data

Web analyticsSocial media

analyticsCampaigns

Print

Measu

re

Knowledge translation

Acquisition - Using diversity of acquisition channels

Social Media

FacebookTwitter

LinkedIn

Print Collaterals

ReportsJournal

publicationsNewslettersInfographicsFact Sheets

Website & Online

MarketingAITSL E-News

Evaluation webpageInteractive

Infographics

Knowledge translation

6233New visi-tors

2140Return-ing Visi-

tors

desktop tablet mobile0

1,000

2,000

3,000

4,000

5,000

6,000

7,000 6,425

1,133 585

Utility of Devices

8,143 unique visitors

E-ne

wsle

tter

Twitt

er

Face

book

Ning

Googl

e+

Wor

dPre

ss

Blogg

er

Pock

et

Slid

eSha

re

010002000

Social Networks

% of audience accessing the survey 70%

898

121

1,780

1,182

3,490

227

1st interaction

2nd interaction

Evaluation page

50.5% 18.7%

Survey page

62.3% 51.8%

Behaviour – Engaging with our content

Avg. Visit Duration: 07:23

0

1000

2000

3000

4000

5000

6000

7000

8000

9000

682 763

912

7893

730 325 276 380

971

969

3721

2277

1543

2045

National Survey (Oct-Nov 2013)

Case Studies 2014 (Mar-May 2014)

Monthly Analytics ReportTracking unique visitors

Behaviour – Engaging with our content

Aug 2013

Sept 2013

Dec 2013

Mar 2014

Apr2014

May 2014

National Survey 1

Research

6,001 respondents

across jurisdictions

SR 1 LAUNCH

Disseminatefindings

Case Studies LAUNCH

Case Studies*closed*

140 submissions

across jurisdictions

June 2014

Stakeholder 1 Fact Sheet –Infographics

Disseminatefindings

Targeted Marketing

Ongoing promulgation

Targeted Campaign

Research Research

August

2014

SR 2LAUNCH

Outcomes – Impact of the campaigns

Short-term- Create awareness of the Survey and Case Studies

Mid-term- Participate in ongoing data collection activities- Inform findings and shape the Evaluation

Long-term- Analyse and reporting of results and findings

Investigate the Impact of the implementation of the Standards on improving teacher quality

Outcomes – Impact of the campaigns

Next Steps

Not purely an online/social media strategy Involves other forms of stakeholder engagement

including workshops, symposiums, forums and meetings

Rethinking about how we deepen our understandings of the Evaluation through other ways of working across jurisdictions, sectors, schools and organisations

Paper 4: Bringing it together

Janet Clinton & Edmund MissonUniversity of Melbourne & AITSL

Final Comments

Adding value

Defining boundaries of collaboration

Incorporating knowledge translation

Transparency

Demonstrating worth

Understanding Collaborative Evaluation

2013- defining the approach and how it was going work Difficult– the literature often promotes it but doesn't really tell

us how It about all about:

RELATIONSHIPSUNDERSTANDING

TRUSTRIGOUR

ROLE DEFINITION

Collaborative Evaluation

How do we truly be collaborative and maintain objectivity?

Relationship• How it evolves• Practically how it works• What does it mean for

methods

Understanding the influences

Rigorous methods that are open and transparent

Understanding the Influences for each Party

Collaboration

Historical

Economics

Social

PsychologicalPolitical

Cultural

Contextual

Have the conversations and act! Relationships tensions Strong personalities Different management

styles Variable flexibilities Extended partners

Conversation---• Open & frank• Lose the emotion• Process a solution

Collaboration about Methods

Forums

Case Studi

es

Survey

Conversations

Model of Objectivity

OBJECTIVITY

• Recognised Standard evaluation framework• Rigorous methodology• Mixed methods• Evidence of analysis at every level• Triangulation clarity• Sophisticated social science analysis• High level of dissemination• Evidence base content• Every step transparent & reproducible

DIAMOND STANDARD APPROACH

Re-analyse -- suggest—advise– reflect—promote - use

Analyse –demand—contact--micro-manage-- change content- little dissemination

High

Low

High-stakes EvaluationIf it doesn’t work

But….

HANDLE WITH CARE

Questions?

Assoc Prof Janet Clintonjclinton@unimelb.edu.au

Ruth Astonruth.aston@unimelb.edu.au

Pauline Ho Pauline.Ho@aitsl.edu.au

Contacts

top related