xdelia evaluation framework
TRANSCRIPT
Game and Sensor Solutions for Financial Decision Training and Support
Work package 6
Kick-Off Meeting, CIMNEMarch 9-11, 2009
[The Institute of Educational Technology, The Open University]
[The Institute of Educational Technology, The Open University]
Objectives To reach agreement on the project’s overall
evaluation framework, which embeds formative evaluation at each stage of the project
To ensure that all participants start from a common basis and make use of the latest developments in the different RTD areas
[wp 6][wp 6]
Kick-Off Meeting, CIMNE Barcelona, 9-11 March 2009
Objectives and Aims
The main objective however is the development of a sound evaluation framework within which RTD activities can be effectively organised and executed.
The framework provides the vehicle to ensure that comprehensive, ongoing evaluation is built into all facets of the project and that evaluation findings feed back into the ongoing development activities of the project
Project Stakeholder Workshops (m4, m33)
Evaluation Framework (m9)
Research questions and instruments (Throughout)
Evaluation studies (m12, m24, m36)
[WP 6][WP 6]
Kick-Off Meeting, CIMNE Barcelona, 9-11 March 2009
Milestones M6.1 Initial stakeholder workshop has taken place (m4) M6.2 xDELIA evaluation framework initial draft
established (m4) M6.3 First iteration of pilot studies undertaken (m24) M6.4 Second iteration of pilot studies undertaken (m33) M6.5 Final stakeholder workshop (m33)
[WP 6][WP 6]
Kick-Off Meeting, CIMNE Barcelona, 9-11 March 2009
Need to work closely with all partners Importance of getting a shared understanding and
terminology for the project Evaluation needs to iteratively feed into the R&D
activities and vice versa State of the art report in m 6 is a collation of reports
from wp 2-5
wp 6 wp 6
Kick-Off Meeting, CIMNE Barcelona, 9-11 March 2009
Methods of working Tools to be used What do partners currently use?
Means of communication What do partners currently use? Project website, blogs, twitter etc?
Role of OU evaluation research fellow Coordinating, contributing?
[WP 6][WP 6]
Kick-Off Meeting, CIMNE Barcelona, 9-11 March 2009
Approach and theoretical perspectives
Approach Participatory Iterative Useful
Draws on the following: Utilization-focused evaluation Participatory design Learning design
Evaluation - definition
Evaluation is the systematic acquisition and assessment of information to provide useful feedback about some objecthttp://www.socialresearchmethods.net/kb/intreval.htm
Generic goal is to provide useful feedback to stakeholders to influence decision-making or policy formulation through the provision of empirically-driven feedback
Many different types, our approach is ‘participant oriented’ and formative (i.e. feeds into the project developments on an ongoing basis)
Utilization-focused evaluation
Evaluation should be judged by their utility and actual use
Evaluation designed with use in mind i.e. how real people in the real world will apply evaluation findings and experience the evaluation process
Intended users are more likely to use evaluations if they understand and feel ownership of the evaluation process and findings; they are more likely to understand and feel ownership if they've been actively involved
Participatory design
Active engagement of the end users in the design process to ensure it meets their needs
Take account of the context of use and the users Move away from computerisation of human skills to
giving user better tools to work with Users perceptions and feelings about technology are
taken account of Views technology in context; as processes rather
than products
Learning design
A new research field that has emerged in recent years, partly in response to a desire to see better use of technologies to support learning
Concerned with the development of tools, models and schema for supporting and making explicit design decisions in the creation of learning interventions/activities
Design and evaluation framework
Formulation of intervention
Designof intervention
Implementationof intervention
Formulationof questions
Data collection
Data analysis
Utilization of results
Reflection On results
Eva
luat
ion
Des
ign
Adapted from www.socialresearchmethods.net/kb/pecycle.php
Design and evaluation framework
Formulation of activity
Designof activity
Implementationof activity
Formulationof questions
Data collection
Data analysis
Utilization of results
Reflection On results
Evaluation
Design
Example
Design Formulation: Agree
outline for a game for use in WP2 to address specific research questions
Design: Creation of the game
Implementation: Use in a specific context & collection of data
Reflection: On findings
Evaluation Formulation: Agree
which evaluation questions are to be considered
Data collection: Agree appropriate methods
Data analysis: Analyse findings
Utilization: Feedback and application
Examples
PI stakeholder design workshop Participants: children, teachers,
local centre representatives Structured day – outline of
problem, presentation of inquiry-model, brainstorm design ideas in groups, reflections
Representing pedagogy Use of the CompendiumLD tool to
articulate a design
Examples continued…
Brainstorming and refining research questions Use of mind mapping to develop a
shared, collective set of research questions
Design interviews Stakeholders design perceptions Teachers (how do they design,
where do they get ideas, how do they represent/share designs, how do they evaluate effectiveness?)
Examples continued…
‘Cloudfests’ Agile development, iterative
presentation and feedback Design challenge workshop
Design a short course in a day, participants work in teams supported by resource stalls
Evolving understanding Working papers, and a project
definition wiki
Instructions
Write down evaluation research questions potential data collection methods stakeholders and their interests
Write one item per post-it note and stick up Read other people’s post-its Cluster together those that are related Draw connections between questions,
methods and stakeholders
Post its
Write down evaluation research questions you would like to see addressed in the project
Write down suggestions for how the question might be answered – what data collection methods could be used? Browse the LTDI evaluation cookbook for ideas
www.icbl.hw.ac.uk/ltdi/cookbook/contents.html Who are the stakeholders for the evaluation and
what are they interested in? What methods have you used before?
Division of labour
CENTRAL TEAM
Capturing initial vision statements
Development of D & E framework
Ongoing evaluation of XDelia processes
Support and advice Synthesis of local
evaluations
LOCAL TEAMS
Input into the development of the D & E framework
Design of local interventions
Data collection and analysis of findings
Contribute to evaluation of XDelia processes
Production of trial reports
Agree focus for stakeholder workshop To agree a cross-project design and evaluation framework
Agree intended outcomes Shared understanding and clear definitions A design and evaluation framework of research questions,
methods and indicative timescales Arrange date for stakeholder workshop
[WP 6][WP 6]
Kick-Off Meeting, CIMNE Barcelona, 9-11 March 2009
Useful references
Oliver, Harvey, Conole and Jones (2007), Evaluation, in G. Conole and M. Oliver (Eds), Contemporary perspectives in e-learning research, RoutledgeFalmer: London
LTDI evaluation cookbookhttp://www.icbl.hw.ac.uk/ltdi/cookbook/contents.html
Research methods knowledge base http://www.socialresearchmethods.net/kb/evaluation.php
Patton, M. (1997), Utilization-focused evaluation, 3rd Edition, Sage Publications
Schuler, D. and Namioka, A. (1993), Participatory design, Lawrence Erlbaum Associates
Lockyer, L., Bennett, S., Agostinho, S. and Harper, B. (Eds), in Handbook of Research on Learning Design and Learning Objects: Issues,
Applications and Technologies, Hersey PA: IGI Global