learning analytics – research challenges arising from a current review of la use
TRANSCRIPT
The European Commission’sscience and knowledge service
Joint Research Centre
Learning Analytics –Research challenges arising from
a current review of LA use
Aulanko, Finland April 5 2017
Dr. Riina Vuorikari DG JRC – Directorate Innovation and Growth
Unit B4 Human Capital and Employment
2
Focus on the priorities of the European Commission: working for more than 20 policy DGs
Policy neutral: has no policy agenda of its own
Independent: no private, commercial or
national interests
The Joint Research Centre (JRC)
Directorate Growth &
Innovation Seville
DigComp (DG EMPL)
EntreComp (DG EMPL)
DigCompConsumers (DG JUST)
Openedu Policies(HE)(DG EAC)
MOOCKnowledge(DG EAC)
Blockchain(DG JRC)
OPTEV(DG JRC)
MOOCs4 inclusion(DG EAC)
Learning Analytics(DG JRC)
Anticipatory studies Policy & society
OrganisationsIndividuals
DigCompEdu (DG EAC)
DigPolEdu(DG EAC)
CPDmodels(DG EAC)
ICTinPISA(DG EAC)
CompuThink (DG JRC)
DigCompOrg4Schools(DG EAC)
OpenEdu (HE)(DG EAC)
DigCompOrg (DG EAC)
Current JRC research on Digital Age Learning and 21st Century Skills
Riina Vuorikari
• A research fellow at the JRC in Seville since June 2013
• 2000-2013 in European Schoolnet as Senior Research Analyst and Project Manager
• Background: • OKL in Savonlinna (MEd)• Studying abroad (exchange
and postgraduate studies) e.g. hypermedia (DEA)web, use of ICT in education
• Doctoral 2009 from the Dutch School Information and Knowledge System
• https://www.slideshare.net/vuorikari
The European Commission’sscience and knowledge service
Joint Research Centre . Part 1: Introduction to the Report: aims, inventory of LA (10min)
. Part 2: Some results of the study
(10 min) . Part 3: research challenges
(10 min)
. Part 4: Short discussion (10 min)
.
Outline:
7
Learning analytics involve the measurement, collection, analysis and reporting of data about learners and their contexts, for purposes of understanding and optimizing learning and the environments in which it occurs. (source)
Learning analytics have their roots in many fields of educational and technical research, including assessment, personal learning and social learning, but also in business intelligence and data mining.
The field draws on theory and methodologies from disciplines as statistics, artificial intelligence and computer science (Dawson et al., 2014).
8
The Study behind the JRC Report
• Goal: Provide research evidence on the use of learning analytics and discuss their implications for education policy
• Study conducted between September 2015-June 2016
• Design of the study: the JRC in Seville, Unit of "Human Capital and Employment”
• Research: The Open University, UK under the contract and supervision of the JRC
9
To access the Inventory
• Google “leap inventory learning analytics”
• http://cloudworks.ac.uk/cloudscape/view/2959
• Google “learning analytics JRC science hub”To access the Report
10
What does the Study contain?
• An inventory of recent implementations of learning analytics:• Tools, practices and policies (60 examples)• 5 case studies• Review of research literature on implementation
• To critically reflect on the impact, potential and limits of using learning analytics in education
• To consider the implication for education policy: “The Action List for Learning Analytics”
11
The Inventory: Tools• 26 examples with international focus• Descriptions available online
A template used for describe tools:• Inventory type: Design and planning tool, Learner support tool;
Analyics for assessment; General analytics tool; Recommendations; "smart system"; learning enviroment tool;
• Role of analytics – the different uses of analytics: summary and description; visualisation; statistical inference; modelling; alerting; prediction; adaptation
• Data sources – where the data originateUses own data; Other (vle; mis; social media; statistical services)
• Keywords, Context info, Maturity and Evidence, Further info…
• …
Not an exhaustive lis
t!
12
Example 1: Inventory of Tools
1.
2.
3.
4.
5.
Specific models of domain knowledge
(in math) and on the learner
responses (cognitive models )
Stand aloneapplication that
generates its own data.
13http://cloudworks.ac.uk/cloud/view/9633
1.
2.
3.
4.
5.
6.
Example 2: Inventory of Tools Data sources include
VLE; social media, “card swipes” (e.g. using
student card to go to library), libraries,
housing
14
1.
2.
3.
4.
Example 3: Inventory of Tools
15
• Tools in Inventory target• compulsory education (13); HE (8); • workplace (2); any (6)
• “stand-alone” tools; custom-made solutions; add on to an existing VLE
• Different data sources:• Student digital traces from the platform or
outside of it, e.g. interaction data, social media, libraries• Data from offline sources, e.g. evaluations by the learner,
demographic data, nation-wide test data/evaluations
The Inventory: Tools (1)
16
The Inventory: Tools (2)• Different target beneficiaries of analytics:
• Learners, teachers, tutors, advisors, counsellors, school heads/managers, policy-makers,..
• Different contexts: e.g. face to face learning in class, distance learning, blended learning out of school
• Actions on data: scaffold, support, recommend, predict,..
• Action based on based: past behaviour, similarity in grades, domain knowledge, right answers, statistics, …
17
The Inventory: Practices (18) Cases where learning analytics are being deployed or developed at scale
• Examples of institutions’ practices at scale (8); pilots (6); candidate for mainstreaming (2)
• Networks and organisations concerned with development (LACE, SoLAR, SNOLA, Jisc, Kennisnet);
• Reports on practice and related issues
A template used for describe tools:• Learning – educational sector to which the practice applies• Geographical – where the practice is applied• Pedagogic – theory of teaching and learning that underpins the practice• Tools used – any relevant tools• Design and implementation – how the practice developed and is applied
18
The Inventory: Policy-related documents • 14 examples with international focus• Descriptions available online
A template used for describe tools:• Document source – where the policy originated• Geographical – region where the policy applies• Relationships – areas covered by the policy
19
5 Case studies:1. Developing school sector awareness, knowledge and skills
around learning analytics in the Netherlands (Kennisnet)
2. The process of developing an institutional ethics policy (The Open University, UK)
3. Learning analytics in the context of a data-intensive strategy (University of Technology, Sydney)
4. Open-source software and architecture as an option (The Apereo Foundation Learning Analytics Initiative)
5. Commercial providers of learning analytics critically moving the whole field forward (Blue Canary)
The European Commission’sscience and knowledge service
Joint Research Centre . Part 1: Introduction to the Report: aims, inventory of LA (10min)
. Part 2: Some results of the study
(10 min) . Part 3: research challenges
(10 min)
. Part 4: Short discussion (10 min)
.
Outline:
21
What do we learn from the study (1)?
22
2.
1..
Example 4: Inventory of Tools
23
What do we learn from the study (1)?• Evidence of Impact: The research evidence
documented in this study shows that there is little formal validation of tools
• e.g. whether the tools fulfil their intended purpose such as having a positive impact on learning; encouraging more efficient learning; or more effective learning,..
24
LACE Evidence Hub; 37 examples
25
LACE Evidence Hub: 32 examples
26
What do we learn from the study (2)?
27
Examples of Open University, UK• Tools: Open Essayist and OU Analyse
• Practices: Ethical use of student data policy
• Case study on “The process of developing an institutional ethics policy ” (part of the Report)
Example 5: Inventory
28
What do we learn from the study (2)?
• Impact: The research evidence documented in this study shows that currently, most impact of learning analytics in education and training seems takes place around issues, little impact on changing practices yet:
• E.g. Sensitive issues of personal data and privacy are at the centre of discussion
29
Europe’s General Data Protection Regulation (GDPR)
• Europe has taken the position that individual privacy is important and that changes to current practices in general analytics are needed
• Institutions will need to understand their responsibilities and obligations with regard to data privacy and data protection and will have to put procedures in place to ensure that they are compliant with the legislation.
http://ec.europa.eu/justice/data-protection/reform/
A word of warning!
30
31
What do we learn from the study (3)?
32
Examples of AustraliaInteresting policy documents include a report by the Australian Government Office for Learning and Teaching on “improving the quality and productivity of the higher education sector”
• Practices:Student retention and learning analytics: A snapshot of Australian practices and a framework for advancement
• Tools: Loop, open source analytics tool to connect with Moodle or Blackboard (funded by the same gov. body)
• Case study: “Learning analytics in the context of a data-intensive strategy” - University of Technology, Sydney
Example 6: Inventory
33
What do we learn from the study (3)?
• Impact: the implementation of learning analytics seems to be a long-term process requiring a vision and a strategy, policy and structure, but also knowledge and skills in technology and pedagogy
• E.g. • Case study in UTS (Au): vision of becoming a data-
intensive university in 2011 – strategy and a new centre in 2014, tools are being developed and piloted now
• Kennisnet working w/schools and vendors since 2014 (products that have useful features!), now focus also on standardisation of student data, etc.
34
What do we learn from the study (4)?
35
Narrowing the attainment gap: Georgia State University At the university, predictive analytics have been used to tackle the achievement gap for low income and first-generation students. GSU graduation rate rose from 32% in 2003 to 54% in 2014.
In the process, the university claims to have removed the achievement gap between students from minority backgrounds or lower socioeconomic status and their peers.
Example 7: Inventory of Practices
Inclusive education, equality!
36
Strategic objectives for European cooperation in education and training (ET2020)1. Relevant and high-quality skills and competences for
employability, innovation, active citizenship2. Inclusive education, equality, non-discrimination,
civic competencesIndicator: reducing school drop-out rates to less than 10%
3. Open and innovative education and training, including by fully embracing the digital era
4. Strong support for educators5. Transparency and recognition of skills and qualifications6. Sustainable investment, performance and efficiency of education
and training systems
37
Is the “school drop-out rate”the low hangingfruit of Learning Analytics?
What about the other priorities and visions for the purpose of LA?
38
What do we learn from the study (4)?• The majority of current learning analytics
work is not strongly aligned with the European Union’s priority areas for education and training
• E.g. Strategic objectives for European cooperation in education and training (ET2020)
39
1. Policy leadership and governance practices 2. Institutional leadership and governance practices 3. Collaboration and networking 4. Teaching and learning practices5. Quality assessment and assurance practices 6. Capacity building 7. Infrastructure
The Action List for Learning Analytics
The European Commission’sscience and knowledge service
Joint Research Centre . Part 1: Introduction to the Report: aims, inventory of LA (10min)
. Part 2: Some results of the study
(10 min) . Part 3: research challenges
(10 min)
. Part 4: Short discussion (10 min)
.
Outline:
41
Research challenges arising from the Report• Challenge 1: Create common vision for LA in
European education and training• Use the ET2020 Priority areas to make policy hooks
• Challenge 2: Build LA tools that help teachers and learners• Now too much focus on the supply side!• Help generate demand: Talk to teachers and
learners to understand what they want• Challenge 3: Conduct research that ends up in the
LACE Evidence Hub• Validation of tools and their promises should lead
the research
42
Ready to debate the challenges? …would you sign up? Are they feasible?Are they desirable? …...what do you think, should policy goals
drive the research?
The European Commission’sscience and knowledge service
Joint Research Centre Check the research of our team at the JRC Science Hub:
https://ec.europa.eu/jrc/
New skills agenda: https://ec.europa.eu/education/ news/20160610-education-skills- factsheet_en
.
Thank you!