analytics - presentation in dkit

Post on 15-Apr-2017

185 Views

Category:

Education

0 Downloads

Preview:

Click to see full reader

TRANSCRIPT

Learning Analytics - value what we measure or measure what we value 

Dr Mark Glynn

@glynnmark

Contact details

• mark.glynn@dcu.ie• glynnmark

• http://enhancingteaching.com

Outline

• Introduction• Motivation and goals• Challenges• Examples• Technical bits• Discussion

– What would you like to analyse

– Collaboration

Teaching Enhancement Unit

TEACHINGENHANCEMENT

UNIT

Onlineand Blended Learning

Support

Awards and Grants

Credit Earning Modules

Professional Development

Workshops

Data Analytics

Data analytics is the science of extracting actionable insight from large amounts of raw data

DIT – MSc in Computing

Youtube

Tesco

Definiton

Learning analytics is the measurement, collection, analysis and reporting of data about learners and their contexts, for purposes of understanding and optimising learning and the environments in which it occurs. A related field is educational data mining.

- wikipedia

Discussion

What data do we already collect?

So much student data we could use

Demographics• Age, home/term address, commuting distance, socio-economic status, family composition, school

attended, census information, home property value, sibling activities, census information

Academic Performance• CAO and Leaving cert, University exams, course preferences, performance relative to peers in

school

Physical Behaviour• Library access, sports centre, clubs and societies, eduroam access yielding co-location with others

and peer groupings, lecture/lab attendance,

Online Behaviour• Mood and emotional analysis of Facebook, Twitter, Instagram activities, friends and their actual

social network, access to VLE (Moodle)

Motivation

Discussion

What challenges do you foresee in your institution?

Core Principles – Open University UK

Learning analytics is a moral practise which should align with core organisational principles

The purpose and boundaries regarding the use of learning analytics should be well defined and visible

Students should be engaged as active agents in the implementation of learning analytics

The organisation should aim to be transparent regarding data collection and provide students with the opportunity to update their own data and consent agreements at regular

intervalsModelling and interventions based on analysis of data should be free from bias and

aligned with appropriate theoretical and pedagogical frameworks wherever possible

Students are not wholly defined by their visible data or our interpretation of that data

Adoption of learning analytics within the organisation requires broad acceptance of the values and benefits (organisational culture) and the development of appropriate skills

The organisation has a responsibility to all stakeholders to use and extract meaning from student data for the benefit of students where feasible

Amazon

Examples

The earlier we diagnose, the earlier we can treat 

John CarrollMark GlynnEabhnat Ni Fhloinn

@glynnmark

Maths Diagnostic test

Data Analytics on VLE Access DataHow much can we mine from a mouseclick ?  

John BrennanOwen CorriganAly EganMark GlynnAlan F. SmeatonSinéad Smyth

@glynnmark

No significant difference in the entry profiles of participants vs. non-participants overall

PredictEd Participant Profile

Total Moodle Activity – notice the periodicity

One example module – ideal !

LGxxx – Predictor confidence (ROC AUC)

All modules

LG116 MS136 LG101 HR101 LG127 ES125 BE101 SS103 CA103 CA1680%

20%

40%

60%

80%

100%WorkshopsWikisForumsAssignmentsQuizzesscormlessonchoicefeedbackdatabaseglossarywikiurlbookpagesfoldersfiles

Course content

a b c d e f g h i j

Study by numbers

• 17 Modules across the University (first year, high failure rate, use Loop, periodicity, stability of content, Lecturer on-board)

• Offered to students who opt-in or opt-out, over 18s only

• 76% of students opted-in, 377 opted-out, no difference among cohorts

• 10,245 emails sent to 1,184 students who opted-in over 13 weekly email alerts

The Interventions – Lecturers’ Experience

Student Engagement - individual

Student Engagement - individual

Regression analysis- class grades

What to measure?

Content Transfer

Interaction Collabora-tion

Assessment

Collaboration

Modules which work well …

• Have periodicity (repeatability) in Moodle access• Confidence of predictor increases over time• Don't have high pass rates (< 0.95)• Have large number of students, early-stage

LGxxx: law based subject

Students / year = ~110Pass rate = 0.78

Student Interventions: Feedback

Relative data

Student Experience of PredictED

Students who took part were asked to complete a short survey at the start of Semester 2 - N=133 (11% response rate)

Question Group 1 (more detailed email)

Group 2

% of respondents who opted out of PredictED during the course of the

semester4.5% 4.5%

% who changed their Loop usage as a result of the weekly emails

43.3% 28.9%

% who would take part again/are offered and are taking part again

72.2% (45.6%/ 26.6% )

76.6% (46% /30.6% )

33% said they changed how they used Loop. We asked them how?

• Studied more– “More study”– “Read some other articles online”– “Wrote more notes”– “I tried to apply myself much more, however yielded no results”– “It proved useful for getting tutorial work done”

• Used Loop more– “I tried harder to engage with my modules on loop”– “I think as it is recorded I did not hesitate to go on loop. And loop as become

my first support of study.”– “I logged on more”– “I read most of the extra files under each topic, I usually would just look at

the lecture notes.”– “I looked at more of the links on the course nes pages, which helped me to

further my understanding of the topics”– “I learnt how often I need to log on to stay caught up.”

Did you change Loop usage for other modules?

• Most who commented used Loop more often for other modules– “More often”– “More efficient”– “Used loop more for other modules when i was logging onto

loop for the module linked to PredictED”– “Felt more motivated to increase my Loop usage in general

for all subjects”

One realised that Lecturers could see their Loop activity“I realised that since teachers knew how much i was

using loop, i had to try to mantain pages long on so it looked as if i used it a lot”

Subject Description Non-Participant ParticipantBE101 Introduction to Cell Biology and Biochemistry 58.89 62.05CA103 Computer Systems 70.28 71.34CA168 Digital World 63.81 65.26ES125 Social&Personal Dev with Communication Skills 67.00 66.46HR101 Psychology in Organisations 59.43 63.32LG101 Introduction to Law 53.33 54.85LG116 Introduction to Politics 45.68 44.85LG127 Business Law 60.57 61.82MS136 Mathematics for Economics and Business 60.78 69.35SS103 Physiology for Health Sciences 55.27 57.03Overall Dff in all modules 58.36 61.22

Average scores for participants are higher in 8 of the 10 modules analysed, significantly higher in BE101, and CA103

Module Average Performance Participants vs. Non-Participants

Measuring the Flipping effect 

Patrick DoyleMark GlynnEveyn Kelleher

@glynnmark

Assessment Challenge

Logistics

• 200+ students• 4 assignments each• 5 minutes per

assignment• 10 lecturers• 2 weeks of assessment

Marking guide

Related research

Comparing students who watched versus not watched video oneComparing Means [ t-test assuming unequal variances (heteroscedastic) ]

Descriptive StatisticsVAR Sample size Mean Variance

Didn't watch 84 51.86905 691.58505Watched 102 63.15686 576.74743

Two-tailed distribution

p-level 0.00284 t Critical Value (5%) 1.97402

One-tailed distribution

p-level 0.00142 t Critical Value (5%) 1.65387       

Discussion

What would you like to measure?

Selectively Analyzing your Course data?

@glynnmark@drjaneholland

Dr Jane Holland, RCSIEric Clarke, RCSIDr Mark Glynn, DCUDr Evelyn Kelleher, DCU

Constructive Alignment

Learning Outcomes

Particulars

• Attendance– Tutorials– labs

• Moodle logs• Defined times• Assessment results

Excel results Video tracking

Zero One Two Three Four Five Six Seven0%

10%

20%

30%

40%

50%

60%

70% What students watched "x" amount of videos

Watched

Watched before

All activities

One activity in particular

Multiple activities

Health warning

Questions and discussion…

Talking to one another

LMS

SRS

CMS

Timetable

Wifi

Library

Databridge

MITM

Course Databa

se

Timetable

ePortfolio

Wifi

LMS

Library

SRS

Additional slides

Building classifiers for each week/each module

Training DataTesting

Notes on model confidence• Y axis is confidence in AUC ROC (not probability)• X axis is time in weeks• 0.5 or below is a poor result• Most Modules start at 0.5 when we don't have much

information• 0.6 is acceptable, 0.7 is really good (for this task)• The model should increase in confidence over time• Even if confidence overall increases, due to randomness

the confidence may go up and down• It should trend upwards to be a valid model and viable

module choice

BExxx: Intro to Cell Biology

Results / year = ~300Pass rate = 0.86

BExxx

SSxx: Health Sciences

Results / Year = ~150Pass rate = 0.92

MSxxx

LGxxx

HR101

CAxxx

Some unusable modulesModules where the ROC AUC increases slowly (e.g stays below 0.6) e.g. PS122

Timescale for Rollout

• Still some issues on Moodle access log data transfer to be resolved

• Still have to resolve student name / email address / Moodle ID / student number

• Still to resolve timing of when we can get new registration data, updates to registrations (late registrations, change of module, change of course, etc.) …

• Should we get new, “clean” data each week ?

Why did you take part?• The majority of students

wanted to learn/monitor their performance

• Many others were curious

• Some were interested in the Research aspect

• Some were just following advice

• Others were indifferent

How easy was it to understand the information in the emails ?(1= not at all easy, 5 = extremely easy)

• Average 3.97 (SD= 1.07)

• Very few had comments to make (19/133)– Most who commented wanted more

detail.

Week 3

Training DataTesting

Week 4

Training DataTesting

Week 5

Training DataTesting

Week 6

Training DataTesting

Week 7

Training DataTesting

Week 8

Training DataTesting

Week 9

Training DataTesting

top related