programming assessment and data collection

38
Programming Assessment and Data Collection Petri Ihantola

Upload: petri-ihantola

Post on 19-Jul-2015

37 views

Category:

Data & Analytics


0 download

TRANSCRIPT

Page 1: Programming Assessment and Data Collection

Programming Assessment and

Data Collection

Petri Ihantola

Page 2: Programming Assessment and Data Collection

Programming Assessment and

Data Collection

Petri IhantolaAssistant Professor at Tampere University of Technology (2014 - ), D.Sc. (Tech) from Aalto University in 2011, Software Engineer in Test at Google (2007-2009), Teaching various large-class programming courses at Aalto University, former Helsinki University of Technology (2004 - 2014)

Page 3: Programming Assessment and Data Collection

Arto Vihavainen, Ville Karavirta, Juha Helminen,

Juha Sorva, Otto Seppälä, ...

Page 4: Programming Assessment and Data Collection
Page 5: Programming Assessment and Data Collection
Page 6: Programming Assessment and Data Collection
Page 7: Programming Assessment and Data Collection

image: http://www.fatandsassymama.com/wp-content/uploads/2013/08/baking.jpg

Programming is a process

Page 8: Programming Assessment and Data Collection

Programming is a process

Feedback should be provided from how students do what they do, not only whether

the end product tastes good or not

Page 9: Programming Assessment and Data Collection

Traditionally, feedback has focused on the end products

image: https://www.flickr.com/photos/clement127/15004844674 cc (by-nc-nd)

Page 10: Programming Assessment and Data Collection

Traditionally, feedback has focused on the end productscorrectness, efficiency, style, design, ...

Ala-Mutka. A survey of automated assessment approaches for programming assignments. Computer Science Education, 15(2):83-102, 2005.

Page 11: Programming Assessment and Data Collection

May encourage ineffective trial and error processes

image: https://www.flickr.com/photos/oliveira_comp/14261335089 cc (by-nc-sa)

Page 12: Programming Assessment and Data Collection

May encourage ineffective trial and error processes tackled by limiting the number of

submissions/feedback, using time penalties, making each exercise unique,

organizing contests, ... Ihantola et al. 2010. Review of recent systems for automatic assessment of programming assignments. In Proceedings of the 10th Koli Calling International Conference on Computing Education Research. 86-93.

Page 13: Programming Assessment and Data Collection

Hey, wait a moment... isn't this already already an example of providing

feedback from the proces

Page 14: Programming Assessment and Data Collection

So what makes it hard to provide even better feedback

(from processes)?

Page 15: Programming Assessment and Data Collection

So what makes it hard to provide even better feedback

(from processes)?

Page 16: Programming Assessment and Data Collection

Systems collect dataBut when trying to get the big picture, we still have to do many assumptions

image: unknown

Page 17: Programming Assessment and Data Collection

image: unknown

Page 18: Programming Assessment and Data Collection

Houston, we have a problem

image: NASA, PD

Page 19: Programming Assessment and Data Collection

Let's look at easier problems first

Ihantola & Karavirta (2011). Two-Dimensional Parson’s Puzzles: The Concept, Tools, and First Observations. In: Journal of Information Technology Education: Innovations in Practice 10, pp. 1–14.

Page 20: Programming Assessment and Data Collection

Helminen, Ihantola, Karavirta, Malmi (2012). How Do Students Solve Parsons Programming Problems? – An Analysis of Interaction Traces. In Proceedings of the 8th International Computing Education Research Conference, pp. 119–126, Auckland, New Zealand.Karavirta, Helminen, Ihantola (2012). A mobile learning application for parsons problems with automatic feedback. In: Koli Calling ’12: Proceedings of the 12th Koli Calling International Conference on Computing Education Research. Koli, Finland: ACM, pp. 11–18. ISBN: 978-1-4503-1795-5. (best system paper award)

Looks like the student got stuck here, lets

help.

Page 21: Programming Assessment and Data Collection

Back to real life and real programming environments

Page 22: Programming Assessment and Data Collection

Back to real life and real programming environments

Page 23: Programming Assessment and Data Collection

How much information is lost when storing snapshots

at different granularities? submissions, save points, key-strokes

Vihavainen, Luukkainen & Ihantola. 2014. Analysis of source code snapshot granularity levels. In Proceedings of the 15th Annual Conference on Information technology education (SIGITE '14). ACM

Page 24: Programming Assessment and Data Collection

Novice programmers

image: https://www.flickr.com/photos/donnieray/8658314801/ cc (by)

Page 25: Programming Assessment and Data Collection

● Introduction to Programming (MOOC)● Spring 2014, University of Helsinki● 1166 students● 93231 submissions● 1.3 million saves, runs and tests● 37 million events (insert, remove, paste)

Novice programmers

Page 26: Programming Assessment and Data Collection

● 50% of students work on assignments that they never submit - no information on the progress in such (harder?) assignments

● Programmers with previous experience move more straightforward (make less sidesteps)

● 6.3 snapshots / submission and 30 key events / snapshot

Some findings

Page 27: Programming Assessment and Data Collection

So... collect the data while you can. It cannot be regenerated, e.g., interpolated.

Page 28: Programming Assessment and Data Collection

Any examples of what to do with more accurate data?

Page 29: Programming Assessment and Data Collection

Can we automatically detect student’s perceived difficulty

as they are working on programming tasks?

Petri Ihantola, Juha Sorva, and Arto Vihavainen. 2014. Automatically detectable indicators of programming assignment difficulty. In Proceedings of the 15th Annual Conference on Information technology education (SIGITE '14). ACM, New York, NY, USA, 33-38. (best paper award)

Page 30: Programming Assessment and Data Collection

Can we understand how the way of how students type

their code evolves over time?

Arto Vihavainen, Juha Helminen, and Petri Ihantola. 2014. How novices tackle their first lines of code in an IDE: analysis of programming session traces. In Proceedings of the 14th Koli Calling International Conference on Computing Education Research (Koli Calling '14). ACM, New York, NY, USA, 109-116.

Page 31: Programming Assessment and Data Collection

What next?

Page 32: Programming Assessment and Data Collection

The three main goals of feedback are to help a learner understand and learn about

1. the learning goals2. own progress towards these goals3. activities needed to make better process

Hattie & Timperley (2007). The Power of Feedback. Review of Educational Research, 77(1), 81-112.

Page 33: Programming Assessment and Data Collection

Time perspective in educational data mining will change to more fine grained

Page 34: Programming Assessment and Data Collection

Plenty of research opportunities from course-level analysis to modeling

individual students

Page 35: Programming Assessment and Data Collection

However, we should not ignore the vast amount of

previous research

e.g., Juha Helminen, Petri Ihantola, and Ville Karavirta. 2013. Recording and analyzing in-browser programming sessions. In Proceedings of the 13th Koli Calling International Conference on Computing Education Research (Koli Calling '13). 13-22.

Page 36: Programming Assessment and Data Collection

ITiCSE working group in July

Page 37: Programming Assessment and Data Collection

https://us.pycon.org/2015/events/edusummit/

Python Education Summitvoting of the topics is open