richard baraniuk mr. lan, andrew waters, christoph studer learning and content analytics

Post on 17-Dec-2015

215 Views

Category:

Documents

1 Downloads

Preview:

Click to see full reader

TRANSCRIPT

Richard BaraniukMr. Lan, Andrew Waters, Christoph Studer

learning and content analytics

learning analytics

Goal: assess and track student learning progress by analyzing their interactions with content

data (massive, rich, personal)

close the learning

feedback loop

content

learninganalytics assess and track student

learning progress by analyzing their interactions with content

content

learninganalytics assume content

is organized(“knowledge graph”)

http://www.newscientist.com/article/mg21528765.700-the-intelligent-textbook-that-helps-students-learn.html

“While such results are promising, perhaps it's a little soon to crown Inquire the future of textbooks. For starters, after two years of work the system is still only half-finished. The team plan to encode the rest of the 1400-page Campbell Biology by the end of 2013, but they expect a team of 18 biologists will be needed to do so. This raises concerns about whether the project could be expanded to cover other areas of science, let alone other subjects.”

content

learninganalytics content

analytics

standard practice

Johnny

Eve

Patty

Neelsh

Nora

Nicholas

Barbara

Agnes

Vivek

Bob

Fernando

Sarah

Hillary

JudyJanet

standard practice

Johnny

Eve

Patty

Neelsh

Nora

Nicholas

Barbara

Agnes

Vivek

Bob

Fernando

Sarah

Hillary

JudyJanet

Goal: using only “grade book” data, infer:

1. the concepts underlying the questions (content analytics)

2. each student’s “knowledge” of each underlying concept (learning analytics)

from grades to concepts

students

pro

ble

ms

data– graded student responses

to unlabeled questions– large matrix with entries:

white: correct responseblack: incorrect responsegrey: unobserved

standard practice– instructor’s “grade book”

= sum/average over each column

goal– infer underlying concepts and

student understanding without question-level metadata

students

pro

ble

ms

data– graded student responses

to unlabeled questions– large matrix with entries:

white: correct responseblack: incorrect responsegrey: unobserved

goal– infer underlying concepts and

student understanding without question-level metadata

key observation– each question involves only

a small number of “concepts” (low rank)

from grades to concepts

students

pro

ble

ms

~ Ber

statistical model

converts to 0/1(probit or logisticcoin flip transformation)

estimate of each student’s ability to solve each problem(even unsolved problems)

red = strong ability

blue = weak ability

students

pro

ble

ms

+

SPARse Factor Analysis

~ Ber

students

pro

ble

ms

+students

concepts

SPARFA

each problem involves a combination of a small number of key “concepts”

each student’s knowledge of each “concept”

each problem’s intrinsic “difficulty”

~ Ber

students

pro

ble

ms

solving SPARFA

factor analyzing the grade book matrix is a severely ill-posed problem

significant recent progress in relaxation-based optimization for sparse/low-rank problems

– matrix based methods (SPARFA-M)– Bayesian methods (SPARFA-B)

similar to compressive sensing

standard practice

Johnny

Eve

Patty

Neelsh

Nora

Nicholas

Barbara

Agnes

Vivek

Bob

Fernando

Sarah

Hillary

JudyJanet

Grade 8 science

• 80 questions• 145 students• 1353 problems

solved (sparsely) • learned 5 concepts

Grade 8 science

• 80 questions• 145 students• 1353 problems

solved (sparsely) • 5 concepts

questions(w/ estimated inherent difficulty)

concepts

studentknowledge

profile

87

55

23

93

62

summary

scaling up personalized learning requires that we exploit the massive collection of relatively unorganized educational content

can estimate content analytics on this collection as we estimate learning analytics

related work: Rasch model, IRT

integrating SPARFA into

Mr. Lan Andrew Waters Christoph Studer

.com

top related