distributed interactive learning environments beverly park woolf department of computer science,...

Post on 27-Dec-2015

217 Views

Category:

Documents

0 Downloads

Preview:

Click to see full reader

TRANSCRIPT

Distributed Interactive Learning Environments

Beverly Park Woolf Department of Computer Science,

University of Massachusetts, U.S.A.

Bev@cs.umass.edu

http://ccbit.cs.umass.edu/ckc/

Short term goals

• Move away from digitizing existing teaching– classrooms, lectures and publishing

• Engage student, learn by doing, active learning

• Computer technology– User models, Machine learning, Bayesian reasoning

probablistic reasoning– Multimedia,Wireless, hand held computers– Cognitive pre-tests

Long Term Goals

Motivate students

Identify gender effects

Increase visual learning

Increase interactive feedback

Identify cognitive development effects

The Multimedia Dimension

Artificial Intelligence

Multimedia

DistributedSystems

3D Animation

InteractiveMultimedia

Video/Sound

The Distributed Dimension

Artificial Intelligence

Multimedia

DistributedSystems

DigitalLibraries

Web-basedHomework

The A.I. Dimension

Artificial Intelligence

Multimedia

DistributedSystems

UserModeling

MachineLearning

NaturalLanguage

I will talk about 3 systems:

1. Grade School mathematics tutor

2. Undergraduate inquiry tutor

3. Undergraduate homework system

All are effective at teaching;Most use artificial intelligence technology

Machine LearningMathematics Tutor

Carole Beal, Psychology;J oe Beck, I von Arroyo, Beverly Woolf,

David Marshall, David Hart, Computer Science;Klaus Schultz, Education

Supported byNational Science Foundation EHR, HRD

AnimalWatch tutor

• Teaching arithmetic to 4th - 6th graders

• One Goal: increase self-confidence;

• Problems cast in terms of environmental biology/endangered species;

• Deployed in three local elementary schools;

• Intelligence done via heuristics.

Procedural Hint

Interactive Hint

Conclusions, AnimalWatch

• Significant improvement in learning (fewer mistakes, less time for similar problems).

• Significant improvement in self confidence math liking.

• System correctly adjusted problem difficulty

Extensive analysis with 300 students, measuring topics learned, hints, efficiency,

Gender and Cognitive Development Effects

• Gender effects– Girls are adversely affected by text only hints– Boys are positively affected by text only hints

• Cognitive development effects– Student with low cog. dev. Improve with hint

intensity– Student with high cog. dev. descrease

performance with hint intensity

Interesting questions

• How can the system learn about a student?

• What reasoning about the student can be generated?

• How do we improve adaptivity?

• What are major stumbling blocks of integrating machine learning into a tutor?

High-level ADVISOR architecture

Population StudentModel (PSM)

Pedagogical Agent (PA)

Data from prior users of tutor Teaching

goal

Teaching action

Result

Teaching policy

Overview of PSM construction

• Gather data for each student response– Student: proficiency, cognitive development– Topic: type of operands/operator– Problem: difficulty, size of operands– Context: student’s prior work on current problem,

time since last response– Action: tutor’s previous action

• PSM associates these with student performance

Component evaluation

• Construct PSM and PA with gathered data– 2 prior studies with AnimalWatch– 11,000 training instances (student responses)– 10% semi-random teaching actions

• Test PSM’s predictions vs. actual student performance (from gathered data)– low-risk– allows experimentation

Evaluating the PA’s improvement

• Goal was to minimize time per problem– perhaps not pedagogically

interesting

– graph time (exp. average) vs. number of trials

• Initially 40 seconds– eventually 16 seconds

• Also a strong result

Student’s Cognitive Development(Arroyo, 1999)

Hint symbolism

Low symbolic hint for addition Highly symbolic hint for multiplication

Hint interactivity

Divide all the things that you have into 5 groups. How many

things are there in each of these groups?

Highly interactive Low interactive

Low symbolic

Highly symbolic

Highly symbolic

Low symbolic

Data Analysis

• Cog. development independent from gender

• 2636 cases (pairs of contiguous problems)

• Linear regression model for predicting hint effectiveness (62% of variance):

• Difficulty of the problem

• Impact of the hint

• Student proficiency at the moment the hint was seen

• Amount of information that the hint provided

I will talk about 3 systems:

1. Grade School mathematics tutor

2. Undergraduate inquiry tutor

3. Undergraduate homework system

All are effective at teaching;Most use artificial intelligence technology

The Multimedia Component

Artificial Intelligence

Multimedia

DistributedSystems

3D Animation

InteractiveMultimedia

Video/Sound

A GENERAL INQUIRY TOOL

Supported byNational Science Foundation EHR, CCLI

U.S. Department of Education, FIPSE

Inquiry skills are difficult to teach

Students need to:• Make good observations, ask good questions, gather

evidence.• Justify the need for additional data to support

conjectures. • Critique a hypothesis and judiciously find support for

hypotheses. • Recognize the inquiry cycle.

Students need to:

• Pose open ended questions

• Plan queries & do research

• Recognize salient data and distinguish the known from unknown

• Be mindful of what they do and monitor their progress.

• Engage in multiple cases for diagnoses and interpretation

• Identify data, from examination or interview

• Identify data as “observed” or “inferred”

• Track their observations, data and hypotheses in an Inquiry Notebook.

Rashi scaffolds students to:

Welcome to Rashi

The Case of the retired Runner

Interview Patient

Students interview the patient through free text (e.g. "nutrition"). The tutor responds in video and transcript, e.g.,

“ I have trouble sleeping and am very nervous. I have palpitations and have a weakness in my legs.”

Examination of Head

The Examination Tool enables students to measure weight, pulse, blood pressure, etc. In this example the student selected the head and is given

choices of viewing exam results for eyes, ears, neck, etc.

Examination of Torso

The student selected the torso and is given a choice to viewing exam results of the lungs,

abdomenor intestines.

Students need to:

• Pose open ended questions

• Plan queries & do research

• Recognize salient data and distinguish the known from unknown

• Engage in multiple cases for diagnoses and interpretation

• Identify data, from examination or interview

• Identify data as “observed” or “inferred”

Rashi scaffolds students to:

Propositions about the Patient

Student observations in the exam and the interview are automatically recorded in the

Inquiry Notebook.

Students indicate type (observations, inferences and hypothesis)

Students edit hypotheses, e.g., she has mono,

and type the text of the deduction into the search engine, which will return a list of possible

propositions that match

Edit relationships

Students edit each fact, hypothesis or principle, add a belief value and provide "supports" or "refutes" links.

Drag Hypotheses to Change Level

Patient History Contradicts Hypothesis

Rashi is a General Platform

• Rashi is extendable

• Currently Rashi explores cases in– geology (recognize and predict earthquakes); – forestry (read the forest landscape);– engineering (diagnose a bridge failure).

Reading the Forest Landscape*

The student: • makes observations about changes in forest composition (stumps, bark growth rings), • develops hypotheses to explain observed changes (fire, farming, global warming), • seeks evidence to support or reject the hypothesis • reforms hypotheses based on evidence

* Adapted from Tom Wassels (1997), Reading the Forested Landscape, The Countryman Press, Vermont.

The Case of Age Discontinuity

“Why are there no medium sized trees?” The student used stickies to note: • The basal scar supports the hypotheses of logging or fire. • The hypothesis of a young forest is refuted by the existence of

trees around 100 years old. • The student links each note to supporting evidence.

Case of the Abandoned Lake

“When did beavers abandon this Lake?” The student noticed:1) the pond is surrounded by conifers, 2) the hemlock shows a wound; 3) the dam is leaking4) stumps are blond-covered.

Pocket Inquiry System

Student observations are automatically recorded in the Personal Digitl Assistant (PD) Inquiry Notebook. The student indicated type (observation, inference, hypothesis etc.) of entered data.

Forest EcologyPocket Inquiry Notebook

Forest EcologyPocket Inquiry Notebook

Rashi Evaluation

• The Case of the Retired Runner was examined at Hampshire College in Spring 2003

• Responses were highly positive towards available features, especially toward the tools that allowed data gathering--the interview and examination tools.

• The knowledge base, examination and interview tool and inquiry notebook were used

• A number of suggestions came forth, mostly improvements that could be made to the inquiry notebook, such as a more streamlined method for encouraging hypothesis generation, and these suggestions are being integrated

Rashi Innovations• The system:

– reasons about a student’s inquiry– Identifies lack of knowledge– Identifies inquiry cycle steps– Provides support for separate student tasks– Is portable to multiple domains (engineering,

geology and biology)– Is used in secondary and higher education

I will talk about 3 systems:

1. Grade School mathematics tutor

2. Undergraduate inquiry tutor

3. Undergraduate homework system

All are effective at teaching;Most use artificial intelligence technology

The Distributed Component

Artificial Intelligence

Multimedia

DistributedSystems

DigitalLibraries

Web-basedHomework

On-line Web Homework [OWL]

David M. Hart, Executive DirectorAlan Peterfreund, Kenneth Rath Evaluators

William J. Vining, Beverly P. Woolf

Supported byNational Science Foundation, CCLI

U.S. Department of Education, FIPSE

What is OWL?

• Web-based learning and assessment tool• Automatically grades assignments and stores results• Key features: Immediate feedback, parameterized

questions• 20 departments, 17,000 students/year at UMass• 20 other colleges, 20,000 additional students• Improve student performance • Reduces cost in large service courses

Undergraduate Classes

• Case study at UMass – what can we learn from it?

• Sample OWL activities

• Impact on performance and cost

• Predictors for success (can we generalize?)

Can we systematically improve large undergraduate classes using online assessment technology?

Basic OWL Provides….

• Electronic homework, automatic grading• Course management tools• Authoring tools to create and customize • Curriculum content inclusion• Advanced features include:

– centralized course management

– powerful customization capabilities

– parameterized questions

– extensive multimedia and Java support

– open architecture for extension

General Chemistry – Questions

General Chemistry

General Chemistry,100 Discovery

Statistics Question

Statistics Answer with Feedback

OWL Domains

Accounting

Art History^*

Astronomy*

Biochemistry

Chemistry, General^

Chemistry, Organic^

Communication^

Computer Science

Economics*

Education

Entomology

Environmental Health & Safety

Finance*

French & Italian

Mathematics & Statistics^

Nutrition^

Physics^

Psychology*

Resource Economics*^External Grant *Davis Participant

OWL’s Impact

• Students– them learn the material and keep pace– Appreciate immediate feedback– like the multimedia, simulation, and visualization– appreciate the 24/7 access

• Instructor notice that students are:– learning the material– getting vital feedback– keeping pace

Classroom changes…less in-class drill and practice, … fewer lectures

some material assigned to be done online exclusively (esp. in honors classes)

Improves Student Performance

• Physics: improved student exam performance in 7 large undergraduates sections by an average 10% *

• Calculus: coupled with other interventions, improved retention in

large course from 48% to 72% (grade of C or better)

• Art History: improved essay exams (from 8/16 to 11/16)

• Statistics: significant increase in student exam performance,

attributed by multiple regression model to OWL homework

assignments

• Power Web-CT user converted to OWL and found student

satisfaction increased significantly (3.61 to 3.91 out of 5)

* Dufresne, R., Mestre, J., Hart, D., & Rath, K. (2002).

Analyzing OWL’s ImpactPre-OWL

OWL

The Advantage of Doing Physics Homework Increases When OWL Used

Conscientious students do much better using OWL!

30

35

40

45

50

55

60

65

70

75

80

Low Homework High Homework

PPH

WBH

Students with weaker skillsgain an advantage

Weaker students do better using OWL!

30

35

40

45

50

55

60

65

70

75

80

Low SAT High SAT

PPH

WBH

Students with lower exam grades still gain an advantage

Average exam scores for all classes across Exam groups

30

35

40

45

50

55

60

65

70

75

80

Low Exam High Exam

PPH

WBH

Communication Problem(Conversation Analysis)

Organic Chemistry Structure-Drawing

Problem

Computer ScienceJava Programming

Problem (Submit Code) --

Art History

Art HistoryInteractivePractice

Microeconomics – Assessment

Cost Model

• Surprise! OWL generates income

• Development and Maintenance -broke even in 2000

• Cost savings Chemistry and Physics: 230K/yr

• External awards (shared): 450K/yr

• Licensing Revenue: 50-100K/yr since 2001

• Annual return on investment was approximately 3:1

Contact Information

GENERAL: http://ccbit.cs.umass.edu/ccbit/ dhart@cs.umass.edu 413-545-3278, 413-545-1309

OWL: http://owl.cs.umass.edu/ owl-dev@cs.umass.edu 413-545-2617 (Cindy Stein)

top related