raising the bar - the chronicle of higher education

28
The Power of Pearson’s MyLab and Mastering Programs—Case Study Results Raising the Bar

Upload: others

Post on 14-Feb-2022

4 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Raising the Bar - The Chronicle of Higher Education

The Power of Pearson’s MyLab and Mastering Programs—Case Study Results

Raising the Bar

Page 2: Raising the Bar - The Chronicle of Higher Education

MyMathLab

“ I never would have thought that I could have this much help in math while taking it online, but I really think that MyMathLab is a great resource and everyone should use it.”

—Student, University of South Florida

MasteringPhysics

“ MasteringPhysics was very clear and simple, and occasionally it would give you hints. If you're having problems solving a problem, it will take you through step-by-step. For other students, I would recommend it as great practice and preparation for the test and overall as a good learning experience.”

—Student, University of Central Florida

MasteringChemistry

“ MasteringChemistry is a great tool to sharpen skills and prepare for exams. A great thing about it is the wide range of difficulty —some easy problems, some intermediate, and some difficult. I frequently use MasteringChemistry as exam prep and I scored well on every chapter.”

—Student, Temple University

MyPsychLab

“ My favorite MyPsychLab feature is the pre- and post-labs. They help me prepare for the exams by testing my knowledge on the chapters that will be on the exam. It also gives you a study plan for what sections you should study more.”

—Student, Pacific University

MyWritingLab

“ I liked MyWritingLab because it really helped me to refresh what I thought that I had forgotten. I have more confidence to move on to the next class thanks to MyWritingLab.”

—Student, Rend Lake College

What are students

saying?

Page 3: Raising the Bar - The Chronicle of Higher Education

Table of Contents 1

Table of Contents

Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2

Case Studies

MyMathL ab

Lone Star College—Montgomery . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4

University of Alabama . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6

University of Wisconsin—Stout . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8

MasteringPhysics

Evidence of Learning and Problem-Solving Transfer: Massachusetts Institute of Technology . . . . . . . . . . . . . . . . . . . . . . . . . . 10

MasteringChemistry

Evidence of Learning and Problem-Solving Transfer: Louisiana State University . . 12

High Reliability of Assessment: Massachusetts Institute of Technology. . . . . . . 14

MyPsychL ab

Georgia College & State University . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15

My WritingL ab

Shelton State Community College . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16

MySpanishL ab

Metropolitan State College . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18

MyEconL ab

Western Michigan University . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .20

myitlab

Bunker Hill Community College. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .22

MyAccountingL ab

Spokane Community College . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .24

Page 4: Raising the Bar - The Chronicle of Higher Education

2 Raising the Bar: A Compendium of Case Studies on the Effectiveness of Pearson’s MyLab and Mastering Programs

Creating High-Impact Technology for Today’s Learners

Determined to drive improvements in student achievement and student retention, Pearson Education is partnering with institutions to develop superior, discipline-specific, interactive, online resources.

Our research has shown that online resources, when created to meet the needs of students and instructors, make teaching and learning more effective. When educators require and integrate these products into their course, they and their students experience the highest success rates and return on investment.

With that goal in mind, Pearson Education has created a broad portfolio of over 80 robust and accessible online products. Branded as the MyLab and Mastering programs, these online resources not only deliver traditional textbook content; they offer outcomes based on self-assessment, personalized study paths, customized teaching resources, and powerful results reporting.

Page 5: Raising the Bar - The Chronicle of Higher Education

Introduction 3

Introduction

Benefits of Integrating Online Resources into College Courses

This report illustrates the consistently positive impact of the MyLab and Mastering programs on student success and cost reduction in higher education instruction. It examines how these programs can be implemented easily in any environment—lab-based, hybrid, distance learning, and traditional. Each case study describes the measurable gains that integrated usage has had on student retention, subsequent success, and overall achievement.1

Benefits For Students Benefits For Educators Benefits For Institutions

Increased motivation and satisfaction among a group for whom technology is familiar and appealing

Less time spent on rudimentary material, and more time teaching deeper concepts

Broader access to educational curricula can be provided

Support for different learning styles through a multi-sensory learning experience

Frequent, individualized student feedback Best-in-class pedagogical practices

Skill preparation that will help students compete and succeed in a technologically sophisticated career beyond graduation

Increased ability to manage more students and more classes, more effectively

Increased marketability through fulfilled institutional goals and priorities

A student-focused learning environment that empowers users to take responsibility for their own learning

Quick and easy modification of course content

Reduced space needs and operating costs

Improved learning outcomes Reduced (or eliminated) time spent grading assignments Increased student retention rates

Cost-effective, fingertip access to course materials and supporting resources

Real-time viewing and analysis of student performance

The ability to leverage investments in infrastructure

Analysis and Results

This report presents the salient findings from eleven case studies evaluating the effectiveness of ten MyLab and Mastering products. These case studies provide compelling evidence of how Pearson’s MyLab and Mastering programs increase student learning and achievement. Based on these results, it is clear that partnering with Pearson is more than an educational advantage for instructors and students—it’s a good business decision for institutions.

1 MyMathLab report, Making The Grade, V.3: A Compendium of Data-Driven Case Studies on the Effectiveness of MyMathLab and MathXL, http://www.mymathlab.com/making-grade

Page 6: Raising the Bar - The Chronicle of Higher Education

4 Raising the Bar: A Compendium of Case Studies on the Effectiveness of Pearson’s MyLab and Mastering Programs

MyM

ath

Lab

WWW. M YM AT H L A B . C OM

Lone Star College–Montgomery

Course Names Pre-Algebra, Introductory Algebra, Intermediate Algebra, College Algebra

Credit Hours Three

Semesters Covered Fall 2004–Spring 2007

Types of Data Reported Success Rates, Subsequent Success, Retention

Type of Implementation Lab-Based

Tables 1 through 5 show more-than-statistically-significantdifferences in retention, pass rates, and subsequent successbetween students who have completed the MyMathLaboutcomes assessment program and those who have not.

In spring 2006 alone, retention rates among those studentsemploying the LOAL/MyMathLab program in Introduc-tory Algebra, Intermediate Algebra, or College Algebrawere all above 90 percent. In the case of College Algebra,the 98 percent retention rate reflects a 75 percent gain

Course DesignFaculty at the Learning Outcome Assessment Lab (LOAL)at Lone Star College–Montgomery use MyMathLab and a flexible schedule to help students better retain course material by testing outcomes individually throughout the course versus at the end of the semester.

Students participate in class lectures and weekly review sessions and follow a weekly syllabus—all under the supervision and appropriate intervention of chair and faculty who stay current on student progress via weeklygrade updates and weekly news and procedure updates.

AssessmentsThe LOAL uses MyMathLab to deliver assessments basedon desired course outcomes. Lab hours range from earlymorning to late at night Monday through Thursday, withmorning and afternoon hours on Friday and Saturday. Students may take the assessments at any time the lab is open, may make as many as seven attempts at any one assessment, and are required to pass all outcomes lab assessments in order to pass the course. LOAL assessmentscount for 20 percent of the final grade. Students do notpass the course without at least 60 percent on each out-come and a minimum 70 percent overall average.

MyMathLab’s Gradebook enables instructors to track student participation and progress and to intervene if necessary via tutoring or other support services.

Courses typically require 8 to 10 concept-based assess-ments. Regular assessment empowers students to takecharge of their learning. By continually evaluating theirstrengths and weaknesses via the immediate feedback provided by MyMathLab, students know exactly wherethey need further study and are less likely to fall behindearly in the semester, when it is hardest to rebound.

MyMathLab ImplementationStudents utilize MyMathLab for testing and homework,which contributes 20 to 30 percent of their final coursegrade. The LOAL’s emphasis on continual assessment and early intervention directly benefits from MyMathLab’score qualities of proactivity, time efficiency, flexibility, immediate feedback, security, and student accountability.The commitment of all faculty members to training in thesoftware and in workshops on pedagogy further ensuresprogram success.

Some instructors import grades into MyMathLab fromother sources.

Table 1. Retention Rates, Spring 2006**Data reflect unduplicated student enrollment.

Without With PercentMyMathLab MyMathLab Increase

Introductory Algebra 65% 97% 49%Intermediate Algebra 71% 92% 30%College Algebra 56% 98% 75%

MyMathLab Course Structure

MyMathLab Course Results

Basic Mathematics, 10e, 2007, Bittinger; Introductory Algebra, 10e, 2007, Bittinger; Intermediate Algebra:

Concepts and Applications, 7e, 2006, Bittinger, Ellenbogen;College Algebra, 9e, 2005, Lial, Hornsby, Schneider

Textbooks in Use with MyMathLab

2006Bellwether

Award

2007Star Award

Page 7: Raising the Bar - The Chronicle of Higher Education

www.mymathlab.com 5

MyM

ath

Lab

WWW. M YM AT H L A B . C OM

The faculty at Lone Star College–Montgomery continue to build upon the winning combination of MyMathLaband continual outcomes assessment in a mandated mathe-mathics lab. By employing the following set of provencomponents, faculty find consistent, replicable success.

• Faculty and students with joint ownership of the math curriculum

• A technology-based outcomes assessment program with superior training and communication

• Understanding that outcomes achievement or not finishing lessons defines course completion

• Giving students the tools to assess their ownprogress, which leads to improved success rates

• Supportive faculty who are willing to offer theirtime, energy, and ideas

LONE STAR COLLEGE–MONTGOMERY

Submitted by Maureen Loiacano, Chair, Mathematics and EducationLone Star College–Montgomery

Table 3. Subsequent Success: Pass Rates in Intermediate Algebra, Spring 2005–Fall 2006**Data reflect unduplicated student enrollment.

Without With MyMathLab MyMathLab Percent

in Intro Algebra in Intro Algebra Increase

Spring 2005 36% 46% 28%Fall 2005 50% 55% 10%Spring 2006 50% 60% 20%Fall 2006 51% 65% 27%

Table 4. Subsequent Success: Pass Rates in College Algebra,Spring 2005–Fall 2006**Data reflect unduplicated student enrollment.

Without With MyMathLab in MyMathLab in PercentInter Algebra Inter Algebra Increase

Spring 2005 55% 73% 33%Fall 2005 76% 84% 11%Spring 2006 72% 80% 11%Fall 2006 55% 79% 44%

Table 5. Percent Increase in Course Enrollments

Spring 2005 Fall 2005 to Spring 2006 to Fall 2006

Developmental 4% 5%College Level 13% 11%Calculus 30% 10%

The way MyMathLab has enabled us to integrate outcomes assessments into our courses and then standardizethem across the department—it’s revolutionized our whole program.

—Maureen LoiacanoLone Star College–Montgomery

over the 56 percent retention rate among those studentswho didn’t employ LOAL and MyMathLab. See Table 1.

Pass rates from fall 2004 to spring 2007 comparing students who had completed the MyMathLab regular outcomes assessment program in the previous course andstudents who had not show the unequivocal benefit of regular outcomes assessement and early intervention. Col-

lege Algebra, the first college-level class students typicallytake after completing the developmental math sequence,saw the most significant pass rate difference: an average of79 percent for those who had worked in the MyMathLab-powered Learning Outcome Assessment Lab in their previous course work versus an average of 64.5 percent forthose who had not. Tables 2 to 4 detail subsequent successcomparisons from spring 2005 to fall 2006 for Introduc-tory Algebra, Intermediate Algebra, and College Algebra.

Course enrollments are also increasing across all levels. Table5 shows percent enrollment increases—most marked inCalculus, where spring enrollment increased by 30 percent.

Conclusions

Table 2. Subsequent Success: Pass Rates in Introductory Algebra, Spring 2005–Fall 2006**Data reflect unduplicated student enrollment.

Without With MyMathLab MyMathLab Percent

in Pre-Algebra in Pre-Algebra Increase

Spring 2005 45% 54% 20%Fall 2005 57% 59% 4%Spring 2006 54% 59% 9%Fall 2006 54% 62% 15%

Page 8: Raising the Bar - The Chronicle of Higher Education

6 Raising the Bar: A Compendium of Case Studies on the Effectiveness of Pearson’s MyLab and Mastering Programs

WWW. M YM AT H L A B . C OM

University of Alabama

Course Names Beginning Algebra, Intermediate Algebra, Finite Math, Precalculus Algebra

Credit Hours Three

Semesters Covered Fall 2000–Fall 2007

Types of Data Reported Success Rates, Retention

Type of Implementation Lab-Based

Course DesignBeginning Algebra and Intermediate Algebra courses haveone required meeting per week in which students report to the math lab and work on homework and quizzes. Inother courses, weekly meetings consist of lecture on keytopics for the week. The math lab is open 71 hours a week; students may receive individualized assistance from a staffof instructors and tutors. Students work at their own pacewithin the deadlines stated in the syllabus. Some studentsfinish the course within eight weeks of the semester start,but the majority work according to deadlines.

Course format comprises the following:

• 30- to 50-minute classes that introduce students to topics and course objectives

• 3 to 4 hours in the lab or elsewhere working inde-pendently and using course software that presents aseries of topics covering specific learning objectives

• Instructors and tutors available in the Mathe-matics Technology Learning Center 71 hours a week to provide individualized assistance

AssessmentsWithin each section of content there is a homework andquiz requirement that contributes to the course grade. Fourmajor tests (not cumulative) each contribute 10 percent to the course grade. A comprehensive final examcounts as 30 percent of the final course grade. In addition,students have lab and class attendance requirements.

MyMathLab ImplementationThe University of Alabama uses the majority of features offered in MyMathLab, including customization, home-work, quizzes, tests, and prerequisites—contributing 93 percent of each student’s final course grade. UA importsgrades into its own grade book.

In summer 2000, UA redesigned the math program usingMyMathLab and the Math Emporium model developed by Virginia Polytechnic Institute and State University andthe National Center for Academic Transformation’s CourseRedesign program. UA’s College of Arts and Sciences assigned to the course the Mathematics Technology Learn-ing Center, which started out as a 70-seat computer lab and now seats 500.

MyMathLab Course Structure

By spring 2006, Intermediate Algebra pass rates had risenan average of 20.2 percent from 2000 rates, with the per-centage of As and Bs increasing from 36.7 percent to 58.3percent. For those courses in which the department hadnot fully made the switch to redesign, side-by-side data revealed not only that the pass rate in the MyMathLab–redesigned Business Calculus was course significantlyhigher than the traditional counterpart (64.7 percent ver-sus 51.3 percent) but also that the failure rate decreasedand the withdrawal rate dropped by more than half.

Tables 1 through 5 illustrate MyMathLab’s wide range of positive impact. They show measurable outcomes inpass rates and retention data for individual classes and theoverall mathematics department—by test, by semester, and as these outcomes relate to subsequent success.

In addition, faculty at UA note the following advantages tothe technology-assisted redesign: flexibility in scheduling,ability to move at individual pace, instant feedback, avail-ability of individual help, equality of presentation, equalityof testing, and elimination of language problems.

MyMathLab Course Results

Beginning Algebra, 10e, 2008, Lial, Hornsby, McGinnis; Intermediate Algebra, 3e, 2007, Martin-Gay; Finite Mathe-matics, 8e, 2005, Lial, Greenwell, Ritchey; Precalculus, 3e,

2008, Beecher, Penna, Bittinger; Calculus with Applications,9e, 2008, Lial, Greenwell, Ritchey

Textbooks in Use with MyMathLab

MyM

ath

Lab 2009

IMS Learning Award

Page 9: Raising the Bar - The Chronicle of Higher Education

www.mymathlab.com 7

WWW. M YM AT H L A B . C OM

UNIVERSITY OF ALABAMA

Table 1. Success Rates of MyMathLab Implementaton by Semester, Fall 2000–Fall 2007

Fall 00 Spring 01 Fall 01 Spring 02 Fall 02 Spring 03 Fall 03 Spring 04 Fall 04 Spring 05 Fall 05 Spring 06 Fall 06 Spring 07 Fall 07

Beg Algebra - - - - 54.6% 35.8% 56.5% 37.2% 60.6% 49.7% 64.2% 65.5% 73.6% 53.2% 74.0%Inter Algebra 50.2% 35.8% 60.5% 49.8% 62.9% 38.9% 78.7% 61.8% 76.2% 59.1% 67.2% 56.2% 73.8% 59.8% 75.2%Finite Math - - - - 67.0% 63.5% 66.5% 56.2% 70.0% 65.0% 66.0% 56.3% 70.3% 62.0% 74.8%Precalc Algebra - - - - 60.5% 66.6% 70.3% 68.5% 71.8% 65.0% 71.6% 62.6% 66.0% 57.2% 69.2%Trigonometry - - - - 68.2% 59.7% 55.1% 66.8% 65.1% 66.1% 65.1% 75.2% 45.1% 69.0% 66.8%Precal Alg/Trig - - - - 78.5% 62.2% 80.0% 61.4% 79.7% 80.6% 79.7% 54.2% 80.6% 71.4% 73.2%Business Calc - - - - - - 50.7% 54.9% 64.7% 74.2% 64.7% 60.6% 60.4% 69.8% 61.9%

Table 3. Intermediate Algebra Retention per Test, Fall 2001–Fall 2005**Data reflect the percentage of students enrolled in the course who took each test.

Test 1 Test 2 Test 3 Test 4 Final

Fall 2001 92.4% 89.3% 83.8% 81.6% 78.6%Fall 2002 92.3% 89.7% 84.7% 79.4% 77.2%Fall 2003 92.1% 91.2% 88.6% 86.3% 85.8%Fall 2004 94.4% 92.2% 90.0% 86.6% 86.4%Fall 2005 93.6% 89.7% 82.7% 79.7% 80.1%

Table 5. Business Calculus Retention per Test, Fall 2005**Data reflect the percentage of students enrolled in the course who took each test.

Table 4. Pass Rates for Subsequent Courses before and after MyMathLab Implementation

Pass Rate for Semesters Subsequent Course

Fall 1998–Spring 1999 57.4%Fall 1999–Spring 2000 54.6%Fall 2000–Spring 2001 58.0%Fall 2001–Spring 2002 74.6%Fall 2002–Spring 2003 81.4%

Without MML

With MML

Test 1 Test 2 Test 3 Test 4 Final

Without MML 88.4% 83.0% 67.0% 64.9% 67.3%With MML 94.6% 92.2% 85.6% 82.6% 81.4%

Table 2. Retention Rates of MyMathLab Implementaton by Semester, Fall 2002–Fall 2007

Fall 02 Spring 03 Fall 03 Spring 04 Fall 04 Spring 05 Fall 05 Spring 06 Fall 06 Spring 07 Fall 07

Beg Algebra 72.8% 53.3% 75.0% 57.0% 79.7% 76.2% 83.7% 73.6% 88.1% 69.4% 85.3%Inter Algebra 77.2% 59.3% 85.8% 72.3% 86.4% 77.7% 80.1% 73.4% 86.7% 77.0% 85.9%Finite Math 74.3% 72.4% 80.8% 71.8% 85.5% 78.8% 80.5% 70.8% 84.2% 75.7% 85.1%Precalc Algebra 73.7% 81.7% 78.7% 80.8% 84.5% 82.9% 83.7% 80.9% 84.3% 82.0% 85.6%Trigonometry 79.9% 83.1% 70.3% 80.4% 82.8% 79.5% 77.0% 85.8% 68.9% 81.9% 81.5%Precal Alg/Trig 91.8% 84.2% 93.9% 88.4% 96.3% 84.4% 91.6% 75.0% 95.5% 85.7% 87.7%Business Calc - - 69.3% 67.6% 64.6% 83.8% 71.3% 76.4% 75.0% 81.7% 77.2%

The use of MyMathLab has significantly improved studentsuccess rates. Prior to implementation MyMathLab, successrates averaged 40 to 45 percent. Today, success rates in In-termediate Algebra have averaged 70 percent in the fall se-mesters and 60 percent in the spring semesters.

As studies have become more longitudinal, UA has realizedhow MyMathLab works best: as part of a larger redesignthat includes mandated use by students. The results consistently show a direct correlation between required attendance in the labs and higher success rates.

Longitudinal studies have also increased UA’s awareness of MyMathLab’s impact on subsequent success. By 2006,students who came out of a MyMathLab–redesigned Inter-

mediate Algebra course passed their subsequent course,Precalculus Algebra, at an average rate of 71.3 percentcompared with the overall average of 48.3 percent.

Based on this data, the University of Alabama is convinced:MyMathLab in an Emporium redesign setup can enhancestudent learning; can increase success rates, particularly for underserved students; and can reduce resource demands.

Plans for the future include using even more of the toolsoffered by MyMathLab (e.g., item analysis and pooling) to further increase student success rates.

Conclusions

Submitted by Jamie GlassMathematics Technology Learning Center Lab Coordinator

University of Alabama

MyM

ath

Lab

Page 10: Raising the Bar - The Chronicle of Higher Education

8 Raising the Bar: A Compendium of Case Studies on the Effectiveness of Pearson’s MyLab and Mastering Programs

1 0 W W W . M Y M A T H L A B . C O M

VAR

IOU

S

FOR

MA

TSTR

AD

ITIO

NA

LH

YB

RID

LAB

BA

SED

University of Wisconsin–Stout

Course Names Beginning Algebra, Intermediate Algebra

Credit Hours Two, Four

Semesters Covered Fall 2004–Spring 2008

Type of Data Reported Retention

The cornerstone of UW-Stout’s math program is dailycomputer-graded homework that counts significantly (about 25 percent) toward the course grade and is contin-ually monitored by the classroom instructor, who activelyintervenes as soon as a student shows signs of falling behind. What distinguishes this curriculum from exclusivelyonline courses is the blending of online homework andtests with required daily classroom sessions in a dedicated,technology-enhanced classroom/tutor lab complex. Another key factor is a new tutoring service dedicated exclusively to supporting Beginning Algebra and Inter-mediate Algebra students.

Since its inception in fall 2004, the Math Teaching andLearning Center (TLC) has served 2,140 students. Duringthis time, the combined failure/withdrawal rate for the 501 students who have taken remedial Beginning Algebraunder the new system has decreased by 55 percent (from29 percent to 13 percent). See Figure 1.

Results in the Intermediate Algebra course show a less dramatic, 39 percent reduction in nonpass rates (17.8 percent for 1,639 students over seven semesters versus 29 percent pre-Math TLC). See Figure 2.

MyMathLab’s tracking features reveal that 95 percent ofstudents are submitting all homework assignments, earning for them an average score of 92 percent. Studentsare spending an average of 95 minutes on each day’s home-work assignment—a figure for which no previous compar-ison data exist but which most teachers of these coursesanywhere would find astonishing. Attendance rates nowaverage 94 percent for Intermediate Algebra and 85 percentfor Beginning Algebra. From 150 to 200 students visit thetutor lab per week compared with the 75 to 80 tutoring sessions per semester logged by the campuswide, free tutoring service for students in these two courses beforethe program began.

MyMathLab Course Results

Course DesignBeginning Algebra meets twice a week for a total of twohours; Intermediate Algebra meets four times a week for a total of four hours. MyMathLab homework is scheduledand due each class day. It may be done at any location, but all quizzes and tests are taken in the classroom or labwith a proctor. Students may work ahead.

AssessmentsBeginning Algebra

• 5 quizzes, 2 tests in MyMathLab, plus one quiz outside MyMathLab, each with an accompanyingpractice quiz/test available in MyMathLab

• 21 MyMathLab homework assignments

Intermediate Algebra

• 7 quizzes, 4 tests in MyMathLab, each with an accompanying practice quiz or test available in MyMathLab

• 42 MyMathLab homework assignments

• 5 homework assignments outside of MyMathLab

MyMathLab ImplementationUse of MyMathLab contributes 90 percent to each student’s final course grade and includes homework, proctored tests, and quizzes; prerequisites for some home-work and tests; the Individual Settings feature; and the Coordinator course feature. Grades are not imported from other sources.

MyMathLab Course Structure

Beginning Algebra, 5e, 2009, Martin-Gay; Intermediate Algebra, 4e, 2005, Martin-Gay; Algebra and Trigonometry,

3e, 2007, Blitzer

Textbooks in Use with MyMathLab

MyM

ath

Lab 2009

Outstanding Teacher Award

Page 11: Raising the Bar - The Chronicle of Higher Education

www.mymathlab.com 9

W W W . M Y M A T H L A B . C O M 1 1

In addition to passing these two courses at unprecedentedrates, students are registering greater engagement and satisfaction with the learning experience—despite thegreater demands placed on them. On an anonymous andvoluntary survey distributed to Math TLC students at theend of each semester, 91 percent of respondents indicatedthat they learned as much as or more than they expected tolearn coming into the course; 84 percent said they’d belikely to take a course using this structure again. And despite the prominence of online homework and learningtools, students still rated my teacher as the top factor influ-encing their learning out of seven choices (online home-work, online help, lectures, tutors, my teacher, open lab,textbook)—strong evidence that the personal interaction

this program provides in the classroom and lab is an essential feature distinguishing this approach from strictlyonline and even most hybrid course structures.

Written responses to student survey questions are typifiedby this verbatim quote: “This class completely changed myviews on math. Before this class I hated math and neverwanted to do it. I hated math even in grade school! Afterthis course I love math and am considering a math minor.I’m even thinking of being a tutor in the Math TLC nextyear. I would never have imagined me teaching and helping others with math.”

UNIVERSITY OF WISCONSIN–STOUT

LAB

BA

SED

H

YB

RID

TR

AD

ITION

AL

AR

IOU

S

FOR

MA

TS

0

5

10

15

20

25

30

Figure 1. Year-by-year data of the percent of students receivingfailure or withdrawal grades in Beginning Algebra

0

5

10

15

20

25

30

35

Figure 2. Year-by-year data of the percent of students receivingfailure or withdrawal grades in Intermediate Algebra

When asked about her view of the future of mathematicsinstruction vis-à-vis her experience with MyMathLab,Jeanne Foley, director of UW-Stout’s Math Teaching andLearning Center, replied with the following.

“Technology will become the universal course deliverymedium. There will still be a market for so-called brick-and-mortar universities and on-site rather than onlineclasses—despite the accelerating growth of online universi-ties and online course work at traditional institutions. It isespecially the students who struggle with math and whoare taking remedial or introductory math classes who most need the hands-on support of a live teacher in a realclassroom. There is a real synergy between the benefits offered by MyMathLab and the face-to-face interactions between students and their classroom teacher and tutors.

“Students’ learning habits have changed tremendously in the past 10 years. Although it does seem that students’ attention spans for lectures and their ability to focus for extended times on traditional assignments have dimin-ished, their willingness to spend one or two or sometimeseven three hours a day on interactive homework like the MyMathLab exercises has increased. Students in UW-StoutIntermediate Algebra classes are spending an average of 95 minutes a day on their MyMathLab homework—farmore than they spent when these courses were taught traditionally.”

Conclusions

Submitted by Jeanne Foley, Director, Math Teaching and Learning CenterUniversity of Wisconsin–Stout

MyM

ath

Lab

Page 12: Raising the Bar - The Chronicle of Higher Education

10 Raising the Bar: A Compendium of Case Studies on the Effectiveness of Pearson’s MyLab and Mastering Programs

Maste

rin

gPh

ysic

s

Study 1. MasteringPhysicsIntroductory Newtonian Mechanics, Fall 2003Massachusetts Institute of TechnologyStudy design: After the first six weeks of the semester, the approximately 430 studentsin Introductory Newtonian Mechanics were divided, based on homework scores, into twoequally skilled groups. The groups were given related tutorial problem pairs (they bothentailed the same concepts and methods), which they solved in opposite order relative to each other without any intervening problems. For example, if problem A was presentedto one group followed by problem B, then problem B was presented to the other groupfollowed by problem A. Thus, one group was unprepared for problem A while the othergroup was prepared for it by having solved problem B, and vice versa. The two groupswere named prepared and unprepared relative to a tutorial problem pair under consider-ation. Six tutorial problem pairs were assigned for credit in the concept domains of linearmomentum, energy conservation, angular motion, gravitation, torque, and rotationaldynamics. (For more details, see R. Warnakulasooriya, D. J. Palazzo, and D. E. Pritchard,Evidence of problem-solving transfer in Web-based Socratic tutor, Proceedings of the 2005 Physics Education Research Conference, pp. 41–43; R. Warnakulasooriya,

Evidence of Learning and Problem-Solving TransferSuccessful learning must lead to problem-solving transfer—that is, the ability to apply what is being learned during one instance of problem solving to anotherinstance in which a similar set of knowledge and methodologies is required.Studies conducted using the Mastering programs show evidence of learning from Mastering’s tutorial items and from the ability gained by the students to transferthat learning where required.

It is highly plausible that the decrease in problem difficulty is due to an overall effect of learning within a givenassignment. The instructor followed the best practice recommendations given in MasteringChemistry andselected a roughly equal number of tutorials and EOCs as much as feasible within an assignment. The tutorialand EOC problems were selected so that they covered important parts of each chapter. Though the 1 (easy)through 5 (hard) difficulty scale was not actively used by the instructor in selecting the problems from theMasteringChemistry’s item library, the problems selected mainly fell in the difficulty range 1–3. Even if the EOC problems (that were assigned at the end of an assignment) were inherently easy, the general negativecorrelation does not explain the decrease in difficulty we see among the tutorial problems along the order.Since the instructor did not consciously select problems in decreasing order of difficulty within an assignment,it is reasonable to infer that on average we see a learning effect from one problem to the next within anassignment. The average decrease in difficulty per problem within an assignment is -0.26 ± 0.13. Thus, thedifficulty of the next problem within an assignment effectively decreases by about 0.26 standard deviations.Since the student skill and the problem difficulty are placed on the same standard deviation scale in an itemresponse model, this also suggests that the increase in skill from one problem to the next within an assignmentis about 0.26 standard deviations.

SummaryIn 10 of the 12 regular assignments given in MasteringChemistry, a linear decrease in problem difficulty occurs,with the earlier problems in an assignment being more difficult than the later problems. The averagecorrelation between the problem difficulty and its order within an assignment is -0.32 ± 0.09 while thedecrease in difficulty from one problem to the next is -0.26 ± 0.13 standard deviations. Hence, the learningeffect attributable to a problem is about 0.26 standard deviations.

0.00

0.05

0.10

0.15

0.20

0.25

0.30

50min18min6.7min

unprepared group

Rate

of Co

mplet

ion

Time

2.5min 2.2hr

Figure 1. The rate-of-completion graphs for a tutorial problem for the prepared and the unpreparedgroups plotted against the logarithmic time, where the time to completion is measured in seconds. For the prepared group, the peak rate of completion is shifted toward shorter times compared with the unprepared group.

Figure 5. The difficulty of the problems decrease along the order in the assignment: Chapter 11 of Brown/LeMay/BurstenIntermolecular Forces, Liquids, and Solids. The problem difficulty is reported on a standard deviation scale. A single-partproblem with difficulty -1 means that a student who is one standard deviation below average in skill has a 50% chance insuccessfully answering the problem on first attempt.

With acknowledgments to Prof. David E. Pritchard, Massachusetts Institute of Technology; and Prof. Randall W. Hall and Prof. Leslie G. Butler, Louisiana State University.

v0110

Page 13: Raising the Bar - The Chronicle of Higher Education

www.masteringphysics.com 11

Master

ingPh

ysic

s

Study 2. MasteringChemistryIntroductory Chemistry, Fall 2007Louisiana State UniversityStudy design: The students were assigned weekly homework in MasteringChemistry. Twelve regularhomework assignments were given (except the introductory assignment to MasteringChemistry) to the class,which consisted of about 260 students. The regular homework assignments had about 15 problems on average per assignment and the end-of-chapter (EOC) problems were always assigned after the tutorial problems withinan assignment. A two-parameter item response model was fitted to the data scored dichotomously based onwhether or not a student obtained the correct answer to a given part of a problem on the first attempt withoutrequesting any help from MasteringChemistry, hence obtaining the difficulty and the discrimination parametersof the problem.

Results: The difficulty of the problems against its position in the assignment correlates at -0.32 ± 0.09 onaverage for 10 homework assignments in which a linear association between problem difficulty and problemorder in the assignment can be identified. Thus, the problem difficulty decreases over a given assignment. In other words, problems given later in an assignment are easier than the ones given earlier.

D. J. Palazzo, and D. E. Pritchard, Time to completion of Web-based physics problems with tutoring, Journal of the Experimental Analysis of Behavior, 2007, 88, pp. 103–113.)

Results: Three results were noted.

1. The prepared group solved a given problem on average 15 ± 3 percent more quickly1 than the unpreparedgroup. This effect was observed across the six concept areas and hence on all of the 12 problems, providingrobust evidence of learning from a prior problem leading to problem-solving transfer.

50 min

41 min

33 min

27 min

22 min

18 min

15 min

12 min

10 min

8 min

7 min

p p g p unprepared group

Tim

e

6 min

Linear momentum

Gravitation Torque Rotational dynamics

Angularmotion

Energy consv.

50 min

41 min

33 min

27 min

22 min

18 min

15 min

12 min

10 min

8 min

7 min

p p g p unprepared group

6 min

Linear momentum

Gravitation Torque Rotational dynamics

Angularmotion

Energy consv.

Figure 2. Prepared group solves a given tutorial problem in 15 ± 3% less time on average compared with the unprepared group.

1

2

3

4

5

6

7

8

unprepared group

Inco

rrect

Res

pons

es p

er S

tude

nt

Linearmomentum

Gravitation Torque Rotational dynamics

Angularmotion

Energyconsv.

1

2

3

4

5

6

7

8

unprepared group

Inco

rrect

Res

pons

es p

er S

tude

nt

Linearmomentum

Gravitation Torque Rotational dynamics

Angularmotion

Energyconsv.

Figure 3. Prepared group solves a given tutorial problem with 11 ± 3 percent fewer errors compared with the unprepared group.

2. The prepared group requested 15 ± 6 percent fewer hints on a given problem compared with the unpreparedgroup.

3. The prepared group made 11 ± 3 percent fewer errors on a given problem compared with the unpreparedgroup.

SummaryStudents engaging with the MasteringPhysics tutorials demonstrated learning and near-term problem-solvingtransfer as measured by the time to completion of problems, the number of errors made, and the number ofhints requested on follow-up problems. The learning effect was a repeatable finding seen across the conceptareas considered in the study (linear momentum, energy conservation, angular motion, gravitation, torque, androtational dynamics).

1 “The quickness” was determined by finding the time at which the highest rate of completion for the respective groups wasobserved and calculating the difference. Time to completion is defined as the time interval between the first opening of aproblem and the submission of the completed problem in the sense that all the main parts of a given problem were answeredcorrectly without any log-ins/log-offs.

prepared group unprepared group prepared group unprepared group

prepared group unprepared group prepared group unprepared group

Figure 4. The difficulty of the problems decrease along the order in the assignment: Chapter 1 of Brown/LeMay/BurstenIntroduction: Matter and Measurement. The problem difficulty is reported on a standard deviation scale. A single-part problemwith difficulty -1 means that a student who is one standard deviation below average in skill has a 50% chance in successfullyanswering the problem on first attempt.

Page 14: Raising the Bar - The Chronicle of Higher Education

12 Raising the Bar: A Compendium of Case Studies on the Effectiveness of Pearson’s MyLab and Mastering Programs

Mast

erin

gChem

istr

y

Study 2. MasteringChemistryIntroductory Chemistry, Fall 2007Louisiana State UniversityStudy design: The students were assigned weekly homework in MasteringChemistry. Twelve regularhomework assignments were given (except the introductory assignment to MasteringChemistry) to the class,which consisted of about 260 students. The regular homework assignments had about 15 problems on average per assignment and the end-of-chapter (EOC) problems were always assigned after the tutorial problems withinan assignment. A two-parameter item response model was fitted to the data scored dichotomously based onwhether or not a student obtained the correct answer to a given part of a problem on the first attempt withoutrequesting any help from MasteringChemistry, hence obtaining the difficulty and the discrimination parametersof the problem.

Results: The difficulty of the problems against its position in the assignment correlates at -0.32 ± 0.09 onaverage for 10 homework assignments in which a linear association between problem difficulty and problemorder in the assignment can be identified. Thus, the problem difficulty decreases over a given assignment. In other words, problems given later in an assignment are easier than the ones given earlier.

D. J. Palazzo, and D. E. Pritchard, Time to completion of Web-based physics problems with tutoring, Journal of the Experimental Analysis of Behavior, 2007, 88, pp. 103–113.)

Results: Three results were noted.

1. The prepared group solved a given problem on average 15 ± 3 percent more quickly1 than the unpreparedgroup. This effect was observed across the six concept areas and hence on all of the 12 problems, providingrobust evidence of learning from a prior problem leading to problem-solving transfer.

50 min

41 min

33 min

27 min

22 min

18 min

15 min

12 min

10 min

8 min

7 min

p p g p unprepared group

Tim

e

6 min

Linear momentum

Gravitation Torque Rotational dynamics

Angularmotion

Energy consv.

50 min

41 min

33 min

27 min

22 min

18 min

15 min

12 min

10 min

8 min

7 min

p p g p unprepared group

6 min

Linear momentum

Gravitation Torque Rotational dynamics

Angularmotion

Energy consv.

Figure 2. Prepared group solves a given tutorial problem in 15 ± 3% less time on average compared with the unprepared group.

1

2

3

4

5

6

7

8

unprepared group

Inco

rrect

Res

pons

es p

er S

tude

nt

Linearmomentum

Gravitation Torque Rotational dynamics

Angularmotion

Energyconsv.

1

2

3

4

5

6

7

8

unprepared group

Inco

rrect

Res

pons

es p

er S

tude

nt

Linearmomentum

Gravitation Torque Rotational dynamics

Angularmotion

Energyconsv.

Figure 3. Prepared group solves a given tutorial problem with 11 ± 3 percent fewer errors compared with the unprepared group.

2. The prepared group requested 15 ± 6 percent fewer hints on a given problem compared with the unpreparedgroup.

3. The prepared group made 11 ± 3 percent fewer errors on a given problem compared with the unpreparedgroup.

SummaryStudents engaging with the MasteringPhysics tutorials demonstrated learning and near-term problem-solvingtransfer as measured by the time to completion of problems, the number of errors made, and the number ofhints requested on follow-up problems. The learning effect was a repeatable finding seen across the conceptareas considered in the study (linear momentum, energy conservation, angular motion, gravitation, torque, androtational dynamics).

1 “The quickness” was determined by finding the time at which the highest rate of completion for the respective groups wasobserved and calculating the difference. Time to completion is defined as the time interval between the first opening of aproblem and the submission of the completed problem in the sense that all the main parts of a given problem were answeredcorrectly without any log-ins/log-offs.

prepared group unprepared group prepared group unprepared group

prepared group unprepared group prepared group unprepared group

Figure 4. The difficulty of the problems decrease along the order in the assignment: Chapter 1 of Brown/LeMay/BurstenIntroduction: Matter and Measurement. The problem difficulty is reported on a standard deviation scale. A single-part problemwith difficulty -1 means that a student who is one standard deviation below average in skill has a 50% chance in successfullyanswering the problem on first attempt.

Page 15: Raising the Bar - The Chronicle of Higher Education

www.masteringchemistry.com 13

Masterin

gChem

istry

Study 1. MasteringPhysicsIntroductory Newtonian Mechanics, Fall 2003Massachusetts Institute of TechnologyStudy design: After the first six weeks of the semester, the approximately 430 studentsin Introductory Newtonian Mechanics were divided, based on homework scores, into twoequally skilled groups. The groups were given related tutorial problem pairs (they bothentailed the same concepts and methods), which they solved in opposite order relative to each other without any intervening problems. For example, if problem A was presentedto one group followed by problem B, then problem B was presented to the other groupfollowed by problem A. Thus, one group was unprepared for problem A while the othergroup was prepared for it by having solved problem B, and vice versa. The two groupswere named prepared and unprepared relative to a tutorial problem pair under consider-ation. Six tutorial problem pairs were assigned for credit in the concept domains of linearmomentum, energy conservation, angular motion, gravitation, torque, and rotationaldynamics. (For more details, see R. Warnakulasooriya, D. J. Palazzo, and D. E. Pritchard,Evidence of problem-solving transfer in Web-based Socratic tutor, Proceedings of the 2005 Physics Education Research Conference, pp. 41–43; R. Warnakulasooriya,

Evidence of Learning and Problem-Solving TransferSuccessful learning must lead to problem-solving transfer—that is, the ability to apply what is being learned during one instance of problem solving to anotherinstance in which a similar set of knowledge and methodologies is required.Studies conducted using the Mastering programs show evidence of learning from Mastering’s tutorial items and from the ability gained by the students to transferthat learning where required.

It is highly plausible that the decrease in problem difficulty is due to an overall effect of learning within a givenassignment. The instructor followed the best practice recommendations given in MasteringChemistry andselected a roughly equal number of tutorials and EOCs as much as feasible within an assignment. The tutorialand EOC problems were selected so that they covered important parts of each chapter. Though the 1 (easy)through 5 (hard) difficulty scale was not actively used by the instructor in selecting the problems from theMasteringChemistry’s item library, the problems selected mainly fell in the difficulty range 1–3. Even if the EOC problems (that were assigned at the end of an assignment) were inherently easy, the general negativecorrelation does not explain the decrease in difficulty we see among the tutorial problems along the order.Since the instructor did not consciously select problems in decreasing order of difficulty within an assignment,it is reasonable to infer that on average we see a learning effect from one problem to the next within anassignment. The average decrease in difficulty per problem within an assignment is -0.26 ± 0.13. Thus, thedifficulty of the next problem within an assignment effectively decreases by about 0.26 standard deviations.Since the student skill and the problem difficulty are placed on the same standard deviation scale in an itemresponse model, this also suggests that the increase in skill from one problem to the next within an assignmentis about 0.26 standard deviations.

SummaryIn 10 of the 12 regular assignments given in MasteringChemistry, a linear decrease in problem difficulty occurs,with the earlier problems in an assignment being more difficult than the later problems. The averagecorrelation between the problem difficulty and its order within an assignment is -0.32 ± 0.09 while thedecrease in difficulty from one problem to the next is -0.26 ± 0.13 standard deviations. Hence, the learningeffect attributable to a problem is about 0.26 standard deviations.

0.00

0.05

0.10

0.15

0.20

0.25

0.30

50min18min6.7min

unprepared group

Rate

of Co

mplet

ion

Time

2.5min 2.2hr

Figure 1. The rate-of-completion graphs for a tutorial problem for the prepared and the unpreparedgroups plotted against the logarithmic time, where the time to completion is measured in seconds. For the prepared group, the peak rate of completion is shifted toward shorter times compared with the unprepared group.

Figure 5. The difficulty of the problems decrease along the order in the assignment: Chapter 11 of Brown/LeMay/BurstenIntermolecular Forces, Liquids, and Solids. The problem difficulty is reported on a standard deviation scale. A single-partproblem with difficulty -1 means that a student who is one standard deviation below average in skill has a 50% chance insuccessfully answering the problem on first attempt.

With acknowledgments to Prof. David E. Pritchard, Massachusetts Institute of Technology; and Prof. Randall W. Hall and Prof. Leslie G. Butler, Louisiana State University.

v0110

Page 16: Raising the Bar - The Chronicle of Higher Education

14 Raising the Bar: A Compendium of Case Studies on the Effectiveness of Pearson’s MyLab and Mastering Programs

regression line. In contrast, a similar study for problems in the paper-based final exam accounted for about40% of the variance. (See D. E. Pritchard and E. S. Morote, Reliable assessment with CyberTutor, a Web-basedhomework tutor, World Conference on E-Learning in Corporate, Government, Health, & Higher Education, 2002, pp. 785–791.) Thus, MasteringPhysics data such as time to first correct response, number of incorrectresponses without receiving feedback, and number of hint requests, can be used to reduce the measurementerror by a factor of two. (The statistical uncertainty in the correlation is between 0.89 and 0.93 with highconfidence, and therefore we can be fairly confident that we would obtain higher reliability values underrepeated measurement under similar conditions.)

Study 2. MasteringChemistryGeneral Chemistry, Fall 2006Randomly selected from the MasteringChemistry databaseStudy design: A General Chemistry class was randomly selected from the MasteringChemistry database, theonly criterion being that the course comprise at least 300 students and at least 60 assigned tutorial itemsthroughout the semester. The random selection avoided any biases that would be introduced if the study was conducted explicitly to verify the reliability of assessment. The 80 tutorial items given throughout the semester were divided randomly into two sets of 40 items each. The average difficulty of a given set for a student was computed as a linear combination of the average values of time to first correct response, thenumber of incorrect responses given when feedback is absent (except “try again”), and the number of hintrequests. The results were based on 347 students.

Results: The correlation between the average difficulty of the first item set and the average difficulty of the second item set is about 0.85 yielding a high reliability of about 92%. The high correlation implies thatabout 72% of the variance is explained by the regression line. (The statistical uncertainty in the correlation is between 0.80 and 0.89 with high confidence, and therefore, we can be fairly confident that we would obtain higher reliability values under repeated measurement under similar conditions.)

SummaryMastering content offers highly reliable assessment (over 90%) as evidenced by the aforementioned twostudies. Such high reliability aids instructors by providing a high level of confidence in Mastering content andassessments in that information provided by Mastering, whether on an individual student or the class as a whole, is validated for further intervention and instruction.

With acknowledgment to Prof. David E. Pritchard, Massachusetts Institute of Technology.

Figure 2. Correlation of average difficulty between two sets of tutorial items containing 40 items each showing the high reliability of assessment. A point corresponds to a single student.

v0110

Mast

erin

gChem

istr

y

regression line. In contrast, a similar study for problems in the paper-based final exam accounted for about40% of the variance. (See D. E. Pritchard and E. S. Morote, Reliable assessment with CyberTutor, a Web-basedhomework tutor, World Conference on E-Learning in Corporate, Government, Health, & Higher Education, 2002, pp. 785–791.) Thus, MasteringPhysics data such as time to first correct response, number of incorrectresponses without receiving feedback, and number of hint requests, can be used to reduce the measurementerror by a factor of two. (The statistical uncertainty in the correlation is between 0.89 and 0.93 with highconfidence, and therefore we can be fairly confident that we would obtain higher reliability values underrepeated measurement under similar conditions.)

Study 2. MasteringChemistryGeneral Chemistry, Fall 2006Randomly selected from the MasteringChemistry databaseStudy design: A General Chemistry class was randomly selected from the MasteringChemistry database, theonly criterion being that the course comprise at least 300 students and at least 60 assigned tutorial itemsthroughout the semester. The random selection avoided any biases that would be introduced if the study was conducted explicitly to verify the reliability of assessment. The 80 tutorial items given throughout the semester were divided randomly into two sets of 40 items each. The average difficulty of a given set for a student was computed as a linear combination of the average values of time to first correct response, thenumber of incorrect responses given when feedback is absent (except “try again”), and the number of hintrequests. The results were based on 347 students.

Results: The correlation between the average difficulty of the first item set and the average difficulty of the second item set is about 0.85 yielding a high reliability of about 92%. The high correlation implies thatabout 72% of the variance is explained by the regression line. (The statistical uncertainty in the correlation is between 0.80 and 0.89 with high confidence, and therefore, we can be fairly confident that we would obtain higher reliability values under repeated measurement under similar conditions.)

SummaryMastering content offers highly reliable assessment (over 90%) as evidenced by the aforementioned twostudies. Such high reliability aids instructors by providing a high level of confidence in Mastering content andassessments in that information provided by Mastering, whether on an individual student or the class as a whole, is validated for further intervention and instruction.

With acknowledgment to Prof. David E. Pritchard, Massachusetts Institute of Technology.

Figure 2. Correlation of average difficulty between two sets of tutorial items containing 40 items each showing the high reliability of assessment. A point corresponds to a single student.

v0110

Study 1. Mastering ChemistryHigh Reliability of Assessment

Page 17: Raising the Bar - The Chronicle of Higher Education

www.mypsychlab.com 15

MyP

sychLab

Georgia College and State University (GCSU),founded in 1889, gained a new mission in 1996: toprovide students with a private liberal arts college

experience at a public university price. Academicallycompetitive and student centered, the university currentlyoffers 36 undergraduate degrees and more than 25 graduateprograms on a campus lauded as one of the top 50 wirelesscampuses in the United States.

Since GCSU currently uses WebCT, Noland White,assistant professor of psychology, was well versed withonline courseware when he sought a product for hisIntroduction to Psychology class. Despite a variety ofcompetitors, White chose MyPsychLab (MPL). “MPLoffers all the components I need in a single resource,” hesays. “It’s all there for me, already created and ready to use.I can link easily and directly from WebCT to MPLmaterials; more important, so can my students. That kind of ease of use increases the likelihood that they’ll use it.”

White includes MPL as a required component of bothhomework and a hybrid lecture format. “MPL supplementsmy style of teaching,” he says. “The interactive modelfacilitates more communication—between professor andstudent and between students—and actively promotes a student-focused model of teaching and learning.”

White recognizes that most freshmen are tremendouslychallenged by the concepts of time management and takingresponsibility for their learning. “It’s very different fromtheir high school experience, in which many of them weretold exactly what to do and when to do it,” says White.“MPL offers students a structure in which they experiencethat making an effort achieves results. They learn how toapproach deadlines and what it means to be accountable for their own learning.

“Students who use MPL are more invested in theirperformance,” says White, “because they can get immediate,individualized feedback and know someone is payingattention to what they’re doing [via the Gradebook]. Theimmediate feedback keeps students motivated to improve—either from an individual perspective or when needed—like

a pat on the back that the professor can’t always be there to give. This interactive component is critical to the processof learning. Before a big test comes along, students usingMPL are aware of what they haven’t mastered and ofexactly what they need to do to gain mastery. Thisforewarning enables them to get help before it’s too late.”

The majority of GCSU’s students are technically savvy;they were neither upset nor challenged by the concept ofonline courseware or a Web interface. “I received lots ofpositive feedback,” White says.

White also had compliments for MPL’s instructor benefits.“I like that MPL shifts learning time from inside theclassroom to outside the classroom,” he says. “It promotesthe students’ interacting with the material as much aspossible and doesn’t limit their learning experiences to theclass period. When students are involved with the materialoutside of class as well, I can engage them in more-productive classroom discussions, and they retain more.”

One of MPL’s unique features is that it enables instructorsto acknowledge and respond to a range of educational goalsand learning styles. “MPL gives me the freedom and theflexibility to assign text materials by chapter or by moduledepending on which is more appropriate for the class,” saysWhite. “Its variety of modalities taps into the individuallearning style of each student: some respond to thesimulations; others, to the reading; and still others, to thevisual cues. MPL is effective with all of them and shouldwork equally well in a large classroom or a small one, indistance learning, or on-site.”

What is his favorite MPL feature? “It gives me anopportunity to do my job better,” says White, a professorwhose students are clearly his number one priority.

accurate assessment • convenient reporting • individualized study plans www.prenhall.com/mypsychlab

Case Study: Georgia College and State UniversityJ. Noland White, Ph.D., Assistant Professor of Psychology

Technical support was responsive and quick. I’m very impressed both with the individualsthemselves and with Pearson in general.

Page 18: Raising the Bar - The Chronicle of Higher Education

16 Raising the Bar: A Compendium of Case Studies on the Effectiveness of Pearson’s MyLab and Mastering Programs

MyW

rit

ingLab

A call for changeIn 2005, students in Shelton State’s mainstream composition courses were not suc-ceeding. Instructors found them unprepared to craft essay-length work; indeed, many students struggled to create sound sentences. Students’ grasp of basic grammar was inadequate. Clearly, significant changes were needed to improve student out-comes.

In response, the instructors in SSCC’s English department undertook a restruc-turing and standardization of developmental English courses and enlisted Pearson’s MyWritingLab as the engine and cornerstone of the program. Standardization of

SSCC’s two semester-length developmental writing courses, Basic English 1 and 2, began in late 2005; full implementation across all sections was achieved in 2007 and continues today.

Basic English 1 and 2 are now taught in a laboratory environment, and all classes use MyWritingLab daily. MyWritingLab comprises 25% of the course grade: 5% of the grade is the pre-diagnostic test and 20% is the final test on grammar. Additionally, students use MyWritingLab each day to work on their individualized student learning plans—specific grammar exercises designed to prepare students to succeed on the final exam.

Results

Comparison of pre- and post-test scoresIn 2007, with the start of full standardization, pre- and post-diagnostic results from MyWritingLab were analyzed. Eighty-six students in five classes participated. Of the 84 students completing the course (only two students failed to complete the course), 97.6% showed improvement. Students improved their average diagnostic test scores from 64% (pre-test) to 79.4% (post-test), an increase of over 15%.

Professor Dice explains, “MyWritingLab is a really valuable tool. I was trained in the sciences so I have a healthy respect for data and MyWritingLab offers a great way to keep score on student outcomes. It’s just a superior way to teach the subject.”

Case study

Shelton State Community College, Tuscaloosa, Alabama text Resources for Writers with Readings, 2/e, by Elizabeth C. Long Resources for Writers with Readings, 3/e, by Elizabeth C. Long

InstRuCtoRs Professor Richard (Ric) Dice, Instructor Professor Janice Filer, Chair of Language and Communications

CouRse English 093: Basic English 1 and 2 covering sentence-level writing

“ MyWritingLab is a really valuable tool. I was trained in the sciences so I have a healthy respect for data and MyWritingLab offers a great way to keep score on student outcomes. It’s just a superior way to teach the subject.”

—Professor Richard Dice

Breakdown of Results (all classes)Pre- to Post-diagnosticPercentage Improvement # of Students

+ 0-4 1

+5-9 15

+10-14 21

+15-19 22

+20-24 14

+25 11

0

5

10

15

20

25

NU

MB

ER

OF

ST

UD

EN

TS

0-4 5-9 10-14 15-19 20-24 25

PERCENTAGE IMPROVEMENT

97.6% of students Improved

2009CODiE Award

Page 19: Raising the Bar - The Chronicle of Higher Education

www.mywritinglab.com 17

MyW

ritin

gLab

Results

Follow-up studySix classes of students who successfully completed English 093 were tracked through English 101, the department’s mainstream composition course. Fifty-seven students enrolled in English 101. Of the 44 students who completed English 101 and received a letter grade, 33 passed for a success rate of 75%; 11 failed (19%). These figures com-pared favorably to the college-wide student success rates for English 101 of 51% pass-ing and 27% failing.

“I’m surprised each semester when I pull the data that students have improved as much as they do. And the results clearly document MyWritingLab’s effectiveness because I haven’t changed my teaching approach. I’m really acting now as an admin-istrator for MyWritingLab,” asserted Professor Dice.

Course administration transformedInstructors note that administering the course is a pleasure with MyWritingLab. Professor Dice notes, “We’ve gone from workbooks—really deadly—to CD-based instruction to the online tools offered by MyWritingLab—just a huge advancement.”Students appreciate the change, too. It was not uncommon for students to doze off

during workbook-based drill exercises. now, with MyWritingLab, students work at their own pace, progressing through a learning plan completely customized to their individual needs by MyWritingLab.

There are other benefits as well. Professor Janice Filer, chair of Language and Communications, notes that the self-guided nature of MyWritingLab gives her flexibility in staffing. She can augment the English department teaching staff with content special-ists, trained to administer MyWritingLab. This enables Professor Filer to offer lab-based sections of Basic English to even small numbers of students who would otherwise be closed out of the classes they need due to insufficient enrollment numbers.

AnalysisWithout question, students’ writing has improved. Professor Dice asserts, “You sense the students’ excitement when they can write the thoughts they imagine. Without a grasp of basic grammar and construction, students would self-edit and dumb down their thoughts to a rudimentary expression they could put down on paper. After working with MyWritingLab, students are confident enough in basic grammar and construction to truly express their thoughts and engage in writing worthy of their imaginations.”

www.pearsonhighered.com/english For a product tour or to find out more, please visit www.mywritinglab.com

“I’m surprised each semester when I pull the data that students have improved as much as they do. And the results clearly document MyWritingLab’s effectiveness because I haven’t changed my teaching approach. I’m really acting now as an administrator for MyWritingLab.”

—Professor Richard Dice

Of the 44 students who

completed English 101

using MyWritingLab,

75%passed.

College-wide

passing rates for

English 101 without

MyWritingLab were

only 51%

Page 20: Raising the Bar - The Chronicle of Higher Education

18 Raising the Bar: A Compendium of Case Studies on the Effectiveness of Pearson’s MyLab and Mastering Programs

MySpanis

hLab

Founded in 1965, Metropolitan State College of denver offers an affordable, quality education in a downtown setting to a large and diverse population. in keeping with Metro State’s commitment to provide a relevant education with strong technology integration, Professor Lunden Macdonald coordinated a review of available online resources for the Modern Language department’s introductory Spanish courses. in addition to improving student performance and equipping students with a suite of 21st century skills through more technology integration, the department hoped to:

n ensure a uniform curriculum and testing platform for more than 1,000 intro-ductory Spanish students annually

n engage an automated grading system to relieve time-crunched faculty

n enhance faculty’s professional skills development by training them to teach using cutting-edge technology resources

The department determined that ¡Arriba! with MySpanishLab was the best fit for their students and began integrating MySpanishLab in intro to Spanish i sections in fall 2008. in spring 2009, the department required MySpanishLab for all students in

all sections of both intro to Spanish i and intro to Spanish ii. MySpanishLab accounts for 50% of students’ final grade. Readiness Checks, Student Activity Manual exercises, and oral Practices are 15%; the mid-term exam is 10%; and the final exam is 15%.

Results

instructors have responded very positively to MySpanishLab’s automated grading and to the performance notification system that displays students’ progress on MySpanishLab assignments. The performance notification system quickly displays overall course per-formance and individual student performance, allowing instructors to identify students

Case study

Metropolitan State CollegeDenver, Colorado InstRuCtoR Dr. Lunden E. MacDonald, Assistant Professor of Spanish

CouRses SPA 1010 Intro to Spanish I SPA 1020 Intro to Spanish II

level Elementary

texts ¡Arriba!: Comunicación y cultura, 5/e, with MySpanishLab by Eduardo Zayas-Bazán, Susan M. Bacon, and Holly Nibert

teRm CoveRed Spring 2009

ContRIbutIon of myspanIshlab to fInal gRade 50%

types of data RepoRted Student performance

CouRse stRuCtuRe Wired classroom

0

2

4

6

8

10

12

14

16

A’s B’s C’s D’s

# O

F T

OTA

L A

SS

IGN

ED

GR

AD

ES

final grade distributions

MySpanishLab has made for a noticeably higher level of communicative ability among my students.

—Dr. Lunden E. MacDonald

Spring 2009 section of Spanish I using MySpanishLab. Average final grade for 27 students was 85

2010IMS Learning

Award

Page 21: Raising the Bar - The Chronicle of Higher Education

www.myspanishlab.com 19

MySpanis

hLab

who are struggling in the course. The automated grading saves instructors time that they can dedicate to engaging with students and helping them to stay on track.

Students’ performance improved with MySpanishLab. “Grades were noticeably higher this semester with MySpanishLab,” asserts Professor Macdonald. Professor Macdonald feels that MySpanishLab’s SAM activities are a key contributor to stu-dents’ higher grades. She says, “The machine-graded SAM activities force the stu-dents to keep up with their homework, whereas in prior semesters i noticed that students didn’t even bother to do workbook activities. MySpanishLab’s instant feed-back really motivates students and makes their homework assignments meaningful. Students either understand the grammar concepts prior to class or they come to class with informed, intelligent questions due to the SAM practices.” And, because less time in class is devoted to grammar review, says Professor Macdonald, “we are able to focus more on speaking and communicative practice. MySpanishLab has made for a noticeably higher level of communicative ability among my students.”

in a representative section of intro to Spanish i for which final grades were record-ed in spring 2009, the average grade for 27 students was 85. Grade distribution was: A 6; B 15; C 4; d 2.

expandIng use of myspanIshlab

Professor Macdonald is exploiting more of MySpanishLab’s features in two acceler-ated summer courses (5 credits in just 5 weeks.) She reports, “These classes are nor-mally delivered over 16 weeks. in the summer, it is not infrequent to have a quiz over a chapter on one day and then a big exam or midterm the next day. By doing SAMs and quizzes in MySpanishLab, i know that my students are getting the immediate feedback without having to wait a night for me to take the quiz home, grade it manu-ally, etc. They can use the grading feedback immediately and we can maintain our extremely tight schedule.”

Reflecting on the department’s choice to integrate MySpanishLab, Professor Macdonald says, “MySpanishLab helps teachers teach better and students learn bet-ter! MySpanishLab provides best-practice technological support for instructors’ pedagogical goals while effectively meeting the varied learning needs of students. We will absolutely continue to use MySpanishLab and incorporate it further and further into our daily instruction.”

futuRe

Looking ahead to fall, Professor Macdonald expects to continue expanding the use of MySpanishLab in the introductory courses and is reviewing MySpanishLab for use in a new pre-introductory course and for courses at the intermedi-ate level as well. She reports that her colleagues teaching italian have adopted MyitalianLab for their program and that the German faculty are eagerly awaiting MyGermanLab, currently in development.

For a product tour or to find out more, please visit www.myspanishlab.comALSo AvAILAbLE: MyFrenchLab and MyItalianLabAvAILAbLE For FALL 2010 CLASSES: MyChineseLabAvAILAbLE For SPrIng 2011 CLASSES: MyLatinLab

CoMIng Soon: MygermanLab, MyPortugueseLab, MyrussianLab

“MySpanishLab

provides

best-practice

technological support

for instructors’

pedagogical goals

while effectively

meeting the varied

learning needs

of students.”

—Dr. Lunden E. MacDonald

Page 22: Raising the Bar - The Chronicle of Higher Education

20 Raising the Bar: A Compendium of Case Studies on the Effectiveness of Pearson’s MyLab and Mastering Programs

MyEco

nLab

MyEconLab usage increased from one semester to the next.Student end-of-course reviews suggest that the increase wasdue in part to student word of mouth about the program.

Data in Table 1 indicates a clear correlation between timespent on MyEconLab and exam scores. By combining examscores for both classes and time spent on the program forboth classes, one can see that of those students whose scoredecreased one letter grade or more from exam 1 to exam 2, 64 percent spent less time on the program before exam 2than they did before exam 1. Of those students whose scoredecreased more than 10 points, 75 percent spent less time onMyEconLab before the second exam.

Similarly, of those students whose score increased one lettergrade or more, 63 percent spent more time on MyEconLab

before the second exam than they did before the first. Ofthose students whose score increased more than 10 points, 72 percent spent more time on MyEconLab before the second exam.

For both semesters, use of the Study Plan was minimal before the first exam and increased by semester’s end and the final exam.

• Spring 2007, 35 students used the Study Plan betweenexams 1 and 2. Sixty-one students (32 percent) usedthe Study Plan before the final exam.

• Fall 2007, 42 students used the Study Plan between exams 1 and 2. Sixty-seven (35 percent) used theStudy Plan before the final exam.

4 WWW. M Y E C O N L A B . C OM

Western Michigan University

Course DesignPrinciples of Microeconomics is a semester-long, on-campus,lecture-format course. Classes comprise more than 200 stu-dents and meet two days a week for 75 minutes per session.There are no discussion sessions.

Students are required to complete one MyEconLab home-work assignment for approximately every week in which they do not have an exam. Homework is assigned at the end ofdiscussion of the chapter; students have approximately oneweek to complete it.

AssessmentsStudents must complete a practice quiz and earn a minimumscore of 70 percent before attempting the weekly homeworkassignment. Multiple attempts to pass the quiz are allowed.

Students can earn a total of 600 points:

300 points Three exams (two midterms, one final)

100 points Two in-class quizzes

200 points Eight MyEconLab homework assignments

MyEconLab ImplementationHomework is completed in MyEconLab and consists of 25 to35 multiple-choice questions. When appropriate, homeworkquestions take advantage of the MyEconLab graphing capa-bilities. All MyEconLab support tools—including E-text,Guided Solutions, and E-mail the Instructor—are allowed. Use of the Study Plan is encouraged but not required.

Off-line quiz and test scores are entered into MyEconLab. This ensures easy, centralized access to all student scores.

Contribution of MyEconLab to Final GradeUse of MyEconLab contributes 33 percent of the final coursegrade.

MyEconLab Course Structure

MyEconLab Course Results

Microeconomics, 2e, 2006, Hubbard and O’Brien

Textbook in Use with MyEconLab

Course Name Principles of Microeconomics

Credit Hours Three

Semesters Covered Spring 2007–Fall 2007

Types of Data Reported Correlation of Final Grades and Time Spent on Task

Page 23: Raising the Bar - The Chronicle of Higher Education

www.myeconlab.com 21

MyEco

nLab

WWW. M YM AT H L A B . C OM 5

WESTERN MICHIGAN UNIVERSITY, CONT’D.

Based on MyEconLab’s contribution to increased studentsuccess, Ryan is making the following changes to his curricu-lum:

• Increased emphasis on MyEconLab’s Study Plan: The data shows that student performance increasedwith its use, so he plans to work toward requiring its use.

• Requirement of a MyEconLab quiz before each home-work assignment.

• Additional office hours each week in a computer lab:This will enable Ryan to work directly in MyEconLabwith his students to actively reinforce both theirknowledge of the material and their understanding of MyEconLab.

Submitted by Michael Ryan, Associate ProfessorDepartment of Economics, Western Michigan University

Conclusions

MyEconLab provides students the timely feedback they desire and gives professors more time to spend doing what we want to do: teach.

—Michael RyanWestern Michigan University

Table 1. Correlation of Final Grades and Time Spent Using MyEconLab, Spring 2007 and Fall 2007 (n=391)

Average Time Spent Average Time SpentFinal Grade on Required Homework Assignments on Optional Study Plan Average Total Time Spent

Spring 2007 Fall 2007 Spring 2007 Fall 2007 Spring 2007 Fall 2007

A (4.0) 11 hrs, 28 mins 13hrs, 44mins 2 hrs, 20 mins 3 hrs, 43 mins 13 hrs, 48 mins 17 hrs, 27 mins

BA (3.5) 15 hrs, 02 mins 20 hrs, 59 mins 0 hrs, 38 mins 2 hrs, 40 mins 15 hrs, 40 mins 23 hrs, 39 mins

B (3.0) 14 hrs, 30 mins 16 hrs, 13 mins 1 hr, 02 mins 2 hrs, 12 mins 15 hrs, 32 mins 18 hrs, 25 mins

CB (2.5) 11 hrs, 45 mins 12 hrs, 13 mins 1 hr, 32 mins 2 hrs, 38 mins 13 hrs, 17 mins 14 hrs, 51 mins

C (2.0) 10 hrs, 37 mins 11 hrs, 35 mins 2 hrs, 04 mins 3 hrs, 58 mins 12 hrs, 41 mins 15 hrs, 33 mins

DC (1.5) 7 hrs, 18 mins 9 hrs, 23 mins 0 hrs, 04 mins 0 hrs, 46 mins 7 hrs, 22 mins 10 hrs, 37 mins

D (1.0) – 9 hrs, 20 mins – 0 hrs, 32 mins – 9 hrs, 52 mins

E (0.0) 8 hrs, 52 mins 3 hrs, 25 mins 0 hrs, 16 mins 1 hr, 07 mins 9 hrs, 8 mins 4 hrs, 32 mins

Total 13 hrs, 47 mins 13 hrs, 15 mins 1 hr, 24 mins 3 hrs, 15 mins 15 hrs, 11 mins 16 hrs, 30 mins

Ryan’s students had the following comments about MyEconLab.

MyEconLab gives me the chance to practice as much as I need to.

The homework link to the e-text has helped me learn how to read textbooks.

Practice, practice, practice! MyEconLab gave me so many opportunities to do practice problems.

MyEconLab helped me study. I like that I received instant feedback when I answered a question.

I wish Dr. Ryan had made the Study Plan mandatory. It reallyhelped me.

I love the calendar. I wish I could put everything on it.

What Students Are Saying

Page 24: Raising the Bar - The Chronicle of Higher Education

22 Raising the Bar: A Compendium of Case Studies on the Effectiveness of Pearson’s MyLab and Mastering Programs

myi

tlab

Bunker Hill Community CollegeBunker Hill Community College (BHCC) is a multicampus institution comprising two main campuses and five satellite campuses in and around the Greater Boston metropolitan area. Founded in 1973, BHCC enrolls more than 8,900 students in day, evening, weekend, Web-based, and distance-learning courses and programs. Through more than 65 associate degree and certificate programs, the college prepares students for both immediate employment and transfer to four-year universities.

As early adopters of technology-based learning systems, Mike Puopolo, professor and chairperson of Bunker Hill Community College’s (BHCC’s) Computer Information Technology (CIT) department, and his colleagues were familiar with the benefits of delivering learning via online and multimedia venues. But by spring 2006, they were looking for a learning management system that was more comprehensive than the one they were then using.

In March 2006, Puopolo began working with a Pearson team on customizing the entire course series—from content to delivery. “All of BHCC’s myitlab courses share a common portal, navigation theme, and user interface,” says Puopolo. “Students log in once and never have to go anywhere else. Customized course content, plus all of the resources they could ever need—podcasts, sound bites, demo documents—is available through myitlab in one centralized location. We even integrated YouTube into it.”

The myitlab application provides Puopolo and the other CIT instructors with the full learning system they’d been seeking. “We love the Gradebook, the discussion features, the textbook-based training, the pre- and posttests—all of it; myitlab has enabled us to streamline our teaching and present more material in a more coherent fashion,” he says.

After examining the changing learning habits of the more and more technologically savvy students entering his classroom, Puopolo decided to tailor his course offerings with those students in mind. “We’re piloting a customized eBook in six sections of the course,” he says. “It reduced the price of the textbook, and the students seem to prefer it. If end-of-semester surveys indicate, we’ll move all 40-plus sections to eBooks this spring.”

myitlab’s customized eBooks enable Puopolo’s students to do universal searches of the book for individual terms—and thereby facilitate their test taking. Other benefits include the integration of multimedia directly into the eBook.

Puopolo was one of the first to volunteer for the testing of myitlab’s Grader. “It’s a great addition for instructors and students,” he says. “It gives me a definite advantage in terms of being able to assign projects. My students are receiving much more detailed information about what they’ve done right and what they’ve done wrong than I could offer them in the same amount of time.”

Puopolo specifically appreciates Grader’s underlying pedagogy. “It dovetails with task-based testing,” he says. “Students work on the same document, correcting and resubmitting until they’ve mastered a skill. By seeing what they’ve done wrong and correcting it themselves, the learning is reinforced. Results go directly into the Gradebook, where they can then see immediately their work’s impact on their grade.”

BHCC’s enrollments are up 25 percent from last year. The result is an exponential increase in the number of sections and the times they are offered. “Our president is committed to not turning away students,” says Puopolo. “Some departments

Making it Click 2010 2010IMS Learning

Award

Page 25: Raising the Bar - The Chronicle of Higher Education

www.myitlab.com 23

myitlab

are even running midnight courses. myitlab enables us to handle the increased scale without losing integrity. All instructors, whether full-time or adjuncts, whether on-site or online—are presented with a fully populated course, including assessments, PowerPoints, demos, multimedia features, and all the assignments. We train adjuncts to deliver this course in a quality fashion in about six hours.”

Many of Puopolo’s newest students are returning for workforce retraining and were priced out of four-year schools. “myitlab offers these students the kinds of real, on-the-ground skills they’ll need regardless of career path,” he says. “You need these skills everywhere.”

BHCC is part of the National Science Foundation’s Boston-area Advanced Technological Education Consortium—a consortium of colleges and universities promoting workforce development in information technology in the Boston region. Fulfilling the consortium’s mission required commissioning a cross-industry workforce study in 2006. “We learned that small and large companies alike assume that graduates have the requisite technical skills to succeed,” says Puopolo. “What they look for above and beyond that are soft-side skills: the abilities to work in teams, to creatively problem solve, to write clearly, and to communicate verbally. Pearson’s vast wealth of content meant that we could respond to what we learned by further customizing myitlab with Pearson’s Self-Assessment Library, thereby creating our ideal learning management system.”

Puopolo’s myitlab course folder also includes (1) a customized Pearson text entitled Professionalism: Real Skills for Workplace Success, (2) a group running case culminating in two formal presentations of the collaborative work done by each group, and (3) links to online meetings and collaborative workplace tools, such as Office Live.

Off campus, BHCC leverages myitlab to further its commitment to providing computer and information technology training for community-based organizations in its service area. The college is able to provide elsewhere the same myitlab courses offered on its main and satellite campuses: on-site at organizations, including Action for Boston Community Development, Jewish Vocational Services, the New England Center for Homeless Veterans, and the Urban League of Eastern Massachusetts. During the past two years, this kind of community outreach has benefited more than 300 students, who receive full college credit for courses taken in their own community settings.

“myitlab is much more flexible than I previously thought,” says Puopolo. “We’re still a computer department and a computer course, but we can now provide our students with all of the 21st-century skills they need to get and keep a good job. It has enabled us to attain our goal of enhancing our students’ skill sets beyond grade point average to employability and promotability. It’s an excellent product.”

Case Studies

www.myitlab.com • 7

Bunker Hill Community CollegeBunker Hill Community College (BHCC) is a multi-campus institution comprising two main campuses andfive satellite campuses in and around the Greater Bostonmetropolitan area. Founded in 1973, BHCC enrollsmore than 8,900 students in day, evening, weekend,Web-based, and distance-learning courses and pro-grams. Through more than 65 associate degree and certificate programs, the college prepares students for both immediate employment and transfer to four-year universities.

As early adopters of technology-based learning systems,Mike Puopolo, professor and chairperson of Bunker HillCommunity College’s (BHCC’s) Computer Informa-tion Technology (CIT) department, and his colleagueswere familiar with the benefits of delivering learning viaonline and multimedia venues. But by spring 2006, theywere looking for a learning management system that wasmore comprehensive than the one they were then using.

In March 2006, Puopolo began working with a Pearsonteam on customizing the entire course series—fromcontent to delivery. “All of BHCC’s myitlab coursesshare a common portal, navigation theme, and user in-terface,” says Puopolo. “Students log in once and neverhave to go anywhere else. Customized course content,plus all of the resources they could ever need—podcasts,sound bites, demo documents—is available through myitlab in one centralized location. We even integratedYouTube into it.”

The myitlab application provides Puopolo and the otherCIT instructors with the full learning system they’dbeen seeking. “We love the Gradebook, the discussionfeatures, the textbook-based training, the pre- andposttests—all of it; myitlab has enabled us to streamlineour teaching and present more material in a more coherent fashion,” he says.

After examining the changing learning habits of themore and more technologically savvy students entering

his classroom, Puopolo decided to tailor his course offerings with those students in mind. “We’re piloting acustomized eBook in six sections of the course,” he says.“It reduced the price of the textbook, and the studentsseem to prefer it. If end-of-semester surveys indicate,we’ll move all 40-plus sections to eBooks this spring.”

myitlab’s customized eBooks enable Puopolo’s studentsto do universal searches of the book for individualterms—and thereby facilitate their test taking. Otherbenefits include the integration of multimedia directlyinto the eBook.

Puopolo was one of the first to volunteer for the testingof myitlab’s Grader. “It’s a great addition for instructorsand students,” he says. “It gives me a definite advantagein terms of being able to assign projects. My studentsare receiving much more detailed information aboutwhat they’ve done right and what they’ve done wrongthan I could offer them in the same amount of time.”

Puopolo specifically appreciates Grader’s underlyingpedagogy. “It dovetails with task-based testing,” he says.“Students work on the same document, correcting andresubmitting until they’ve mastered a skill. By seeingwhat they’ve done wrong and correcting it themselves,the learning is reinforced. Results go directly into theGradebook, where they can then see immediately theirwork’s impact on their grade.”

myitlab enables us to handle the increasedscale without losing integrity. All instructors,whether full-time or adjuncts, whether on-site or online—are presented with a fullypopulated course

—Mike PuopoloBunker Hill Community College

Case Studies

Page 26: Raising the Bar - The Chronicle of Higher Education

24 www.myaccountinglab.com

MyA

cco

unti

ngLab

The Power of Practice 2010: MyAccountingLab and 21st Century Accounting Instruction

Spokane Community CollegeMore than 13,000 students (over 6,000 full-time equivalents) attend Spokane Community College (SCC). Whether they work full-time and attend school part-time to obtain the skills necessary for on-the-job advancement or whether they plan to transfer to four-year institutions, students at SCC know they will be supported by a breadth of curricular and student services designed to fulfill the school’s tagline: In the business of helping you change your life.

Jeffrey Waybright, accounting instructor at Spokane Community College, is no stranger to online courseware. Asked to develop online classes for the college several years ago, he learned an available program and created the classes but was never completely satisfied with it. Nearly four years ago, Waybright was introduced to MyAccountingLab. “My previous experience left me hesitant,” he says. “But implementation of MyAccountingLab went smoothly, and it worked better than I anticipated it would. I really appreciate the Gradebook feature and the enhanced testing features.”

Since then, Waybright has integrated MyAccountingLab into all of his distance-learning and on-site courses, using it for everything: homework, quizzes, testing, study plans, tracking, grading, and communication with students. Recently, he increased the amount of homework he requires. “I’m trying to find a middle ground between students who don’t need to do a lot of homework and motivating those who need more time on task,” he says.

MyAccountingLab helps Waybright respond to the needs of both distance-learning and on-site students. “Distance students don’t have the same access to me,” he says. “MyAccountingLab offers them access to resources and the ability to walk through a problem [Help Me Solve This] immediately, at any hour. One distance-learning student recently told me, ‘Don’t feel bad, but I don’t really need you. I’ll e-mail you if I do.’”

Waybright doesn’t feel bad. In fact, during a recent quarter, the scores on the first test in his online class were higher than the scores in his on-site class. “The classes have the same demographic,” he says. “The only difference is that some couldn’t get the class time they wanted.”

Students in Waybright’s on-site classes also embrace the program. “Once students realize that all the resources they need are available in MyAccountingLab, they’re no longer intimidated by the topic,” he says. “They have at their fingertips everything they need to succeed. I can watch their homework and quiz scores and see that they’re doing great. They’re learning, they’re getting their grades, and they’re happy.”

Data from SCC’s Introduction to Accounting courses (see page 4) support Waybright’s observations. Of those students who completed the class, the percentage earning 3.1 to 4.0 has

steadily risen since implementation of MyAccountingLab’s—from approximately 34 percent prior to adoption to 52 percent now, three years after adoption.

Students who want more instructor contact can use the Ask My Instructor feature, which Waybright encourages. “In the past, when students e-mailed me questions, I frequently wondered what they were talking about,” he says. “Student communication isn’t always clear. Ask My Instructor links me to exactly where they are and what the problem is. It facilitates my giving them meaningful responses that can really help them.”

“MyAccountingLab’s immediate feedback helps, too,” says Waybright. “It holds the interest of a newer generation of learners, who are used to instant gratification. And as a learning tool, it not only indicates right or wrong but also shows students why they didn’t get it the first time and how to do it right. Students master one concept before moving on to the next.”

For Waybright and the rest of his department, one of the primary motivations to use MyAccountingLab is the consistency of assessment. “No matter who your teacher is or whether your class is on-site or online, using MyAccountingLab ensures that our assessments and our quality standards are the same and that our measurement outcomes are the same,” he says. “It means our students will always be prepared for the next course.”

Even the adjuncts have embraced MyAccountingLab. “As coordinator of the course, I set up everything ahead of time: the quizzes are written, and the homework is all there,” says Waybright. “It makes it easy for adjuncts to step in. I think it has helped us get adjuncts, particularly those with time considerations, such as working CPAs.”

It’s those kinds of full-course support that have hooked Waybright: He knows that students are more likely to use a program if they have a positive experience. It saves him time, streamlines departmental communication, and enables him to gain support from adjuncts without compromising assessment integrity. Most important, it works.

Case Studies

quickly and easily. More students communicate with methis way than any other. I’m able to learn what’s on theirminds—right when it’s on their minds—and to moveour interactions away from only during office hours and to an online relationship.”

Turner’s students particularly appreciate the DemoDocsin MyAccountingLab’s. Turner recently overheard onestudent tell another, “When I have a problem, I just clickon DemoDocs, and I can solve it!”

But MyAccountingLab isn’t just for students with problems. One of Turner’s A-grade students—an accounting major on the president’s list—said, “I loveMyAccountingLab. Why can’t we use it in all our classes?”

Turner’s classes have changed since making the switchto MyAccountingLab. “More students do their home-work now,” she says. “Before using MyAccountingLab,I’d assign homework, and students would show up witha blank sheet of paper. Retention was low, frustrationwas high, and the first test scores were very low. Nowthat my students have instant feedback—that they canreceive assistance at the moment they need it—they’redoing their homework, and everything’s changed.”

Both students and faculty will tell you that RCCC accounting classes are hard. Drop/fail rates in the rangeof 25 to 50 percent used to be the norm. Since adoptingMyAccountingLab, however, that rate has dropped significantly. “Our drop/failure rate has declined to 10 to 25 percent,” says Turner. “It means that fewer students repeat the class and that a greater numbercomplete their degree on time.”

“My students have become more responsible for theirown learning,” says Turner. “They’re doing better in

class, but more important, they’re learning how to problem solve. In the real world, there’s no book withthe answers and no one to spoon-feed support to them.MyAccountingLab teaches them how to problem solveand interpret information on their own.”

Turner looks forward to receiving the next upgrade to MyAccountingLab . “I enjoy putting this technologyin my classroom,” she says. “I’m interested in whatever I can do to make the learning experience better for mystudents. If you show students you care about theirlearning, they’ll do anything you ask.”

Spokane Community College More than 13,000 students (over 6,000 full-time equivalents) attend Spokane Community College(SCC). Whether they work full-time and attend schoolpart-time to obtain the skills necessary for on-the-jobadvancement or whether they plan to transfer to four-year institutions, students at SCC know they will besupported by a breadth of curricular and student services designed to fulfill the school’s tagline: In thebusiness of helping you change your life.

Jeffrey Waybright, accounting instructor at SpokaneCommunity College, is no stranger to online course-ware. Asked to develop online classes for the college several years ago, he learned an available program andcreated the classes but was never completely satisfiedwith it. Nearly four years ago, Waybright was intro-duced to MyAccountingLab. “My previous experienceleft me hesitant,” he says. “But implementation of MyAccountingLab went smoothly, and it worked betterthan I anticipated it would. I really appreciate theGradebook feature and the enhanced testing features.”

Since then, Waybright has integrated MyAccountingLabinto all of his distance-learning and on-site courses, using it for everything: homework, quizzes, testing,study plans, tracking, grading, and communication withstudents. Recently, he increased the amount of home-work he requires. “I’m trying to find a middle groundbetween students who don’t need to do a lot of home-work and motivating those who need more time ontask,” he says.

MyAccountingLab helps Waybright respond to theneeds of both distance-learning and on-site students.“Distance students don’t have the same access to me,”

www.myaccountinglab.com • 13

Some students are slower, and some arefaster. With MyAccountingLab, neithergroup impedes the class as a whole. Theyall get exactly what they need at their own pace.

—Robin TurnerRowan-Cabarrus Community College

Page 27: Raising the Bar - The Chronicle of Higher Education

MySpanishLab

“ I use MySpanishLab a lot because I’m always on the computer and it’s right there, so if I’m feeling like I can take out some homework assignments, I’ll just log in to it, do a few while I can, and it really helps me not procrastinate on things like leaving a workbook until the very end. I would definitely recommend MySpanishLab to anyone getting involved in a foreign language program.”

—Student, Illinois State University

MyEconLab

“ MyEconLab is an online program that helps you understand material better. While studying for a test or going over notes, it is a perfect complement for you to gain understanding of the material. As long as you take the time to help yourself, MyEconLab will definitely help you and your grade.”

—Student, Northern Illinois University

myitlab

“ It’s been 20 years since I’ve taken a computer course. Thanks to the trainings in myitlab and the instructor’s patience, I have completed the course with some very important skills.”

—Student

MyAccountingLab

“ I think that MyAccountingLab is a great tool to have with such a difficult class. Accounting itself takes a lot of practice—you need to really work at it to get it, so having all the practice quizzes, homework, and everything there for you makes it a lot easier for you to succeed in such a hard class.”

—Student, Penn State University

What are students

saying?

Page 28: Raising the Bar - The Chronicle of Higher Education

Inside this report are case studies about:

Now available! Hit the ground running with batch registration, advanced reporting, and superior 24/7 phone support for students and instructors.

Find out more at www.mylabsplus.com