readiness. behavior, and foundational mathematics course success

7
Readiness. Behavior, and Foundational Mathematics Course Success By Kevin Li, Richard Zelenka, Larry Buonaguidi, Robert Beckman, Alex Casillas, Jill Crouse, Jeff Allen, Mary Ann Hanson, Tara Acton, and Steve Robbins ABSTRACT: This study examines the effects of math readiness and student course behavior (e.g., attendance, participation, homework comple- tion) on knowledge gain and course success using two samples of students enrolled in foundational skills (noncredit-bearing) mathematics courses. As hypothesized, entering student mathematics readiness and course behavior predicted posttest mathematics knowledge. Posttest knowledge and course behaviorpredicted course success (i.e.,pass- ingthe course). Results highlight the importance of mathematics readiness and student behavior for understanding mathematics knowledge gains and course success. Implications for institutional policy and practice using effective diagnostic testing and behavioral monitoring are discussed. Few studies have looked at the impact of specific and observable course behaviors [on academic performance]. Kevin Li Dean of Instruction [email protected] Richard Zelenka Chair of Foundational Studies Department First-year student retention and successful course completion is a challenge for postsecondaryinstitu- tions, particularly community colleges. According to the latest national figures, approximately 45% of degree- or certificate-seeking community col- lege students fail to maintain enrollment or earn a credential (certificate or degree) within the first 2 years of enrollment (Provasnik & Planty, 2008). The longer-term retention and degree-attainment rates are even more concerning: Only 28% of community college students received any type of certificate or degree within 6 years (Provasnik & Planty, 2008). Important factors contributing to low degree attainment are lack of academic preparation for college-level coursework (Strayhorn, 2011) and a lack of student motivation (Rosenbaum, Redline, & Stephan, 2007). For example, based on the ACT College Readiness Benchmarks, which are tied to a 50% likelihood of earning a B or better in credit- bearing college general education courses, only 24% ofhigh school graduates were academicallyprepared for college in all four core subject areas (ACT, 2010). When considering the ACT Mathematics Bench- mark alone, 43% of high school graduates are ready to succeed in college-level mathematics courses. Students who do not have the necessary prerequi- site skills and knowledge, or who may not be fully committed to attaining a degree, are less likely to succeed in college courses or to return for a second year (Robbins, Allen, Casillas, Peterson, & Le, 2006). Larry Buonaguidi Quality Assurance Coordinator Robert Beckman Director ofTesting Wilbur Wright College 4300 N. Narragansett Ave. Chicago, IL60634 Alex Casillas Senior Research Associate [email protected] Jill Crouse Jeff Allen Mary Ann Hanson Tara Acton Research Division, ACT,Inc. 500 ACT Drive, P.O.Box 168 Iowa City, IA52243 Steve Robbins Director of Research Innovations Educational Testing Service (ETS) Rosedale Road MS 18-E Princeton, NJ 08541 14 The primary method for helping students who are underprepared for postsecondary coursework progress toward successful degree attainment is developmentalinstruction (see Attewell, Lavin, Domina, & Levey, 2006). Due to their open enrollment policies, much of the remediation challenge has fallen to community colleges. Between 2000 and 2008, the percentage of two- year college students taking at least one devel- opmental course to improve their foundational skills rose from 39% to 44% (NCES, 2010). The estimated percentage of students needing remediation tends to be greater in mathematics (70%) than in English (34%; Biswas, 2007). Course placement is used to put students into courses in which they are able to master material designed to prepare them for eventual successful completion of college algebra (Smith & Michael, 1998). The dilemma involved in setting rigorous course placement standards has been highlighted by Jacobsen (2006), who has pointed outthat plac- ing students in noncredit-bearing developmental mathematics courses increased their mastery of the foundational skills needed for college algebra. However, it also has resulted in reduced program completion due to high attrition rates. Bahr (2010) reinforces this point by highlighting the disparities in course completion rates at differing levels ofinitial math skills. Within developmental mathematics, Bahr's results suggest that student persistence behavior is critical for understanding course completion and success as the math "skill gaps" themselves. The effectiveness of developmental instruc- tion has been a topic of research for approxi- mately 2 decades (e.g., Bailey, Jeong, & Cho, 2010; Boylan, 2009; Conley, 2007). Specific to devel- opmental math, researchers have tried to better understand which approaches and strategies work best to strengthen adult students' math skills to help them progress into college-level courses (Golfin, Jordan, Hull, & Ruffin, 2005). Promising practices include the use of technol- ogyto assess and accurately place students into appropriate-level math courses and tailored instruction as opposed to traditional lecture and lab format. JOURNAL a/DEVELOPMENTAL EDUCATION

Upload: trinhanh

Post on 01-Jan-2017

221 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Readiness. Behavior, and Foundational Mathematics Course Success

Readiness. Behavior, and FoundationalMathematics Course SuccessBy Kevin Li, Richard Zelenka, Larry Buonaguidi, Robert Beckman, Alex Casillas, Jill Crouse, Jeff Allen,Mary Ann Hanson, Tara Acton, and Steve Robbins

ABSTRACT: This study examines the effects ofmath readiness and student course behavior (e.g.,attendance, participation, homework comple-tion) on knowledge gain and course success usingtwo samples of students enrolled in foundationalskills (noncredit-bearing) mathematics courses.As hypothesized, entering student mathematicsreadiness and course behavior predicted posttestmathematics knowledge. Posttest knowledge andcourse behaviorpredicted course success (i.e.,pass-ingthe course). Results highlight the importance ofmathematics readiness and student behavior forunderstanding mathematics knowledge gains andcourse success. Implications for institutional policyand practice using effective diagnostic testing andbehavioral monitoring are discussed.

Few studies have looked atthe impact of specific andobservable course behaviors[on academic performance].

Kevin LiDean of [email protected]

Richard ZelenkaChair of Foundational Studies Department

First-year student retention and successful coursecompletion is a challenge for postsecondaryinstitu-tions, particularly community colleges. Accordingto the latest national figures, approximately 45%of degree- or certificate-seeking community col-lege students fail to maintain enrollment or earna credential (certificate or degree) within the first2 years of enrollment (Provasnik & Planty, 2008).The longer-term retention and degree-attainmentrates are even more concerning: Only 28% ofcommunity college students received any type ofcertificate or degree within 6 years (Provasnik &Planty, 2008). Important factors contributing to lowdegree attainment are lack of academic preparationfor college-level coursework (Strayhorn, 2011) anda lack of student motivation (Rosenbaum, Redline,& Stephan, 2007). For example, based on the ACTCollege Readiness Benchmarks, which are tied toa 50% likelihood of earning a B or better in credit-bearing college general education courses, only 24%ofhigh school graduates were academicallypreparedfor college in all four core subject areas (ACT, 2010).When considering the ACT Mathematics Bench-mark alone, 43% of high school graduates are readyto succeed in college-level mathematics courses.Students who do not have the necessary prerequi-site skills and knowledge, or who may not be fullycommitted to attaining a degree, are less likely tosucceed in college courses or to return for a secondyear (Robbins, Allen, Casillas, Peterson, &Le, 2006).

Larry BuonaguidiQuality Assurance Coordinator

Robert BeckmanDirector ofTesting

Wilbur Wright College4300 N. Narragansett Ave.Chicago, IL60634

Alex CasillasSenior Research [email protected]

Jill CrouseJeff AllenMary Ann HansonTara Acton

Research Division, ACT,Inc.500 ACT Drive, P.O.Box 168Iowa City, IA52243

Steve RobbinsDirector of Research InnovationsEducational Testing Service (ETS)Rosedale Road MS 18-EPrinceton, NJ 08541

14

The primary method for helping students whoare underprepared for postsecondary courseworkprogress toward successful degree attainment isdevelopmentalinstruction (see Attewell, Lavin,Domina, & Levey, 2006). Due to their openenrollment policies, much of the remediationchallenge has fallen to community colleges.Between 2000 and 2008, the percentage of two-year college students taking at least one devel-opmental course to improve their foundationalskills rose from 39% to 44% (NCES, 2010).The estimated percentage of students needingremediation tends to be greater in mathematics(70%) than in English (34%; Biswas, 2007).

Course placement is used to put students intocourses in which they are able to master materialdesigned to prepare them for eventual successfulcompletion of college algebra (Smith & Michael,1998). The dilemma involved in setting rigorouscourse placement standards has been highlightedby Jacobsen (2006), who has pointed outthat plac-ing students in noncredit-bearing developmentalmathematics courses increased their mastery ofthe foundational skills needed for college algebra.However, it also has resulted in reduced programcompletion due to high attrition rates. Bahr(2010) reinforces this point by highlighting thedisparities in course completion rates at differinglevels ofinitial math skills. Within developmentalmathematics, Bahr's results suggest that studentpersistence behavior is critical for understandingcourse completion and success as the math "skillgaps" themselves.

The effectiveness of developmental instruc-tion has been a topic of research for approxi-mately 2 decades (e.g., Bailey, Jeong, & Cho, 2010;Boylan, 2009; Conley, 2007). Specific to devel-opmental math, researchers have tried to betterunderstand which approaches and strategieswork best to strengthen adult students' mathskills to help them progress into college-levelcourses (Golfin, Jordan, Hull, & Ruffin, 2005).Promising practices include the use of technol-ogyto assess and accurately place students intoappropriate-level math courses and tailoredinstruction as opposed to traditional lecture andlab format.

JOURNAL a/DEVELOPMENTAL EDUCATION

Page 2: Readiness. Behavior, and Foundational Mathematics Course Success

The potential of the tailored instructionapproach was demonstrated by Waycaster (2001),who successfully used individualized computer-aided instruction in developmental math coursesat several two-year institutions. Similarly,Quinn (2003) studied a college that used a com-puter-based placement test to route studentsinto the appropriate courses. Students who didnot meet a particular cutoff score were assignedcourseware modules to improve their skills and- en allowed to take the placement test again. Aseneral conclusion ofthis line of work has been-- t effective course placement is essential to

cess.In a separate line of research, a variety of

r - ary studies and meta-analyses have demon-ed the importance of psychosocial constructs,uding personality, study skills, and other

academic behaviors (e.g., time management) toacademic performance and persistence in postsec-ondary settings (Crede &Kuncel, 200S; O'Connor

Paunonen, 2007; Poropat, 2009; Robbins et al.,2004). However, few studies have looked at theimpact of specific and observable course behaviorsrelated to psychosocial skills, such as attendanceand participation. For example, Dollinger, Matyja,and Huber (200S) examined the extent to whichcontrollable factors such as attendance and studytime contribute to college academic performanceafter controlling for past performance, aptitude,and personality characteristics. Although the bestpredictors of performance were ability and pastperformance, study time and attendance wererelated to performance and explained an additional6-10% of the variance explained. Similarly, stud-iesby Conard (2006) and Farsides and Woodfield_003) found that attendance, as an indicator of

"application and effort" (Farsides & Woodfield,2003, p. 1236) predicted academic performanceGPA) above and beyond the effects ofintellectual

ility and personality characteristics such as con-- .entiousness. Thus, based on available research,.::our e behavior appears to be an important factor

course success.Taken together the preceding lines of researchest that placing students in courses in which

- • are likely to succeed depends on both enter-(J skill levels and student motivation or effort.

:examining these two issues can help shed lightthe best ways to improve the success rates of

mmmunity college students. The premise of thecurrent study is that the combination of effectivecourse placement and behavioral effort factors isessential to successful completion of developmentalraarhematics courses. We applied the best practices. individualized delivery of developmental matheducation along with our knowledge of academicoehavior (cf. ACT, 2012, Robbins et al., 2004;Robbins, Oh, Le, & Button, 2009) to better under-stand developmental mathematics course success.

OlUME 37, ISSUE 1 • FALL 2013

The model tested in this study (shown in Figure1, p. 16) combines academic skills (entering mathskills) and course behavior (instructor-reportedeffort ratings) measures to predict course suc-cess. The model proposes that entering students'mathematics readiness (i.e., initial knowledge)directly predicts later math knowledge (measuredvia a posttest) which, in turn, is predictive of coursesuccess (completing with a passing grade). Thismodel also allows for examination of the extentto which precourse academic skills and studentbehaviors influence course success separately aswell as together. Specifically, math readiness isexpected to show a strong direct effect on posttestmath knowledge and a strong indirect effect oncourse success. Course behavior is expected to havedirect effects on both math knowledge and coursesuccess. Students who expend higher levels of effortare expected to demonstrate higher score gains andsuccess rates than those students who expend lesseffort.

MethodDemographics

Institution. The study institution is an openadmissions two-year public college in a large Mid-western U.S. city serving more than 14,000 students.Based on institutional enrollment data, approxi-mately70% of the students are part time and 5S%are

female. A majority of students (47%) are Hispanic,33% are White, 9% are African American, and S%are Asian/Pacific Islander. Nearly all participantsreside in state, and 47% are age 24 or younger.

Participants. A total of 1,254 students wereenrolled in developmental mathematics coursesat the study institution during the Fall200S (N =S19) and Spring 2009 (N = 435) semesters. Thesestudents were enrolled in developmental coursesbecause they did not have the skills required forsuccess in credit-bearing courses based on theresults of a computer-adaptive placement test(COMPASS). COMPASS provides reliable andvalid measures of current mathematics skill levelsand instructional needs (ACT, 2006). Students wereplaced into a mathematics tier based on COMPASSplacement test scores. Only new students were usedin the study; spring students who had enrolled in adevelopmental mathematics course in the previousfall were removed.

Participant demographics are shown, bysemester, in Table 1.Demographics are shown forthe entire starting sample of participants for eachsemester, as well as for the subset that achievedcourse success. Average student age ranged from21 to 23 across samples (SD = 6 years) and stu-dents were predominantly female and Hispanic.Demographics are very similar for the samples of

Table 1

Demographics for the Fal/2008 and Spring 2009 Semesters

Fall 2008 Spring 2009

DemographicsStarted course(N=819)

Completed course(N=507)

Started course Completed course(N=435) (N=231)

Age M=21.1 M=21.2 M=23.1

5.6 S.O 6.4 S.O

13.3 10.2 15.7 12.9

57.5 59.1 56.7 57.1

0.5 0.2 0.0 0.0

0.4 0.5 1.1 1.S

22.7 21.9 20.0 20.3

15

SD=5.9 SD=6.3

M=22.6

SD=6.3 SD=6.2

Gender

Female 59.4 60.3 54.5 55.9

Male 40.6 39.7 45.5 44.1

Ethnicity

Asian

Black

Hispanic

NativeAmerican

Other

White

Page 3: Readiness. Behavior, and Foundational Mathematics Course Success

students who began the course and the subsetsthat completed the course successfully.

Predictor VariablesMathematics readiness. The COMPASS

mathematics diagnostic tests were administeredto measure students' strengths and weaknessesin specific mathematics content areas. For eachof two developmental mathematics courses, sixdiagnostic pretests were administered for contentareas mapped to the course content. The prealgebradiagnostics include operations with integers, frac-tions, and decimals; positive integer exponents;square roots and scientific notation; ratios and pro-portions; as well as percentages and averages. Thealgebra diagnostics include substituting values intoalgebraic expressions; setting up equations; opera-tions and factoring of polynomials, exponents, andradicals; rational expressions; and linear equations.

.130 <.03.l~ /

.106 <.048l/

Math Readiness(COMPASS pretest)

.506 (.042)-

.648 (.055)-

Week 1 Week4

rating scale ranging from 1 (almost never) to S(always) was used to rate the following four items:

• active participation in group work (student isactively engaged during group work; helps otherstudents with assignments; does his/her fairshare of the work, etc.),

• active participation in lecture (student is alertand attentive during class: asks/answers ques-tions: etc.),

• attendance (student attends class: stays forwhole period, etc.), and

• completion of homework assignments (stu-dent completes assignments thoroughly andcompletely; turns assignments in on time).

These items were combined into a composite ratingof student course-behavioral effort. Internal con-sistencyofthis composite was.92 using Cronbach's

.268 (.039)·

.145 (.077) n.s

Math Knowledge(COMPASS posttest)

Course Success.347 (.040)"

.295 (.06S)"

Week 12 Week16

Semester Timeline

Figure 1.Proposed and tested model for combining cognitive and behavior measuresto predict course success.

Note. Fall 2008 coefficients are above the line; Spring 2009 coefficients are below the line.

* Significant paths are 2 SEs from zero; ns = nonsignificant paths.

Because the diagnostic tests are adaptive, traditionalmeasures of internal consistency reliability are notapplicable. Instead, the marginal reliability coef-ficient is used to report reliability. Based on thismeasure of reliability, the COMPASS diagnosticscales have internal consistency reliability rangingfrom .84 to .92,with a median of.8S. Each diagnostictest score is an estimate of the percentage of testitems the student would answer correctly if theywere administered all items in the COMPASS itempool in that content area. Because each diagnosticmeasure uses the same scale, an aggregate measureof initial mathematics readiness level is defined asthe mean of the six diagnostic scores.

Student course behavior. During the devel-opmental mathematics courses, instructors ratedstudents' levels of participation, attendance, andcompletion of homework assignments. AS-point

16

coefficient alpha, suggesting that these behaviorswere tapping the same construct. Since students ineach course section were rated by a single instruc-tor, it was not possible to calculate interrater agree-ment of behavioral effort ratings.

However, research on course behavioral rat-ings suggests that these types of scales can be usedreliably by single raters (Das, Frost, &, Barnowe,1979;Grussing, Valuck, &Williams, 1994). Further,the research suggests that behavioral ratings arerelated to performance in both academic andwork settings, including course grades and otherindicators of academic achievement (ACT, 2012;Knapp, Campbell, Borman, Pulakos, & Hanson,2001; Motowidlo & Borman, 1977).

\ )~~ ...;g;-~',.

I ~ ~"_~ ~ _

.•.. ~"'~"

Dependent VariablesMathematics knowledge. The COMPASS

diagnostic tests administered at the beginning ofthe course were read ministered during the 12thweek of the semester to measure mathematicsknowledge gain. The reliability estimates forthe diagnostic tests, as well as how the six scoreswere combined to form a composite, remainedas previously described for the measure of initialmathematics readiness. Controlling for initialmathematics readiness allows for measurementof the effect of course behavior on growth in math-ematics knowledge (Wright, Sanders, & Rivers,2006). Within this model of course success, posttestmathematics knowledge was treated as an inter-mediate rather than a final outcome.

Course success. As with most developmentalmathematics courses, the developmental coursesat the study institution were competency based.Students who successfully complete all requiredexaminations and homework received a passinggrade. For the purposes of this study, a dichotomousvariable of course success was used that was basedon pass/fail grades assigned by course instructors .Students who dropped out were coded as failures aswere the students who remained in the course butreceived a failing grade. Course success was definedas both staying enrolled and passing the course.

Procedures and Instructor TrainingStudents placed into the developmental math-

ematics (noncredit-bearing) courses completedCOMPASS diagnostic tests during the first weekof classes to provide a deeper assessment of initialskill levels in prealgebra and algebra. Instructorsand students were provided pretest diagnosticresults. Individual student action plans were cre-ated and reviewed with each student, and lessonplans and homework assignments were tailoredto a modal class profile of diagnostic strengthsand weaknesses. Instruction focused more oncontent in which students scored low and less oncontent in which students scored high.

Instructors rated students' behavioral effort inthe course during week 4 of classes after instructorshad become familiar with students and their workhabits. During week 12 of the semester, studentswere then readministered the COMPASS diag-nostics tests to assess knowledge gains. As withpretest results, students and instructors wereprovided COMPASS diagnostic posttest reportsso that individual scores could be compared totarget achievement goals. Weaker content areaswere targeted for additional in-course review.At the end of the semester (week 16), instructorsassigned grades based on students' performancein the developmental courses.

Training for instructors on how to rate stu-dents was provided via written materials and, for

CONTINUED ON PAGE 18

JOURNAL o/DEVELOPMENTAL EDUCATION

Page 4: Readiness. Behavior, and Foundational Mathematics Course Success

CONTINUED FROM PAGE 16

those who requested it, in person. Despite theseefforts, some instructors did not follow throughwith the ratings even after expressing initial com-mitment to the study and after multiple reminders.

Analytic MethodsThe analyses were based on four variables: initialmathematics readiness, course behavior, math-ematics knowledge, and course success. Coursebehavior and mathematics knowledge (COMPASSposttest) data were missing when students with-drew and where instructors did not follow throughin rating students' course behavior. In addition,students who had poor attendance were less likelyto take the COMPASS posttest. Instead of remov-ing these students from the analyses altogether, amethod for estimating path coefficients from thecorrelation matrix of the dependent and indepen-dent variables was used (see Wright, Sanders, &Rivers, 2006). This approach allowed students withvalid data for the two variables forming a path tobe included in that path's coefficient estimation,even if they were missing other variables in themodel that were not part of the estimated path.

The path coefficients can be interpreted asstandardized regression weights where B is thechange in Y given a one standard deviation changein X. Standard errors (SEs) for the coefficientswere calculated using bootstrap estimation. Pathcoefficients greater than two SEs from zero areconsidered statistically significant (p < 0.05).

To cross-validate the model, data for fall andspring semesters were analyzed separately. Withineach semester, two general types of developmentalmath courses were included: prealgebra and alge-bra. Data from the two course types were pooled.To estimate each path coefficient, an indicator forcourse type was used as a covariate to eliminateconfounding associated with course type.

ResultsTesting the Proposed Path ModelTable 2 summarizes the direct, indirect, and totaleffects of each variable in the path model on threeoutcomes: course success, posttest math knowledge,and course behavior ratings. Figure 1 (p. 16) illus-trates the same results in a path diagram. Althoughcourse success is the primary outcome of interest,the prediction of the other two outcomes (usingmath readiness) can help in explaining the effectson this primary outcome. Results are reported bysemester and the statistically significant coefficientsare marked by an asterisk in Table 2 and Figure 1.

Fall semester effects. The path coefficients forposttest math knowledge (B = .347) and coursebehavior (B = .343) suggested that these variableshad fairly strong direct effects on course success.Course-behavior ratings also had a moderatelystrong direct effect on posttest math knowledge

18

m = .268), thus the indirect effect of course behav-ior on course success (B = .093) was significant,reinforcing the importance of course behavior onboth knowledge gain and course success. Due toits strong direct effect on posttest math knowledge(B = .506) and smaller effect on course behavior(B = .130), math readiness had a moderate indi-rect effect on course success (B = .232). In otherwords, mathematics readiness (pretest) impactedlater mathematics knowledge (posttest) and coursebehavior, which in turn affected course outcome.The overall effect of posttest math knowledge andcourse behavior on course success was strong, witha resulting model Multiple R of 0.53.

Spring semester effects. The pattern of directand indirect effects on spring course successwas similar to the fall semester. Posttest math

knowledge (B = .295) and course behavior (B =.251) again had moderately strong direct effects oncourse success. The total effect of course behavioron course success (B= .294) was just slightly largerthan its direct effect. Math readiness had a moder-ate indirect effect on course success (B= .222) whichwas again likely due to its strong direct effect onposttest math knowledge (B = .662) and smallereffect on course behavior (B = .106).

The spring results differed from the fall inthat course behavior did not have a significantdirect effect on posttest math knowledge (B =.145, SE=.077). The overall effect of posttest mathknowledge and course behavior on course successwas strong, with a resulting model Multiple R of0.41. However, it was noticeably smaller than R forthe fall semester sample.

Table 2

Estimates of Effects on Path Outcomes

Outcome and effect

Fall-semester Spring semester

Effect EffectSf Sf

Course behavior

Direct

Math readiness

Posttest math knowledge

Direct

Course behavior

Math readiness

Indirect

Math readiness

Total

Math readiness

Course success

Direct

Posttest math knowledge

Course behavior

Indirect

Course behavior

Math readiness

Total

Course behavior

.130* .106*.038 .048

.268* .039 .145 .077

.506* .648* .055.042

.035* .Oll .014 .010

.541* .041 .662* .054

.347* .295*.040 .065

.343* .036 .251* .051

.093* .016 .042 .025

.232* .222*.028 .047

.435* .294*.029 .047

* Denotes statistically significant coefficients (at two or more standard errors from the mean).

JOURNAL a/DEVELOPMENTAL EDUCATION

~l <

"!"",,' "

" ,1)'.,

4>", .;

Page 5: Readiness. Behavior, and Foundational Mathematics Course Success

Overall effects. Overall, the structural modelconfirmed expectations that students' enter-

= math skills and behavioral effort have direct-=. direct effects on course success. The pat-

~ 0"direct, indirect, and total effects on coursewas similar across semesters, with one

ion.The direct effect of course behavior onmath knowledge and its indirect effect on

success was significant in the fall but not- spring. Some possibilities are addressed for

inconsistency in the discussion.In general, both math readiness and course

.or were useful predictors of posttest mathdedge with math readiness being the strongestictor, Not surprisingly, the math knowledgeed by week 12 of the semester was predictive

-eventual course success. Course behavior had. ent and sizeable direct effects. Descriptive

. .cs and correlation matrices used to generates: • coefficients for each sample are included in

Appendix.

tiler View of the Effects onedge Gains and Course Success

_.OLJ..ll·u·onto the path models, we examined theof math readiness and course behavior on

JC::C""ied.lllegains and course success by classifying.;;:;:::~itsby level of behavior. To examine knowl-=-_~gains by effort level, students' course-behavior

were split into three levels: low (bottom 25th

tile), medium (middle 50th percentile), andtop 25th percentile). Next, student knowledgevas compared by effort leveL Not surpris-

- •students who were rated by their instructorsonstrating a higher level of effort during

zze morse made larger gains in math knowledge.the fall semester, students achieved the

ring mean gains in math knowledge (ased by COMPASS) when sorted on course

ior ratings: 7.1 (low effort), lD.6 (medium• and 15.9 (high effort). During the springer students achieved the following mean

when sorted based on course behavior: 8.910.0 (medium), and 1l.9 (high). These gain

"" are another way to demonstrate that effort~~ attend class, complete homework, participate

~ '~LlU>""· on) is related to learning.-e also examined course success rates byits' math readiness and instructors' aver-

_ comse-behavior ratings. Table 3 presents fall!C::;;.;::s;<ersuccessrates and Table 4 presents springsezeszer success rates. Each table shows students

into three levels of math readiness (low,..;:x=--:;::z:.m, high) as well as three levels of course-

- r rating (low, medium, high). Similar toc.....:;:.-;o:::<UI gains previously presented, Tables 3 and

sub tantial differences in course success_ course behavior, both by the three overall

.:e::u;- IT levels as well as by behavior within eachc=c...::...c"""" leveL

;: 37,ISSUE 1· FALL 2013

As can be seen, students at all math readinesslevelsare more likely to dowell ifthey engage in appro-priate classroom and homework activities, whereasthose students who do not exert sufficient effort havelower probabilities of success. In particular, thosestudents at the lower end of the readiness spectrumand who demonstrate low effort are at high risk forfailure. Indeed, only 19%of students in this categorycompleted the course successfully in the fall and 10%completed the course successfully in the spring.

DiscussionThe effects of math readiness and course behavioron course success were examined using a sampleof students enrolled in developmental mathemat-ics courses. Specifically, this model combinedacademic skills (math readiness) and academicbehavior (in-class behavior ratings) to predictcourse success. Results from this study reinforcethe importance of understanding the combinationof academic skills and student effort on develop-mental course success. Math readiness showedstrong direct effects on posttest math knowledgeas well as indirect effects on course success viacourse behavior. Posttest math knowledge alsoshowed strong direct effects on course success.Further, student course behavior showed a strongdirect effect on course success, as well as indirecteffects through posttest math knowledge. Across

both semesters, the model results confirmed ourexpectations that students' entering math readi-ness and course behavior have direct and indirecteffects on course success.

Although the model estimates were mostlyconsistent across the two semesters, there weresome differences that may not be completely due tochance. The overall model fit for predicting coursesuccess was larger for fall (R=0.53) than for spring(R=0.41). The spring sample was a mixture of newenrollees and students who enrolled in the fall buttook the developmental math course in the spring.The two samples were quite similar in terms ofdemographics and mean math readiness, butmore students experienced course success in thefall (62%) than in the spring (53%).

Differences in the success rates and differencesin the model results might be related to factorsassociated with fall and spring enrollment. Onehypothesis is that spring enrollees are more likelyto have delayed enrollment for reasons that also causethem to have additional challenges to course success(e.g.,competing priorities such as family obligationsand needing to work more hours). This hypothesisis supported by prior research that has shown thatdelayed enrollment is related to lower odds of degreecompletion (even after controlling for achievement,socioeconomic factors, and high school completion)

Table 3Success Rates for Fal/2008 Based on Math Readiness and Course Behavior

Effort level (course behavior)

Math readiness High

High .92

Medium .86

Low .74

Note. N = 713.

Table 4

Medium Low

.80 .59

.67 .29

.50 .19

Success Rates for Spring 2009 Based on Math Readiness and Course Behavior

Effort level (course behavior)

Math readiness High Medium Low

High .83 .65 .50

Medium .68 .51 .30

Low .64 .43 .lD

Note. N = 304.

19

Page 6: Readiness. Behavior, and Foundational Mathematics Course Success

and that delayed enrollees are more likely to transi-tion to other life roles such as spouses or parentsbefore entering college (Bozick & DeLuca, 2005).

The findings of this study generally demon-strate that better course behavior was predictiveoflarger knowledge gains. Further, higher course-behavior ratings were also associated with a muchgreater likelihood of course success (defined asremaining enrolled and passing the course). Inother words, as important as diagnostic assessmentand effective course placement is to knowledge gainand academic mastery, a student's willingness to dothe work is essential for learning and course success.Although this is an intuitive finding, it is not clearthat institutions are successfully addressing bothissues (academic skill and student behavior deficits)as part of their core mission of classroom instruc-tion. Our findings suggest that a two-prongedapproach to address both issues is necessary forstudents to succeed in developmental courses.

Study findings indicate that behavioral com-ponents are critical to the academic mastery andpersistence that culminate in degree attainment.Further, it is evident that "hybrid" interventionsare effective in helping students to succeed in post-secondary studies (cf. Robbins etal., 2009). Hybridinterventions are those involving a combinationof assistance with academic skills content (e.g.,pre algebra and algebra skills in a developmen-tal mathematics course) as well as assistance indeveloping effective academic behaviors (e.g., goalsetting, time management, working productivelywith peers and instructors, regulating affect andbehavior, etc.). It is also evidentthatwhen studentsat risk are (a) identified on a timely basis, (b) pro-vided interventions to address their needs, and (c)make even mild-to-moderate use of the prescribedinterventions, they benefit with increased GPA andpersistence in enrollment (Robbins et al., 2009).The model being developed at the study institutioncontains the components necessary to continueto examine the use and effectiveness of hybridinterventions in community colleges.

LimitationsThe study used a well-defined path model that wastested for 2 semesters, which allowed assessmentof the model's consistency and reduced potentialproblems related to sampling error. However, thesampling error issue is illustrated by the differencesin path coefficients from instructor-reported behav-ior to posttest results for the fall and spring samples.This difference may be due to chance or to varia-tions in the samples associated with spring and fall-enrolled developmental math students. Includingdata from subsequent semesters would allow evenmore robust estimates and minimize the possibilitythat conclusions are due to sampling error.

In addition, this study was based only ondevelopmental mathematics courses. Replicating

20

these findings across more academic areas con-sidered essential for college readiness and success(e.g., English and science) would lend further cre-dence to the model. Also, this research is limitedto students from one institution; conducting thiswork at multiple institutions (preferably from dif-ferent geographical regions) would strengthen thegeneralizability of the results. Further, as noted inthe methods section, despite attempts to provideinstructors with training on assigning behavioralratings, some instructors did not follow throughwith the ratings. We looked at institutional recordsto see if this issue may have impacted our results inany systematic way and found that students enroll-ing in different sessions (i.e., different instructors)of the same course showed similar retention, coursesuccess rates, and behavioral patterns. Thus, webelieve that the student effort distributions fromthe missing cases would resemble those we includedin this paper. Based on this information, we donot believe that missing ratings had a systematicimpact on our findings.

Behavioral components arecritical to academic masteryand persistence.

A final limitation of the study is caused bymissing data associated with course failure: Becausestudents with lower effort and poor attendancewere less likely to take the COMPASS posttest, itis possible that our estimates of the direct effect ofposttest knowledge on course success are biased.

Implications for PracticeOur findings provide an initial step to empiricallysupport the assessment and intervention modelthat is being developed at the study institution.In this model, each new student is assessed fromboth academic and behavioral risk perspectivesand subsequently referred to resources for aca-demic and behavioral skill development. Thediagnostic score profiles help faculty and studentsto better understand areas of strength and needin mathematics content areas. These profiles canbe used to more effectively target instruction andintervention both at the individual and classroomlevels. Not all students require remediation in allcontent areas. Thus, helping instructors and stu-dents target class modal and individual profiles islikely to improve studentlearning as well as studentmotivation and engagement. Using timely andindividualized information about specific skillsmay make the course seem more approachable,thus increasing students' sense of mastery, whichin turn is related to increased motivation and

9

engagement (e.g., Hsieh, Sullivan, & Guerra,2007; Rosenbaum et aI., 2007; Senko, Hulleman,& Harackiewicz, 2011).

For example, institutions making use ofthis information could arrange course sectionsbased on students' profiles so that those studentswith similar needs (e.g., difficulty with prealgebraconcepts) could be grouped together and providedwith additional instructional time designed toaddress their specific skill deficit. This groupingcould also be done by instructors so that instructiontime and/or content modules can be organized toaddress the topics that most students find chal-lenging. In addition, institutions and instructorscould use students' profiles to match students withstrengths in certain content areas with those whohave deficits in those same areas as part of a tutor-ing program. Such a program may have the addedbenefit of helping to strengthen students' socialconnectedness within the academic context, whichprevious research shows is related to persistencein degree attainment (e.g., Robbins et aI., 2006).Finally, students' profile information can help toprovide a clearer picture to students about theirspecific skills relative to what they need to knowto be successful in future courses and keep thembetter appraised of their progress toward degreecompletion.

Institutions could apply the same principlesof diagnostic testing and differentiated instructionof academic needs to address students' behavioralneeds, such as time management and organizationalskills, self-discipline, study habits, communicationskills, working in teams, and building resiliencewhen faced with challenges. The potential positiveimpact of behavioral assessment is driven to someextent by the strength of predictive relationshipbetween assessed behaviors and college outcomes.It is driven even more so by improvements in behav-ioral intervention programs that can result fromthe assessment such as those that address engage-ment, study skills, and academic motivation (Allen,Robbins, & Sawyer, 2010). For example, studentsoften arrive in college with the goal to major ina particular field without concrete strategies forhow to successfully achieve that goal. Communitycolleges may need to help students develop suchstrategies, including creating schedules, keepingassignments and priorities well-organized despitecompeting demands related to work and family,making time to periodically meet with instruc-tors regarding coursework, constantly improvingstudy skills, developing strategies for dealing withobstacles that may come along, and knowing whatresources are available (e.g., academic advising,career center, wellness programs, etc.) to make useof them when needed. These topics are often part ofan onboarding or first -year experience short courseat many four-year colleges. Given the behavioral

CONTINUED ON PAGE 22

JOURNAL afDEVELOPMENTAL EDUCATION

Page 7: Readiness. Behavior, and Foundational Mathematics Course Success

CONTINUED FROM PAGE 20

needs of college students in general, such coursesmay resonate with students at community col-leges. Thus, institutions must have tight alignmentbetween assessment and behavioral interventionjust as they must have alignment between academicassessment and instructional programs. In fact,institutions that have created a crosswalk betweenstudents' behavioral needs and available campusresources and that explicitly raise students' aware-ness of such resources via email communications,advising sessions, and early warning systems havefound that (a) students are more likely to use avail-able resources, and (b) they experience direct andmeasurable benefits in the form of increased aca-demic performance and persistence in enrollmentfrom one semester to the next (e.g., Robbins et aI.,2009; Tampke & Casillas, 2011).

Implications for Future ResearchThis study serves as an important first step inevaluating the impact of individualized assessmentand instruction for community college students.Future research needs to more carefully controland evaluate model components, as well as expandthe model to include interventions. Specifically,future work can concentrate on capturing students'perceptions of services as well as actual service usepatterns at community colleges. Such research isessential to gain a better understanding of (a) howstudents decide to pursue service referrals aftertheir needs have been identified by the institution,(b) which services facilitate the most effective out-comes, and (c)which students benefitthe most fromsaid services. The findings from this study suggestthat a joint focus on academic skills and academicbehaviors is imperative for students to experiencesuccess in developmental courses. However, itwould also be interesting to examine whether asimilar model is effective for a broader range ofentry-level courses (e.g.,mainstream, introductoryEnglish and science, certification programs, etc.).

ConclusionThe current study integrated approaches (academicskills used for course placement and ratings ofcourse behavior) typically used in different linesof research to better understand students' successin developmental math courses. Study findings areconsistent with previous literature in that academicskills (e.g., Bahr, 2010) and course behavior (e.g.,Farsides & Woodfiled, 2003) are both importantto successful completion of developmental math-ematics courses. However, to our knowledge, thisis among the first studies to examine the intercon-nectedroles of academic skills and effort for predict-ing student performance in developmental courses.These findings underscore the need for institutionsto take a multifaceted approach to assessing andidentifying student academic and behavioral skill

22

gaps, and, in turn, provide resources designed toaddress these gaps. For students, this approach hasthe potential to provide more customized instruc-tion and feedback based on their specific needs,thus leading to higher rates of successful coursecompletion and degree attainment. For institutions,this approach has the potential to contribute to atighter alignment between classroom instructionand support services, thus supporting institutions'ultimate purpose of helping individuals achievetheir educational goals. In order to move awayfrom a "one-size-fits-all" approach to instruction,service delivery, and student success in educationalsystems, it seems essential to develop a fuller under-standing of the skills (whether based on academiccontent areas or behavior) that lead to studentpersistence and success.

ReferencesACT, Inc. (2006). COMPASS Internet version reference

manual. Iowa City, IA: Author.ACT, Inc. (2010). The condition of college & career readiness

2010. Iowa City, IA: Author.ACT, Inc., (2012). ACT engage teacher edition user guide.

Iowa City, IA: Author.Allen, J., Robbins, S., & Sawyer, R. (2010). Can measuring

psychosocial factors promote college success? AppliedMeasurement in Education, 23(1), 1-22.

Attewell, P., Lavin,D., Domina, T, & Levey, T. (2006). Newevidence on college remediation. Journal of HigherEducation, 77(5),886-924.

Bahr, P. (2010). Making sense of disparities in mathematicsremediation: What is the role of student retention?Journal of College Student Retention, 12(1), 25-49.

Bailey, T., Jeong, D. W, & Cho, S. W. (2010). Referral,enrollment, and completion in developmental educationsequences in community colleges. Economics ofEducation Review, 29, 255-270.

Biswas, R.R. (2007). Acceleratingremedial math education:How institutional innovation and state policy interact.Retrieved from http://office.achievingthedream.org/sites/defaultifiles/resources/RemedialMath.pdf

Boylan, H. R. (2009). Targeted intervention fordevelopmental education students (Tl.D.E.S.). Journalof Developmental Education, 32(3),14-23.

Bozick.R, &DeLuca, S.(2005).Delayed enrollment in the highschool to college transition. Social Forces,84(1),531-554.

Conard, M. A. (2006). Aptitude is not enough: How person-alityandbehaviorpredictacademicperformance.Journalof Research in Personality, 40, 339-346.

Conley, D. T. (2007). Toward a more comprehensiveconception of collegereadiness. Eugene, OR: EducationalPolicy Improvement Center.

Crede, M., & Kunce!, N. R. (2008). Study habits, skills,and attitudes: The third pillar supporting collegiateacademic performance. Perspectives on PsychologicalScience, 3,425-453.

Das, H., Frost, P.J., & Barnowe, J.T. (1979). Behaviorallyanchored scales for assessing behavioral science teaching.Canadian Journal ofBehavioural Science, 11(1),79-88.

Dollinger, S. J" Matyja, A. M., & Huber, J, L. (2008). Whichfactors account for academic success: Those whichcollege students can control or those they cannot?Journal of Research in Personality, 42, 872-885.

Farsides, T, &Woodfield, R. (2003). Individual differencesand undergraduate academic success: The roles ofpersonality, intelligence, and application. Personalityand Individual Differences, 34, 1225-1243.

Golfin, P., Jordan, W, Hull, D., & Ruffin, M. (2005).Strengthening mathematics skills at the postsecondarylevel: Literature review and analysis. Washington DC:U.S. Department of Education.

Grussing, P. G., Valuck, R. J., & Williams, R. G. (1994).Development and validation of behaviorally-anchoredrating scales for student evaluation of pharmacyinstruction. American Journal of PharmaceuticalEducation, 58(1), 25-37.

Hsieh, P., Sullivan, J., & Guerra, N. (2007). A closer lookat college students: Self-efficacy and goal orientation.Journal of Advanced Academics, 19,454-476.

CONTINUED ON PAGE 34

Appendix

Table A 7. Descriptive Statistics and Intercorrelation Matrix for Fal/200B Sample

Variable M 4N SD 2 3

1. Math readiness 28.9 7.7724

2. Posttest math knowledge 436 41.7

3. Course behavior 800 14.0

4. Course success 819 0.6

12.1 .52

4.8 .13 .33

0.5 .30 .45 .46

Table A2. Descriptive Statistics and Intercorrelation Matrix for Spring 2009 Sample

Variable M 4N SD 2 3

1. Math readiness 32.3 10.2363

2. Posttest math knowledge 195 40.0

3. Course behavior 370 15.2

4. Course success 435 0.5

._-.-",""

;," ~~" '~~

13.4 .67

4.7 .22.12

0.5 .19 .34 .30

JOURNAL o/DEVELOPMENTAL EDUCATION