the impact of problem sets on student learning

9

Click here to load reader

Upload: karen-moustafa

Post on 11-Apr-2017

222 views

Category:

Documents


1 download

TRANSCRIPT

Page 1: The Impact of Problem Sets on Student Learning

This article was downloaded by: [The University of British Columbia]On: 09 December 2014, At: 19:06Publisher: RoutledgeInforma Ltd Registered in England and Wales Registered Number: 1072954 Registered office: Mortimer House,37-41 Mortimer Street, London W1T 3JH, UK

Journal of Education for BusinessPublication details, including instructions for authors and subscription information:http://www.tandfonline.com/loi/vjeb20

The Impact of Problem Sets on Student LearningMyeong Hwan Kim a , Moon-Heum Cho b & Karen Moustafa Leonard aa Indiana University–Purdue University Fort Wayne , Fort Wayne , Indiana , USAb Kent State University–Stark , North Canton , Ohio , USAPublished online: 01 Feb 2012.

To cite this article: Myeong Hwan Kim , Moon-Heum Cho & Karen Moustafa Leonard (2012) The Impact of Problem Sets onStudent Learning, Journal of Education for Business, 87:3, 185-192, DOI: 10.1080/08832323.2011.586380

To link to this article: http://dx.doi.org/10.1080/08832323.2011.586380

PLEASE SCROLL DOWN FOR ARTICLE

Taylor & Francis makes every effort to ensure the accuracy of all the information (the “Content”) containedin the publications on our platform. However, Taylor & Francis, our agents, and our licensors make norepresentations or warranties whatsoever as to the accuracy, completeness, or suitability for any purpose of theContent. Any opinions and views expressed in this publication are the opinions and views of the authors, andare not the views of or endorsed by Taylor & Francis. The accuracy of the Content should not be relied upon andshould be independently verified with primary sources of information. Taylor and Francis shall not be liable forany losses, actions, claims, proceedings, demands, costs, expenses, damages, and other liabilities whatsoeveror howsoever caused arising directly or indirectly in connection with, in relation to or arising out of the use ofthe Content.

This article may be used for research, teaching, and private study purposes. Any substantial or systematicreproduction, redistribution, reselling, loan, sub-licensing, systematic supply, or distribution in anyform to anyone is expressly forbidden. Terms & Conditions of access and use can be found at http://www.tandfonline.com/page/terms-and-conditions

Page 2: The Impact of Problem Sets on Student Learning

JOURNAL OF EDUCATION FOR BUSINESS, 87: 185–192, 2012Copyright C© Taylor & Francis Group, LLCISSN: 0883-2323 print / 1940-3356 onlineDOI: 10.1080/08832323.2011.586380

The Impact of Problem Sets on Student Learning

Myeong Hwan KimIndiana University–Purdue University Fort Wayne, Fort Wayne, Indiana, USA

Moon-Heum ChoKent State University–Stark, North Canton, Ohio, USA

Karen Moustafa LeonardIndiana University–Purdue University Fort Wayne, Fort Wayne, Indiana, USA

The authors examined the role of problem sets on student learning in university microeco-nomics. A total of 126 students participated in the study in consecutive years. independentsamples t test showed that students who were not given answer keys outperformed students whowere given answer keys. Multiple regression analysis showed that, along with pre-GPA andstudent major, a problem set with or without answer key significantly explained student learn-ing in economics. The authors discuss the role of answer keys and implications for teachinguniversity economics courses.

Keywords: microeconomics, problem sets, student evaluation, student learning

Homework is one of the most common instructional strate-gies instructors use to enhance students learning (Cooper &Valentine, 2001; Warton, 2001). Homework refers to any as-signment given by teachers to students designed for themto do outside of regular class hours (Cooper, Robinson, &Patall, 2006; Cosden, Morrison, Albanese, & Macias, 2001;Olympia, Sheridan, & Jenson, 1994). Cooper et al. suggestedthat there are instructional and noninstructional roles forhomework in Grades K–12. From an instructional perspec-tive, homework is an important opportunity for students toreview, practice, and synthesize skills or concepts. By com-pleting the homework, students are expected to master andtransfer skills or knowledge to different contexts (Becker &Epstein, 1982; Cooper et al.; Epstein & Voorhis, 2001). Froma noninstructional viewpoint, homework creates a commu-nication opportunity between parents and young students,as well as providing information about class and schools toparents (Epstein & Voorhis). In addition, homework fulfillsdirectives of school administrators, but it can also be used asa way to punish students. However, in higher education, the

Correspondence should be addressed to Myeong Hwan Kim, Indi-ana University-Purdue University Fort Wayne, Department of Economics,2101 E. Coliseum Boulevard, Fort Wayne, IN 46805, USA. E-mail:[email protected]

instructional aspect of homework is more critical than non-instructional roles because students are independent learnerswho must take responsibility for their own learning.

Researchers have found that homework is more effectivein secondary than elementary school level classes (Cooperet al., 2006; Keith, Reimers, Fehrman, Pottebaum, &Aubrey, 1986). In their meta-analysis, Cooper et al. founda positive influence of homework on student learning ingeneral. However, Cooper et al. showed stronger correlationsbetween homework and learning in grades 7–12 than inK–6. In addition, Cooper and Valentine (2001) and Cooperet al. consistently found that the relationship between timespent on homework and learning was weaker in elementarythan secondary school. Cooper and Valentine suggested thatyounger students have less ability to pay attention to thehomework and use less effective study habits. Therefore, itseems homework is a more effective instructional strategyin higher grade levels than in lower grade levels.

However, most of the homework studies were conductedin K–12 settings. Very little research has been done in highereducation (Radhakrishman, Lam, & Ho, 2009), althoughhomework plays a significant role in instruction at this level(Emerson, 2011). Unlike high school, university students areexpected to take more responsibility for their own learning.Whereas younger students received assistance from parents,university students cannot expect the same or similar type

Dow

nloa

ded

by [

The

Uni

vers

ity o

f B

ritis

h C

olum

bia]

at 1

9:06

09

Dec

embe

r 20

14

Page 3: The Impact of Problem Sets on Student Learning

186 M. H. KIM ET AL.

of help. Students are independent learners required to takeresponsibility for their own learning.

In the present study, we investigated the role of homeworkon student learning in a university setting, particularly in aneconomics class, particularly the role of answer keys givento homework problem sets. The specific research questionguiding the study is the following:

Research Question 1: Do college students in a conditionwhere they received problem sets with answer keys per-form better than students in a condition where they onlyreceived problem sets in an economics class?

We believe that the study results expand our understandingof the relationship of homework to student achievement inlearning economics in higher education.

Problem Sets as Homework in EconomicClasses

Although there are diverse forms of homework in an eco-nomic class, one of the most common is problem sets givenas homework (Becker & Watts, 2001; Bonham, Deardorff,& Beichner, 2003; Emerson, 2011; Grove, Wasserman, &Grodner, 2006; Miller & Westmoreland, 1998). This is aparticularly common strategy in mathematics, physics, andcomputer science, as well as in economics. Because thecontent in these areas is regarded as particularly difficult formany students, problem sets are used to help students solvedifferent types of problem and to practice ways to approachthe problems. Often, instructors choose some or many ofthem for following tests.

The types and numbers of problems in each set vary de-pending on the instructor. Also, ways to use problem sets asan instructional strategy are very diverse, including individ-ual work with problem sets, collaboration with problem sets,and presentation of the results of problem sets. Althoughmany faculty members use problem sets, empirical researchinvestigating their effect is very rare. Much research relatedto problem sets in economic class has been concerned withthe role of student aptitude variables (e.g., scholastic apti-tude test [SAT]) on learning (Cooper et al., 2006; Okpala,Okpala, & Ellis, 2000). Okpala et al. argued that, in uni-versity macroeconomics class, academic efficacy and studyhabits was positively and significantly related to students’performance. Also, Okpala et al. showed that SAT scoresand accumulated credit hours were significantly related toabove-average students but not to below-average students.That is, other effects could be found for students doing well,but not for those doing poorly.

There is a lack of experimental research investigatingthe role of problem sets in student learning (Grove &Wasserman, 2006). There were few examples of empiricalresearch using problem sets in an economics course. Millerand Westmoreland (1998) conducted an empirical study toinvestigate the effectiveness of alternative versus traditional

ways of grading in college economics courses. Students inthe experimental group were asked to solve all the problemsets as homework. The instructor only solved two selectedproblems during class, and students in the experimentalgroup did not know which questions the instructor wasplanning to solve. Students in the control group also wereasked to solve all the problem sets as homework. However,the instructor in the control group solved all of the problemsduring class. Miller and Westmoreland did not find any sig-nificant differences in student learning between two groups.Miller and Westmoreland recommended that, when problemsets are used as an instructional strategy to enhance students’learning, instructors should solve only a small number ofproblems and not let students know which questions they willsolve in class.

Grove and Wasserman (2006) investigated the effective-ness of using grade incentives for completing problem setson college freshmen students’ learning in an economicscourse. The problem set assignment was a part of coursegrade for the students in the experimental group, but not forthose in the control group. Grove and Wasserman showedthat grade incentives significantly enhanced students’learning. However, this effect was more significant onaverage freshmen than on those above or below average.

Online systems can be used to assist assigning problemsets as homework. Emerson (2011) used Aplia, an automatedonline homework program designed to grade problem setsand provide feedback on solved problems in a microeco-nomics course. They required students in the experimentalgroup to solve problem sets using Aplia, whereas its use wasoptional for students in the control group. Emerson arguedthat students who were required to use Aplia achieved signif-icantly better final exam and course grades than did studentsin the control group. They found that assigning homeworkcan enhance student learning, and an online system can re-duce instructor’s workload.

In the present study, we investigated the effects of problemsets with and without answer keys on students’ learning. Inparticular, we chose the most common assignment, giving alarge number of problems, and examined its effect on studentlearning. In specific, our hypothesis, working from the priorliterature, was the following:

Hypothesis 1: Students receiving problem sets with answerkeys are expected to do better than are those who receiveproblem sets without answer keys.

METHOD

Participants were those students taking a principles of mi-croeconomics course during fall 2008 and fall 2009 in aMidwest university taught by the same instructor. During fall2008, 100 students were enrolled in the course and, duringfall 2009, 75 students were enrolled. The demographics of

Dow

nloa

ded

by [

The

Uni

vers

ity o

f B

ritis

h C

olum

bia]

at 1

9:06

09

Dec

embe

r 20

14

Page 4: The Impact of Problem Sets on Student Learning

IMPACT OF PROBLEM SETS ON STUDENT LEARNING 187

these students are shown in Appendix A, including the examscore mean, standard deviation, and range for all students inthe classes and those students for whom all information wasavailable.

In each class, taught by the same instructor, students wereevaluated by two quizzes (15% each), two exams (20% each),and one comprehensive final exam (30%). The main teachingmethod for both years was lecture. The instructor providedstudents with the same lecture notes and problem sets in eachclass. Further, the instructor gave identical exams (quizzes,midterms, and comprehensive final exam) for both classes.The only difference between the two samples was the pro-vision or lack of an answer key to the problem sets. The in-structor did not provide the answer keys to students enrolledin three sections of fall 2008; however, students enrolled inthe two sections of fall 2009 received answer keys. Studentswere informed that over 90% of the quizzes, midterms, andfinal exam multiple choice test questions would be from theproblem sets.

The sample size dropped from 100 to 75 and from 75 to51 for fall 2008 and fall 2009, respectively, because we wereunable to get all students’ pre-GPA. We assume that selectionbias was not a problem. We compared the mean differencebefore and after the drop in observations between two samplegroups. The results are shown in Table 1, and clearly showthat we did not accept the null hypotheses, meaning that themeans of two samples should significantly differ from zerofor both tests. Therefore, we confirm that drop in observationsdid not cause a selection bias in our study.

In addition, Appendix A contains the final exam score,gender, age, high school GPA percentile, student major, andwhether the student was part of a learning community. Learn-ing communities are groups of students in linked or pairedcourses. At the university in this study, students have theopportunity to take an introduction to microeconomics aspart of a learning community experience. Learning commu-nities are generally designed to improve the learning expe-rience, study habits, and problem-solving skills of students(Schroder, 2010). Therefore, it is a needed control variable in

this study because students involved in the learning commu-nities might be expected to have higher grades in a principlesof microeconomics course.

To determine the role of the problem sets with and withoutthe answer key, an independent samples t test was performedto calculate mean difference between two groups. We thenconducted multiple regressions. The basic approach was toestimate a simple linear equation, with the exam score as thedependent variable and with physical and the same-genderratio in the classroom considered as environmentally related,and previous academic performance-specific variables con-sidered as explanatory variables. We included the same gen-der ration to see whether students performed better if more ofthe same gender students were in their class, assuming thatstudents got along well with same gender, as group study wasencouraged by the instructor.

These considerations led to the following model:

ExamScorei =α0 + α1Genderi + α2Agei + α3PreGPAi

+α4LearningCommunityi + α5BusinessMajori

+α6GenderRatioi + α7ProblemSetsi + εi (1)

where Exam Score denotes total points earned from all testsand a comprehensive final exam at the end of the academicterm; Gender denotes a binary dummy variable, which isunity if i is male student and zero otherwise; Age denotesage; Pre-GPA denotes i’s GPA before the course in question;Learning Community is a binary variable that is unity if i is inthe learning community; Business Major is a binary variablethat is unity if i is majoring business; Gender Ratio denotessame-gender ratio of i’s class if i is male then, male–femaleand if i is female then, female–male; Problem Sets is a binaryvariable that is unity if i received an answer key for theproblem sets; α is vector of nuisance coefficients; and ε

represents the omitted other influences on the exam scoreand is assumed to be well behaved.

The final form of model specifications based on the gen-eral functional form (Equation 1) was obtained from general-to-specific specification search, which wast popularized

TABLE 1Comparison of Background in the Principles of Microeconomics Course

Answer is not given group Answer is given group

Variable M SD M SD t p

Exam score (all) 68.893 11.614 63.045 13.688 2.981 0.003Exam score 67.553 10.971 61.852 12.508 2.637 0.010Final exam 56.300 14.940 52.843 15.653 1.240 0.100Gender 0.693 0.464 0.745 0.440 −0.634 0.528Age 21.760 3.816 22.431 4.006 −0.941 0.349High school percentile 65.213 22.825 61.216 20.995 1.013 0.313Pre-GPA 2.711 0.677 2.588 0.496 1.182 0.240Learning community 0.227 0.421 0.333 0.476 −1.292 0.199Business major 0.427 0.498 0.471 0.504 −0.362 0.718Gender ratio 0.575 0.165 0.608 0.194 −34.962 0.000

Dow

nloa

ded

by [

The

Uni

vers

ity o

f B

ritis

h C

olum

bia]

at 1

9:06

09

Dec

embe

r 20

14

Page 5: The Impact of Problem Sets on Student Learning

188 M. H. KIM ET AL.

particularly by Hendry (1993,1995), Hendry and Mizon(1990), and Mizon (1995). Hendry, David, and Krolzig(2001) recommended the use of multiple search paths inthe process of moving from a generalized unrestricted model(GUM) to a parsimonious specification. The reason for thisrecommendation is to avoid the risk of deleting an importantvariable that should ideally be retained in the final specifica-tion along any single search path and minimize the risk ofretaining as proxies for the missing variable with the resultthat the final model is over-parameterized. In addition, thisapproach is ideally suited to the analysis in the search processof unspecified functions because the underlying theory is suf-ficiently loose to admit a wide range of candidate regressors.Therefore, our final form of each team in the generic Equa-tion 1 is determined by parsimony, satisfactory performanceagainst diagnostic tests incorporated with Schwarz criterion.

RESULTS

Several statistical methods were used to answer our researchquestion, including correlations, t test, and multiple regres-sions. First, we present a simple correlation coefficient matrixof the data set in Appendix B.

Table 1 shows the background comparison between thetwo course groups. Group differences were observed in examscores, for before and after drop in observations, as well ascomprehensive final exam and gender ratio. In the 2008 eco-nomics class, 57.5% of the students were men and 42.5%were women. In the 2009 class, 60.8% of the students weremen and 39.2% were women. Number of men versus women,age, high school percentile, previous GPA, number of learn-ing community students, and number of business majors werestatistically equivalent between the two groups. No differ-ences were found between the attributes of the two groups.

Using a t test, the mean differences between the courses infall 2008 and fall 2009 were compared (Group 1 in 2008:answer keys were not given; Group 2 in 2009: answer keyswere given). The mean of exam scores of the two groups ofstudents was significantly different from zero for their finalachievements (p = .010). Inspection of the two group meanssuggests that students who were not given answer keys in fall2008 (x1 = 67.553) statistically outperformed students whowere given answer keys in fall 2009 (x2 = 61.852), as wellas on the comprehensive final exam.

We conducted multiple regression analysis to determinewhether other variables may contribute to group differencesin student learning, other than provision of answer keys. Weran two different sets of regression for two different depen-dent variables, total scores earned from quizzes, midterms,and final comprehensive exam, and for only the final com-prehensive exam. First, we ran simple ordinary least squares(OLS), which is a standard and the most popular approachto the approximate solution of overdetermined systems.Second, we used the “areg” technique using Stata (2007),which fits a linear regression absorbing one categorical fac-tor and calculates heteroskedastic robust standard errors. Thismethod implements a fixed-effects regression and equivalentto adding a dummy for each subject, but the value of eachdummy is not shown (Stata). The number of valid data formultiple regression analysis was 126. Results are shown inTable 2. Multiple regression analysis clearly demonstratesthat pre-GPA, students’ major and problem sets with or with-out an answer key significantly explain students’ final scoresand the comprehensive final exam score. In other words, pro-vision of problem set answers had a negative effect on thefinal score, and on the comprehensive final exam.

Results from various sensitivity analyses are shown inTable 3. First, we divided the sample by gender to determinewhether there are any differences between men and women.

TABLE 2Regression Results

OLS AREG

Exam score Final exam Exam score Final exam

Variable Coefficient SE Coefficient SE Coefficient SE Coefficient SE

Gender −7.173 5.344 −5.742 7.169 −7.173∗ 1.896 −5.742 5.129Age 0.224 0.255 0.289 0.347 0.224† 0.086 0.289† 0.107Pre-GPA 6.309∗∗ 1.558 5.358∗ 2.103 6.309∗ 1.857 5.358∗ 1.241Learning community −3.616 2.217 −3.506 3.003 −3.616† 1.537 −3.506 1.810Business major 6.228∗∗ 1.963 6.753∗ 2.646 6.228∗ 1.436 6.753 4.299Problem sets −5.291∗∗ 1.970 −4.602† 2.657Gender ratio 20.990 13.680 27.659 18.379 20.990† 8.618 27.659 15.749Constant 36.650∗∗ 8.675 21.475† 11.698 34.508∗ 7.701 19.657† 7.763Observation 126 124 126 124R2 0.263 0.175 0.263 0.175

Note. AREG fits a linear regression absorbing one categorical factor and with a large dummy-variable set. Robust standard errors were calculated throughclustering by crn and absorbed by term. Drop problemset because it does not vary within category. OLS = ordinary least squares.

†p = .10. ∗p = .05. ∗∗p = .01.

Dow

nloa

ded

by [

The

Uni

vers

ity o

f B

ritis

h C

olum

bia]

at 1

9:06

09

Dec

embe

r 20

14

Page 6: The Impact of Problem Sets on Student Learning

TAB

LE3

Sen

sitiv

ityA

naly

sis

Gen

der

Pre-

GPA

Hig

hsc

hool

perc

entil

e

Mal

eon

lyFe

mal

eon

lyPr

e-G

PA≥

33

>Pr

e-G

PA≥

22

>Pr

e-G

PA≥

1H

S%≥

8080

>H

S%≥

6060

>H

S%≥

0

Var

iabl

eC

oeffi

cien

tSE

Coe

ffici

ent

SEC

oeffi

cien

tSE

Coe

ffici

ent

SEC

oeffi

cien

tSE

Coe

ffici

ent

SEC

oeffi

cien

tSE

Coe

ffici

ent

SE

Gen

der

−8.3

007.

956

−4.9

957.

547

−3.4

8744

.676

−9.3

167.

836

−8.7

5111

.265

2.39

09.

873

Age

0.48

20.

324

0.22

00.

492

0.33

30.

286

−0.4

180.

575

1.15

33.

171

−2.1

19†

1.04

80.

406

0.35

01.

071†

0.58

1L

earn

ing

com

mun

ity−1

.600

2.72

2−7

.282

4.51

0−0

.726

3.41

5−8

.615

∗∗3.

256

20.2

4716

.684

−0.8

843.

309

−3.1

004.

634

−4.0

284.

164

Bus

ines

sm

ajor

3.73

82.

432

12.9

70∗∗

3.84

23.

144

3.14

93.

731

2.78

35.

917

22.3

146.

351†

3.15

45.

300

4.12

85.

683†

3.32

7Pr

oble

mse

ts−5

.410

∗2.

537

−7.4

204.

784

−6.0

73†

3.09

4−4

.049

2.70

99.

453

12.8

32−6

.486

†3.

506

−9.9

71∗

3.80

1−1

.160

3.44

3G

ende

rra

tio23

.409

18.0

838.

352

30.8

9810

.928

20.3

5427

.434

19.0

44−6

5.63

914

2.80

910

.842

21.3

8028

.460

28.6

182.

594

26.0

80C

onst

ant

35.3

29∗

14.3

4841

.397

∗19

.410

64.4

70∗∗

14.2

4256

.890

∗∗15

.781

79.9

6566

.771

112.

800∗

∗26

.087

45.9

18∗∗

13.4

1335

.156

†18

.097

Obs

erva

tion

9036

3974

1135

4150

R2

0.12

80.

503

0.18

90.

251

0.62

20.

340

0.31

00.

142

† p=

.10.

∗ p=

.05.

∗∗p

=.0

1.

189

Dow

nloa

ded

by [

The

Uni

vers

ity o

f B

ritis

h C

olum

bia]

at 1

9:06

09

Dec

embe

r 20

14

Page 7: The Impact of Problem Sets on Student Learning

190 M. H. KIM ET AL.

For men, the analysis shows that having the answer key hada significant negative effect on their learning. However, theresults for women were inconclusive because the coefficientfor the problem set dummy variable was not statisticallysignificant. Further, the analysis shows that women who werebusiness majors performed better than their male counter-parts in this study. Second, we divided samples by students’GPA before they took the class. One interesting point in thisanalysis is that having the answer key for the problem setsnegatively correlated to the students’ learning when their pre-GPA is relatively higher. Finally, we divided the sample basedon the students’ high school percentile. Results are similarto the pre-GPA, which means that having the answer key forthe problem sets negatively affected student performancewhen students’ had a relatively higher high school GPA.

As a final point, we tested provision of the answer keyfor the problem sets on student learning and found that therewere a few methodological issues to be addressed. Most busi-ness programs require a minimum grade for the introductorycourses. For instance, a C or above is a passing grade for thestudy institution. Therefore, we constructed a binary depen-dent variable in our data using students’ final scores. Thismeans that when a student’s final score was greater than orequal to 70 (equal to the letter grade of C or better), werecorded it as a 1 (recorded as 0 otherwise). Thus, by usingthis binary dependent variable, we conducted logit and probitregressions. We provide these results in Table 4. The resultsare similar to Table 1. The results indicate that our key interestvariable, provision of answer key for the problem sets, wasnegative and statistically significant. This means that whenthe answer key is given to the students their performance wasworse than those students who did not have the answer key.

DISCUSSION

The purpose of the present study was to investigate the role ofproblem sets on college students’ achievements in learning

economics. Expanding the line of existing research (Emer-son, 2011; Grove & Wasserman, 2006; Miller & Westmore-land, 1998), we further investigated whether having the so-lutions to homework problem sets aided student learning,using grades on the exams and the course as outcome vari-ables. The statistical analyses results indicated that provisionof answers to problem sets did not assist with improvingstudent learning, as well as the comprehensive final examscore. Further, they consistently showed that students in theexperimental group in which students were not given answerkeys performed better than students in the control group inwhich students were given answer keys.

In our research, we did not find evidence to support stu-dent beliefs that having the answer key for the problem setshelps them solve economics problems on examination (i.e.,achieve better grades on examinations). We believe that theuncertainty and anxiety surrounding learning is not relieved,in reality, by providing problem set answers. Instead, we be-lieve that the results clearly show that having the problemset answers actually inhibits students’ ability to do well onthe examinations, as a whole. This likely stems from thetrue nature of learning in the experiential classroom—thatis, that having to solve the problems themselves allows stu-dents to have a closer exposure to the material, which allowsthem to solve the problems on the examination and simi-lar problems that they may encounter after leaving the classor university.

It seems that the experimental condition (in which stu-dents were given problem sets without answer keys) provideda more realistic opportunity for students to practice and re-view the content than students in the control group (in whichstudents were given problem sets with answer keys). Withoutthe answer keys, students in the experimental group mighthave felt more challenged and taken responsibility for theirown assignment, and used active learning strategies such asseeking help from the instructors, reviewing class notes, orrereading the book. However, students in the control group

TABLE 4Logit and Probit

Logit dy/dx Probit dy/dx

Variable Coefficient SE Coefficient SE Coefficient SE Coefficient SE

Gender −1.430 1.154 −0.339 0.261 −0.870 0.704 −0.334 0.260Age 0.112∗ 0.066 0.026† 0.015 0.069† 0.040 0.026† 0.015Pre-GPA 1.109∗∗ 0.384 0.257∗∗ 0.088 0.626∗∗ 0.205 0.238∗∗ 0.078Learning community −0.888† 0.519 −0.191† 0.101 −0.551† 0.309 −0.197∗ 0.102Business major 1.435∗ 0.459 0.328∗∗ 0.098 0.890∗∗ 0.270 0.333∗∗ 0.096Problem sets −1.095∗ 0.462 −0.242∗∗ 0.095 −0.670∗ 0.271 −0.245∗∗ 0.093Gender ratio 3.769 3.006 0.875 0.697 2.278 1.806 0.864 0.685Constant −7.114∗∗ 2.333 −4.198∗∗ 1.318Observation 126 126R2 0.191 0.192

Note. Marginal probability (dy/dx) is for discrete 0 to 1 change for dummy variable.†p = .10. ∗p = .05. ∗∗p = .01.

Dow

nloa

ded

by [

The

Uni

vers

ity o

f B

ritis

h C

olum

bia]

at 1

9:06

09

Dec

embe

r 20

14

Page 8: The Impact of Problem Sets on Student Learning

IMPACT OF PROBLEM SETS ON STUDENT LEARNING 191

might not have felt challenged because they had answer keysfor problem sets, and the instructor told them that more than90% of the questions would come from the problem sets. Be-cause students were not challenged, they might not have feltthe necessity of actively seeking help or finding resources toaid them in completing the homework. The results showedthat challenging students to self-regulate their own learning isimportant when instructors design homework (Bembenutty,2009; Zimmerman & Kitsantas, 2005).

Providing challenging opportunities for students to ac-tively study and self-regulate their learning seems to be oneof the most effective instructional strategies to enhance stu-dents’ learning in a university economics class, based onprevious studies with problem sets (Emerson, 2011; Grove& Wasserman, 2006; Miller & Westmoreland, 1998), home-work research (Bembenutty, 2009; Zimmerman & Kitsantas,2005), and our findings. As our research showed, if the aimis to encourage learning, the instructors should not providean answer key with problem sets. In this way, instructorsencourage students to use active study strategies to completehomework. Students can come to the instructors with ques-tions, voluntarily organize study groups, or use universityservices (e.g., tutors).

We believe that providing problem set answers allows thestudents false assurance that they understand the material. Allthat they need to do is look at the question, guess the answer,and then proceed to the next question. The students do notneed to think deeply or reflect on their learning processes toascertain that the question is correct or to understand why theanswer is correct. Using problem set answers restricts theirability to explore when the answer might be incorrect, thatis, they do not necessarily have to understand the context ofthe problem. Those students with higher GPAs may be morelikely to think they have understood the material, particularlybecause they have done so well up to the point of this class.These students may be more likely to believe that they canmemorize the answers to the questions as they have in thelower level classes; however, the material in the class inthis study requires more than memorization—understandingand reflection are required. The research demonstrates thatproblems sets are not the answer to assisting student learning,at least in the course examined in the present study.

CONCLUSION

In the present study, we examined the role of problem setson student learning and satisfaction in university microeco-nomics. Our research is meaningful because we investigatedeffective ways to present problem sets as homework in theclassroom. Results from the independent samples t test re-sults clearly showed that students who were not given answerkeys statistically outperformed students who were given theanswer keys. Further, multiple regression analysis revealedthat, along with pre-GPA and student major, problem sets

with answer key significantly explained student learning ineconomics. The coefficient for the problem sets dummy isnegative and statistically significant, implying that having theanswer key for the problem sets hindered student success insolving economics problems on examination, although theirevaluation for the course was higher.

We found some other interesting results from this study.Student evaluations of the instructor may suffer if there is noanswer key, as the instructor found from the students’ writ-ten evaluations of the course from fall 2008 (experimentalgroup). It should be noted that the instructor provided an-swers when a student asked a specific question. It is likelythat this complaint was reflected in their lower numericalevaluation for the course and the instructor, which were lowerin fall 2008 than in fall 2009. The implication is that, eventhough providing answer key for the homework problem setshinders students’ learning, evaluations for the course werehigher when the answer key was provided.

We believe our research contributes to the existing eco-nomic education research and existing homework strategiesbecause we discuss the problem sets as an instructional strat-egy that is counterintuitive. We believe that providing an-swers to problem sets is counterproductive, if the aim iseducation (i.e., aiding learning). However, we believe thatproviding answers is productive, if the aim is to obtain betterstudent evaluations of the course and instruction for merit,reappointment, promotion, and tenure purposes. This con-cerns us greatly, as instructors in higher education may en-counter a dilemma similar to The Lady or the Tiger?

Limitations of our study include the sample being fromone university and one instructor. However, we also believethis is strength, as the variability between instructional expec-tations and instructor style was eliminated. We are somewhatconcerned about the limitations of this study to disciplinesother than economics, as of the nature of the problem setsand the difficulties encountered by some students taking acourse they consider somewhat esoteric, at least within thebusiness school.

There are some interesting questions resulting from thisstudy that would benefit from further research. One concernsthe use of problem sets in more advanced economics classes:At the senior level, is the same effect found? This wouldhelp instructors understand whether the problem set issue isa matter of immaturity or of poor learning skills. Anotherquestion concerns student evaluations: How heavily did thestudent rate the problem set issue when rating the professor?Did it play a large part in the lower evaluations, or were thereother issues that students weighted more heavily? Answers tothese and other similar questions will help clarify the findingsof this study.

REFERENCES

Becker, H. J., & Epstein, J. L. (1982). Parent involvement: A survey ofteacher practices. Elementary School Journal, 83, 85–102.

Dow

nloa

ded

by [

The

Uni

vers

ity o

f B

ritis

h C

olum

bia]

at 1

9:06

09

Dec

embe

r 20

14

Page 9: The Impact of Problem Sets on Student Learning

192 M. H. KIM ET AL.

Becker, W., & Watts, M. (2001). Teaching methods in U.S. undergraduateeconomics courses. Journal of Economic Education, 32, 269–279.

Bembenutty, H. (2009). Self-regulation of homework completion. Psychol-ogy Journal, 6, 138–153.

Bonham, S. W., Deardorff, D. L., & Beichner, R. J. (2003). Comparison ofstudent performance using web and paper-based homework in college-level physics. Journal of Research in Science Teaching, 40, 1050–1071.

Cooper, H., Robinson, J. C., & Patall, E. A. (2006). Does homework improveacademic achievement? A synthesis of research 1987–2003. Review ofEducational Research, 76, 1–62.

Cooper, H., & Valentine, J. C. (2001). Using research to answer practicalquestions about homework. Educational Psychologist, 36, 143–153.

Cosden, M., Morrison, G., Albanese, A. L., & Macias, S. (2001). Whenhomework is not home work: After-school programs for homework as-sistance. Educational Psychologist, 36, 211–221.

Emerson, T. L. N. (2011). Homework: To require or not? Online gradedhomework and student achievement. Perspectives on Economic EducationResearch, 7(1), 20–42.

Epstein, J. L., & Voorhis, F. L. V. (2001). More than minutes: Teachers’ rolein designing homework. Educational Psychologist, 36, 181–193.

Grove, W. A., & Wasserman, T. (2006). Incentives and student learning: Anatural experiment with economics problem sets. The American EconomicReview, 96, 447–452.

Grove, W. A., Wasserman, T., & Grodner, A. (2006). Choosing a proxy foracademic aptitude. Journal of Economic Education, 37, 131–147.

Hendry, D. F. (1993). Econometrics-alchemy or science? Economica, 47,387–406.

Hendry, D. F. (1995). Dynamic econometrics. Oxford, England: OxfordUniversity Press.

Hendry, D. F., David, F., & Krolzig, H. (2001). Automatic econometricmodel selection using PcGets. London, England: Timberlake ConsultantsPress.

Hendry, D. F., & Mizon, G. E. (1990). Procrustean econometrics: Or stretch-ing and squeezing data. In C. W. J. Granger (Ed.), Modelling economicseries (pp. 121–136). Oxford, England: Clarendon Press.

Keith, T. Z., Reimers, T., Fehrman, P. G., Pottebaum, S. M., & Aubrey,L. W. (1986). Parental involvement, homework, and TV time: Directand indirect effects on high school achievement. Journal of EducationalPsychology, 78, 373–380.

Miller, E., & Westmoreland, G. (1998). Student response to selective grad-ing in college economics courses. Journal of Economic Education, 29,195–201.

Mizon, G. E. (1995). Progressive modelling of macroeconomic time series:The LSE methodology. In K. D. Hoover (Ed.), Macroeconometrics: De-velopments, tensions, and prospects (pp. 107–170). Boston, MA: Kluwer.

Okpala, A. O., Okpala, C. O., & Ellis, R. (2000). Academic efforts and studyhabits among students in a principles of macroeconomics course. Journalof Education for Business, 75, 219–224.

Olympia, D. E., Sheridan, S. M., & Jenson, W. (1994). Homework: A naturalmeans of home-school collaboration. School Psychology Quarterly, 9,60–80.

Radhakrishnan, P., Lam, D., & Ho, G. (2009). Giving university studentsincentives to do homework improves their performance. Journal of In-structional Psychology, 36, 219–225.

Schroder, R. (2010). Teaching across disciplines: Using collaborative in-struction in undergraduate education. Journal of Economics & Finance,34, 484–488.

Stata. (2007). Stata 10 manual. College Station, TX: Stata Press.Warton, P. M. (2001). The forgotten voices in homework: Views of students.

Educational Psychologist, 36, 155–165.Zimmerman, B. J., & Kitsantas, A. (2005). Homework practices and

academic achievement: The mediating role of self-efficacy and per-ceived responsibility beliefs. Contemporary Educational Psychology, 30,397–417.

APPENDIX A—Descriptive Statistics

Variable Observationa M SD Min Max

Exam score (all) 175 66.21639 12.42765 27.0150 94.3857Exam score 126 65.24554 11.90477 27.0150 91.6929Final exam 124 54.51613 15.07999 25 92.5Gender 126 0.71429 0.45356 0 1Age 126 22.03175 3.89243 19 46High school percentile 126 63.59524 22.10454 6 99Learning community 126 0.26984 0.44565 0 1Major 126 0.44444 0.49889 0 1Problem sets 126 0.40476 0.49281 0 1Gender ratio 126 0.58842 0.17709 0.2286 0.7714

aDue to the availability of other explanatory variables, samples dropped from 100 to 75 and from 75 to 51 for Fall 2008 and Fall 2009, respectively. Twostudents did not take the comprehensive final exam.

APPENDIX B—Correlation Matrix

Variable Exam score Final exam Gender AgeHigh schoolpercentile

Learningcommunity

Businessmajor

Problemsets Gender ratio

Exam score 1.000Final exam 0.834 1.000Gender 0.094 0.166 1.000Age 0.116 0.051 −0.075 1.000High school percentile 0.245 0.084 −0.163 −0.203 1.000Learning community −0.136 −0.114 −0.017 −0.228 0.132 1.000Business major 0.251 0.218 0.178 −0.162 0.165 0.124 1.000Problem sets −0.195 −0.147 0.045 0.067 −0.089 0.111 0.042 1.000Gender Ratio 0.133 0.193 0.919 −0.087 −0.127 −0.045 0.137 0.080 1.000

Dow

nloa

ded

by [

The

Uni

vers

ity o

f B

ritis

h C

olum

bia]

at 1

9:06

09

Dec

embe

r 20

14