educational apps: using mobile applications to enhance student learning of statistical concepts

12
Educational Apps: Using Mobile Applications to Enhance Student Learning of Statistical Concepts Chen Ling, Drew Harnish, and Randa Shehab School of Industrial & Systems Engineering, University of Oklahoma, Norman Abstract This research sought to understand whether the use of mobile applications (e.g., iPhone apps) had an impact on students’ learning of new statistical concepts. A control group (n = 12) was compared against a group that used a statistical mobile app (n = 13) during a simulated statistics lecture on the topic of normal distribution. Students received a classroom lecture followed by a period of either pencil-and-paper only or technology-assisted examples. They then took a quiz on the material. The overall quiz results showed that the app group outperformed the control group. When learning through examples with the mobile app, students performed better on problems that required them to apply their knowledge, which corresponds to the third level of learning in Bloom’s taxonomy. A postexperimental survey showed that students in the app group felt strongly that mobile apps helped them understand the new concepts more clearly and were more confident in their ability to learn the new material more quickly than the control group. Overall, this research demonstrates that incorporating mobile apps into lectures has the potential to positively affect student learning. C 2014 Wiley Periodicals, Inc. Keywords: Mobile application; Engineer education; Statistics learning; Bloom’s taxonomy; Active learning 1. INTRODUCTION As technology-enabled tools are increasingly prevalent in the classroom, there is much potential for these tools to benefit education. Research has shown that people are more motivated to learn when they are actively engaged in learning (Felder & Brent, 2003). However, traditional classrooms are usually structured as “di- rect instruction” lectures, where professors verbally explain material to students with unremarkable visual aids. With the nationwide emphasis on STEM (science, technology, engineering, and mathematics) education in the United States, there has been much research in Correspondence to: Chen Ling, 202 W. Boyd, Room 124, Norman OK 73019; e-mail: [email protected] Received: 4 June 2012; revised 22 January 2013; accepted 24 January 2013 View this article online at wileyonlinelibrary.com/journal/hfm DOI: 10.1002/hfm.20550 field technology-assisted learning field, with the no- tion that the introduction of new technology into the classroom will help students learn better. Many forms of technology-assisted learning tools have been stud- ied, including online tutorials, animation, video games, and interactive multimedia systems. Mobile applica- tions are a recent form of technology-assisted learning that are becoming more popular teaching tools. Mobile applications are software applications that run on mobile devices such as smartphones or tablet computers. The concept of mobile “apps” has only been around since Apple launched the Apple App Store in July 2008. Their use has continued to rise in re- cent years, and there are now more than 350,000 apps available at the Apple App Store, thousands of which are geared toward “every subject and every stage of learning” (Apple.com). Even prior to the launch of the Apple App Store, it was already noted that people were widely using mobile devices, such as smartphones and PDAs, to engage in “informal learning” while on 532 Human Factors and Ergonomics in Manufacturing & Service Industries 24 (5) 532–543 (2014) c 2014 Wiley Periodicals, Inc.

Upload: randa

Post on 07-Apr-2017

213 views

Category:

Documents


0 download

TRANSCRIPT

Educational Apps: Using Mobile Applications toEnhance Student Learning of Statistical ConceptsChen Ling, Drew Harnish, and Randa Shehab

School of Industrial & Systems Engineering, University of Oklahoma, Norman

Abstract

This research sought to understand whether the use of mobile applications (e.g., iPhone apps) hadan impact on students’ learning of new statistical concepts. A control group (n = 12) was comparedagainst a group that used a statistical mobile app (n = 13) during a simulated statistics lecture onthe topic of normal distribution. Students received a classroom lecture followed by a period of eitherpencil-and-paper only or technology-assisted examples. They then took a quiz on the material. Theoverall quiz results showed that the app group outperformed the control group. When learning throughexamples with the mobile app, students performed better on problems that required them to apply theirknowledge, which corresponds to the third level of learning in Bloom’s taxonomy. A postexperimentalsurvey showed that students in the app group felt strongly that mobile apps helped them understandthe new concepts more clearly and were more confident in their ability to learn the new material morequickly than the control group. Overall, this research demonstrates that incorporating mobile appsinto lectures has the potential to positively affect student learning. C© 2014 Wiley Periodicals, Inc.

Keywords: Mobile application; Engineer education; Statistics learning; Bloom’s taxonomy; Activelearning

1. INTRODUCTIONAs technology-enabled tools are increasingly prevalentin the classroom, there is much potential for these toolsto benefit education. Research has shown that peopleare more motivated to learn when they are activelyengaged in learning (Felder & Brent, 2003). However,traditional classrooms are usually structured as “di-rect instruction” lectures, where professors verballyexplain material to students with unremarkable visualaids. With the nationwide emphasis on STEM (science,technology, engineering, and mathematics) educationin the United States, there has been much research in

Correspondence to: Chen Ling, 202 W. Boyd, Room 124,Norman OK 73019; e-mail: [email protected]

Received: 4 June 2012; revised 22 January 2013; accepted 24January 2013

View this article online at wileyonlinelibrary.com/journal/hfm

DOI: 10.1002/hfm.20550

field technology-assisted learning field, with the no-tion that the introduction of new technology into theclassroom will help students learn better. Many formsof technology-assisted learning tools have been stud-ied, including online tutorials, animation, video games,and interactive multimedia systems. Mobile applica-tions are a recent form of technology-assisted learningthat are becoming more popular teaching tools.

Mobile applications are software applications thatrun on mobile devices such as smartphones or tabletcomputers. The concept of mobile “apps” has onlybeen around since Apple launched the Apple App Storein July 2008. Their use has continued to rise in re-cent years, and there are now more than 350,000 appsavailable at the Apple App Store, thousands of whichare geared toward “every subject and every stage oflearning” (Apple.com). Even prior to the launch ofthe Apple App Store, it was already noted that peoplewere widely using mobile devices, such as smartphonesand PDAs, to engage in “informal learning” while on

532 Human Factors and Ergonomics in Manufacturing & Service Industries 24 (5) 532–543 (2014) c© 2014 Wiley Periodicals, Inc.

Ling, Harnish, and Shehab Educational Apps

the go (Clough, Jones, McAndrew, & Scanlon, 2008).Since the inception of this new app phenomenon,researchers have noticed a trend in the public’s desire tolearn through mobile devices (Jeng, Wu, Huang, Tan,& Yang, 2010).

Mobile apps can afford an additional way of interac-tion through direct manipulation on the touch-screeninterface on many mobile devices. Coupling this formof interaction with the vast user acceptance of mo-bile apps and the recent push from universities towardexperiential learning, one can see the potential for ed-ucational apps to become a part of everyday learningin the near future. Furthermore, apps are increasinglypopular among college students. Compared to tradi-tional technology such as desktop computers, apps arehighly portable, which means they can be used on thego and allow students to learn from anywhere.

This research focuses on the educational effective-ness of mobile apps in a simulated statistics lecture.Currently, there is limited research on the educationaleffectiveness of mobile apps (e.g., iPhone and iPadapps). This study aims to explore whether mobile appscan facilitate learning in the classroom. The researcherssought to investigate whether a lecture augmented witha mobile app could enhance student learning of a basicengineering statistics topic—the normal distribution.

2. THEORETICAL BACKGROUNDThere are many learning theories that seek to explainhow people learn. The following sections highlightsome key areas in learning, including the theory oflearning styles, the theory of learning according toBloom’s taxonomy, and technology-assisted learning.Then, these theories will be blended to present an inte-grated conceptual model of learning. This model willbe used to evaluate the effectiveness of a mobile appused to facilitate learning in this study.

2.1. Theories of Learning Styles

People learn differently. In the engineering educationdomain, the Felder and Silverman model (Felder &Silverman, 1988) is most frequently used. According toFelder and Silverman (1988), individuals learn throughthree learning modalities: visual (pictures, diagrams,charts), verbal (sound, words), and kinesthetic (touch,taste, smell). The model considers receiving and pro-cessing new information as the two main steps in learn-ing. Four dimensions that the learning style focuses on

are the sensing/intuitive dimension, the visual/verbaldimension, the active/reflective dimension, and the se-quential/global dimension. The sensing/intuition di-mension determines how the learner tends to perceivethe world. Sensing involves observing and gatheringdata through the five senses, while people who usetheir intuition prefer learning with principles and the-ories. The visual/verbal dimension is related to howpeople receive information. Visual learners rememberbest what they see and will probably forget much ofwhat is said to them. However, verbal learners remem-ber best 1) what they hear and 2) what they hear andthen say. Information that is comprehended is con-verted to knowledge by either active experimentationor reflective observation. These two categories make upthe active/reflective dimension of Felder–Silverman’slearning style. The sequential/global dimension dealswith how the information is presented to the learner.If one is a sequential learner, he or she masters thematerial as it is presented in a logically ordered fash-ion. Instead, global learners learn in what is referred toas “fits and starts.” These learners make spontaneousleaps in their learning and are often unable to explainhow they came up with the solution. Global learners arebetter at divergent thinking and synthesis. Felder andSpurlin (2005) reported that the majority of engineer-ing students are sensing, visual, active, and sequentiallearners (Felder & Spurlin, 2005).

There seems to be a mismatch between engineer-ing students’ learning style and the way engineeringeducation is delivered. Carl Jung (1971) asserts thatmost college engineering education tends to favor in-tuitors (i.e., those that prefer abstract theories) oversensors (i.e., those who prefer applications). In addi-tion, most engineering education is taught by lecturewith mostly verbal contents, although most studentsare visual learners (Waldheim, 1987). To make lecturesmore effective to all students regardless of their learn-ing preference, Waldheim suggested using both text-based modules (handouts, PowerPoint) and visual aids(pictures, diagrams, flowcharts). Technology-assistedlearning tools provide new learning experiences thatcould help accommodate diverse learners with differ-ent learning styles.

2.2. Bloom’s Taxonomy

Since its inception, Benjamin Bloom’s taxonomy ofeducational objectives has served as a frameworkfor designing curricula based on varying depths of

Human Factors and Ergonomics in Manufacturing & Service Industries DOI: 10.1002/hfm 533

Educational Apps Ling, Harnish, and Shehab

comprehension (Bloom, 1956). Its revised version (An-derson & Krathwohl, 2001) describes learning in twodimensions: the cognitive process dimension, whichdescribes the process to learn, and the knowledge di-mension, which describes the kind of knowledge tobe learned. The cognitive process dimension containssix levels of learning—remember, understand, apply,analyze, evaluate, and create. According to the theory(Anderson & Krathwohl, 2001), remembering is theability to recall pertinent and previously learned ma-terial from memory. Understanding is the ability toextract and explain meaning from learned material orthe ability to make interpretations based on existingknowledge. Applying is the ability to implement exist-ing knowledge to additional concepts. Analyzing is theability to break down information into parts and to un-derstand how they are related. Evaluating is the abilityto determine relationships between multiple conceptelements and where they fit together. Creating is theability to organize and reorganize concept elementstogether in new ways.

The knowledge dimension in the revised Bloom’staxonomy has four levels—factual knowledge, concep-tual knowledge, procedural knowledge, and metacog-nitive knowledge (Anderson & Krathwohl, 2001). Fac-tual knowledge refers to essential facts and terminologythat students must know to understand a discipline. Inthe context of the topic of the normal distribution,this factual knowledge could be definitions of mean,standard deviation, and probability distribution. Con-ceptual knowledge includes principles, theories, andmodels pertinent to a discipline. For example, under-standing that the shaded area under a normal distri-bution curve represents the probability of that rangeis a type of conceptual knowledge. Procedural knowl-edge refers to algorithms, techniques, specific skills, ormethodologies in a field. In the context of the currentstudy, this knowledge may include the ability to lookup a cumulative standard normal distribution table tofind the corresponding z value for a given probabil-ity. Such knowledge is important for learners to per-form step-by-step calculation of probability. Metacog-nitive knowledge is strategic or reflective knowledgeabout the process of solving problems (Anderson &Krathwohl, 2001). This type of knowledge is gainedwhen a learner is able to form a strategy for problemsolving after repeated practice. The revised Bloom’staxonomy is intended to be used as an authentic toolfor curriculum planning, instructional delivery, andassessment (Anderson & Krathwohl, 2001). Many in-

stitutions find this taxonomy especially helpful withoutlining course requirements for new faculty or withdeveloping new courses (Starr, Manaris, & Stalvey,2008). Educators can characterize a learning activ-ity on the combined dimensions of cognitive processand knowledge to assess whether instruction is actu-ally affecting learning. Researchers also use Bloom’staxonomy in evaluating the levels of learning that atechnology-assisted learning tool helps the learner toachieve (Wang, Vaughn, & Liu, 2011).

2.3. Technology-Assisted LearningTools

As technology develops, more educational technol-ogy is being introduced into classrooms. Becausetechnology-assisted learning tools have the potentialto facilitate deep learning by actively engaging thelearner in the learning process (Evans & Gibbons,2007), there has been much research on the edu-cational effectiveness of these tools. Many forms oftechnology-assisted learning tools have been studied,including computer-based tutorials (Holman, 2000),computer-assisted videos (McNaught et al., 1995), in-teractive multimedia (Evans & Gibbons, 2007), anima-tion (Wang et al., 2011), and computer games (Coller& Scott, 2009).

There have been mixed results on the effectivenessof technology-assisted learning on the learning out-come. Holman (2000) found that posttest scores ofstudents taught by computer tutorial were not sig-nificantly different from those of students taught bytraditional classroom instruction (i.e., both methodswere equally effective). A study by Merino and Abel(2003) had similar results but also found that these tu-torials seemed to supplement learning when pairedwith classroom instruction. However, other studiesdemonstrate the positive effect of technology-assistedlearning tools. McNaught et al. (1995) found that aprelab computer-assisted video shown in a chemistrylab effectively raised performance of all students. Evansand Gibbons (2007) found that students who usedan interactive multimedia system completed problem-solving tests faster and better when compared to thenoninteractive group. Similarly, Wang et al. (2011)found that animation enhanced student learning ofhypothesis testing concepts when compared to staticinformation. Three forms of animation—simple ani-mation, change input animation, and practice modeanimation—were compared against static instruction.

534 Human Factors and Ergonomics in Manufacturing & Service Industries DOI: 10.1002/hfm

Ling, Harnish, and Shehab Educational Apps

Figure 1 Integrated view of technology-assisted learning.

The static instruction mode involves posting informa-tion online in PDF format, which is similar to a tradi-tional lecture. In simple animation, users could controlthe sequence and pace of three animations for calcu-lating Type I error, Type II error, and p value. Theycould also choose to view the animation repeatedly. Inchange input animation, users could actively changeinput to modify the animation that they see. In prac-tice mode animation, users could calculate type I error,type II error, and p value themselves and receive imme-diate feedback. The study found that all three levels ofanimation assisted in enhancing learning in the under-standing level and lower level, applying of Bloom’s tax-onomy when compared to static instruction. However,the study did not find any significant differences be-tween the three levels of animation. Wang et al. (2011)discussed that the complexity associated with the prac-tice animation mode, which was originally thoughtby the researchers to be the most helpful in assistingthe learner to form mental models, might have hin-dered its effectiveness. Therefore, although interactiv-ity designed in educational tools may enhance learning,careful design is needed to bring out the best learningoutcome.

Video games are another tool to facilitate learn-ing (Gee, 2003; Rapeepisarn, Wong, Fung, & Khine,2008). Gee (2003) noted the motivating effect of videogames on teaching people to think critically and tosolve problems in a complex environment. Coller andScott (2009) evaluated the use of an educational gamein a mechanical engineering course and found that stu-dents spent approximately twice the amount of time ontheir coursework outside of class while also showing adeeper understanding of presented concepts as com-pared with students in a traditional lecture setting.

Reflecting on why certain technologies work andothers do not, the discrepancy could be attributed to

how the interactions in the technology-assisted learn-ing tools are designed and incorporated into the cur-riculum. Although new technologies afford additionalmeans to positively influence learning by providingnew ways of rendering information, the designer ofthese tools has to carefully design the delivery methodto ensure a positive effect on the learners.

2.4. An Integrative Conceptual Modelof Learning

Considering past research in learning styles, learn-ing according to Bloom’s taxonomy and technology-assisted learning, we derived an integrative conceptualmodel for learning with technology-assisted learningtools, which illustrates the relationship of these keyconcepts (see Figure 1). According to this model, atechnology-assisted learning tool serves as a channelfor the knowledge to reach its learners. The delivery ofknowledge via technology-assisted tools is critical to in-fluence diverse learners and to achieve the final goal oflearning. Learners have different learning styles, differ-ent engagement levels, and different motivation levelswhile learning. The goal of technology-assisted learn-ing tools is to actively engage learners and to motivatethem to continue learning. These tools are enabled bytechnology to provide different learning experiencescharacterized by new ways of information presenta-tion and user interaction. The design of these tools interms of their content, functionality, interface design,and interaction as well as the actual delivery mecha-nism of knowledge (e.g., timing of use and rules ofuse) is critical to influencing their effectiveness. Forexample, mobile apps on a touch-screen mobile deviceafford interaction via touch sense. Designers of learn-ing tools should carefully leverage such technologicalcapability to achieve supreme knowledge deliverance

Human Factors and Ergonomics in Manufacturing & Service Industries DOI: 10.1002/hfm 535

Educational Apps Ling, Harnish, and Shehab

Figure 2 Screenshots of LearnStatistics.

to the learner. In terms of timing of use and rules ofuse, for instance, an instructor could have the studentsuse a mobile app during the lecture or after class. Theycould also have students use a mobile app to collab-orate while solving problems or to learn individually.All aspects of the learning experience have to work to-gether to make them effective. Hence, it is easy to un-derstand why certain technology-assisted tools workedto increase student learning while others failed to doso. Introducing new technology is not a cure-all. De-signing the tools to deliver knowledge to learners in anoptimal way is the key to success.

3. DERIVATION OF RESEARCHHYPOTHESISFor a mobile app to be an effective learning tool, it mustimpart knowledge to facilitate learners’ understandingvia interactions designed in the app. In evaluating theeducational app, it is important to identify which char-acteristic(s) of a mobile app as a technology-assistedlearning tool affect the learner and, in turn, contributeto learning defined by Bloom’s taxonomy. The next sec-tion describes the basic characteristics of the app usedin this study as well as how the design of app affectslearning of the normal distribution topic.

3.1. Description of the Mobile AppLearnStatistics

The mobile app evaluated in this study was Learn-Statistics, a publicly available mobile app developed byGO2STAT LLC designed to provide real-time feedback

and interactive feedback to the user. After users enterthe statistical parameters (e.g., mean and standard de-viation), LearnStatistics displays a distribution curve asshown in the screenshots in Figure 2. After users selectthe “normal” distribution option in this app, they canselect to view one of four screens by tapping the cor-responding tab: PDF (standing for probability densityfunction), CDF (standing for cumulative distributionfunction), Options, and Notes. In the “Options” screen(see Figure 2a), users can enter parameters (e.g., meanand standard deviation) of a normal distribution; thisallows two types of calculations: 1) given the left andright bound of a distribution, calculate the correspond-ing probability, and 2) given a probability, calculate thecut-off points. Screens of “PDF” and “CDF” allow theuser to view the distribution graphically for calcula-tion problem type 1) and 2), respectively. When userstaps “PDF,” they can view the image of the probabilitydensity function of the normal distribution with theselected area shaded according to the user-providedleft and right bounds. A movable slide bar allows usersto adjust the left and right bounds of an interval and vi-sualize how the probability and shaded area change inreal time (see Figure 2b). When users tap “CDF,” theycan view the image of cumulative distribution func-tion with the percentile value they entered in the “Op-tions” screen. A movable slide bar allows users to visual-ize how the cumulative probability and correspondingx value change in real time as the x values are adjusted(see Figure 2c). The “Note” screen provides basic infor-mation on topics, including the normal distribution,z score, the central limit theorem, and instructions onusing the app.

536 Human Factors and Ergonomics in Manufacturing & Service Industries DOI: 10.1002/hfm

Ling, Harnish, and Shehab Educational Apps

TABLE 1. LearnStatistics Characteristics Mapped to Learning Levels in Bloom’s Taxonomy

Characteristics of Mobile Application

Content Functionality User Interaction Interface Design

Bloom’staxonomylevel

Remember Provide bothprobability valuesand distributioncurve; allow userinput ofdistributionparameters

Options tab allowsusers to inputvalues needed todefine distributionand app providesanswersimmediately

Touch-screeninteraction allowsreal-time usermanipulation ofdata

Reinforces memoryby using touchsenses to bringabout datachanges andvisual sense toobserve

Understand Probability outputchanges based onuser manipulationof input

Movable slide barallowsmanipulation ofleft and right tailtruncation points

Movement of slidebar showsimmediate changein shaded area

Ability to togglebetween graphand data entry tosee connections

Apply Relationshipbetweendisplayedprobability andshaded area indistribution curve

App displaysprobabilitychangesassociated withtruncation points

Observingprobability changeand correlationwith shaded area

Change in responseto user inputshows relationshipbetween valuesand visuals

3.2. Effect of App Design on Learning

The LearnStatistics app is designed specifically to pro-vide real-time feedback to user inputs under a set ofuser-supplied parameters, including mean and stan-dard deviation. It is meant as a tool to help learnersinterpret and conceptualize statistical information andsolve problems based on user inputs. It supports learn-ing of the normal distribution by helping learners to re-member the basic concepts, to understand the relation-ship between concepts (e.g., the relationship betweenthe left and right bounds of an interval and the cor-responding probability), and to apply this knowledgein solving problems. Because it does not present newconcepts or the relationships between them, it does notsupport higher-order thinking skills of Bloom’s taxon-omy (i.e., analyzing, evaluating, and creating). There-fore, only the three lower levels of learning in Bloom’staxonomy (i.e., remembering, understanding, and ap-plying) were considered in evaluating the effectivenessof the mobile app. A detailed analysis of the app’s de-sign was performed to map the app’s characteristics interms of its content, functionality, mode of user inter-action, and interface design to the three lower levelsof learning in Bloom’s taxonomy (see Table 1). It canbe seen from Table 1 that many design characteristicsof LearnStatistics can support students’ abilities to re-member, understand, and apply what they have learned

about the normal distribution. Thus, it is reasonableto hypothesize that a lecture augmented with this mo-bile app will enhance student learning when comparedwith a traditional lecture without the mobile app.

4. METHODOLOGY4.1. Experimental Design

The independent variable in this study, lecture type,consisted of two levels: without app (control group)and with app (app group). The dependent variablesused to measure learning were comprehension quizscore (compared overall and by each question) and av-erage quiz scores of problems in the three lower learn-ing levels of Bloom’s taxonomy (i.e., remember, under-stand, and apply), as well as the postexperiment surveyresponses. A t-test was used to evaluate the differencein mean quiz scores between the control group and theapp group. A nonparametric Wilcoxon rank-sum testwas used to analyze the difference in average scores ofproblems in three levels of learning and to analyze thesurvey results.

4.2. Participants

Twenty-six participants—15 men and 11 women—participated in this study. The average age was

Human Factors and Ergonomics in Manufacturing & Service Industries DOI: 10.1002/hfm 537

Educational Apps Ling, Harnish, and Shehab

20.27 years. All were fortuitously sampled from a pop-ulation of local college students. The participants wererandomly assigned to two equal groups of 13 (n1 =n2 =13). However, one participant from the control groupleft many questions blank so the data from this partic-ipant was discarded. Therefore, the control group had12 data points, and the app group had 13 data points.All participants met the qualifying criteria of no priorcoursework in statistics.

The app group was divided into two subgroups be-cause of time conflicts of participants. App Group A(n1 = 8) had a mean score of 0.79 and a standarddeviation of .16, while app Group B (n2 = 5) had amean score of 0.69 and a standard deviation of 0.17.Before these data were combined, a two-sample t-testwas used to test for significance at a level of .05. Therewas no significant difference between the mean overallquiz scores of the two app subgroups; therefore, thedata from these two subgroups was combined to forma single app group of 13.

4.3. Equipment and Material

4.3.1. Mobile Device and Mobile App

Each member of the app group was shown how tosolve example problems using the iPod touch mobiledevices, loaned by Apple as part of their seeding pro-gram. The instructor used an iPad 2 device to projectthese examples on a screen. The mobile app used inthis study was LearnStatistics.

4.3.2. Comprehension Quiz

A standardized comprehension quiz consisting of14 questions was designed to assess students’ knowl-edge retention. Each question was mapped to one of thethree lower levels of Bloom’s taxonomy—remember,understand, and apply (Anderson & Krathwohl, 2001).The quiz questions were similar to the example prob-lems that were practiced during the lecture.

4.3.3. Grading Scale

In an effort to gauge the depth that participants learnedabout the normal distribution topic, a three-optiongrading scale was used. Student received 1, 0.5, or 0for answers of correct, partial credit, or incorrect. Thefinal quiz average was determined by the total pointsearned over the total points possible (i.e., 14). These

scores were rounded to the nearest whole percent. Forexample, a student with eight correct questions, onepartial credit question, and five incorrect questionswould receive a 61% after dividing 8.5 by 14.

Partial credit was given under the following circum-stances: 1) On a two-part question, the participant gotone answer correct and one answer incorrect. Note:This was only the case for Questions 2 and 8, whichasked for the mean and standard deviation. 2) Theparticipant looked up the correct value in the z-tablebut failed to subtract it from 1 (e.g., when Question 5asked for p > 6). 3) The participant wrote the correctformula and did the correct statistics but made an al-gebra error, for example, when dividing (70 − 65)/2.5,they got 2.5 instead of 2.

4.3.4. Postexperiment Survey

In the postexperiment survey, a 7-point Likert scale(i.e., strongly disagree = 1, neutral = 4, stronglyagree = 7) was used to gauge participants’ comfortlevel with lecture pace, confidence in answers, per-ceived usefulness of LearnStatistics (app group only),and likelihood of future mobile app use.

4.4. Procedure

For both the control group and the app group, anoverview of the study and informed consent form waspresented prior to any data collection. The controlgroup did not have access to any iPod touch devicesduring the lecture and thus simulated a “typical” statis-tics course lecture. The instructor gave a pre-prepared60-min lecture about the topic of normal distribu-tion followed by 15-min pencil-and-paper exampleson probability calculations and z-score transforma-tions. At the conclusion of this lecture and exampleperiod, a standardized comprehension quiz was ad-ministered to participants to gauge their learning ofthe new materials. They were allowed as much time asneeded on this quiz.

The app group used iPod touch devices during theexample period following the standard lecture. Thissimulated a course with technology-assisted learningtools for active learning. The same instructor gave anidentical pre-prepared 60-min lecture about the nor-mal distribution followed by a 15-min example periodusing the same examples. This example period wasaugmented with mobile app. The instructor projectedthe LearnStatistics app on a screen using an iPad 2,

538 Human Factors and Ergonomics in Manufacturing & Service Industries DOI: 10.1002/hfm

Ling, Harnish, and Shehab Educational Apps

Figure 3 A student in the app group follows along withthe instructor while solving example problems.

while students followed along on their iPod touch de-vices. The instructor demonstrated each step of theapp in real time. Figure 3 shows the setup during thisexperiment. At the conclusion of this lecture and ex-ample period, the same standardized comprehensionquiz was administered. Since mobile apps were meantto supplement learning and not to assist with test tak-ing in this study, participants did not have access to theLearnStatistics app during the quiz. After the quiz wascompleted, all participants filled out a postexperimentsurvey.

5. RESULTS AND ANALYSIS5.1. Overall Quiz Score

The overall mean quiz score for each participant wascalculated as the total points gained divided by the totalnumber of questions. This score in terms of percentagecorrect was 0.58 for the control group (n = 12) withstandard deviation of 0.22. The overall mean quiz score

Figure 4 Average comprehension quiz scores by question.

of the app group (n = 13) was 0.74 with a standarddeviation of 0.17.

After validating the normality, independence, andequal variance assumptions, a two-sample t-test wasused to compare overall mean quiz scores. The datashow that the quiz score of the app group is significantlyhigher than that of the control group (t = 2.14, one-tailed p = .02).

5.2. Quiz Scores by Individual Question

A further examination of individual questions was per-formed to determine whether there was significant dif-ferences between the three levels of Bloom’s taxonomyevaluated in this study (i.e., remember, understand,apply). For each question, a percentage correct valuewas calculated as the total point gained by all partic-ipants for that question divided by the total numberof participants in the group. Figure 4 shows that theapp group scored higher on each question and that theanswers followed the same trend (i.e., similar relativefrequency of correct/incorrect within each group). Aside-by-side comparison of the percentage correct byquestion is shown in Table 2.

A test for significance was performed for each indi-vidual question. Using the SAS univariate procedure,the plot of residuals revealed that data of all ques-tions violated the normality assumption. As a result, aWilcoxon rank-sum test was used as a nonparametricalternative to compare the means of individual ques-tions between the two independent groups. The one-sided p values for exact test are reported in Table 2.Although there is a trend for the app group to outper-form the control group according to the means, noneof the scores for the individual questions are signifi-cantly different from each other at an α level of .05.This might be due to the limited sample size in thisstudy.

Human Factors and Ergonomics in Manufacturing & Service Industries DOI: 10.1002/hfm 539

Educational Apps Ling, Harnish, and Shehab

TABLE 2. Mean Comprehension Quiz Scores by Individual Question

Question DescriptionBloom’s

Taxonomy

Control Group(n = 12)M (SD)

App Group(n = 13)M (SD)

%MeanDifference

Between Groups

WilcoxonTest

Statistics (U)

Exact Test1-Sidedp-Value

Q1 Report pointprobability

Remember 0.50 (0.52) 0.62 (0.51) 24 147 .43

Q2 Mean and standarddeviation ofstandard normaldistribution

Remember 0.75 (0.45) 0.96 (0.14) 28 141 .10

Q3 Calculate probabilityof a range

Apply 0.75 (0.45) 0.96 (0.14) 28 141 .10

Q4 Calculate probabilityof a range

Apply 0.83 (0.39) 0.88 (0.30) 6 154 .47

Q5 Calculate probabilityof a range

Apply 0.58 (0.47) 0.85 (0.24) 47 133 .09

Q6 Calculate probabilityof a range

Apply 0.50 (0.52) 0.85 (0.38) 70 129 .08

Q7 Calculate probabilityof a range

Apply 0.79 (0.33) 0.96 (0.14) 22 135.5 .11

Q8 Report mean andstandard deviationof a distributiongraph

Understand 0.58 (0.36) 0.65 (0.38) 12 147 .35

Q9 Calculate probabilityof a range

Apply 0.33 (0.49) 0.54 (0.43) 64 136 .10

Q10 Approximatepercentage within2σ

Understand 0.67 (0.49) 0.92 (0.28) 37 136 .14

Q11 Calculate probabilityof a range

Apply 0.17 (0.39) 0.38 (0.51) 124 139 .22

Q12 Calculate z score Apply 0.33 (0.44) 0.42 (0.49) 27 149 .39Q13 Draw a normal

distributionUnderstand 0.63 (0.31) 0.69 (0.25) 10 148 .41

Q14 Draw regionsoutside 2σ ongraph

Understand 0.67 (0.44) 0.73 (0.26) 9 155 .44

Note: % mean difference = {(App Group Mean − Control Group Mean) / Control Group Mean}× 100.

5.3. Quiz Scores by Bloom’s TaxonomyLearning Level

Because each question was mapped to one of the threelower levels of Bloom’s taxonomy (i.e., remember, un-derstand, and apply), the learning effectiveness wascompared for each of the three lower levels of learn-ing. The analysis is similar to that done by Wang et al.(2011). A SAS univariate procedure showed that thedata violated the normality assumption. Therefore, aWilcoxon rank-sum test was used to compare the aver-age score between the control and app group for threelower levels of learning in Bloom’s taxonomy (remem-

ber, understand, and apply). The independent variableis the presence or absence of the mobile app in thelearning process. The dependent variable is the aver-age scores of questions that fall into one of the threelower levels of learning in Bloom’s taxonomy. Threeseparate Wilcoxon rank-sum tests were performed foreach learning level. Table 3 shows the results. The quizscores for the “apply” level of learning are significantlydifferent (p = .04) between the control group (M =0.54, SD = 0.29) and the app group (M = 0.73, SD =0.22). The scores for the other two learning levels—remember and understand—are not significantly dif-ferent. The results suggest that the use of mobile apps

540 Human Factors and Ergonomics in Manufacturing & Service Industries DOI: 10.1002/hfm

Ling, Harnish, and Shehab Educational Apps

TABLE 3. Average Comprehension Quiz Scores by Individual Question

Bloom’sTaxonomyLearning Levels Questions

Control Group(n = 12)M (SD)

App Group(n = 13)M (SD)

Wilcoxon TestStatistics (U)

Exact Test1-Sidedp-Value

Remember Q1, Q2 0.63 (0.31) 0.79 (0.29) 135 .11Understand Q8, Q10, Q13, Q14, 0.64 (0.18) 0.75 (0.20) 132 .10Apply Q3, Q4, Q5, Q6, Q7,

Q9, Q11, Q120.54 (0.29) 0.73 (0.22) 124 .04

positively influences students’ abilities to apply thelearned knowledge to solve problems and to derivecorrect calculations.

5.4. Survey Responses

Both groups rated that they were not very familiar withthe material prior to the study but felt that the instruc-tor was effective at explaining the concepts and felt thatthey understood the examples. Twelve out of 13 stu-dents in the app group felt that mobile apps helpedthem understand the concepts more clearly (M = 5.36,SD = 1.37). Ten out of 13 students in the app group in-dicated that they plan to use mobile apps as a learningtool in the future (M = 5.54, SD = 1.24). The compar-ison of survey responses between the control and appgroups again necessitated the use of the Wilcoxon rank-sum test due to violation of the normality assumption.Only one question was found to be significant at α levelof .05. The interest in learning the material for partic-ipants in the control group (M = 5.33, SD = 1.56)was higher (p = .04) than that of the app group (M =4.69, SD = 1.08). The app group (M = 5.54, SD = .79)rated their confidence in learning the material quicklymarginally higher (p = .07) than the control group(M = 4.33, SD = 2.06). Both groups reported similarlevels of engagement in learning the materials (controlgroup: M = 4.83, SD = 1.80; app group: M = 4.38,SD = 0.87). It is interesting to note that despite the appgroup’s having lower interest in learning the materialsthan the control group, their overall quiz performancewas better.

6. DISCUSSIONOverall, the results showed that the app group out-performed the control group on the comprehensionquiz. This was expected since mobile apps are a formof active learning and previous studies in active learn-ing have shown similar trends (Felder & Brent, 2003).

The postexperiment survey showed that 12 out of13 participants in the app group felt strongly that mo-bile apps contributed to their learning. This feeling ofconfidence in learning correlates with their increasedquiz performance. That students attributed their in-creased performance to the use of the mobile app fur-ther supports the effectiveness of the app in engagingthe learners.

So what makes the educational mobile app work?We can evaluate the app from two perspectives, ac-cording to the integrated conceptual model of learn-ing. The first perspective is viewing the mobile appas a channel for delivering knowledge to its learn-ers and evaluating the types of interaction the appprovides to deliver the knowledge. With the mobileapp, learners directly swipe on the touch screen of themobile device to adjust the location of left and rightbounds on the x-axis, while simultaneously observ-ing changes in the shaded areas under the distributioncurve. This interaction of direct manipulation to bringabout changes and visualization of the changes—bothin the shaded area on the graph and in the probabil-ity value changes in the formula—is a powerful toolfor engaging students in an active learning process.Meanwhile, the process helps the learner establish acorrect mental model of the normal distribution andthe associated relationship between p values and z val-ues, which might be an abstract concept to captureotherwise.

The second perspective is evaluating the relation-ship between the app design and the learner’s learningstyle. Incorporating mobile apps into the classroomcould enable an instructor to more effectively caterto needs of students with diverse learning styles. Theapp presented active learners with the opportunity tofollow along on their own app, while allowing reflec-tive learners the chance to internalize these changingconditions as they happen. By having the opportunityto experiment with different outcomes using an appthat provides real-time feedback, students were able to

Human Factors and Ergonomics in Manufacturing & Service Industries DOI: 10.1002/hfm 541

Educational Apps Ling, Harnish, and Shehab

more quickly understand concepts and draw relation-ships. Presenting both the visual content of normaldistribution graph and the verbal content of proba-bility formula and narratives helps to accommodateboth visual and verbal learners. Involving direct ma-nipulation with hand and touch sense accommodatesan additional mode of learning modality—kinestheticlearning—and could potentially get students more en-gaged. Many mobile devices are operated via directmanipulation of the user’s touch input. In addition,many devices take inputs based on the user’s move-ment and position via a built-in accelerometer. Thesefeatures afford the mobile apps potential to fully engagethe kinesthetic modality of the learner as well. Becausepeople with the sensing style prefer to learn with allof their senses, engaging more senses during learningis beneficial. Introducing mobile apps as an educa-tional tool can help bridge the gap between traditionalteaching styles, which mostly favor verbal and intuitivelearners, and the prevailing engineering student learn-ing styles of visual, sensing, and active. Overall, thelearner’s process of using a mobile app to find answersis a form of active learning, which has been proved toimprove student engagement and learning (Felder &Brent, 2003).

7. CONCLUSION AND FUTURESTUDIESThis study compared the effectiveness of using an edu-cational mobile app to assist in teaching the basic sta-tistical concept of the normal distribution. The resultsindicate that the mobile app influenced student learn-ing positively by increasing the overall quiz score andincreasing the average scores for questions requiringapplication of knowledge. The postexperiment evalu-ation survey also showed overwhelmingly positive re-sponses from students who used the mobile app whilelearning. All of this evidence is promising for intro-ducing mobile apps as a supplementary educationaltool.

The practical contribution of the current research isthat it provides initial empirical evidence that mobileapps can serve as supplementary tools in educationto enhance student learning of abstract statistical con-cepts. More research can be done in a similar area to geta deeper understanding of how to design mobile appsfor better learning experience and better educationaleffectiveness.

The theoretical contribution of the current researchis that we proposed an integrative conceptual model oflearning to illustrate the relationship between knowl-edge, technology-assisted learning tools, the learner,and learning that follows as a result. This model canserve as a framework in evaluating other technology-assisted learning tools. The model emphasizes the im-portance of an optimal design of deliverance method(s)of the knowledge assisted by new technologies inhelping students assimilate new knowledge. Design-ers of technology-assisted learning tools should notfocus too much on the new technology as the solu-tion but, rather, should carefully craft the user interac-tion enabled by the technology to effectively facilitatelearning.

As an exploratory study in examining this new formof educational tool, there are several limitations of thecurrent study. First, we assumed that participants whohad not taken a college statistics course did not knowthe normal distribution concepts and as a result wouldscore poorly on a pretest if it were given. However, acomparison of pretest/posttest scores between the twogroups would provide a better metric to evaluate theeducational effectiveness of a mobile app than just thequiz scores. Future studies should include the baselineperformance evaluation through a pretest to eliminateany uncertainty about subject matter familiarity. Sec-ond, the types of quiz questions used were limited to ba-sic concept questions and equation-based questions. Itwould be interesting to examine student performanceon essay problems to help us understand the level ofunderstanding as well as to identify any misconcep-tions the students may have. Third, the current studyis limited in sample size and length of exposure to themobile app in the lecture. Further research is neededto expand the current study to fully examine the effectsof mobile apps on learning. It is also necessary to eval-uate the impact of mobile apps on an actual course asopposed to a simulated environment. The students en-rolled in a statistic course may have higher motivationto learn the presented materials. This research is meantas an inquiry into the overall effectiveness of mobileapps as a new educational tool. Though it has perhapsraised more questions than it has answered, it has laidthe groundwork for future studies on this emergingconcept of mobile apps in education.

Mobile apps are interactive and deliver results with-out the need of a computer, a textbook, or the Internet.In essence, when well designed, they appeal to today’sgeneration of learners because of their portability and

542 Human Factors and Ergonomics in Manufacturing & Service Industries DOI: 10.1002/hfm

Ling, Harnish, and Shehab Educational Apps

ease of use. Compared with other forms of active learn-ing or technology-assisted learning, mobile apps offera unique tool for learning. The average app user’s pre-existing relationship with apps and the platform ofmobile devices allows such apps to be used as tools onhand at any time for today’s student or working pro-fessional. The portability of these devices allows an appuser’s immediate access whenever and wherever ques-tions arise. Therefore, introducing mobile apps intolearning could bring about great benefits of better con-ceptual understanding, higher performance, increasedconfidence, higher motivation of students and, per-haps, eventually lead to higher retention and gradua-tion rates among students. More research in this areawill help to ensure the greatest benefits of mobile appsare realized through thoughtful designs.

References

Anderson, L. W., & Krathwohl, D. R. (2001). A taxon-omy for learning, teaching and assessing: A revision ofBloom’s taxonomy of educational objectives (Completeedition). New York, NY: Longmans.

Bloom, B. S. (1956). Taxonomy of educational objec-tives: The classification of educational goals, by a com-mittee of college and university examiners. London:Longmans.

Clough, G., Jones, A. C., McAndrew, P., & Scanlon, E.(2008). Informal learning with PDAs and smartphones.Journal of Computer Assisted Learning, 24, 359–371.

Coller, B. D., & Scott, M. J. (2009). Effectiveness of using avideo game to teach a course in mechanical engineering.Computers & Education, 53(3), 900–912.

Evans, C., & Gobbons, N. (2007). The interactivity effectin multimedia learning. Computers & Education, 49(4),1147–1160.

Felder, R. M., & Brent, R. (2003). Learning by doing.Chemical Engineering Education, 37(4), 282–283.

Felder, R. M., & Silverman, L. K. (1988). Learning and

teaching styles in engineering education. Journal of En-gineering Education, 78(7), 674–681.

Felder, R. M., & Spurlin, J. (2005). Applications, reliabil-ity and validity of the index of learning styles. Interna-tional Journal of Engineering Education, 21(1), 103–112.

Gee, J. P. (2003). What video games have to teach us aboutlearning and literacy. New York, NY: Longmans.

Holman, L. (2000). A comparison of computer-assistedinstruction and classroom bibliographic instruction.American Library Association, 71(12), 116–119.

Jeng, Y., Wu, T., Huang, Y., Tan, Q., & Yang, S. (2010).The add-on impact of mobile applications in learningstrategies: A review study. Educational Technology &Society, 13(3), 3–11.

Jung, C. G. (1971). Psychological types. Princeton, NJ:Princeton University Press.

McNaught, C., Grant, H., Fritz, P., Barton, J., McTigue,P., & Prosser, R. (1995). The effectiveness of computer-assisted learning in the teaching of quantitative volu-metric analysis skills in a first-year university course.Journal of Chemical Education, 72(11), 1003–1007.

Merino, D. N., & Abel, K. D. (2003). Evaluating the effec-tiveness of computer tutorials versus traditional lectur-ing in accounting topics. Journal of Engineering Edu-cation, 92(2), 189–194.

Rapeepisarn, K., Wong, K. W., Fung, C. C., & Khine, M. S.(2008). The relationship between game genres, learningtechniques and learning styles in educational computergames. Edutainment, LNCS, 5093, 497–508.

Starr, C. W., Manaris, B., & Stalvey, R. H. (2008). Bloom’staxonomy revisited: Specifying assessable learning ob-jectives in computer science. SIGCSE Bulletin, 40(1),261–265.

Waldheim, G. P. (1987). Understanding how studentsunderstand. Journal of Engineering Education, 77(5),306–308.

Wang, P. Y., Vaughn, B., & Liu, M. (2011). The impactof animation interactivity on novice’s learning of in-troductory statistics. Computers & Education, 56, 300–311.

Human Factors and Ergonomics in Manufacturing & Service Industries DOI: 10.1002/hfm 543