audience response systems: using “clickers” to enhance bsw education

14
This article was downloaded by: [University of Cambridge] On: 09 October 2014, At: 15:42 Publisher: Routledge Informa Ltd Registered in England and Wales Registered Number: 1072954 Registered office: Mortimer House, 37-41 Mortimer Street, London W1T 3JH, UK Journal of Technology in Human Services Publication details, including instructions for authors and subscription information: http://www.tandfonline.com/loi/wths20 Audience Response Systems: Using “Clickers” to Enhance BSW Education Laurie A. Smith a , Herb Shon a & Rowena Santiago b a CSUSB School of Social Work , San Bernardino, California b CSUSB Teaching Resource Center , San Bernardino, California Published online: 22 Jul 2011. To cite this article: Laurie A. Smith , Herb Shon & Rowena Santiago (2011) Audience Response Systems: Using “Clickers” to Enhance BSW Education, Journal of Technology in Human Services, 29:2, 120-132, DOI: 10.1080/15228835.2011.587737 To link to this article: http://dx.doi.org/10.1080/15228835.2011.587737 PLEASE SCROLL DOWN FOR ARTICLE Taylor & Francis makes every effort to ensure the accuracy of all the information (the “Content”) contained in the publications on our platform. However, Taylor & Francis, our agents, and our licensors make no representations or warranties whatsoever as to the accuracy, completeness, or suitability for any purpose of the Content. Any opinions and views expressed in this publication are the opinions and views of the authors, and are not the views of or endorsed by Taylor & Francis. The accuracy of the Content should not be relied upon and should be independently verified with primary sources of information. Taylor and Francis shall not be liable for any losses, actions, claims, proceedings, demands, costs, expenses, damages, and other liabilities whatsoever or howsoever caused arising directly or indirectly in connection with, in relation to or arising out of the use of the Content. This article may be used for research, teaching, and private study purposes. Any substantial or systematic reproduction, redistribution, reselling, loan, sub-licensing, systematic supply, or distribution in any form to anyone is expressly forbidden. Terms & Conditions of access and use can be found at http://www.tandfonline.com/page/terms- and-conditions

Upload: rowena

Post on 09-Feb-2017

213 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Audience Response Systems: Using “Clickers” to Enhance BSW Education

This article was downloaded by: [University of Cambridge]On: 09 October 2014, At: 15:42Publisher: RoutledgeInforma Ltd Registered in England and Wales Registered Number: 1072954 Registeredoffice: Mortimer House, 37-41 Mortimer Street, London W1T 3JH, UK

Journal of Technology in Human ServicesPublication details, including instructions for authors andsubscription information:http://www.tandfonline.com/loi/wths20

Audience Response Systems: Using“Clickers” to Enhance BSW EducationLaurie A. Smith a , Herb Shon a & Rowena Santiago ba CSUSB School of Social Work , San Bernardino, Californiab CSUSB Teaching Resource Center , San Bernardino, CaliforniaPublished online: 22 Jul 2011.

To cite this article: Laurie A. Smith , Herb Shon & Rowena Santiago (2011) Audience ResponseSystems: Using “Clickers” to Enhance BSW Education, Journal of Technology in Human Services, 29:2,120-132, DOI: 10.1080/15228835.2011.587737

To link to this article: http://dx.doi.org/10.1080/15228835.2011.587737

PLEASE SCROLL DOWN FOR ARTICLE

Taylor & Francis makes every effort to ensure the accuracy of all the information (the“Content”) contained in the publications on our platform. However, Taylor & Francis,our agents, and our licensors make no representations or warranties whatsoever as tothe accuracy, completeness, or suitability for any purpose of the Content. Any opinionsand views expressed in this publication are the opinions and views of the authors,and are not the views of or endorsed by Taylor & Francis. The accuracy of the Contentshould not be relied upon and should be independently verified with primary sourcesof information. Taylor and Francis shall not be liable for any losses, actions, claims,proceedings, demands, costs, expenses, damages, and other liabilities whatsoever orhowsoever caused arising directly or indirectly in connection with, in relation to or arisingout of the use of the Content.

This article may be used for research, teaching, and private study purposes. Anysubstantial or systematic reproduction, redistribution, reselling, loan, sub-licensing,systematic supply, or distribution in any form to anyone is expressly forbidden. Terms &Conditions of access and use can be found at http://www.tandfonline.com/page/terms-and-conditions

Page 2: Audience Response Systems: Using “Clickers” to Enhance BSW Education

BRIEF REPORTS

Audience Response Systems: Using‘‘Clickers’’ to Enhance BSW Education

LAURIE A. SMITHCSUSB School of Social Work, San Bernardino, California

HERB SHONCSUSB School of Social Work, San Bernardino, California

ROWENA SANTIAGOCSUSB Teaching Resource Center, San Bernardino, California

Among new technologies for enhancing classroom-basededucation are audience response systems (ARS), also known as‘‘clickers.’’ These handheld devices record student responses toinstructor questions and send them electronically to a receiverthat tallies the responses. Summary results are then projected,usually as a graph. Instructors piloted the use of clickers inundergraduate social work research and practice courses.Instructor and student experiences with the clickers were exam-ined by type of course, frequency of use, ease of use, perceivedimpact on learning, and use by students with disabilities. Instruc-tors and students in both types of courses found the clickersbeneficial. Minor differences by type of course were found. Somestudents with disabilities noted problems using the clickers. Moreuse of clickers and research on their use in social work educationis recommended along with continued attention to universaldesign in course preparation.

KEYWORDS audience response systems, clickers, social workeducation

Received December 1, 2010; revised February 24, 2011; accepted May 10, 2011.Address correspondence to Laurie A. Smith, MSW, PhD, Professor and Director, CSUSB

School of Social Work, 5500 University Parkway, San Bernardino, CA 92407. E-mail:[email protected]

Journal of Technology in Human Services, 29:120–132, 2011Copyright # Taylor & Francis Group, LLCISSN: 1522-8835 print=1522-8991 onlineDOI: 10.1080/15228835.2011.587737

120

Dow

nloa

ded

by [

Uni

vers

ity o

f C

ambr

idge

] at

15:

42 0

9 O

ctob

er 2

014

Page 3: Audience Response Systems: Using “Clickers” to Enhance BSW Education

INTRODUCTION

Among relatively new technologies for enhancing classroom-basededucation are audience response systems (ARS), also known as classroomresponse systems (CRS) and, informally, as ‘‘clickers.’’ Clickers are handhelddevices that, at minimum, allow all students in the classroom to electronicallyrespond to questions posed by the instructor. Clicker responses are then tal-lied by a receiver and presented in graph form in tandem with any projectedsoftware (e.g., PowerPoint). This instant feedback allows the instructor tocreate an interactive learning environment by stimulating further discussion,providing clarification where necessary, or even reteaching concepts ifclicker responses reflect poor understanding of the lesson. Opinion-basedas well as factual questions can be posed.

Clickers are made by several companies and have varying featuresbeyond the tallying and display of student responses. Clickers have beenavailable for several years and some universities or individual instructorsmandate that students purchase clickers. However, other than a descriptionof using one manufacturer’s ARS system (Quinn, 2007) and a more recentstudy by Quinn (2010) on student evaluation of their clicker experience ina human behavior class, the social work literature lacks any reference tothe use of and=or efficacy of ARS in social work education. As ARS areadopted by universities, it is important that their use be critically examined,especially in the typically smaller social work foundation courses versuslarger science courses.

This paper presents findings from an exploratory study that involved thepilot testing of two different models of clickers in undergraduate social workresearch and practice courses, courses with contrasting learning expecta-tions. It will compare the acceptability and desirability of ARS from both stu-dent and instructor perspectives in these two courses. Findings from studentsurveys on how the clickers impacted their learning and their recommenda-tions on further use of clickers will be presented. Findings will be comparedby type of course and respondents’ age and the impact of disabilities onclicker use will be examined.

Background

An ARS gives every student in the classroom a way to respond instantly to aquestion raised by the instructor. The question is displayed (e.g., projectedon a screen) or communicated in some way to the class along with potentialanswers to the question. The potential answers are typically lettered or num-bered in a multiple choice format. A basic component of an ARS is a smallhandheld device for each student that has a numbered and=or lettered padfor entering the number or letter corresponding to their response choice.The handheld device transmits the entered data to a receiver that is capable

Audience Response Systems 121

Dow

nloa

ded

by [

Uni

vers

ity o

f C

ambr

idge

] at

15:

42 0

9 O

ctob

er 2

014

Page 4: Audience Response Systems: Using “Clickers” to Enhance BSW Education

of displaying aggregated responses on a projected screen, typically as part ofa PowerPoint presentation.

Questions posed may center on values clarification important for socialwork practice such as ‘‘Could you support a client who wants an abortion?’’Questions can also be factual and used to assess student learning in socialwork education content areas, such as ‘‘Which of these situations illustratesthe working phase of group development?’’ Values questions allow anony-mous expression of opinions and can lead to stimulating class discussion.Factual questions serve as a reality check for instructors and students as towhether content is being mastered.

The devices are sold by various vendors and costs per handheld unitvary, but are roughly comparable to a modestly priced textbook. Receiversare generally more costly, but some vendors may include it for free whena minimum number of clickers are purchased by students, when clickersare bundled with a textbook, or when a campus adopts or standardizes toa particular clicker brand. An Internet search using the term ‘‘audienceresponse systems’’ yields literally pages of different vendors. Systems varyin their specific capabilities. In addition to anonymous summarization of par-ticipant responses, most devices are capable of registering students to a parti-cular handheld unit so that taking attendance and quizzes can be conductedusing the clicker and recorded for grading purposes. When linked to a parti-cular student, some systems can show students their grades and particularitems missed on quizzes. Further, these grades can be transferred to learningmanagement systems such as Blackboard or Moodle. Some systems have theability to analyze student responses by other characteristics such as gender,sometimes called ‘‘data slicing.’’ A recent development in ARS is cell-phone-based polling in which students (or any audience) with texting capability cantext responses that are collected and displayed through a web-based inter-face (Poll Everywhere, 2011). For the very latest information on systemsand features available, an Internet search on audience response systems isrecommended. Likewise, vendors may set prices by volume or other factors.

LITERATURE REVIEW

Literature on ARS in higher education includes descriptive narratives of howthe ARS was used and the immediate effect in the classroom, usually in largescience classes or when sensitive or controversial material is discussed. Thereis less literature on measurable effects on learning. Also, discussion ofpedagogical theory on why ARS may be effective in classrooms is limited.Literature on each of these topics is examined here.

Literature that reported on the advantages of ARS indicate that theyincrease classroom engagement, interactivity, and participation (Collins,Moore, & Shaw-Kokot, 2007; Siau, Sheng, & Fui-Hoon, 2006). Most

122 L. A. Smith et al.

Dow

nloa

ded

by [

Uni

vers

ity o

f C

ambr

idge

] at

15:

42 0

9 O

ctob

er 2

014

Page 5: Audience Response Systems: Using “Clickers” to Enhance BSW Education

documented use has been in large lecture-style courses (MacGeorge et al.,2008). Common are science courses in general and physics courses specifi-cally (Beatty, Gerace, Leonard, & Dufresne, 2006; Duncan, 2005; Herreid,2006). Use for sensitive topics such as sexuality education has been docu-mented (Fisher, 2006; Vail-Smith, Blumell, & Elmore, 2006) as well as usein psychology courses (Stowell & Nelson, 2007). In social work, Quinn(2010) reported on five themes that emerged from student evaluation of theirclicker experience, and three of these themes were on pedagogical implica-tions (increased participation in class discussions, fostering learning fromothers, and allowing for knowledge checks).

Kay and LeSage (2009) reviewed 64 papers on ARS published from2000–2009. One focus of the review was on why ARS may be effective inthe classroom. Benefits of using ARS were grouped as (a) benefits to theclassroom environment, (b) benefits for learning, and (c) assessment bene-fits. Classroom environment benefits were increased attendance (if linkedto final grade), increased attention (by providing an activity that requires par-ticipation), a nonjudgmental opportunity for students to share their opinions(due to anonymity), and increased engagement (ARS use requires active par-ticipation and ARS is perceived as entertaining). Learning benefits includedincreases in quality and quantity of classroom discussion (especially in a peerdiscussion format where initial votes were followed by discussion and revot-ing), and the opportunity to shape class sessions based on feedback on thedegree to which students understand the content. Most importantly inregards to learning, Kay and LeSage found 16 articles in which anecdotaland experimental observations indicated that students using ARS outper-formed students using customary formats (Kay & LeSage, 2009). The benefitsfor assessment of learning, which are similar to learning benefits, are thatboth instructors and students get feedback about the degree to whichconcepts are being understood and that ARS gives students feedback ontheir learning relative to other students.

Among disadvantages of ARS noted in Kay and LeSage’s (2009) reviewwere technological glitches or complexity issues and the flexibility neededon the part of instructors to respond to poorly understood concepts. Thelatter was noted to reduce the amount of content that could be covered inclass. Further, when response units are registered to specific users, some stu-dents resented feeling monitored (Kay & LeSage, 2009). Additionally, Morgan(2008) reports on possible disadvantages of clicker use as cited by bothinstructors and students (clickers can distract from learning, focus seems tobe on technology rather than the material, questions are not very helpful),as well as studies that reported increased enjoyment but decline in studentengagement, and minimal increase in test scores and=or course grades, asa result of clicker use.

The existing literature indicates that ARS have the potential to positivelyimpact the classroom environment and student learning. Further, advances in

Audience Response Systems 123

Dow

nloa

ded

by [

Uni

vers

ity o

f C

ambr

idge

] at

15:

42 0

9 O

ctob

er 2

014

Page 6: Audience Response Systems: Using “Clickers” to Enhance BSW Education

technology are constantly adding features to ARS as well as making themmore ubiquitous, as in the example of using personal mobile phones for poll-ing. Social work education could benefit from more information on the useof ARS in the typically smaller classes that cover basic social work foundationcurriculum. Quinn (2007) provides a very thorough description of the pro-cess of using one manufacturer’s ARS in a Human Behavior course and aquantitative methods course. The findings were consistent with previousliterature: students liked the clickers and felt they facilitated learning and dis-cussion. Current literature has not explored how course content may affectsatisfaction and learning with clickers, nor does it provide any analysis bygender, age, or other background factors such as disability. The issue of dis-ability is particularly important as we strive to implement universal design sothat instruction is planfully accessible for all students (Lightfoot & Gibson,2005).

Accordingly, it is important to further investigate this promisingtechnology for its potential to enhance learning among social work stu-dents. The specific research questions addressed in this exploratory study,as they apply to social work students, were (a) Will social work studentsuse clickers? (b) How difficult or easy to use are they for undergraduatesocial work students? (c) What did social work students like and dislikeabout using clickers? (d) What was their perceived impact on learning?(e) How does having a disability impact clicker use among social workstudents? (f) Does type of course, age, or gender affect perception ofclickers? The two faculty who used the clickers will also discuss theirexperience of the ease of use of the clickers and use with either an under-graduate social work research course or, in one case, a social workpractice course.

METHOD

Participants

The participants in this study were undergraduate social work majors in threeundergraduate social work courses, two sections of a practice course, andone section of a research course. All students were given the opportunityto participate and all were provided with a consent form relevant to the sur-veying of their thoughts on the clickers. The survey protocol and theinformed consent process were approved by the School of Social Work’ssubcommittee of the Institutional Review Board of the university.

In all, 61 students participated in using the clickers and in responding tothe survey on the use of the clickers in class. There was a 100% participationrate for both. The average age of the students was 32 years old and mostwere female (94%).

124 L. A. Smith et al.

Dow

nloa

ded

by [

Uni

vers

ity o

f C

ambr

idge

] at

15:

42 0

9 O

ctob

er 2

014

Page 7: Audience Response Systems: Using “Clickers” to Enhance BSW Education

Survey Instrument

A very brief anonymous self-administered paper and pencil questionnairewas created for this project that consisted of nine questions that focusedon frequency of clicker use in class, ease of clicker use and perceived clickerimpact on learning. Six questions were close-ended questions with predeter-mined options, for example, ‘‘Using the clicker was . . . extremely easy, easy,neither easy nor difficult, difficult, extremely difficult.’’ There were twoopen-ended questions that asked students to explain what they liked anddisliked about using the clickers. One question at the end asked studentswith any disability to note whether their disability affected clicker use.

Procedures

Two instructors used clickers from two vendors (iclicker and TurningPoint).However, the advanced features of one of the systems (e.g., linking studentsto their responses, linking grades to Blackboard) were not used making themquite similar. Units were used anonymously and not registered to individualstudents. One system was used by one instructor with 20 students in onesection of a research course. The other system was used by a second instruc-tor with 30 students in one section of a practice course and 11 students inanother section of the same course. The clickers were on loan to the instruc-tors and used in two to three class sessions. Questions used in the researchcourse included opinion questions to stimulate discussion such as ‘‘Wouldyou continue to participate in an interview longer in person or on thephone?’’ and factual questions to test understanding of concepts such as‘‘what type of research question is answered with a single subject design?’’An example of a question used in the practice course was ‘‘What is the roleand responsibilities of social workers regarding work with at-risk popula-tions?’’ Regardless of whether a question sought opinions or a specific correctanswer, responses were structured as numbered or lettered choices. Thesurvey as described above was handed out in class when the trial use ofthe clickers was over.

Data Analysis

Frequency analyses were conducted for all close-ended questions.Chi-square analysis was then performed to test for differences by course.Correlation analysis was used to test for any age differences in ease of use, per-ception of impact on learning, and recommendation of ARS (there were toofew males for gender analysis). All open-ended responses on what studentsliked and disliked about clickers and comments from students with disabilitieswere manually grouped by theme if possible. If student comments containedmore than one theme, each was counted under the respective theme.

Audience Response Systems 125

Dow

nloa

ded

by [

Uni

vers

ity o

f C

ambr

idge

] at

15:

42 0

9 O

ctob

er 2

014

Page 8: Audience Response Systems: Using “Clickers” to Enhance BSW Education

RESULTS

Student Survey

By counting completed surveys, it was determined that all 61 students hadparticipated. All students responded to the questions on gender and age.On other questions, one to three students did not fill in an answer.

Regarding frequency of use in class, 98% (n¼ 60) said that theyresponded to every question when it was possible to use a clicker. Table 1summarizes student opinions to close-ended questions on clickers. As thepercentages indicate, the student response to the clickers was very positive.Students found the clickers either extremely easy (85%) or easy (15%) to use.Nearly all students (97%) felt that the clickers helped them learn the classmaterial. Most students (90%) highly recommended use of the clickers.

Chi-square analysis by type of course (Table 1) found a statistically sig-nificant difference for the question regarding clicker impact on learning.Although the majority indicated it did help, students in the research coursewere more likely to say that the clicker did not help them learn the classmaterial (10%) as compared to the practice course (3%). There was also astatistically significant difference on recommending clicker use (Table 1).The research course students were more likely to give a ‘‘neutral’’ or ‘‘rec-ommend’’ rating rather than a ‘‘highly recommend’’ rating as compared tothe practice course. No statistically significant differences were found byage in rating clicker affect on learning, and recommendations regardingclicker use. However, there was a statistically significant moderate correlationbetween ease of clicker use and age of student. As student age increased,ratings indicating more difficulty of use increased (r ¼ .27, p< .05).

Among students who wrote responses to the open-ended question onwhat they liked about using the clicker system, the most common theme

TABLE 1 Student Perceptions of Ease of Use, Impact on Learning, and Recommendations

Total Research Practice

N % N % N % df p x2

Ease of Clicker use 1 .095 2.792Extremely easy 51 85 14 74 37 90Easy 9 15 5 26 9 15

Did clicker help you learnclass material?

1 .039 4.25�

Yes 56 97 17 90 39 100No 2 3 2 10 0 0

Recommendation on clicker use 2 .017 8.093�

Recommend highly 55 90 15 75 40 98Recommend 4 7 3 15 1 2Neutral 2 3 2 10 0 0

�p< .05.

126 L. A. Smith et al.

Dow

nloa

ded

by [

Uni

vers

ity o

f C

ambr

idge

] at

15:

42 0

9 O

ctob

er 2

014

Page 9: Audience Response Systems: Using “Clickers” to Enhance BSW Education

(18 comments) was that the clickers enhanced student participation andinteraction. Sample comments of this type were: ‘‘It allowed everyone to par-ticipate and have an active class’’ and ‘‘It helps because people don’t alwaysspeak up in class.’’ Almost as many comments (17) said that the clickersenhanced their enjoyment of the class, for example, ‘‘It made the class ses-sion fun and a wonderful learning environment,’’ ‘‘It was fantastic, it makesclass fun.’’ A third theme (13 comments) was that students liked the anon-ymity of the clicker: ‘‘It makes you think and answer and not look like anidiot’’ and ‘‘It was also great that it was anonymous so no one felt bad if theygot the question wrong.’’ The final two themes were both related learning.One theme (13 comments) was that students liked that the clickers helpedthem identify their level of understanding: ‘‘I could see how my answerscompared to my cohort’’ and ‘‘It helped me see where I needed to study. . . also it helped me recognize how I was doing.’’ The other theme (11 com-ments) was that use of the clickers enhanced understanding of coursematerial: ‘‘It gives us a chance to review and discuss in case we are not fullyunderstanding the lecture materials. It allows us the opportunity to bettergrasp key concepts and ideas.’’

There were fewer comments (9 total) on what students did not like aboutthe clickers. Most students wrote that there was nothing that they did not likeabout clicker use. Most of the dislikes (7) centered on technology issues andseveral addressed time delays. For example, one student commented that‘‘There was a delay in the computer’s receiving ability’’ and one did not like‘‘The fact that it took too long.’’ One student was concerned that the instructorcould register clickers to individual students, thus removing the anonymity.Finally, one student thought that some students were participating super-ficially and said ‘‘Sometimes discussion wasn’t as informative. People seemedto simply ‘‘click’’ and then allow the professor to do the talking.’’

Regarding use by students with disabilities, four students (7%) revealedphysical challenges such as blindness or bone or nerve damage. No studentsreported learning disabilities. Among the four students who reported physicaldisabilities, two noted difficulties using the clicker system: ‘‘it was a little hardto press the buttons’’ and ‘‘the power button automatically shuts off and Icould not see the light, I was not sure if my answer was being recorded unlessI asked someone if my light was still on.’’ The other two students with disabil-ities noted they had no difficulties using the clickers. All students with adisability indicated that the clickers helped them learn the class material. Threehighly recommended clicker use and the fourth recommended clicker use.

Faculty Response

Faculty response to using the clickers was positive. Discussed here are theease of use, impact on student discussion and interaction, and impact onstudent learning. Both instructors found the two systems fairly easy to use.

Audience Response Systems 127

Dow

nloa

ded

by [

Uni

vers

ity o

f C

ambr

idge

] at

15:

42 0

9 O

ctob

er 2

014

Page 10: Audience Response Systems: Using “Clickers” to Enhance BSW Education

However, the instructor with the model that had fewer features found iteasier to use ‘‘right out of the box.’’ The interface with the PowerPoint wasintuitive and it was easy to create questions and display the results. The otherinstructor found that his clicker system required reading the user manual andseveral attempts before being able to successfully use it. Students using thissystem also seemed to need a bit of orientation to explain that answersneeded to be selected as well as sent. The vision-impaired student in thisclass also needed additional assistance understanding the layout of thenumber=key pad.

Among the observations made over the class sessions were that all stu-dents appeared to be challenged to pay attention and engage in answeringquestions. Although a typical in-class question might be answered by someportion of the students, the clickers allowed every student to respond.Responses could then be further discussed, for example, ‘‘For those of youwho answered ‘‘no,’’ what was your thinking behind that?’’ It was possibleafter further discussion of this type to reask the question and see if responseshad changed.

Student enthusiasm was evident, the classroom buzzed with excitementin the sessions in which the clickers were used. Students expressed that theylooked forward to more clicker questions. The clicker units were on loan andstudents were disappointed that they would not be used in further classsessions.

Use of the clickers was very revealing for the instructors. When askingquestions on concepts, getting feedback from all students provided a realitycheck on the level of students understanding. Often students’ understandingwas much less than the instructor assumed. Sometimes the class session hadto be slowed down when it became evident how tenuous of a grasp studentshad on content.

DISCUSSION

The survey results and observations in the classroom indicate that regardlessof course type and clicker system type, age or disability, students overwhelm-ingly enjoyed clickers, recommended clickers, and said they enhanced learn-ing in the class. There was a slightly less enthusiastic response from studentsin the research course. The only two responses in which the clickers did notsupport their learning came from the research course and more of the‘‘recommend’’ responses (versus highly recommend) came from the researchcourse as did the only neutral response regarding recommendation. It maybe that the content of the research course is more difficult for students so thatthe use of clickers, although very positive overall, has less of an effect onenhancing student learning. Instructor differences might also account forthese slight differences.

128 L. A. Smith et al.

Dow

nloa

ded

by [

Uni

vers

ity o

f C

ambr

idge

] at

15:

42 0

9 O

ctob

er 2

014

Page 11: Audience Response Systems: Using “Clickers” to Enhance BSW Education

There appeared to be an instant cultural congruency with the clickersfor most students. They related easily to handheld devices that communicateinformation, as it was similar to what they do with cell phones. They alsorelated to instant display of their responses, which occurs with their socialnetworking sites such as Facebook and Twitter. They were very comfortablewith the polling aspect of the clickers, which is what they see in televisiongame shows or reality shows that involve voting.

New technologies are increasingly used for instruction at the highschool level, including clickers, smart boards, and display of Internet-basedinformation. Students coming directly from high school to college may findcourses that do not use these interactive technologies boring. The findingthat older students found clickers somewhat more difficult to use makessense as these students would not have come of age in this era of fast-pacedtechnological change in personal communication technology. However, thefinding that overall recommendation of clicker use and perceived impact onlearning did not differ by age suggests that older students are quite adaptableto technology that enhances the classroom experience. They may simplyneed a bit more time to become comfortable with new technology.

From the perspective of the instructors, the ARS had very positive effectson interaction and participation in the classroom. The technology facilitatesparticipation by all students. Students were excited as they waited for thenext clicker question and results. There was an inherent drama in watchingthe vote tally climb to 100% and then seeing the results. Having alert, excited,and engaged students is satisfying. It intuitively makes sense that engagedstudents would learn more than disengaged students. Clearly, when com-pared to the low-tech method of asking for raised hands in response to aquestion, the ARS has the advantage of anonymity and a precise and quickcount of student responses.

Limitations of this study include the use of nonprobability sampling,modest sample size, and use over several weeks versus a longer period oftime. Further, the design was a brief survey rather than an experimentaldesign. Instructor eagerness to use new technology may have influencedstudent perceptions. Nonetheless, the findings and the experience of theinstructors reflected previous research. The most outstanding teaching bene-fit of clicker use was instant feedback on how well students understood con-cepts presented in the class. It was truly eye-opening to find out the degreeto which students were uncertain or simply wrong about the material pre-sented. Without the clickers, it is possible to maintain the illusion that theinstructor is effectively transferring knowledge to all students.

There may be dangers in overusing the clickers. Questions remainwhether the excitement seen in the classroom would be sustained or reducedas the novelty wore off. Because our clickers were loaners, we did not havethe opportunity to try them over an extended time period. Further, theremay a negative reaction as noted in the literature if students are registered

Audience Response Systems 129

Dow

nloa

ded

by [

Uni

vers

ity o

f C

ambr

idge

] at

15:

42 0

9 O

ctob

er 2

014

Page 12: Audience Response Systems: Using “Clickers” to Enhance BSW Education

to specific clicker units. Although this would allow attendance and quiztaking to be done through the clicker, it also removes the benefits thatstudents note regarding anonymously responding to questions.

The issue of the use of clickers by students with disabilities is critical toconsider. It would be unfair to increase engagement and interaction for somestudents, but diminish it for students with disabilities. The few students in ourstudy with disabilities were able to use clickers and reported enjoying theiruse. However, adaptations were made to assist them such as reading ques-tions and answer choices aloud or having another student confirm that theirvote had been registered. Higher percentages of students are entering collegewith varied disabilities (Scott, McGuire, & Shaw, 2003) and such ad hocapproaches to assisting students with disabilities using clickers may proveunworkable. An alternative approach is to ensure that clickers are accessibleby design. This approach is consistent with the principle of Universal Designfor Learning, also known as Universal Design for Instruction in which plan-ning for course material to be accessible is done ‘‘up front’’ rather than inresponse to a specific disability (Scott et al., 2003; The Center for UniversalDesign, 1997). Some manufacturers are making clickers with featuresthat make them accessible, so there may be models currently available oravailable in the near future that can meet a universal design standard(iclicker, 2010).

The use of ARS to enhance social work education appears promisingfrom this exploratory research. Further research using stronger designsand investigating student, instructor, and ARS variables is needed to expandour understanding of how ARS affects learning and what variables mayaffect learning outcomes. Specifically, stronger research designs (experi-mental or quasiexperimental) with larger sample sizes and clicker use overlonger time periods would clarify the extent to which ARS affects learningoutcomes. With a larger sample size, variables that may impact acceptanceand benefits of ARS such as age, disability, student GPA, and ease withtechnology could be fully analyzed. Examination of literature on learningto identify other student variables related to effective learning, such aslearning styles and student engagement, when planning future studies isalso recommended.

Current literature does not measure instructor variables and this needsfurther research. The instructors here were seasoned faculty who readilyadapted to new technology and were able to be flexible and responsive intheir classroom teaching. Perhaps only those instructors who are experi-enced with teaching the subject material and are at ease with technology willbe successful using ARS. Randomly assigning instructors to use ARS or havingall faculty in a department use ARS would clarify instructor factors. Untilstudies go farther than documenting the ARS use by a few innovators, wholikely have unique characteristics, findings cannot be generalized. The ques-tion of how attitudes and learning may change with use of ARS over time

130 L. A. Smith et al.

Dow

nloa

ded

by [

Uni

vers

ity o

f C

ambr

idge

] at

15:

42 0

9 O

ctob

er 2

014

Page 13: Audience Response Systems: Using “Clickers” to Enhance BSW Education

should also be investigated. On our own campus, we do have anecdotalevidence of long time users. There is less evidence of wide adoption ofARS even with supportive services available.

Comparisons of specific system will prove difficult over the long run.Since this study was undertaken, the systems we used have already changed.Text messaging based polling, still fairly new, may be overtaken by smartphone applications, also known as ‘‘apps.’’ In fact, Poll Everywhere has justadded a Twitter feature to its phone-based polling system (Poll Everywhere,2011). Greater integration with instructional software such as Blackboard isalso likely to occur. Those considering using ARS will have to investigatewhat is on the market at that time.

In our experience, clickers were well suited to relatively small (15–30students) undergraduate social work classes. Even with a small group ofstudents, clickers were an efficient way to find out what everyone is thinking,to keep students interested, and to teach more effectively. The differing subjectmatter of research versus practice had a limited effect, with the overwhelminglypositive student ratings only slightly dampened in the research course. Ourexperience with ARS in social work courses mirrored previous research thatthis technology enhances the classroom environment and contributes tostudent learning. Wider use of ARS and additional research on their use isrecommended.

REFERENCES

Beatty, I. D., Gerace, W. J., Leonard, W. J., & Dufresne, R. J. (2006). Designing effec-tive questions for classroom response system teaching. American Journal ofPhysics, 74(1), 31–39.

Collins, L. J., Moore, M. E., & Shaw-Kokot, J. (2007). Livening up the classroom:Using audience response systems to promote active learning. Medical ReferenceServices Quarterly, 26(1), 81–88.

Duncan, D. (2005). Clickers in the classroom: How to enhance science teachingusing classroom response systems. New York, NY: Addison-Wesley

Fisher, C. M. (2006). Automated classroom response systems: Implications for sexu-ality education and research. American Journal of Sexuality Education, 1(4),23–31.

Herreid, C. F. (2006). ‘‘Clicker’’ cases: Introducing case study teaching into largeclassrooms. Journal of College Science Teaching, 36(2), 43–47.

iclicker. (2010). iclicker launches webclicker, fully accessible browser-based studentresponse system. Retrieved from http://www.marketwire.com/press-release/iclicker-Launches-webclicker-Fully Accessible-Browser-Based-Student-Response-System-1133071.htm

Kay, R. H., & LeSage, A. (2009). Examining the benefits and challenges of usingAudience Response Systems: A review of the literature. Computers & Education,53(3), 819–827.

Audience Response Systems 131

Dow

nloa

ded

by [

Uni

vers

ity o

f C

ambr

idge

] at

15:

42 0

9 O

ctob

er 2

014

Page 14: Audience Response Systems: Using “Clickers” to Enhance BSW Education

Lightfoot, E., & Gibson, P. (2005). Universal instructional design: A new frameworkfor accommodating students in social work courses. Journal of Social WorkEducation, 41(2), 269–277.

MacGeorge, E., Homan, S., Dunning, J., Elmore, D., Bodie, G., Evans, E., et al.(2008). Student evaluation of audience response technology in large lectureclasses. Educational Technology Research & Development, 56(2), 125–145.

Morgan, R. K. (2008). Exploring the pedagogical effectiveness of clickers. InSight:A Journal of Scholarly Teaching, 3, 31–36.

Poll Everywhere. (2011). Retrieved February 23 from http://www.polleverywhere.com/

Quinn, A. (2010). An exploratory study of opinions on clickers and class partici-pation from students of human behavior in the social environment. Journal ofHuman Behavior in the Social Environment, 20(6), 721–731.

Quinn, A. S. (2007). Audience Response System (clickers) by TurningPoint. Journalof Technology in Human Services, 25(3), 107–114.

Scott, S. S., McGuire, J. M., Shaw, S. (2003). Universal design for instruction: A newparadigm for teaching adults in postsecondary education. Remedial and SpecialEducation, 24(6), 369–379.

Siau, K., Sheng, H., & Fui-Hoon, N. F. (2006). Use of a classroom response systemto enhance classroom interactivity. IEEE Transactions on Education, 49(3),398–403.

Stowell, J. R., & Nelson, J. M. (2007). Benefits of electronic audience responsesystems on student participation, learning, and emotion. Teaching of Psy-chology, 34(4), 253–258.

The Center for Universal Design. (1997). The principles of universal design: Version2.0. Raleigh, NC: North Carolina State University, Center for Universal Design.

Vail-Smith, K., Blumell, C., & Elmore, B. (2006). Using a ‘‘Classroom ResponseSystem’’ to improve active student participation in a large sexual health class.American Journal of Sexuality Education, 1(2), 47–54.

132 L. A. Smith et al.

Dow

nloa

ded

by [

Uni

vers

ity o

f C

ambr

idge

] at

15:

42 0

9 O

ctob

er 2

014