research article a comparison of field-based and...

10
Hindawi Publishing Corporation Advances in Human-Computer Interaction Volume 2013, Article ID 619767, 9 pages http://dx.doi.org/10.1155/2013/619767 Research Article A Comparison of Field-Based and Lab-Based Experiments to Evaluate User Experience of Personalised Mobile Devices Xu Sun 1 and Andrew May 2 1 Science and Engineering, University of Nottingham Ningbo, 199 Taikang East Road, Ningbo 315100, China 2 Loughborough Design School, Loughborough University, Ashby Road, Loughborough, Leicestershire LE11 3TU, UK Correspondence should be addressed to Xu Sun; [email protected] Received 30 July 2012; Accepted 12 January 2013 Academic Editor: Caroline G. L. Cao Copyright © 2013 X. Sun and A. May. is is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. ere is a growing debate in the literature regarding the tradeoffs between lab and field evaluation of mobile devices. is paper presents a comparison of field-based and lab-based experiments to evaluate user experience of personalised mobile devices at large sports events. A lab experiment is recommended when the testing focus is on the user interface and application-oriented usability related issues. However, the results suggest that a field experiment is more suitable for investigating a wider range of factors affecting the overall acceptability of the designed mobile service. Such factors include the system function and effects of actual usage contexts aspects. Where open and relaxed communication is important (e.g., where participant groups are naturally reticent to communicate), this is more readily promoted by the use of a field study. 1. Introduction Usability analysis of systems involving stationary computers has grown to be an established discipline within human- computer interaction. Established concepts, methodologies, and approaches in HCI are being challenged by the increas- ing focus on mobile applications. Real-world ethnographic studies have received relatively little attention within the HCI literature, and little specific effort has been spent on delivering solid design methodologies for mobile applications [1]. Researchers and practitioners have been encouraged to investigate further the criteria, methods, and data collection techniques for usability evaluation of mobile applications [2]. Lab-based experiments and field-based experiments are the methods most discussed in relation to evaluating a mobile application [24]. ere has been considerable debate over whether inter- actions with mobile systems should be investigated in the field or in the more traditional laboratory environment. ere seems to be an implicit assumption that the usability of a mobile application can only be properly evaluated in the field, for example, Kjeldskov and Stage [5]. Some argue that it is important that mobile applications are tested in realistic settings, since testing in a conventional usability lab is unlikely to find all problems that would occur in real mobile usage (e.g., [2, 6, 7]). For example, Christensen et al. [6] presented a study of how ethnographic fieldwork can be used to study children’s mobility patterns via mobile phones. Authors consider that field studies make it possible to carry out analysis that can broaden and deepen understand- ing of peoples’ everyday life. However, some authors have highlighted how ethnographic field experiments are time consuming, complicate data collection, reduce experimental control, or are unacceptably intrusive [1, 3, 5]. Laboratory experiments are generally not burdened with the problems that arise in field experiments as the conditions for the experiment can be controlled, and it is possible to employ facilities for collection of high-quality data [5, 8]. However, Esbj¨ ornsson et al. [9] and many other authors have argued that traditional laboratory experiments do not ade- quately simulate the context where mobile devices are used and also lack the desired ecological validity. is may lead to less valid data, where there is a potential disconnect between stated preferences, intentions, and actual experiences [4]. ere are alternatives to field studies when assessing the impact of mobile devices. Adding contextual richness to

Upload: others

Post on 23-Jul-2020

1 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Research Article A Comparison of Field-Based and …downloads.hindawi.com/journals/ahci/2013/619767.pdfbusiness, philosophy, anthropology, cognitive science, social science, and other

Hindawi Publishing CorporationAdvances in Human-Computer InteractionVolume 2013 Article ID 619767 9 pageshttpdxdoiorg1011552013619767

Research ArticleA Comparison of Field-Based and Lab-Based Experiments toEvaluate User Experience of Personalised Mobile Devices

Xu Sun1 and Andrew May2

1 Science and Engineering University of Nottingham Ningbo 199 Taikang East Road Ningbo 315100 China2 Loughborough Design School Loughborough University Ashby Road Loughborough Leicestershire LE11 3TU UK

Correspondence should be addressed to Xu Sun xusunnottinghameducn

Received 30 July 2012 Accepted 12 January 2013

Academic Editor Caroline G L Cao

Copyright copy 2013 X Sun and A May This is an open access article distributed under the Creative Commons Attribution Licensewhich permits unrestricted use distribution and reproduction in any medium provided the original work is properly cited

There is a growing debate in the literature regarding the tradeoffs between lab and field evaluation of mobile devices This paperpresents a comparison of field-based and lab-based experiments to evaluate user experience of personalised mobile devices atlarge sports events A lab experiment is recommended when the testing focus is on the user interface and application-orientedusability related issues However the results suggest that a field experiment is more suitable for investigating a wider range offactors affecting the overall acceptability of the designed mobile service Such factors include the system function and effects ofactual usage contexts aspects Where open and relaxed communication is important (eg where participant groups are naturallyreticent to communicate) this is more readily promoted by the use of a field study

1 Introduction

Usability analysis of systems involving stationary computershas grown to be an established discipline within human-computer interaction Established concepts methodologiesand approaches in HCI are being challenged by the increas-ing focus on mobile applications Real-world ethnographicstudies have received relatively little attention within theHCI literature and little specific effort has been spent ondelivering solid designmethodologies formobile applications[1] Researchers and practitioners have been encouraged toinvestigate further the criteria methods and data collectiontechniques for usability evaluation of mobile applications [2]Lab-based experiments and field-based experiments are themethods most discussed in relation to evaluating a mobileapplication [2ndash4]

There has been considerable debate over whether inter-actions with mobile systems should be investigated in thefield or in the more traditional laboratory environmentThere seems to be an implicit assumption that the usabilityof a mobile application can only be properly evaluated inthe field for example Kjeldskov and Stage [5] Some arguethat it is important that mobile applications are tested in

realistic settings since testing in a conventional usabilitylab is unlikely to find all problems that would occur inreal mobile usage (eg [2 6 7]) For example Christensenet al [6] presented a study of how ethnographic fieldworkcan be used to study childrenrsquos mobility patterns via mobilephones Authors consider that field studiesmake it possible tocarry out analysis that can broaden and deepen understand-ing of peoplesrsquo everyday life However some authors havehighlighted how ethnographic field experiments are timeconsuming complicate data collection reduce experimentalcontrol or are unacceptably intrusive [1 3 5]

Laboratory experiments are generally not burdened withthe problems that arise in field experiments as the conditionsfor the experiment can be controlled and it is possible toemploy facilities for collection of high-quality data [5 8]However Esbjornsson et al [9] and many other authors haveargued that traditional laboratory experiments do not ade-quately simulate the context where mobile devices are usedand also lack the desired ecological validity This may lead toless valid data where there is a potential disconnect betweenstated preferences intentions and actual experiences [4]

There are alternatives to field studies when assessing theimpact of mobile devices Adding contextual richness to

2 Advances in Human-Computer Interaction

laboratory settings through scenarios and context simula-tion can contribute to the realism of the experiment whilemaintaining the benefits of a controlled setting [3 5 10]The extent to which simulated scenarios represent a real-life situation is a critical determinant of the validity of theusability experiment [11] In addition for mobile devicestwo basic contextual factors which need to be consideredare mobility and divided attention To replicate real-worldmobility within a lab setting test participants have beenasked to use a treadmill or walk on a specifically definedtrack in a lab setting (eg [12]) To replicate divided atten-tion a range of measures have been used in the past Forexample to assess the impact of information provision ondrivers a range of simulators have been used from low-fidelity personal computer-based simulations [13] to high-fidelity simulators with large projection screens involving realdashboards [14] These simulators recreate divided attentionas well as enabling task performance measurement

This paper reports a combination of a field and lab-basedevaluation in order to assess the impact of personalisationon the user experience at sports events This mixed approachenabled a comparison of evaluation methodology and com-ments on their relative effectiveness for mobile users

2 Background

21 Mobile Personalisation Personalization techniques canbe classified into three different categories [15] rule-basedfiltering systems content-filtering systems and collaborativefiltering systems Some recent techniques used in collabo-rative filtering are based on data mining in order to inferpersonalisation rules or build personalisation models fromlarge data sets

The main aspects of personalization which are relevantwithin this paper are what is personalized and how thisis achieved This paper focuses on content personalization[16] that is the tailoring of information within a particularnode within the human-device navigation space This formof personalization is based on the key assumption that theoptimal content for an individual is dependent on contextualfactors relating to the individual the situation they are in andthe activities they are undertakingmdashthese factors can be usedas triggers for the adaptation of content for the individual [17]in order to enhance their user experience

The personalization framework in this research containsfour modules which (1) cooperate to perform the functionsof classification of information (2) collect relevant contextualfactors and (3) personalize content accordingly (Figure 1)

The overall context of this research (which enabled thefield versus laboratory comparison described here) was theinvestigation of the benefits of personalization of mobile-based contentHowever the research project also investigatedthe benefitsdrawbacks of either the user or the system per-forming personalization of content Some research favoursuser-initiated personalization and its focus on the naturalintelligence of the user while others found that system-initiated approachesweremore effective for dynamic contexts[18]

Mobile personalisation system

Content classification module

Context collection module

Content presentation module

Mobile Service provider

User and their context

User interface

Content determinationmodule

Figure 1 Modules forming a mobile personalization application

22 User Experience ldquoUser experiencerdquo is a broader conceptthan usability As user experience affects the success ofa product studies of user experience should therefore beconsidered as an important part of the product developmentprocess [19] The importance of user experience stems frommobile devices being personal objects used by individualswith particular social and cultural norms within an externalcontext defined by their environment

There is much interest in user experience from designbusiness philosophy anthropology cognitive science socialscience and other disciplines Among these there havebeen some initial efforts to create theories of user experi-ence Rasmussen [20] argues that as society becomes moredynamic and integrated with technology there is a need for agreater multidisciplinary approach in tackling human factorsproblems Arhippainen and Tahti [21] in evaluating mobileapplication prototypes describe five categories of influenceson the user experience evoked through interaction with anapplication These are user factors social factors culturalfactors context of use and product (ie application) relatedfactors They also list specific attributes for each categorysuch as the age emotional state of the user habits and normsas cultural factors the pressure of success and failure associal factors time and place as context of use factors andusability and size as product factors Similarly Hassenzahland Tractinsky [22] define user experience as ldquoa consequenceof a userrsquos internal state (predispositions expectations needsmotivation mood etc) the characteristics of the designedsystem (eg complexity purpose usability functionalityetc) and the context (or the environment) within whichthe interaction occurs (eg organizationalsocial settingmeaningfulness of the activity voluntariness of use etc)rdquo

Despite the emerging importance of user experiencethere are several barriers to using this concept as a key design

Advances in Human-Computer Interaction 3

objective and hence how they may be employed withineither a field or lab-based evaluation processThere is not yet acommon definition of user experience because it is associatedwith a broad range of both ldquofuzzyrdquo and dynamic concepts forexample emotion affect experience hedonic and aesthetics[24] There is also currently a lack of consensus regarding theconstructs that comprise user experience and how they maybe measured [25]

This research follows the approach taken by Arhippainenand Tahti [21] and Hassenzahl and Tractinsky [22] in con-sidering user experience to comprise multiple componentsnamely user social usage context cultural and productConsequently multicomponent user experience was mea-sured using 15 agree-disagree scales addressing the five com-ponents This approach therefore considers user experienceto be a formative construct that is measured in terms of itscomponents [26]

3 Field-Based Evaluation ofContent Personalisation

31 Aims The first empirical study was a field experimentthe aims of which were to evaluate the impact of using amobile device and personalising that device for a spectatorwithin a real sports environment Field studies typicallysacrifice some experimental control in order to maximise theecological validity of the experiment

32 Method

321 Setup The study took place in a sports stadium duringa competition involving local football clubsThis competitioncomprised fast-moving sporting action and a large gatheringof spectators most of whomwere unfamiliar with each otherThe user experience was therefore typical of that encounteredduring a large sports event Information was broadcast tospectators over a public address system and shown on a largedisplay screen in one corner of the stadium

322 Experimental Scenarios Thesuccessful use of scenariostakes into account the diversity of contexts encountered byspectators these were derived from previous studies [27 28]Four scenarios were developed including (1) checking theschedule of forthcoming matches and finding one of particu-lar interest (2) obtaining information on a particular playerof interest (3) reviewing the progress of the current match(dynamic information access) (4) joining a ldquocommunityrdquo andparticipating in community-based activities in the stadium

323 Prototypes Prior to the experiments described in thispaper a total of seven field studies were undertaken withspectators at large sports events [27 28] These studiesfound that a large number of contextual factors influencedthe design of serviceinformation provision to a spectatorbut that three had the greatest potential impact on theuser experience These were the sporting preferences of thespectator their physical location in the stadium and the eventprogress

Following the previous studies one paper prototypeand two mobile prototypes (a personalised prototype anda nonpersonalised prototype) were developed that providedcontent to support the experimental scenarios above

Both mobile prototypes were identical in terms of theirfunctionality and visual design The personalised prototypehad the option to manually configure the presentation ofinformation according to the key contextual factors usingthe content filtering personalisation technique (descriptionof the technical development is out of scope of this paper)With this prototype users were asked to set their preferencesrelating to the sports types and athletes taking part from anextended tree menu structure (Figure 2) As a result tailoredevent-based information (eg information on athletes) andevent schedules were presented to the spectator (Figure 3)In contrast the nonpersonalised mobile prototype did notrequire the user to set the personalisation attributes and asa result presented information and services applicable for amore general audience (Figure 4)

The personalised prototype also enabled users to assignthemselves to virtual communities with common interestswithin the stadium using collaborative filtering techniquebased on their stated interests This was via online chatand media sharing within groups defined by their personalpreferences The nonpersonalised mobile prototype enabledthe same chat and media sharing but within a larger groupnot differentiated according to personal interests

In addition a paper leaflet was prepared that was basedon the information that a spectator would traditionally getduring a real event from posters and programs It providedinformation on match schedules and playersrsquo profiles

324 Participants Eighteen participants were recruited byan external agency Their ages ranged from 18 to 45 yearsmean 285 with an equal gender split A range of occupa-tions were represented including sales journalist engineerteacher secretary and accountant eight participants wereuniversity students A recruitment criterion was that allparticipants should have undertaken personalisation of theirringtone screen background or shortcut keys at least once aweek In addition all participants had watched a large sportsevent in an open stadium within the last six months

325 User Experience Measurement The key dependentvariable was the user experience that resulted from usingthe prototypes User experience was measured in terms ofthe multidimensional components described in Section 2User experience was therefore rated by participants in amulticomponent assessment that was theoretically groundedand empirically derived (see the appendix)

326 Procedure A pilot study was used to check the timingsrefine the data collection methods and resolve any ambigui-ties with the instructions and data collection tools This wascarried out in a stadium

At the beginning of the study participants were giveninstruction on how to use the mobile prototype Participantsthen undertook the scenario-based tasks using either the

4 Advances in Human-Computer Interaction

Figure 2 Setting personalisation preferences

Figure 3Obtaining personalised information on particular athletes

paper-based programme or one of the mobile prototypesThey then completed the 15-item questionnaire that differen-tiated the five components of user experience

Finally they were interviewed in a semistructured formatto discuss the experiment design in the field study The studylasted around 60 minutes for each user

327 Data Collection Amultiplicity of data collectionmeth-ods was used within the study to enable a limited triangu-lation of subjective rating verbal report and observational

Figure 4 Obtaining general information on all athletes

data A video camera was used to record their interactionswith the mobile prototypes Users were encouraged (but notrequired) to ldquothink aloudrdquo during the trial A user assessmentin relation to overall user experience was captured using six-point agreedisagree rating scales Direct observation wasalso used and informal interaction with the researcher wasencouraged during the trial A posttrial structured interviewwas video recorded

328 Analysis of Data For quantitative data Friedmannonparametric tests for three related samples were calcu-lated for the main within-subjects factors Multiple pairedcomparisons were undertaken using the technique describedin Siegel and Castellan [23 page 180] to take into accountthe increased likelihood of a type I error with multiplecomparisons The qualitative data (interview transcripts andconcurrent verbal reports) and observational data were ana-lyzed using an affinity diagram technique [29] to collate andcategorise this data

33 Field StudyResults Thefield experiments generated largeamounts of rich and grounded data in relatively short timeFor all tasks the personalised mobile device consistentlygenerated the highest user experience rating (see Table 1)Thenonpersonalised device was consistently worse but still animprovement over the control condition (paper-based pro-gramme) The control condition (paper-based programme)was consistently rated poormdasha limitation also noted byNilsson et al [30] in their field observations of sports events

To make sure that the user interface itself (rather thanthe personalization approach) was not majorly influencingthe experiment outcome the user-initiated interface wasevaluated by calculating the percentage of tasks completed byparticipants and analysing user comments The experiment

Advances in Human-Computer Interaction 5

Table 1 The overall user experience assessment for each task

Task Friedman (3 related samples) Multiple paired comparisons [23 page 180] (120572 = 05)

Checking match schedules 1205942(2) = 313119875 lt 001

119873 = 18 10038161003816100381610038161003816119877Personalized minus 119877Nonpersonalized

10038161003816100381610038161003816= 215 gtZ = 1436lowast

119873 = 18 10038161003816100381610038161003816119877Personalized minus 119877paper

10038161003816100381610038161003816= 295 gtZ = 1436lowast

119873 = 18 10038161003816100381610038161003816119877Nonpersonalized minus 119877paper

10038161003816100381610038161003816= 8 ltZ = 1436

Obtaining player information 1205942(2) = 34119875 lt 001

119873 = 18 10038161003816100381610038161003816119877Personalized minus 119877Nonpersonalized

10038161003816100381610038161003816= 17 gtZ = 1436lowast

119873 = 18 10038161003816100381610038161003816119877Personalized minus 119877paper

10038161003816100381610038161003816= 34 gtZ = 1436lowast

119873 = 18 10038161003816100381610038161003816119877Nonpersonalized minus 119877paper

10038161003816100381610038161003816= 17 gtZ = 1436lowast

Reviewing match progress 1205942(2) = 306119875 lt 001

119873 = 18 10038161003816100381610038161003816119877Personalized minus 119877Nonpersonalized

10038161003816100381610038161003816= 165 gtZ = 1436lowast

119873 = 18 10038161003816100381610038161003816119877Personalized minus 119877paper

10038161003816100381610038161003816= 33 gtZ = 1436lowast

119873 = 18 10038161003816100381610038161003816119877Nonpersonalized minus 119877paper

10038161003816100381610038161003816= 165 gtZ = 1436lowast

Building a community 1205942(2) = 321119875 lt 001

119873 = 18 10038161003816100381610038161003816119877Personalized minus 119877Nonpersonalized

10038161003816100381610038161003816= 17 gtZ = 1436lowast

119873 = 18 10038161003816100381610038161003816119877Personalized minus 119877paper

10038161003816100381610038161003816= 34 gtZ = 1436lowast

119873 = 18 10038161003816100381610038161003816119877Nonpersonalized minus 119877paper

10038161003816100381610038161003816= 17 gtZ = 1436lowast

lowast

Indicates a significant difference

recorded approximately 18 hours of video capturing the 18subjectsrsquo interaction steps while completing each task Insummary 955 of the tasks were completed successfully Forthe 45 of unfinished tasks 35 usability problems with thepersonalized mobile prototype were reported

Participants in the field experiment stressed problemsof mobile ldquouserdquo rather than simply application ldquousabilityrdquoand typically those problems were expressed in the languageof the situation For example users were concerned aboutspending toomuch time personalizing the application duringthe event (detracting from the event itself) and the font onthe interface being too small to read in an open stadiumunder bright sunlight The field study also identified issues ofvalidity and precision of the data presented by the applicationFor example users were concerned about the reliability ofinformation provided by the prototypes after they found thatsome player information presented on the mobile device didnot match with the real events

To the participants the field experiment environmentfelt fairly informal and the users talked freely about theuse of the application and their feelings Users expressedhow the field experiment allowed them to feel relaxed andable to communicate with the researcher as they undertookthe evaluation scenarios Rather than focusing on interfaceissues users generally expressed broader views and wereable to give a wide range of evaluation-related informa-tion during the experiment such as expressing contextuallyrelated requirements A particular example of the kind of datagenerated during the field study was the need for the mobilecontent (and its delivery) to be highly integrated into thetemporal flow of the sporting action

Using a field experiment approach it may be possible toobtain a higher level of ldquorealismrdquo However this evaluationmethod is not easy to undertake Experiments in the fieldare influenced by external factors such as the weather andmoreover it is more difficult to actually collect data fromparticipants Users were impacted by events happening in thefield such as noise and other disturbances For example someparticipants actually forgot that they were taking part in a

research studymdashuntil prompted they focused their attentionon the competition happening in the stadium and weresubstantially distracted from the field experiment Flexibilityand pragmatism are needed to still collect useful data whilenot detracting unduly from the experience generated withinthe field setting

4 Lab-Based Evaluation ofPersonalisation Approach

41 Aims Whereas the previous section has described a field-based study this section describes a very similar lab studythis time centred around a multievent athletics meeting Thespecific objectives of this study were to use a more controlledexperimental setup to compare the user experience for aspectator at a large sports event under three conditions (1)using paper-based (not mobile) content (2) using a mobileprototype where personalisation parameters were set by theuser (3) using a similar prototype where parameters wereset automatically Similar procedures and data collectionmethods enable a comparison of findings with the field-basedstudy described in Section 3

42 Method

421 Setup This evaluation took place in a usability labo-ratory in the UK The usability lab was set up to resemble apart of the sports stadium To mimic the divided attentionthat would result from a spectator watching a sporting eventsports footage including auditory output was projectedonto the front wall of the laboratory A crowd scene wasreplicated on each of the two side walls In contrast totypical mobile applications such as tourist guides [31] aspectator is usually seated and relatively static at a sportingevent Therefore mobility per se did not need to be incor-porated into the experimental environment A video camerawas used to record usersrsquo interaction with the prototypes(Figure 5)

6 Advances in Human-Computer Interaction

Figure 5 User study in a usability lab

422 Experimental Scenarios Five scenarios were devel-oped based on the same key spectator activities as used in theprevious field-based experiment Four of the experimentaltasks were the same as those employed in the field experi-ment This lab-based study also employed a fifth task whichrequired the participant to select a suitable viewing angle for amobile broadcast on the deviceThis enabled the participantsto follow the sporting action from wherever they were in thestadium

423 Prototypes Two personalised mobile prototypes weredeveloped using content filtering and collaborative filteringpersonalisation techniqueThey shared the same look and feelas those used in the field study described in Section 3

One prototype enabled user-initiated personalisation ofcontent using an extended menu structure as before (seeFigure 2) As a result this prototype presented event infor-mation such as athlete details and event schedules based onthe usersrsquo settings

In contrast the prototype with system-initiated personal-isation did not require that participants set personalisationparameters Personalised content was then presented auto-matically to the participant using the content and collabora-tive filtering personalisation technique

As for the field experiment a control condition wasincluded that was representative of a nondigital event pro-gramme that a spectator would typically have This was apaper leaflet that provided information on competition timesand athlete information

424 Participants A different cohort of eighteen partici-pants took part in the study again split equally male-femaleaged between 18 and 38 and with various occupations Asin the previous field-based study all participants had regularexperience of personalising mobile devices and had attendeda large sports event within the last six months

425 Procedure and Analysis of Data A pilot study wasused to maximise the realism of the simulation and ensurethat the data collection methods that were used during thefield trial were effective within a laboratory setting Two keychanges were made rearrangement of speakers to broadcastaudience noise the side projection of a video of the audiencewas replaced with large-scale posters This enhanced theperceived social atmosphere within the laboratory without

unnecessarily distracting the participant from the mainprojected view (the sporting action)

The procedure followed that used for the field experimentdescribed in Section 3 Each task was completed in turnwith the three personalisation conditions counterbalancedacross participants within each task The design of the labexperiment was also discussed with participants at the end ofthe study The study lasted approximately one hour for eachparticipant

As for field study the data comprised quantitative ratingscale data concurrent verbal reports and posttrial interviewdata and observational data This was analysed as describedpreviously

43 Lab Study Results Across all tasks there were clearadvantages to having personalised content delivered overa mobile device The control condition (representing thatpresent at current stadium environments) was significantlyworse in all scenarios (see Table 2) Table 2 also indicatesthat in terms of the impact on user experience neither usernor system-initiated personalisation emerged as a single bestapproach across the range of tasks studied

The experiment recorded approximately 18 hours of videoduring the lab study In summary 92 of the tasks werecompleted successfully For the 8 of unfinished tasks 42usability problems with the personalized mobile prototypewere reported In general participants considered the userinterface of both personalized mobile prototypes easy to use

The laboratory study was relatively easy to undertake andcollect data from During the lab study participants quicklyrevealed considerable information about how a spectator usesthe prototypes The lab environment also offered more con-trol over the conditions for the experiment In comparisonto the field study participants were more focused on theexperiment being undertaken and were not influenced byexternal factors such asweather noise or others disturbancesfrom the sporting environment

There were some drawbacks to the lab experimentincluding only a limited representation of the real worldand greater uncertainty over the degree of generalization ofresults outside laboratory settingsThis laboratory study triedto ldquobringrdquo the large sporting event into the experiment bycarefully setting up the lab to resemble a stadium designingscenarios based on previous studies of context within largesports events and involving users who were familiar with theusage context It also addressed the issue of usersrsquo dividedattention by requiring subjects to watch a sport event videowhich was projected on the front wall of the lab roomwhile performing the scenario-based tasks with the mobileprototypes As a result participants were able to identifysome context related problems (eg the font was too smallto read in an open stadium) during the lab experiment Inaddition participants expressed their concerns with usingthe personalized prototypes from contextual and social per-spectives including concerns over spending too much timepersonalizing the device during the event and thereforeactually missing some of the sporting action

Advances in Human-Computer Interaction 7

Table 2 The overall user experience assessment for each task

Task Friedman (3 related samples) Multiple paired comparisons [23 page 180] (120572 = 05)

Follow sporting action119873 = 18 119873 = 18 1003816100381610038161003816

1003816119877system-initiated minus 119877use-initiated

10038161003816100381610038161003816= 15 gtZ = 1436lowast

1205942(2) = 275 119873 = 18 1003816100381610038161003816

1003816119877system-initiated minus 119877paper

10038161003816100381610038161003816= 49 gtZ = 1436lowast

119875 lt 001 119873 = 18 10038161003816100381610038161003816119877user-initiated minus 119877paper

10038161003816100381610038161003816= 34 gtZ = 1436lowast

Obtaining athlete information119873 = 18 119873 = 18 1003816100381610038161003816

1003816119877system-initiated minus 119877use-initiated

10038161003816100381610038161003816= 2 ltZ = 1436

1205942(2) = 294 119873 = 18 1003816100381610038161003816

1003816119877system-initiated minus 119877paper

10038161003816100381610038161003816= 16 gtZ = 1436lowast

119875 lt 001 119873 = 18 10038161003816100381610038161003816119877user-initiated minus 119877paper

10038161003816100381610038161003816= 18 gtZ = 1436lowast

Reviewing athletics event results119873 = 18 119873 = 18 1003816100381610038161003816

1003816119877system-initiated minus 119877use-initiated

10038161003816100381610038161003816= 175 gtZ = 1436lowast

1205942(2) = 355 119873 = 18 1003816100381610038161003816

1003816119877system-initiated minus 119877paper

10038161003816100381610038161003816= 35 gtZ = 1436lowast

119875 lt 001 119873 = 18 10038161003816100381610038161003816119877user-initiated minus 119877paper

10038161003816100381610038161003816= 175 gtZ = 1436lowast

Building a community119873 = 18 119873 = 18 1003816100381610038161003816

1003816119877system-initiated minus 119877use-initiated

10038161003816100381610038161003816= 6 ltZ = 1436

1205942(2) = 296 119873 = 18 1003816100381610038161003816

1003816119877system-initiated minus 119877paper

10038161003816100381610038161003816= 24 gtZ = 1436lowast

119875 lt 001 119873 = 18 10038161003816100381610038161003816119877user-initiated minus 119877paper

10038161003816100381610038161003816= 30 gtZ = 1436lowast

Checking event schedules119873 = 18 119873 = 18 1003816100381610038161003816

1003816119877system-initiated minus 119877use-initiated

10038161003816100381610038161003816= 165 gtZ = 1436lowast

1205942(2) = 345 119873 = 18 1003816100381610038161003816

1003816119877system-initiated minus 119877paper

10038161003816100381610038161003816= 345 gtZ = 1436lowast

119875 lt 001 119873 = 18 10038161003816100381610038161003816119877user-initiated minus 119877paper

10038161003816100381610038161003816= 18 gtZ = 1436lowast

lowast

Indicates a significant result

The lab experiment did not allow users to feel relaxedduring the experimental procedure Participants acted morepolitely during the study and they pointed out that they wereuncomfortable about expressing negative feelings about theapplications In one example when interviewing a participantabout aspects of his user experience he generally stated thatit was fine However when presented with the EmotionCards(a group of cartoon faces that were used to help promotediscussion of affective aspects of interaction) he tended topick up one emotion card and talked a lot about concernsover the time and effort required to manually personalize theapplication without feeling that he was being overly critical

5 Discussion and Conclusion

In recent years there has been much debate on whethermobile applications should be evaluated in the field or in atraditional lab environment issues including usersrsquo behaviour[11] identification of usability problems [3] the experimentsettings [5 8 32] the communication with participants[33] This research enabled a comparison between fieldand laboratory experiments based on similar users mobileapplications and task-based scenarios In particular a similaruser-initiated prototype was used in each even though thefield experiment took place at a football competition and thelab experiment recreated an athletics meeting

The number of usability problems identified during boththe field and lab experiments was similar (over all partic-ipants when using the user-initiated prototype there were35 usability problems identified in the field setting and 42found in the lab setting) These findings are consistent withthose of Kjeldskov et al [3] They specifically compared laband field-based usability results and found that the differencein effectiveness of these two approaches was nonsignificant

in identifying most usability problems Also some contextrelated problems such as the font being too small to read inan open stadium were identified in both experiment settingsHowever some key differences in the effectiveness of the fieldand laboratory approaches were found the lab experimentidentified problems related to the detail of the interfacedesign for example the colours and icons on the interfacethe field experiment identified issues of validity and precisionof the data presented by the application when using theapplication in a stadium The field experiment also stressedthe problems of mobile ldquouserdquo rather than simply applicationusability and typically these problems were expressed in thelanguage of the situation [11]

An analysis of positive versus negative behaviours [11] wasundertakenThis data included verbal reports and rating scaledata according to the user experience definitions Acceptingthe limitations of a direct comparison participants reactedmore negatively in the laboratory setting when completingsimilar tasks (using the similar user-initiated personalisationapproach) In the field individuals were influenced by theatmosphere surrounding the sports event and this resulted inan enhanced user experience In addition they focused moreattention on the actual usage of personalisation on themobiledevice instead of issues to do with the interface The labsetting was less engaging than the field setting participantsweremore likely to be critical and in general they took longerto perform certain tasks by focusing (and commenting) oninterface issues such as fonts and colours used

The field experiment was more difficult to conduct thanthe lab experiment a point noted by many authors includingKjeldskov et al [3] and Baillie [8] Confounding factors werepresent for example variations in theweather and noise fromother spectators In addition although it was desirable thatparticipants engaged in the sporting action spectatorsrsquo foci ofattention could not be controlled and was difficult to predict

8 Advances in Human-Computer Interaction

Some participants were ldquodistractedrdquo from the experimentaltasks and this did not occur during the laboratory settingThe greater control possible with a laboratory study (asdiscussed by a range of authors including [2 5 8])was clearlydemonstrated during these studies

Where there is an interest in qualitative data goodcommunication between the researcher and participants isvital The field experiment provided a more open and relaxedatmosphere for discourse Users more freely discussed theiruse of the mobile applications their underlying beliefs andattitudes that arose during the study The field experimenthelped the communication tensions with the participants asthey felt they were not being directly examined As well asgenerally promoting the generation of qualitative data thefield experiments encouraged the expression of broader aswell as more contextually relevant views An example is theidentification of contextually dependent requirements whichoccurred much less frequently during the lab-based studies

Some suggestions for user impact assessment withmobiledevices can be made based on the findings of this study Alab experiment is recommended when the focus is on theuser interface and device-oriented usability issues In suchcases a well-designed lab study should provide the validityrequired while being easier quicker and cheaper to conductHowever the results suggest that a field experiment is bettersuited for investigating a wider range of factors affectingthe overall acceptability of mobile services including systemfunctions and impact of usage contexts Where open andrelaxed communication is important (eg where participantgroups are naturally reticent to communicate) this is morereadily promoted by the use of a field study

The natural tension between a deductive and inductiveresearch design was also apparent This research in generalwas essentially deductive since it set out to explain causalrelationships between variables operationalized conceptscontrolled variables and used structured and repeatablemethods to collect data However the field study in particularalso comprised an inductive element as there was a desire tounderstand the research context and the meanings attachedto events in order to help design the lab study Van Elzakkeret al [2] underline how it is often desirable to combineapproaches within the same study The undertaking of a fieldexperiment followed by a lab experiment was an attemptat multiplicity of methods from an essentially deductiveviewpoint This recognised that the natural research processis often that of moving from a process of understanding toone of testing whilst attempting to avoid the unsatisfactorymiddle ground of user evaluations that are divorced from anyunderlying research objectives

Appendix

Likert Items Used to Assess User Experience

In all cases participant responses were based on a six-pointscale ranging from 1 (strongly disagree) to 6 (strongly agree)

User Aspect

(1) I feel happy using [ABC] during the event

(2) My expectations regarding my spectator experiencein the stadium are met using [ABC]

(3) My needs as a spectator are taken into account using[ABC]

Social Aspect

(4) Using [ABC] helps me feel I am communicatingand sharing information with others in the stadium

(5) The [ABC] helps me create enjoyable experienceswithin the stadium

(6) The [ABC] helps me share my experiences withothers within the stadium

Usage Context Aspect

(7) The [ABC] provides me with help in the stadiumwhile watching the sporting action

(8) The [ABC] provides me with information aboutother spectators in the stadium

(9) The [ABC] helps provide me with a good physicaland social environment in the stadium

Culture Aspect

(10) The [ABC] helps me feel part of a group(11) The [ABC] helps me promote my group image(12) The [ABC] helps me interact with my group

Product Aspect

(13) The [ABC] is useful at the event(14) The [ABC] is easy to learn how to use(15) The [ABC] is easy to use

Acknowledgments

This work has been carried out as part of the Philips ResearchProgramme on Lifestyle The authors would like to thank allthe participants who were generous with their time duringthis study

References

[1] D Raptis N Tselios and N Avouris ldquoContext-based designof mobile applications for museums a survey of existingpracticesrdquo in Proceedings of the 7th International Conference onHuman Computer Interaction with Mobile Devices amp Services2005

[2] C P J M Van Elzakker I Delikostidis and P J M VanOosterom ldquoField-based usability evaluation methodology formobile geo-applicationsrdquo Cartographic Journal vol 45 no 2pp 139ndash149 2008

[3] J Kjeldskov M B Skov B S Als and R T Hoslashegh ldquoIs it worththe hassle Exploring the added value of evaluating the usabilityof context-aware mobile systems in the Fieldrdquo in Proceedingsof the 6th International Mobile Conference (HCI rsquo04) Springer2004

Advances in Human-Computer Interaction 9

[4] X Sun D Golightly J Cranwelly B Bedwell and S SharplesldquoParticipant experiences of mobile device-based diary studiesrdquoInternational Journal ofMobile HumanComputer Interaction Inpress

[5] J Kjeldskov and J Stage ldquoNew techniques for usability eval-uation of mobile systemsrdquo International Journal of HumanComputer Studies vol 60 no 5-6 pp 599ndash620 2004

[6] P Christensen M Romero M Thomas A S Nielsen and HHarder ldquoChildren mobility and space using GPS and mobilephone technologies in ethnographic researchrdquo Journal of MixedMethods Research vol 5 no 3 pp 227ndash246 2011

[7] E G Coleman ldquoEthnographic approaches to digital mediardquoAnnual Review of Anthropology vol 39 pp 487ndash505 2010

[8] L Baillie ldquoFuture Telecommunication exploring actual userdquoin Proceedings of the International Conference on Human-Computer Interaction IOS press 2003

[9] M Esbjornsson B Brown O Juhlin D Normark MOstergren and E Laurier ldquoWatching the cars go round andround designing for active spectatingrdquo in Proceedings of theConference on Human Factors in Computing Systems (CHI rsquo06)pp 1221ndash1224 New York NY USA April 2006

[10] X Sun ldquoUser requirements of personalized mobile applicationsat large sporting eventsrdquo in Proceedings of the IADIS Multi Con-ference on Computer Science and Information Systems(MCCSISrsquo10) Freiburg Germany 2010

[11] H B L Duh G C B Tan and V H H Chen ldquoUsabilityevaluation for mobile device a comparison of laboratory andfield testsrdquo in Proceedings of the 8th International Conference onHuman-Computer Interaction with Mobile Devices and Services(MobileHCI rsquo06) pp 181ndash186 September 2006

[12] A Pirhonen S Brewster and C Holguin ldquoGestural andaudio metaphors as a means of control for mobile devicesrdquo inProceedings of the Conference on Human Factors in ComputingSystems (CHI rsquo02) pp 291ndash298 NewYork NY USA April 2002

[13] R Graham and C Carter ldquoComparison of speech input andmanual control of in-car devices while on-the-moverdquo in Pro-ceedings of the 2nd Workshop on Human Computer Interactionwith Mobile Devices (HCI rsquo1999) Edinburgh UK 1999

[14] J Lai K Cheng P Green andO Tsimhoni ldquoOn the road and onthe web Comprehension of synthetic and human speech whiledrivingrdquo in Proceedings of the Conference on Human Factors inComputing Systems (CHI rsquo01) pp 206ndash212 New York NY USAApril 2001

[15] B Mobasher ldquoData mining for personalizationrdquo in The Adap-tive Web Methods and Strategies of Web Personalization PBrusilovsky A Kobsa and W Nejdl Eds pp 1ndash46 SpringerBerlin Germany 2007

[16] D Wu I Im M Tremaine K Instone and M Turoff ldquoAframework for classifying personalization schemes used on e-commerce websitesrdquo in Proceedings of the 36th Hawaii Inter-national Conference on Systems Sciences (HICSS rsquo03) p 222bMaui Hawaii USA 2003

[17] L Norros E Kaasinen J Plomp and P Rama Human-Technology Interaction Research and Design VTT RoadmapVTT Research Notes 2220 Espoo Finland 2003

[18] E Frias-Martinez S Y Chen and X Liu ldquoEvaluation of apersonalized digital library based on cognitive styles adaptivityversus adaptabilityrdquo International Journal of Information Man-agement vol 29 no 1 pp 48ndash56 2009

[19] J Dewey Art as Experience Perigee New York NY USA 1980

[20] J Rasmussen ldquoHuman factors in a dynamic information soci-ety where are we headingrdquo Ergonomics vol 43 no 7 pp 869ndash879 2000

[21] L Arhippainen and M Tahti ldquoEmpirical Evaluation of UserExperience in Two AdaptiveMobile Application Prototypesrdquo inProceedings of the 2nd International Conference on Mobile andUbiquitous Multimedia Lulea Sweden 2003

[22] M Hassenzahl andN Tractinsky ldquoUser experiencemdasha researchagendardquo Behaviour and Information Technology vol 25 no 2pp 91ndash97 2006

[23] S Siegel and N J Castellan Nonparametric Statistics For theBehavioral Sciences McGraw-Hill New York NY USA 1988

[24] E Law V Roto A P O S Vermeeren J Kort and MHassenzahl ldquoTowards a shared definition of user experiencerdquoin Proceedings of the 28th Annual Conference on Human Factorsin Computing Systems (CHI rsquo08) pp 2395ndash2398 Florence ItalyApril 2008

[25] E L C Law and P van Schaik ldquoModelling user experiencemdashanagenda for research and practicerdquo Interacting with Computersvol 22 no 5 pp 313ndash322 2010

[26] C H Lin P J Sher and H Y Shih ldquoPast progress andfuture directions in conceptualizing customer perceived valuerdquoInternational Journal of Service Industry Management vol 16no 4 pp 318ndash336 2005

[27] X Sun and A May ldquoMobile personalisation at large sportseventsmdashuser experience and mobile device personalisationrdquoin Human-Computer Interaction 2007 vol 11 Springer BerlinGermany 2007

[28] X Sun and A May ldquoThe role of spatial contextual factors inmobile personalization at large sports eventsrdquo Personal andUbiquitous Computing vol 13 no 4 pp 293ndash302 2009

[29] J Hackos and J Redish User and Task Analysis For InterfaceDesign John Wiley amp Sons New York NY USA 1998

[30] A Nilsson U Nulden and D Olsson ldquoSpectator informationsupport exploring the context of distributed eventsrdquo in Pro-ceedings of the International ACM SIGGROUP Conference onSupporting Group Work 2004

[31] B Oertel K Steinmuller and M Kuom ldquoMobile multimediaservices for tourismrdquo in Information and Communication Tech-nologies in Tourism 2002 K W Wober A J Frew and M HitzEds pp 265ndash274 Springer New York NY USA 2002

[32] D D Salvucci ldquoPredicting the effects of in-car interfaces ondriver behavior using a cognitive architecturerdquo in Proceedingsof the Conference on Human Factors in Computing Systems (CHIrsquo01) pp 120ndash127 New York NY USA April 2001

[33] A Kaikkonen A Kekalainen M Cankar T Kallio and AKankainen ldquoUsability testing of mobile applications a compar-ison between laboratory and field testingrdquo Journal of UsabilityStudies vol 1 pp 4ndash16 2005

Submit your manuscripts athttpwwwhindawicom

Computer Games Technology

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Distributed Sensor Networks

International Journal of

Advances in

FuzzySystems

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014

International Journal of

ReconfigurableComputing

Hindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Applied Computational Intelligence and Soft Computing

thinspAdvancesthinspinthinsp

Artificial Intelligence

HindawithinspPublishingthinspCorporationhttpwwwhindawicom Volumethinsp2014

Advances inSoftware EngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Electrical and Computer Engineering

Journal of

Journal of

Computer Networks and Communications

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporation

httpwwwhindawicom Volume 2014

Advances in

Multimedia

International Journal of

Biomedical Imaging

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

ArtificialNeural Systems

Advances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

RoboticsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Computational Intelligence and Neuroscience

Industrial EngineeringJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Modelling amp Simulation in EngineeringHindawi Publishing Corporation httpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Human-ComputerInteraction

Advances in

Computer EngineeringAdvances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Page 2: Research Article A Comparison of Field-Based and …downloads.hindawi.com/journals/ahci/2013/619767.pdfbusiness, philosophy, anthropology, cognitive science, social science, and other

2 Advances in Human-Computer Interaction

laboratory settings through scenarios and context simula-tion can contribute to the realism of the experiment whilemaintaining the benefits of a controlled setting [3 5 10]The extent to which simulated scenarios represent a real-life situation is a critical determinant of the validity of theusability experiment [11] In addition for mobile devicestwo basic contextual factors which need to be consideredare mobility and divided attention To replicate real-worldmobility within a lab setting test participants have beenasked to use a treadmill or walk on a specifically definedtrack in a lab setting (eg [12]) To replicate divided atten-tion a range of measures have been used in the past Forexample to assess the impact of information provision ondrivers a range of simulators have been used from low-fidelity personal computer-based simulations [13] to high-fidelity simulators with large projection screens involving realdashboards [14] These simulators recreate divided attentionas well as enabling task performance measurement

This paper reports a combination of a field and lab-basedevaluation in order to assess the impact of personalisationon the user experience at sports events This mixed approachenabled a comparison of evaluation methodology and com-ments on their relative effectiveness for mobile users

2 Background

21 Mobile Personalisation Personalization techniques canbe classified into three different categories [15] rule-basedfiltering systems content-filtering systems and collaborativefiltering systems Some recent techniques used in collabo-rative filtering are based on data mining in order to inferpersonalisation rules or build personalisation models fromlarge data sets

The main aspects of personalization which are relevantwithin this paper are what is personalized and how thisis achieved This paper focuses on content personalization[16] that is the tailoring of information within a particularnode within the human-device navigation space This formof personalization is based on the key assumption that theoptimal content for an individual is dependent on contextualfactors relating to the individual the situation they are in andthe activities they are undertakingmdashthese factors can be usedas triggers for the adaptation of content for the individual [17]in order to enhance their user experience

The personalization framework in this research containsfour modules which (1) cooperate to perform the functionsof classification of information (2) collect relevant contextualfactors and (3) personalize content accordingly (Figure 1)

The overall context of this research (which enabled thefield versus laboratory comparison described here) was theinvestigation of the benefits of personalization of mobile-based contentHowever the research project also investigatedthe benefitsdrawbacks of either the user or the system per-forming personalization of content Some research favoursuser-initiated personalization and its focus on the naturalintelligence of the user while others found that system-initiated approachesweremore effective for dynamic contexts[18]

Mobile personalisation system

Content classification module

Context collection module

Content presentation module

Mobile Service provider

User and their context

User interface

Content determinationmodule

Figure 1 Modules forming a mobile personalization application

22 User Experience ldquoUser experiencerdquo is a broader conceptthan usability As user experience affects the success ofa product studies of user experience should therefore beconsidered as an important part of the product developmentprocess [19] The importance of user experience stems frommobile devices being personal objects used by individualswith particular social and cultural norms within an externalcontext defined by their environment

There is much interest in user experience from designbusiness philosophy anthropology cognitive science socialscience and other disciplines Among these there havebeen some initial efforts to create theories of user experi-ence Rasmussen [20] argues that as society becomes moredynamic and integrated with technology there is a need for agreater multidisciplinary approach in tackling human factorsproblems Arhippainen and Tahti [21] in evaluating mobileapplication prototypes describe five categories of influenceson the user experience evoked through interaction with anapplication These are user factors social factors culturalfactors context of use and product (ie application) relatedfactors They also list specific attributes for each categorysuch as the age emotional state of the user habits and normsas cultural factors the pressure of success and failure associal factors time and place as context of use factors andusability and size as product factors Similarly Hassenzahland Tractinsky [22] define user experience as ldquoa consequenceof a userrsquos internal state (predispositions expectations needsmotivation mood etc) the characteristics of the designedsystem (eg complexity purpose usability functionalityetc) and the context (or the environment) within whichthe interaction occurs (eg organizationalsocial settingmeaningfulness of the activity voluntariness of use etc)rdquo

Despite the emerging importance of user experiencethere are several barriers to using this concept as a key design

Advances in Human-Computer Interaction 3

objective and hence how they may be employed withineither a field or lab-based evaluation processThere is not yet acommon definition of user experience because it is associatedwith a broad range of both ldquofuzzyrdquo and dynamic concepts forexample emotion affect experience hedonic and aesthetics[24] There is also currently a lack of consensus regarding theconstructs that comprise user experience and how they maybe measured [25]

This research follows the approach taken by Arhippainenand Tahti [21] and Hassenzahl and Tractinsky [22] in con-sidering user experience to comprise multiple componentsnamely user social usage context cultural and productConsequently multicomponent user experience was mea-sured using 15 agree-disagree scales addressing the five com-ponents This approach therefore considers user experienceto be a formative construct that is measured in terms of itscomponents [26]

3 Field-Based Evaluation ofContent Personalisation

31 Aims The first empirical study was a field experimentthe aims of which were to evaluate the impact of using amobile device and personalising that device for a spectatorwithin a real sports environment Field studies typicallysacrifice some experimental control in order to maximise theecological validity of the experiment

32 Method

321 Setup The study took place in a sports stadium duringa competition involving local football clubsThis competitioncomprised fast-moving sporting action and a large gatheringof spectators most of whomwere unfamiliar with each otherThe user experience was therefore typical of that encounteredduring a large sports event Information was broadcast tospectators over a public address system and shown on a largedisplay screen in one corner of the stadium

322 Experimental Scenarios Thesuccessful use of scenariostakes into account the diversity of contexts encountered byspectators these were derived from previous studies [27 28]Four scenarios were developed including (1) checking theschedule of forthcoming matches and finding one of particu-lar interest (2) obtaining information on a particular playerof interest (3) reviewing the progress of the current match(dynamic information access) (4) joining a ldquocommunityrdquo andparticipating in community-based activities in the stadium

323 Prototypes Prior to the experiments described in thispaper a total of seven field studies were undertaken withspectators at large sports events [27 28] These studiesfound that a large number of contextual factors influencedthe design of serviceinformation provision to a spectatorbut that three had the greatest potential impact on theuser experience These were the sporting preferences of thespectator their physical location in the stadium and the eventprogress

Following the previous studies one paper prototypeand two mobile prototypes (a personalised prototype anda nonpersonalised prototype) were developed that providedcontent to support the experimental scenarios above

Both mobile prototypes were identical in terms of theirfunctionality and visual design The personalised prototypehad the option to manually configure the presentation ofinformation according to the key contextual factors usingthe content filtering personalisation technique (descriptionof the technical development is out of scope of this paper)With this prototype users were asked to set their preferencesrelating to the sports types and athletes taking part from anextended tree menu structure (Figure 2) As a result tailoredevent-based information (eg information on athletes) andevent schedules were presented to the spectator (Figure 3)In contrast the nonpersonalised mobile prototype did notrequire the user to set the personalisation attributes and asa result presented information and services applicable for amore general audience (Figure 4)

The personalised prototype also enabled users to assignthemselves to virtual communities with common interestswithin the stadium using collaborative filtering techniquebased on their stated interests This was via online chatand media sharing within groups defined by their personalpreferences The nonpersonalised mobile prototype enabledthe same chat and media sharing but within a larger groupnot differentiated according to personal interests

In addition a paper leaflet was prepared that was basedon the information that a spectator would traditionally getduring a real event from posters and programs It providedinformation on match schedules and playersrsquo profiles

324 Participants Eighteen participants were recruited byan external agency Their ages ranged from 18 to 45 yearsmean 285 with an equal gender split A range of occupa-tions were represented including sales journalist engineerteacher secretary and accountant eight participants wereuniversity students A recruitment criterion was that allparticipants should have undertaken personalisation of theirringtone screen background or shortcut keys at least once aweek In addition all participants had watched a large sportsevent in an open stadium within the last six months

325 User Experience Measurement The key dependentvariable was the user experience that resulted from usingthe prototypes User experience was measured in terms ofthe multidimensional components described in Section 2User experience was therefore rated by participants in amulticomponent assessment that was theoretically groundedand empirically derived (see the appendix)

326 Procedure A pilot study was used to check the timingsrefine the data collection methods and resolve any ambigui-ties with the instructions and data collection tools This wascarried out in a stadium

At the beginning of the study participants were giveninstruction on how to use the mobile prototype Participantsthen undertook the scenario-based tasks using either the

4 Advances in Human-Computer Interaction

Figure 2 Setting personalisation preferences

Figure 3Obtaining personalised information on particular athletes

paper-based programme or one of the mobile prototypesThey then completed the 15-item questionnaire that differen-tiated the five components of user experience

Finally they were interviewed in a semistructured formatto discuss the experiment design in the field study The studylasted around 60 minutes for each user

327 Data Collection Amultiplicity of data collectionmeth-ods was used within the study to enable a limited triangu-lation of subjective rating verbal report and observational

Figure 4 Obtaining general information on all athletes

data A video camera was used to record their interactionswith the mobile prototypes Users were encouraged (but notrequired) to ldquothink aloudrdquo during the trial A user assessmentin relation to overall user experience was captured using six-point agreedisagree rating scales Direct observation wasalso used and informal interaction with the researcher wasencouraged during the trial A posttrial structured interviewwas video recorded

328 Analysis of Data For quantitative data Friedmannonparametric tests for three related samples were calcu-lated for the main within-subjects factors Multiple pairedcomparisons were undertaken using the technique describedin Siegel and Castellan [23 page 180] to take into accountthe increased likelihood of a type I error with multiplecomparisons The qualitative data (interview transcripts andconcurrent verbal reports) and observational data were ana-lyzed using an affinity diagram technique [29] to collate andcategorise this data

33 Field StudyResults Thefield experiments generated largeamounts of rich and grounded data in relatively short timeFor all tasks the personalised mobile device consistentlygenerated the highest user experience rating (see Table 1)Thenonpersonalised device was consistently worse but still animprovement over the control condition (paper-based pro-gramme) The control condition (paper-based programme)was consistently rated poormdasha limitation also noted byNilsson et al [30] in their field observations of sports events

To make sure that the user interface itself (rather thanthe personalization approach) was not majorly influencingthe experiment outcome the user-initiated interface wasevaluated by calculating the percentage of tasks completed byparticipants and analysing user comments The experiment

Advances in Human-Computer Interaction 5

Table 1 The overall user experience assessment for each task

Task Friedman (3 related samples) Multiple paired comparisons [23 page 180] (120572 = 05)

Checking match schedules 1205942(2) = 313119875 lt 001

119873 = 18 10038161003816100381610038161003816119877Personalized minus 119877Nonpersonalized

10038161003816100381610038161003816= 215 gtZ = 1436lowast

119873 = 18 10038161003816100381610038161003816119877Personalized minus 119877paper

10038161003816100381610038161003816= 295 gtZ = 1436lowast

119873 = 18 10038161003816100381610038161003816119877Nonpersonalized minus 119877paper

10038161003816100381610038161003816= 8 ltZ = 1436

Obtaining player information 1205942(2) = 34119875 lt 001

119873 = 18 10038161003816100381610038161003816119877Personalized minus 119877Nonpersonalized

10038161003816100381610038161003816= 17 gtZ = 1436lowast

119873 = 18 10038161003816100381610038161003816119877Personalized minus 119877paper

10038161003816100381610038161003816= 34 gtZ = 1436lowast

119873 = 18 10038161003816100381610038161003816119877Nonpersonalized minus 119877paper

10038161003816100381610038161003816= 17 gtZ = 1436lowast

Reviewing match progress 1205942(2) = 306119875 lt 001

119873 = 18 10038161003816100381610038161003816119877Personalized minus 119877Nonpersonalized

10038161003816100381610038161003816= 165 gtZ = 1436lowast

119873 = 18 10038161003816100381610038161003816119877Personalized minus 119877paper

10038161003816100381610038161003816= 33 gtZ = 1436lowast

119873 = 18 10038161003816100381610038161003816119877Nonpersonalized minus 119877paper

10038161003816100381610038161003816= 165 gtZ = 1436lowast

Building a community 1205942(2) = 321119875 lt 001

119873 = 18 10038161003816100381610038161003816119877Personalized minus 119877Nonpersonalized

10038161003816100381610038161003816= 17 gtZ = 1436lowast

119873 = 18 10038161003816100381610038161003816119877Personalized minus 119877paper

10038161003816100381610038161003816= 34 gtZ = 1436lowast

119873 = 18 10038161003816100381610038161003816119877Nonpersonalized minus 119877paper

10038161003816100381610038161003816= 17 gtZ = 1436lowast

lowast

Indicates a significant difference

recorded approximately 18 hours of video capturing the 18subjectsrsquo interaction steps while completing each task Insummary 955 of the tasks were completed successfully Forthe 45 of unfinished tasks 35 usability problems with thepersonalized mobile prototype were reported

Participants in the field experiment stressed problemsof mobile ldquouserdquo rather than simply application ldquousabilityrdquoand typically those problems were expressed in the languageof the situation For example users were concerned aboutspending toomuch time personalizing the application duringthe event (detracting from the event itself) and the font onthe interface being too small to read in an open stadiumunder bright sunlight The field study also identified issues ofvalidity and precision of the data presented by the applicationFor example users were concerned about the reliability ofinformation provided by the prototypes after they found thatsome player information presented on the mobile device didnot match with the real events

To the participants the field experiment environmentfelt fairly informal and the users talked freely about theuse of the application and their feelings Users expressedhow the field experiment allowed them to feel relaxed andable to communicate with the researcher as they undertookthe evaluation scenarios Rather than focusing on interfaceissues users generally expressed broader views and wereable to give a wide range of evaluation-related informa-tion during the experiment such as expressing contextuallyrelated requirements A particular example of the kind of datagenerated during the field study was the need for the mobilecontent (and its delivery) to be highly integrated into thetemporal flow of the sporting action

Using a field experiment approach it may be possible toobtain a higher level of ldquorealismrdquo However this evaluationmethod is not easy to undertake Experiments in the fieldare influenced by external factors such as the weather andmoreover it is more difficult to actually collect data fromparticipants Users were impacted by events happening in thefield such as noise and other disturbances For example someparticipants actually forgot that they were taking part in a

research studymdashuntil prompted they focused their attentionon the competition happening in the stadium and weresubstantially distracted from the field experiment Flexibilityand pragmatism are needed to still collect useful data whilenot detracting unduly from the experience generated withinthe field setting

4 Lab-Based Evaluation ofPersonalisation Approach

41 Aims Whereas the previous section has described a field-based study this section describes a very similar lab studythis time centred around a multievent athletics meeting Thespecific objectives of this study were to use a more controlledexperimental setup to compare the user experience for aspectator at a large sports event under three conditions (1)using paper-based (not mobile) content (2) using a mobileprototype where personalisation parameters were set by theuser (3) using a similar prototype where parameters wereset automatically Similar procedures and data collectionmethods enable a comparison of findings with the field-basedstudy described in Section 3

42 Method

421 Setup This evaluation took place in a usability labo-ratory in the UK The usability lab was set up to resemble apart of the sports stadium To mimic the divided attentionthat would result from a spectator watching a sporting eventsports footage including auditory output was projectedonto the front wall of the laboratory A crowd scene wasreplicated on each of the two side walls In contrast totypical mobile applications such as tourist guides [31] aspectator is usually seated and relatively static at a sportingevent Therefore mobility per se did not need to be incor-porated into the experimental environment A video camerawas used to record usersrsquo interaction with the prototypes(Figure 5)

6 Advances in Human-Computer Interaction

Figure 5 User study in a usability lab

422 Experimental Scenarios Five scenarios were devel-oped based on the same key spectator activities as used in theprevious field-based experiment Four of the experimentaltasks were the same as those employed in the field experi-ment This lab-based study also employed a fifth task whichrequired the participant to select a suitable viewing angle for amobile broadcast on the deviceThis enabled the participantsto follow the sporting action from wherever they were in thestadium

423 Prototypes Two personalised mobile prototypes weredeveloped using content filtering and collaborative filteringpersonalisation techniqueThey shared the same look and feelas those used in the field study described in Section 3

One prototype enabled user-initiated personalisation ofcontent using an extended menu structure as before (seeFigure 2) As a result this prototype presented event infor-mation such as athlete details and event schedules based onthe usersrsquo settings

In contrast the prototype with system-initiated personal-isation did not require that participants set personalisationparameters Personalised content was then presented auto-matically to the participant using the content and collabora-tive filtering personalisation technique

As for the field experiment a control condition wasincluded that was representative of a nondigital event pro-gramme that a spectator would typically have This was apaper leaflet that provided information on competition timesand athlete information

424 Participants A different cohort of eighteen partici-pants took part in the study again split equally male-femaleaged between 18 and 38 and with various occupations Asin the previous field-based study all participants had regularexperience of personalising mobile devices and had attendeda large sports event within the last six months

425 Procedure and Analysis of Data A pilot study wasused to maximise the realism of the simulation and ensurethat the data collection methods that were used during thefield trial were effective within a laboratory setting Two keychanges were made rearrangement of speakers to broadcastaudience noise the side projection of a video of the audiencewas replaced with large-scale posters This enhanced theperceived social atmosphere within the laboratory without

unnecessarily distracting the participant from the mainprojected view (the sporting action)

The procedure followed that used for the field experimentdescribed in Section 3 Each task was completed in turnwith the three personalisation conditions counterbalancedacross participants within each task The design of the labexperiment was also discussed with participants at the end ofthe study The study lasted approximately one hour for eachparticipant

As for field study the data comprised quantitative ratingscale data concurrent verbal reports and posttrial interviewdata and observational data This was analysed as describedpreviously

43 Lab Study Results Across all tasks there were clearadvantages to having personalised content delivered overa mobile device The control condition (representing thatpresent at current stadium environments) was significantlyworse in all scenarios (see Table 2) Table 2 also indicatesthat in terms of the impact on user experience neither usernor system-initiated personalisation emerged as a single bestapproach across the range of tasks studied

The experiment recorded approximately 18 hours of videoduring the lab study In summary 92 of the tasks werecompleted successfully For the 8 of unfinished tasks 42usability problems with the personalized mobile prototypewere reported In general participants considered the userinterface of both personalized mobile prototypes easy to use

The laboratory study was relatively easy to undertake andcollect data from During the lab study participants quicklyrevealed considerable information about how a spectator usesthe prototypes The lab environment also offered more con-trol over the conditions for the experiment In comparisonto the field study participants were more focused on theexperiment being undertaken and were not influenced byexternal factors such asweather noise or others disturbancesfrom the sporting environment

There were some drawbacks to the lab experimentincluding only a limited representation of the real worldand greater uncertainty over the degree of generalization ofresults outside laboratory settingsThis laboratory study triedto ldquobringrdquo the large sporting event into the experiment bycarefully setting up the lab to resemble a stadium designingscenarios based on previous studies of context within largesports events and involving users who were familiar with theusage context It also addressed the issue of usersrsquo dividedattention by requiring subjects to watch a sport event videowhich was projected on the front wall of the lab roomwhile performing the scenario-based tasks with the mobileprototypes As a result participants were able to identifysome context related problems (eg the font was too smallto read in an open stadium) during the lab experiment Inaddition participants expressed their concerns with usingthe personalized prototypes from contextual and social per-spectives including concerns over spending too much timepersonalizing the device during the event and thereforeactually missing some of the sporting action

Advances in Human-Computer Interaction 7

Table 2 The overall user experience assessment for each task

Task Friedman (3 related samples) Multiple paired comparisons [23 page 180] (120572 = 05)

Follow sporting action119873 = 18 119873 = 18 1003816100381610038161003816

1003816119877system-initiated minus 119877use-initiated

10038161003816100381610038161003816= 15 gtZ = 1436lowast

1205942(2) = 275 119873 = 18 1003816100381610038161003816

1003816119877system-initiated minus 119877paper

10038161003816100381610038161003816= 49 gtZ = 1436lowast

119875 lt 001 119873 = 18 10038161003816100381610038161003816119877user-initiated minus 119877paper

10038161003816100381610038161003816= 34 gtZ = 1436lowast

Obtaining athlete information119873 = 18 119873 = 18 1003816100381610038161003816

1003816119877system-initiated minus 119877use-initiated

10038161003816100381610038161003816= 2 ltZ = 1436

1205942(2) = 294 119873 = 18 1003816100381610038161003816

1003816119877system-initiated minus 119877paper

10038161003816100381610038161003816= 16 gtZ = 1436lowast

119875 lt 001 119873 = 18 10038161003816100381610038161003816119877user-initiated minus 119877paper

10038161003816100381610038161003816= 18 gtZ = 1436lowast

Reviewing athletics event results119873 = 18 119873 = 18 1003816100381610038161003816

1003816119877system-initiated minus 119877use-initiated

10038161003816100381610038161003816= 175 gtZ = 1436lowast

1205942(2) = 355 119873 = 18 1003816100381610038161003816

1003816119877system-initiated minus 119877paper

10038161003816100381610038161003816= 35 gtZ = 1436lowast

119875 lt 001 119873 = 18 10038161003816100381610038161003816119877user-initiated minus 119877paper

10038161003816100381610038161003816= 175 gtZ = 1436lowast

Building a community119873 = 18 119873 = 18 1003816100381610038161003816

1003816119877system-initiated minus 119877use-initiated

10038161003816100381610038161003816= 6 ltZ = 1436

1205942(2) = 296 119873 = 18 1003816100381610038161003816

1003816119877system-initiated minus 119877paper

10038161003816100381610038161003816= 24 gtZ = 1436lowast

119875 lt 001 119873 = 18 10038161003816100381610038161003816119877user-initiated minus 119877paper

10038161003816100381610038161003816= 30 gtZ = 1436lowast

Checking event schedules119873 = 18 119873 = 18 1003816100381610038161003816

1003816119877system-initiated minus 119877use-initiated

10038161003816100381610038161003816= 165 gtZ = 1436lowast

1205942(2) = 345 119873 = 18 1003816100381610038161003816

1003816119877system-initiated minus 119877paper

10038161003816100381610038161003816= 345 gtZ = 1436lowast

119875 lt 001 119873 = 18 10038161003816100381610038161003816119877user-initiated minus 119877paper

10038161003816100381610038161003816= 18 gtZ = 1436lowast

lowast

Indicates a significant result

The lab experiment did not allow users to feel relaxedduring the experimental procedure Participants acted morepolitely during the study and they pointed out that they wereuncomfortable about expressing negative feelings about theapplications In one example when interviewing a participantabout aspects of his user experience he generally stated thatit was fine However when presented with the EmotionCards(a group of cartoon faces that were used to help promotediscussion of affective aspects of interaction) he tended topick up one emotion card and talked a lot about concernsover the time and effort required to manually personalize theapplication without feeling that he was being overly critical

5 Discussion and Conclusion

In recent years there has been much debate on whethermobile applications should be evaluated in the field or in atraditional lab environment issues including usersrsquo behaviour[11] identification of usability problems [3] the experimentsettings [5 8 32] the communication with participants[33] This research enabled a comparison between fieldand laboratory experiments based on similar users mobileapplications and task-based scenarios In particular a similaruser-initiated prototype was used in each even though thefield experiment took place at a football competition and thelab experiment recreated an athletics meeting

The number of usability problems identified during boththe field and lab experiments was similar (over all partic-ipants when using the user-initiated prototype there were35 usability problems identified in the field setting and 42found in the lab setting) These findings are consistent withthose of Kjeldskov et al [3] They specifically compared laband field-based usability results and found that the differencein effectiveness of these two approaches was nonsignificant

in identifying most usability problems Also some contextrelated problems such as the font being too small to read inan open stadium were identified in both experiment settingsHowever some key differences in the effectiveness of the fieldand laboratory approaches were found the lab experimentidentified problems related to the detail of the interfacedesign for example the colours and icons on the interfacethe field experiment identified issues of validity and precisionof the data presented by the application when using theapplication in a stadium The field experiment also stressedthe problems of mobile ldquouserdquo rather than simply applicationusability and typically these problems were expressed in thelanguage of the situation [11]

An analysis of positive versus negative behaviours [11] wasundertakenThis data included verbal reports and rating scaledata according to the user experience definitions Acceptingthe limitations of a direct comparison participants reactedmore negatively in the laboratory setting when completingsimilar tasks (using the similar user-initiated personalisationapproach) In the field individuals were influenced by theatmosphere surrounding the sports event and this resulted inan enhanced user experience In addition they focused moreattention on the actual usage of personalisation on themobiledevice instead of issues to do with the interface The labsetting was less engaging than the field setting participantsweremore likely to be critical and in general they took longerto perform certain tasks by focusing (and commenting) oninterface issues such as fonts and colours used

The field experiment was more difficult to conduct thanthe lab experiment a point noted by many authors includingKjeldskov et al [3] and Baillie [8] Confounding factors werepresent for example variations in theweather and noise fromother spectators In addition although it was desirable thatparticipants engaged in the sporting action spectatorsrsquo foci ofattention could not be controlled and was difficult to predict

8 Advances in Human-Computer Interaction

Some participants were ldquodistractedrdquo from the experimentaltasks and this did not occur during the laboratory settingThe greater control possible with a laboratory study (asdiscussed by a range of authors including [2 5 8])was clearlydemonstrated during these studies

Where there is an interest in qualitative data goodcommunication between the researcher and participants isvital The field experiment provided a more open and relaxedatmosphere for discourse Users more freely discussed theiruse of the mobile applications their underlying beliefs andattitudes that arose during the study The field experimenthelped the communication tensions with the participants asthey felt they were not being directly examined As well asgenerally promoting the generation of qualitative data thefield experiments encouraged the expression of broader aswell as more contextually relevant views An example is theidentification of contextually dependent requirements whichoccurred much less frequently during the lab-based studies

Some suggestions for user impact assessment withmobiledevices can be made based on the findings of this study Alab experiment is recommended when the focus is on theuser interface and device-oriented usability issues In suchcases a well-designed lab study should provide the validityrequired while being easier quicker and cheaper to conductHowever the results suggest that a field experiment is bettersuited for investigating a wider range of factors affectingthe overall acceptability of mobile services including systemfunctions and impact of usage contexts Where open andrelaxed communication is important (eg where participantgroups are naturally reticent to communicate) this is morereadily promoted by the use of a field study

The natural tension between a deductive and inductiveresearch design was also apparent This research in generalwas essentially deductive since it set out to explain causalrelationships between variables operationalized conceptscontrolled variables and used structured and repeatablemethods to collect data However the field study in particularalso comprised an inductive element as there was a desire tounderstand the research context and the meanings attachedto events in order to help design the lab study Van Elzakkeret al [2] underline how it is often desirable to combineapproaches within the same study The undertaking of a fieldexperiment followed by a lab experiment was an attemptat multiplicity of methods from an essentially deductiveviewpoint This recognised that the natural research processis often that of moving from a process of understanding toone of testing whilst attempting to avoid the unsatisfactorymiddle ground of user evaluations that are divorced from anyunderlying research objectives

Appendix

Likert Items Used to Assess User Experience

In all cases participant responses were based on a six-pointscale ranging from 1 (strongly disagree) to 6 (strongly agree)

User Aspect

(1) I feel happy using [ABC] during the event

(2) My expectations regarding my spectator experiencein the stadium are met using [ABC]

(3) My needs as a spectator are taken into account using[ABC]

Social Aspect

(4) Using [ABC] helps me feel I am communicatingand sharing information with others in the stadium

(5) The [ABC] helps me create enjoyable experienceswithin the stadium

(6) The [ABC] helps me share my experiences withothers within the stadium

Usage Context Aspect

(7) The [ABC] provides me with help in the stadiumwhile watching the sporting action

(8) The [ABC] provides me with information aboutother spectators in the stadium

(9) The [ABC] helps provide me with a good physicaland social environment in the stadium

Culture Aspect

(10) The [ABC] helps me feel part of a group(11) The [ABC] helps me promote my group image(12) The [ABC] helps me interact with my group

Product Aspect

(13) The [ABC] is useful at the event(14) The [ABC] is easy to learn how to use(15) The [ABC] is easy to use

Acknowledgments

This work has been carried out as part of the Philips ResearchProgramme on Lifestyle The authors would like to thank allthe participants who were generous with their time duringthis study

References

[1] D Raptis N Tselios and N Avouris ldquoContext-based designof mobile applications for museums a survey of existingpracticesrdquo in Proceedings of the 7th International Conference onHuman Computer Interaction with Mobile Devices amp Services2005

[2] C P J M Van Elzakker I Delikostidis and P J M VanOosterom ldquoField-based usability evaluation methodology formobile geo-applicationsrdquo Cartographic Journal vol 45 no 2pp 139ndash149 2008

[3] J Kjeldskov M B Skov B S Als and R T Hoslashegh ldquoIs it worththe hassle Exploring the added value of evaluating the usabilityof context-aware mobile systems in the Fieldrdquo in Proceedingsof the 6th International Mobile Conference (HCI rsquo04) Springer2004

Advances in Human-Computer Interaction 9

[4] X Sun D Golightly J Cranwelly B Bedwell and S SharplesldquoParticipant experiences of mobile device-based diary studiesrdquoInternational Journal ofMobile HumanComputer Interaction Inpress

[5] J Kjeldskov and J Stage ldquoNew techniques for usability eval-uation of mobile systemsrdquo International Journal of HumanComputer Studies vol 60 no 5-6 pp 599ndash620 2004

[6] P Christensen M Romero M Thomas A S Nielsen and HHarder ldquoChildren mobility and space using GPS and mobilephone technologies in ethnographic researchrdquo Journal of MixedMethods Research vol 5 no 3 pp 227ndash246 2011

[7] E G Coleman ldquoEthnographic approaches to digital mediardquoAnnual Review of Anthropology vol 39 pp 487ndash505 2010

[8] L Baillie ldquoFuture Telecommunication exploring actual userdquoin Proceedings of the International Conference on Human-Computer Interaction IOS press 2003

[9] M Esbjornsson B Brown O Juhlin D Normark MOstergren and E Laurier ldquoWatching the cars go round andround designing for active spectatingrdquo in Proceedings of theConference on Human Factors in Computing Systems (CHI rsquo06)pp 1221ndash1224 New York NY USA April 2006

[10] X Sun ldquoUser requirements of personalized mobile applicationsat large sporting eventsrdquo in Proceedings of the IADIS Multi Con-ference on Computer Science and Information Systems(MCCSISrsquo10) Freiburg Germany 2010

[11] H B L Duh G C B Tan and V H H Chen ldquoUsabilityevaluation for mobile device a comparison of laboratory andfield testsrdquo in Proceedings of the 8th International Conference onHuman-Computer Interaction with Mobile Devices and Services(MobileHCI rsquo06) pp 181ndash186 September 2006

[12] A Pirhonen S Brewster and C Holguin ldquoGestural andaudio metaphors as a means of control for mobile devicesrdquo inProceedings of the Conference on Human Factors in ComputingSystems (CHI rsquo02) pp 291ndash298 NewYork NY USA April 2002

[13] R Graham and C Carter ldquoComparison of speech input andmanual control of in-car devices while on-the-moverdquo in Pro-ceedings of the 2nd Workshop on Human Computer Interactionwith Mobile Devices (HCI rsquo1999) Edinburgh UK 1999

[14] J Lai K Cheng P Green andO Tsimhoni ldquoOn the road and onthe web Comprehension of synthetic and human speech whiledrivingrdquo in Proceedings of the Conference on Human Factors inComputing Systems (CHI rsquo01) pp 206ndash212 New York NY USAApril 2001

[15] B Mobasher ldquoData mining for personalizationrdquo in The Adap-tive Web Methods and Strategies of Web Personalization PBrusilovsky A Kobsa and W Nejdl Eds pp 1ndash46 SpringerBerlin Germany 2007

[16] D Wu I Im M Tremaine K Instone and M Turoff ldquoAframework for classifying personalization schemes used on e-commerce websitesrdquo in Proceedings of the 36th Hawaii Inter-national Conference on Systems Sciences (HICSS rsquo03) p 222bMaui Hawaii USA 2003

[17] L Norros E Kaasinen J Plomp and P Rama Human-Technology Interaction Research and Design VTT RoadmapVTT Research Notes 2220 Espoo Finland 2003

[18] E Frias-Martinez S Y Chen and X Liu ldquoEvaluation of apersonalized digital library based on cognitive styles adaptivityversus adaptabilityrdquo International Journal of Information Man-agement vol 29 no 1 pp 48ndash56 2009

[19] J Dewey Art as Experience Perigee New York NY USA 1980

[20] J Rasmussen ldquoHuman factors in a dynamic information soci-ety where are we headingrdquo Ergonomics vol 43 no 7 pp 869ndash879 2000

[21] L Arhippainen and M Tahti ldquoEmpirical Evaluation of UserExperience in Two AdaptiveMobile Application Prototypesrdquo inProceedings of the 2nd International Conference on Mobile andUbiquitous Multimedia Lulea Sweden 2003

[22] M Hassenzahl andN Tractinsky ldquoUser experiencemdasha researchagendardquo Behaviour and Information Technology vol 25 no 2pp 91ndash97 2006

[23] S Siegel and N J Castellan Nonparametric Statistics For theBehavioral Sciences McGraw-Hill New York NY USA 1988

[24] E Law V Roto A P O S Vermeeren J Kort and MHassenzahl ldquoTowards a shared definition of user experiencerdquoin Proceedings of the 28th Annual Conference on Human Factorsin Computing Systems (CHI rsquo08) pp 2395ndash2398 Florence ItalyApril 2008

[25] E L C Law and P van Schaik ldquoModelling user experiencemdashanagenda for research and practicerdquo Interacting with Computersvol 22 no 5 pp 313ndash322 2010

[26] C H Lin P J Sher and H Y Shih ldquoPast progress andfuture directions in conceptualizing customer perceived valuerdquoInternational Journal of Service Industry Management vol 16no 4 pp 318ndash336 2005

[27] X Sun and A May ldquoMobile personalisation at large sportseventsmdashuser experience and mobile device personalisationrdquoin Human-Computer Interaction 2007 vol 11 Springer BerlinGermany 2007

[28] X Sun and A May ldquoThe role of spatial contextual factors inmobile personalization at large sports eventsrdquo Personal andUbiquitous Computing vol 13 no 4 pp 293ndash302 2009

[29] J Hackos and J Redish User and Task Analysis For InterfaceDesign John Wiley amp Sons New York NY USA 1998

[30] A Nilsson U Nulden and D Olsson ldquoSpectator informationsupport exploring the context of distributed eventsrdquo in Pro-ceedings of the International ACM SIGGROUP Conference onSupporting Group Work 2004

[31] B Oertel K Steinmuller and M Kuom ldquoMobile multimediaservices for tourismrdquo in Information and Communication Tech-nologies in Tourism 2002 K W Wober A J Frew and M HitzEds pp 265ndash274 Springer New York NY USA 2002

[32] D D Salvucci ldquoPredicting the effects of in-car interfaces ondriver behavior using a cognitive architecturerdquo in Proceedingsof the Conference on Human Factors in Computing Systems (CHIrsquo01) pp 120ndash127 New York NY USA April 2001

[33] A Kaikkonen A Kekalainen M Cankar T Kallio and AKankainen ldquoUsability testing of mobile applications a compar-ison between laboratory and field testingrdquo Journal of UsabilityStudies vol 1 pp 4ndash16 2005

Submit your manuscripts athttpwwwhindawicom

Computer Games Technology

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Distributed Sensor Networks

International Journal of

Advances in

FuzzySystems

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014

International Journal of

ReconfigurableComputing

Hindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Applied Computational Intelligence and Soft Computing

thinspAdvancesthinspinthinsp

Artificial Intelligence

HindawithinspPublishingthinspCorporationhttpwwwhindawicom Volumethinsp2014

Advances inSoftware EngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Electrical and Computer Engineering

Journal of

Journal of

Computer Networks and Communications

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporation

httpwwwhindawicom Volume 2014

Advances in

Multimedia

International Journal of

Biomedical Imaging

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

ArtificialNeural Systems

Advances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

RoboticsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Computational Intelligence and Neuroscience

Industrial EngineeringJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Modelling amp Simulation in EngineeringHindawi Publishing Corporation httpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Human-ComputerInteraction

Advances in

Computer EngineeringAdvances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Page 3: Research Article A Comparison of Field-Based and …downloads.hindawi.com/journals/ahci/2013/619767.pdfbusiness, philosophy, anthropology, cognitive science, social science, and other

Advances in Human-Computer Interaction 3

objective and hence how they may be employed withineither a field or lab-based evaluation processThere is not yet acommon definition of user experience because it is associatedwith a broad range of both ldquofuzzyrdquo and dynamic concepts forexample emotion affect experience hedonic and aesthetics[24] There is also currently a lack of consensus regarding theconstructs that comprise user experience and how they maybe measured [25]

This research follows the approach taken by Arhippainenand Tahti [21] and Hassenzahl and Tractinsky [22] in con-sidering user experience to comprise multiple componentsnamely user social usage context cultural and productConsequently multicomponent user experience was mea-sured using 15 agree-disagree scales addressing the five com-ponents This approach therefore considers user experienceto be a formative construct that is measured in terms of itscomponents [26]

3 Field-Based Evaluation ofContent Personalisation

31 Aims The first empirical study was a field experimentthe aims of which were to evaluate the impact of using amobile device and personalising that device for a spectatorwithin a real sports environment Field studies typicallysacrifice some experimental control in order to maximise theecological validity of the experiment

32 Method

321 Setup The study took place in a sports stadium duringa competition involving local football clubsThis competitioncomprised fast-moving sporting action and a large gatheringof spectators most of whomwere unfamiliar with each otherThe user experience was therefore typical of that encounteredduring a large sports event Information was broadcast tospectators over a public address system and shown on a largedisplay screen in one corner of the stadium

322 Experimental Scenarios Thesuccessful use of scenariostakes into account the diversity of contexts encountered byspectators these were derived from previous studies [27 28]Four scenarios were developed including (1) checking theschedule of forthcoming matches and finding one of particu-lar interest (2) obtaining information on a particular playerof interest (3) reviewing the progress of the current match(dynamic information access) (4) joining a ldquocommunityrdquo andparticipating in community-based activities in the stadium

323 Prototypes Prior to the experiments described in thispaper a total of seven field studies were undertaken withspectators at large sports events [27 28] These studiesfound that a large number of contextual factors influencedthe design of serviceinformation provision to a spectatorbut that three had the greatest potential impact on theuser experience These were the sporting preferences of thespectator their physical location in the stadium and the eventprogress

Following the previous studies one paper prototypeand two mobile prototypes (a personalised prototype anda nonpersonalised prototype) were developed that providedcontent to support the experimental scenarios above

Both mobile prototypes were identical in terms of theirfunctionality and visual design The personalised prototypehad the option to manually configure the presentation ofinformation according to the key contextual factors usingthe content filtering personalisation technique (descriptionof the technical development is out of scope of this paper)With this prototype users were asked to set their preferencesrelating to the sports types and athletes taking part from anextended tree menu structure (Figure 2) As a result tailoredevent-based information (eg information on athletes) andevent schedules were presented to the spectator (Figure 3)In contrast the nonpersonalised mobile prototype did notrequire the user to set the personalisation attributes and asa result presented information and services applicable for amore general audience (Figure 4)

The personalised prototype also enabled users to assignthemselves to virtual communities with common interestswithin the stadium using collaborative filtering techniquebased on their stated interests This was via online chatand media sharing within groups defined by their personalpreferences The nonpersonalised mobile prototype enabledthe same chat and media sharing but within a larger groupnot differentiated according to personal interests

In addition a paper leaflet was prepared that was basedon the information that a spectator would traditionally getduring a real event from posters and programs It providedinformation on match schedules and playersrsquo profiles

324 Participants Eighteen participants were recruited byan external agency Their ages ranged from 18 to 45 yearsmean 285 with an equal gender split A range of occupa-tions were represented including sales journalist engineerteacher secretary and accountant eight participants wereuniversity students A recruitment criterion was that allparticipants should have undertaken personalisation of theirringtone screen background or shortcut keys at least once aweek In addition all participants had watched a large sportsevent in an open stadium within the last six months

325 User Experience Measurement The key dependentvariable was the user experience that resulted from usingthe prototypes User experience was measured in terms ofthe multidimensional components described in Section 2User experience was therefore rated by participants in amulticomponent assessment that was theoretically groundedand empirically derived (see the appendix)

326 Procedure A pilot study was used to check the timingsrefine the data collection methods and resolve any ambigui-ties with the instructions and data collection tools This wascarried out in a stadium

At the beginning of the study participants were giveninstruction on how to use the mobile prototype Participantsthen undertook the scenario-based tasks using either the

4 Advances in Human-Computer Interaction

Figure 2 Setting personalisation preferences

Figure 3Obtaining personalised information on particular athletes

paper-based programme or one of the mobile prototypesThey then completed the 15-item questionnaire that differen-tiated the five components of user experience

Finally they were interviewed in a semistructured formatto discuss the experiment design in the field study The studylasted around 60 minutes for each user

327 Data Collection Amultiplicity of data collectionmeth-ods was used within the study to enable a limited triangu-lation of subjective rating verbal report and observational

Figure 4 Obtaining general information on all athletes

data A video camera was used to record their interactionswith the mobile prototypes Users were encouraged (but notrequired) to ldquothink aloudrdquo during the trial A user assessmentin relation to overall user experience was captured using six-point agreedisagree rating scales Direct observation wasalso used and informal interaction with the researcher wasencouraged during the trial A posttrial structured interviewwas video recorded

328 Analysis of Data For quantitative data Friedmannonparametric tests for three related samples were calcu-lated for the main within-subjects factors Multiple pairedcomparisons were undertaken using the technique describedin Siegel and Castellan [23 page 180] to take into accountthe increased likelihood of a type I error with multiplecomparisons The qualitative data (interview transcripts andconcurrent verbal reports) and observational data were ana-lyzed using an affinity diagram technique [29] to collate andcategorise this data

33 Field StudyResults Thefield experiments generated largeamounts of rich and grounded data in relatively short timeFor all tasks the personalised mobile device consistentlygenerated the highest user experience rating (see Table 1)Thenonpersonalised device was consistently worse but still animprovement over the control condition (paper-based pro-gramme) The control condition (paper-based programme)was consistently rated poormdasha limitation also noted byNilsson et al [30] in their field observations of sports events

To make sure that the user interface itself (rather thanthe personalization approach) was not majorly influencingthe experiment outcome the user-initiated interface wasevaluated by calculating the percentage of tasks completed byparticipants and analysing user comments The experiment

Advances in Human-Computer Interaction 5

Table 1 The overall user experience assessment for each task

Task Friedman (3 related samples) Multiple paired comparisons [23 page 180] (120572 = 05)

Checking match schedules 1205942(2) = 313119875 lt 001

119873 = 18 10038161003816100381610038161003816119877Personalized minus 119877Nonpersonalized

10038161003816100381610038161003816= 215 gtZ = 1436lowast

119873 = 18 10038161003816100381610038161003816119877Personalized minus 119877paper

10038161003816100381610038161003816= 295 gtZ = 1436lowast

119873 = 18 10038161003816100381610038161003816119877Nonpersonalized minus 119877paper

10038161003816100381610038161003816= 8 ltZ = 1436

Obtaining player information 1205942(2) = 34119875 lt 001

119873 = 18 10038161003816100381610038161003816119877Personalized minus 119877Nonpersonalized

10038161003816100381610038161003816= 17 gtZ = 1436lowast

119873 = 18 10038161003816100381610038161003816119877Personalized minus 119877paper

10038161003816100381610038161003816= 34 gtZ = 1436lowast

119873 = 18 10038161003816100381610038161003816119877Nonpersonalized minus 119877paper

10038161003816100381610038161003816= 17 gtZ = 1436lowast

Reviewing match progress 1205942(2) = 306119875 lt 001

119873 = 18 10038161003816100381610038161003816119877Personalized minus 119877Nonpersonalized

10038161003816100381610038161003816= 165 gtZ = 1436lowast

119873 = 18 10038161003816100381610038161003816119877Personalized minus 119877paper

10038161003816100381610038161003816= 33 gtZ = 1436lowast

119873 = 18 10038161003816100381610038161003816119877Nonpersonalized minus 119877paper

10038161003816100381610038161003816= 165 gtZ = 1436lowast

Building a community 1205942(2) = 321119875 lt 001

119873 = 18 10038161003816100381610038161003816119877Personalized minus 119877Nonpersonalized

10038161003816100381610038161003816= 17 gtZ = 1436lowast

119873 = 18 10038161003816100381610038161003816119877Personalized minus 119877paper

10038161003816100381610038161003816= 34 gtZ = 1436lowast

119873 = 18 10038161003816100381610038161003816119877Nonpersonalized minus 119877paper

10038161003816100381610038161003816= 17 gtZ = 1436lowast

lowast

Indicates a significant difference

recorded approximately 18 hours of video capturing the 18subjectsrsquo interaction steps while completing each task Insummary 955 of the tasks were completed successfully Forthe 45 of unfinished tasks 35 usability problems with thepersonalized mobile prototype were reported

Participants in the field experiment stressed problemsof mobile ldquouserdquo rather than simply application ldquousabilityrdquoand typically those problems were expressed in the languageof the situation For example users were concerned aboutspending toomuch time personalizing the application duringthe event (detracting from the event itself) and the font onthe interface being too small to read in an open stadiumunder bright sunlight The field study also identified issues ofvalidity and precision of the data presented by the applicationFor example users were concerned about the reliability ofinformation provided by the prototypes after they found thatsome player information presented on the mobile device didnot match with the real events

To the participants the field experiment environmentfelt fairly informal and the users talked freely about theuse of the application and their feelings Users expressedhow the field experiment allowed them to feel relaxed andable to communicate with the researcher as they undertookthe evaluation scenarios Rather than focusing on interfaceissues users generally expressed broader views and wereable to give a wide range of evaluation-related informa-tion during the experiment such as expressing contextuallyrelated requirements A particular example of the kind of datagenerated during the field study was the need for the mobilecontent (and its delivery) to be highly integrated into thetemporal flow of the sporting action

Using a field experiment approach it may be possible toobtain a higher level of ldquorealismrdquo However this evaluationmethod is not easy to undertake Experiments in the fieldare influenced by external factors such as the weather andmoreover it is more difficult to actually collect data fromparticipants Users were impacted by events happening in thefield such as noise and other disturbances For example someparticipants actually forgot that they were taking part in a

research studymdashuntil prompted they focused their attentionon the competition happening in the stadium and weresubstantially distracted from the field experiment Flexibilityand pragmatism are needed to still collect useful data whilenot detracting unduly from the experience generated withinthe field setting

4 Lab-Based Evaluation ofPersonalisation Approach

41 Aims Whereas the previous section has described a field-based study this section describes a very similar lab studythis time centred around a multievent athletics meeting Thespecific objectives of this study were to use a more controlledexperimental setup to compare the user experience for aspectator at a large sports event under three conditions (1)using paper-based (not mobile) content (2) using a mobileprototype where personalisation parameters were set by theuser (3) using a similar prototype where parameters wereset automatically Similar procedures and data collectionmethods enable a comparison of findings with the field-basedstudy described in Section 3

42 Method

421 Setup This evaluation took place in a usability labo-ratory in the UK The usability lab was set up to resemble apart of the sports stadium To mimic the divided attentionthat would result from a spectator watching a sporting eventsports footage including auditory output was projectedonto the front wall of the laboratory A crowd scene wasreplicated on each of the two side walls In contrast totypical mobile applications such as tourist guides [31] aspectator is usually seated and relatively static at a sportingevent Therefore mobility per se did not need to be incor-porated into the experimental environment A video camerawas used to record usersrsquo interaction with the prototypes(Figure 5)

6 Advances in Human-Computer Interaction

Figure 5 User study in a usability lab

422 Experimental Scenarios Five scenarios were devel-oped based on the same key spectator activities as used in theprevious field-based experiment Four of the experimentaltasks were the same as those employed in the field experi-ment This lab-based study also employed a fifth task whichrequired the participant to select a suitable viewing angle for amobile broadcast on the deviceThis enabled the participantsto follow the sporting action from wherever they were in thestadium

423 Prototypes Two personalised mobile prototypes weredeveloped using content filtering and collaborative filteringpersonalisation techniqueThey shared the same look and feelas those used in the field study described in Section 3

One prototype enabled user-initiated personalisation ofcontent using an extended menu structure as before (seeFigure 2) As a result this prototype presented event infor-mation such as athlete details and event schedules based onthe usersrsquo settings

In contrast the prototype with system-initiated personal-isation did not require that participants set personalisationparameters Personalised content was then presented auto-matically to the participant using the content and collabora-tive filtering personalisation technique

As for the field experiment a control condition wasincluded that was representative of a nondigital event pro-gramme that a spectator would typically have This was apaper leaflet that provided information on competition timesand athlete information

424 Participants A different cohort of eighteen partici-pants took part in the study again split equally male-femaleaged between 18 and 38 and with various occupations Asin the previous field-based study all participants had regularexperience of personalising mobile devices and had attendeda large sports event within the last six months

425 Procedure and Analysis of Data A pilot study wasused to maximise the realism of the simulation and ensurethat the data collection methods that were used during thefield trial were effective within a laboratory setting Two keychanges were made rearrangement of speakers to broadcastaudience noise the side projection of a video of the audiencewas replaced with large-scale posters This enhanced theperceived social atmosphere within the laboratory without

unnecessarily distracting the participant from the mainprojected view (the sporting action)

The procedure followed that used for the field experimentdescribed in Section 3 Each task was completed in turnwith the three personalisation conditions counterbalancedacross participants within each task The design of the labexperiment was also discussed with participants at the end ofthe study The study lasted approximately one hour for eachparticipant

As for field study the data comprised quantitative ratingscale data concurrent verbal reports and posttrial interviewdata and observational data This was analysed as describedpreviously

43 Lab Study Results Across all tasks there were clearadvantages to having personalised content delivered overa mobile device The control condition (representing thatpresent at current stadium environments) was significantlyworse in all scenarios (see Table 2) Table 2 also indicatesthat in terms of the impact on user experience neither usernor system-initiated personalisation emerged as a single bestapproach across the range of tasks studied

The experiment recorded approximately 18 hours of videoduring the lab study In summary 92 of the tasks werecompleted successfully For the 8 of unfinished tasks 42usability problems with the personalized mobile prototypewere reported In general participants considered the userinterface of both personalized mobile prototypes easy to use

The laboratory study was relatively easy to undertake andcollect data from During the lab study participants quicklyrevealed considerable information about how a spectator usesthe prototypes The lab environment also offered more con-trol over the conditions for the experiment In comparisonto the field study participants were more focused on theexperiment being undertaken and were not influenced byexternal factors such asweather noise or others disturbancesfrom the sporting environment

There were some drawbacks to the lab experimentincluding only a limited representation of the real worldand greater uncertainty over the degree of generalization ofresults outside laboratory settingsThis laboratory study triedto ldquobringrdquo the large sporting event into the experiment bycarefully setting up the lab to resemble a stadium designingscenarios based on previous studies of context within largesports events and involving users who were familiar with theusage context It also addressed the issue of usersrsquo dividedattention by requiring subjects to watch a sport event videowhich was projected on the front wall of the lab roomwhile performing the scenario-based tasks with the mobileprototypes As a result participants were able to identifysome context related problems (eg the font was too smallto read in an open stadium) during the lab experiment Inaddition participants expressed their concerns with usingthe personalized prototypes from contextual and social per-spectives including concerns over spending too much timepersonalizing the device during the event and thereforeactually missing some of the sporting action

Advances in Human-Computer Interaction 7

Table 2 The overall user experience assessment for each task

Task Friedman (3 related samples) Multiple paired comparisons [23 page 180] (120572 = 05)

Follow sporting action119873 = 18 119873 = 18 1003816100381610038161003816

1003816119877system-initiated minus 119877use-initiated

10038161003816100381610038161003816= 15 gtZ = 1436lowast

1205942(2) = 275 119873 = 18 1003816100381610038161003816

1003816119877system-initiated minus 119877paper

10038161003816100381610038161003816= 49 gtZ = 1436lowast

119875 lt 001 119873 = 18 10038161003816100381610038161003816119877user-initiated minus 119877paper

10038161003816100381610038161003816= 34 gtZ = 1436lowast

Obtaining athlete information119873 = 18 119873 = 18 1003816100381610038161003816

1003816119877system-initiated minus 119877use-initiated

10038161003816100381610038161003816= 2 ltZ = 1436

1205942(2) = 294 119873 = 18 1003816100381610038161003816

1003816119877system-initiated minus 119877paper

10038161003816100381610038161003816= 16 gtZ = 1436lowast

119875 lt 001 119873 = 18 10038161003816100381610038161003816119877user-initiated minus 119877paper

10038161003816100381610038161003816= 18 gtZ = 1436lowast

Reviewing athletics event results119873 = 18 119873 = 18 1003816100381610038161003816

1003816119877system-initiated minus 119877use-initiated

10038161003816100381610038161003816= 175 gtZ = 1436lowast

1205942(2) = 355 119873 = 18 1003816100381610038161003816

1003816119877system-initiated minus 119877paper

10038161003816100381610038161003816= 35 gtZ = 1436lowast

119875 lt 001 119873 = 18 10038161003816100381610038161003816119877user-initiated minus 119877paper

10038161003816100381610038161003816= 175 gtZ = 1436lowast

Building a community119873 = 18 119873 = 18 1003816100381610038161003816

1003816119877system-initiated minus 119877use-initiated

10038161003816100381610038161003816= 6 ltZ = 1436

1205942(2) = 296 119873 = 18 1003816100381610038161003816

1003816119877system-initiated minus 119877paper

10038161003816100381610038161003816= 24 gtZ = 1436lowast

119875 lt 001 119873 = 18 10038161003816100381610038161003816119877user-initiated minus 119877paper

10038161003816100381610038161003816= 30 gtZ = 1436lowast

Checking event schedules119873 = 18 119873 = 18 1003816100381610038161003816

1003816119877system-initiated minus 119877use-initiated

10038161003816100381610038161003816= 165 gtZ = 1436lowast

1205942(2) = 345 119873 = 18 1003816100381610038161003816

1003816119877system-initiated minus 119877paper

10038161003816100381610038161003816= 345 gtZ = 1436lowast

119875 lt 001 119873 = 18 10038161003816100381610038161003816119877user-initiated minus 119877paper

10038161003816100381610038161003816= 18 gtZ = 1436lowast

lowast

Indicates a significant result

The lab experiment did not allow users to feel relaxedduring the experimental procedure Participants acted morepolitely during the study and they pointed out that they wereuncomfortable about expressing negative feelings about theapplications In one example when interviewing a participantabout aspects of his user experience he generally stated thatit was fine However when presented with the EmotionCards(a group of cartoon faces that were used to help promotediscussion of affective aspects of interaction) he tended topick up one emotion card and talked a lot about concernsover the time and effort required to manually personalize theapplication without feeling that he was being overly critical

5 Discussion and Conclusion

In recent years there has been much debate on whethermobile applications should be evaluated in the field or in atraditional lab environment issues including usersrsquo behaviour[11] identification of usability problems [3] the experimentsettings [5 8 32] the communication with participants[33] This research enabled a comparison between fieldand laboratory experiments based on similar users mobileapplications and task-based scenarios In particular a similaruser-initiated prototype was used in each even though thefield experiment took place at a football competition and thelab experiment recreated an athletics meeting

The number of usability problems identified during boththe field and lab experiments was similar (over all partic-ipants when using the user-initiated prototype there were35 usability problems identified in the field setting and 42found in the lab setting) These findings are consistent withthose of Kjeldskov et al [3] They specifically compared laband field-based usability results and found that the differencein effectiveness of these two approaches was nonsignificant

in identifying most usability problems Also some contextrelated problems such as the font being too small to read inan open stadium were identified in both experiment settingsHowever some key differences in the effectiveness of the fieldand laboratory approaches were found the lab experimentidentified problems related to the detail of the interfacedesign for example the colours and icons on the interfacethe field experiment identified issues of validity and precisionof the data presented by the application when using theapplication in a stadium The field experiment also stressedthe problems of mobile ldquouserdquo rather than simply applicationusability and typically these problems were expressed in thelanguage of the situation [11]

An analysis of positive versus negative behaviours [11] wasundertakenThis data included verbal reports and rating scaledata according to the user experience definitions Acceptingthe limitations of a direct comparison participants reactedmore negatively in the laboratory setting when completingsimilar tasks (using the similar user-initiated personalisationapproach) In the field individuals were influenced by theatmosphere surrounding the sports event and this resulted inan enhanced user experience In addition they focused moreattention on the actual usage of personalisation on themobiledevice instead of issues to do with the interface The labsetting was less engaging than the field setting participantsweremore likely to be critical and in general they took longerto perform certain tasks by focusing (and commenting) oninterface issues such as fonts and colours used

The field experiment was more difficult to conduct thanthe lab experiment a point noted by many authors includingKjeldskov et al [3] and Baillie [8] Confounding factors werepresent for example variations in theweather and noise fromother spectators In addition although it was desirable thatparticipants engaged in the sporting action spectatorsrsquo foci ofattention could not be controlled and was difficult to predict

8 Advances in Human-Computer Interaction

Some participants were ldquodistractedrdquo from the experimentaltasks and this did not occur during the laboratory settingThe greater control possible with a laboratory study (asdiscussed by a range of authors including [2 5 8])was clearlydemonstrated during these studies

Where there is an interest in qualitative data goodcommunication between the researcher and participants isvital The field experiment provided a more open and relaxedatmosphere for discourse Users more freely discussed theiruse of the mobile applications their underlying beliefs andattitudes that arose during the study The field experimenthelped the communication tensions with the participants asthey felt they were not being directly examined As well asgenerally promoting the generation of qualitative data thefield experiments encouraged the expression of broader aswell as more contextually relevant views An example is theidentification of contextually dependent requirements whichoccurred much less frequently during the lab-based studies

Some suggestions for user impact assessment withmobiledevices can be made based on the findings of this study Alab experiment is recommended when the focus is on theuser interface and device-oriented usability issues In suchcases a well-designed lab study should provide the validityrequired while being easier quicker and cheaper to conductHowever the results suggest that a field experiment is bettersuited for investigating a wider range of factors affectingthe overall acceptability of mobile services including systemfunctions and impact of usage contexts Where open andrelaxed communication is important (eg where participantgroups are naturally reticent to communicate) this is morereadily promoted by the use of a field study

The natural tension between a deductive and inductiveresearch design was also apparent This research in generalwas essentially deductive since it set out to explain causalrelationships between variables operationalized conceptscontrolled variables and used structured and repeatablemethods to collect data However the field study in particularalso comprised an inductive element as there was a desire tounderstand the research context and the meanings attachedto events in order to help design the lab study Van Elzakkeret al [2] underline how it is often desirable to combineapproaches within the same study The undertaking of a fieldexperiment followed by a lab experiment was an attemptat multiplicity of methods from an essentially deductiveviewpoint This recognised that the natural research processis often that of moving from a process of understanding toone of testing whilst attempting to avoid the unsatisfactorymiddle ground of user evaluations that are divorced from anyunderlying research objectives

Appendix

Likert Items Used to Assess User Experience

In all cases participant responses were based on a six-pointscale ranging from 1 (strongly disagree) to 6 (strongly agree)

User Aspect

(1) I feel happy using [ABC] during the event

(2) My expectations regarding my spectator experiencein the stadium are met using [ABC]

(3) My needs as a spectator are taken into account using[ABC]

Social Aspect

(4) Using [ABC] helps me feel I am communicatingand sharing information with others in the stadium

(5) The [ABC] helps me create enjoyable experienceswithin the stadium

(6) The [ABC] helps me share my experiences withothers within the stadium

Usage Context Aspect

(7) The [ABC] provides me with help in the stadiumwhile watching the sporting action

(8) The [ABC] provides me with information aboutother spectators in the stadium

(9) The [ABC] helps provide me with a good physicaland social environment in the stadium

Culture Aspect

(10) The [ABC] helps me feel part of a group(11) The [ABC] helps me promote my group image(12) The [ABC] helps me interact with my group

Product Aspect

(13) The [ABC] is useful at the event(14) The [ABC] is easy to learn how to use(15) The [ABC] is easy to use

Acknowledgments

This work has been carried out as part of the Philips ResearchProgramme on Lifestyle The authors would like to thank allthe participants who were generous with their time duringthis study

References

[1] D Raptis N Tselios and N Avouris ldquoContext-based designof mobile applications for museums a survey of existingpracticesrdquo in Proceedings of the 7th International Conference onHuman Computer Interaction with Mobile Devices amp Services2005

[2] C P J M Van Elzakker I Delikostidis and P J M VanOosterom ldquoField-based usability evaluation methodology formobile geo-applicationsrdquo Cartographic Journal vol 45 no 2pp 139ndash149 2008

[3] J Kjeldskov M B Skov B S Als and R T Hoslashegh ldquoIs it worththe hassle Exploring the added value of evaluating the usabilityof context-aware mobile systems in the Fieldrdquo in Proceedingsof the 6th International Mobile Conference (HCI rsquo04) Springer2004

Advances in Human-Computer Interaction 9

[4] X Sun D Golightly J Cranwelly B Bedwell and S SharplesldquoParticipant experiences of mobile device-based diary studiesrdquoInternational Journal ofMobile HumanComputer Interaction Inpress

[5] J Kjeldskov and J Stage ldquoNew techniques for usability eval-uation of mobile systemsrdquo International Journal of HumanComputer Studies vol 60 no 5-6 pp 599ndash620 2004

[6] P Christensen M Romero M Thomas A S Nielsen and HHarder ldquoChildren mobility and space using GPS and mobilephone technologies in ethnographic researchrdquo Journal of MixedMethods Research vol 5 no 3 pp 227ndash246 2011

[7] E G Coleman ldquoEthnographic approaches to digital mediardquoAnnual Review of Anthropology vol 39 pp 487ndash505 2010

[8] L Baillie ldquoFuture Telecommunication exploring actual userdquoin Proceedings of the International Conference on Human-Computer Interaction IOS press 2003

[9] M Esbjornsson B Brown O Juhlin D Normark MOstergren and E Laurier ldquoWatching the cars go round andround designing for active spectatingrdquo in Proceedings of theConference on Human Factors in Computing Systems (CHI rsquo06)pp 1221ndash1224 New York NY USA April 2006

[10] X Sun ldquoUser requirements of personalized mobile applicationsat large sporting eventsrdquo in Proceedings of the IADIS Multi Con-ference on Computer Science and Information Systems(MCCSISrsquo10) Freiburg Germany 2010

[11] H B L Duh G C B Tan and V H H Chen ldquoUsabilityevaluation for mobile device a comparison of laboratory andfield testsrdquo in Proceedings of the 8th International Conference onHuman-Computer Interaction with Mobile Devices and Services(MobileHCI rsquo06) pp 181ndash186 September 2006

[12] A Pirhonen S Brewster and C Holguin ldquoGestural andaudio metaphors as a means of control for mobile devicesrdquo inProceedings of the Conference on Human Factors in ComputingSystems (CHI rsquo02) pp 291ndash298 NewYork NY USA April 2002

[13] R Graham and C Carter ldquoComparison of speech input andmanual control of in-car devices while on-the-moverdquo in Pro-ceedings of the 2nd Workshop on Human Computer Interactionwith Mobile Devices (HCI rsquo1999) Edinburgh UK 1999

[14] J Lai K Cheng P Green andO Tsimhoni ldquoOn the road and onthe web Comprehension of synthetic and human speech whiledrivingrdquo in Proceedings of the Conference on Human Factors inComputing Systems (CHI rsquo01) pp 206ndash212 New York NY USAApril 2001

[15] B Mobasher ldquoData mining for personalizationrdquo in The Adap-tive Web Methods and Strategies of Web Personalization PBrusilovsky A Kobsa and W Nejdl Eds pp 1ndash46 SpringerBerlin Germany 2007

[16] D Wu I Im M Tremaine K Instone and M Turoff ldquoAframework for classifying personalization schemes used on e-commerce websitesrdquo in Proceedings of the 36th Hawaii Inter-national Conference on Systems Sciences (HICSS rsquo03) p 222bMaui Hawaii USA 2003

[17] L Norros E Kaasinen J Plomp and P Rama Human-Technology Interaction Research and Design VTT RoadmapVTT Research Notes 2220 Espoo Finland 2003

[18] E Frias-Martinez S Y Chen and X Liu ldquoEvaluation of apersonalized digital library based on cognitive styles adaptivityversus adaptabilityrdquo International Journal of Information Man-agement vol 29 no 1 pp 48ndash56 2009

[19] J Dewey Art as Experience Perigee New York NY USA 1980

[20] J Rasmussen ldquoHuman factors in a dynamic information soci-ety where are we headingrdquo Ergonomics vol 43 no 7 pp 869ndash879 2000

[21] L Arhippainen and M Tahti ldquoEmpirical Evaluation of UserExperience in Two AdaptiveMobile Application Prototypesrdquo inProceedings of the 2nd International Conference on Mobile andUbiquitous Multimedia Lulea Sweden 2003

[22] M Hassenzahl andN Tractinsky ldquoUser experiencemdasha researchagendardquo Behaviour and Information Technology vol 25 no 2pp 91ndash97 2006

[23] S Siegel and N J Castellan Nonparametric Statistics For theBehavioral Sciences McGraw-Hill New York NY USA 1988

[24] E Law V Roto A P O S Vermeeren J Kort and MHassenzahl ldquoTowards a shared definition of user experiencerdquoin Proceedings of the 28th Annual Conference on Human Factorsin Computing Systems (CHI rsquo08) pp 2395ndash2398 Florence ItalyApril 2008

[25] E L C Law and P van Schaik ldquoModelling user experiencemdashanagenda for research and practicerdquo Interacting with Computersvol 22 no 5 pp 313ndash322 2010

[26] C H Lin P J Sher and H Y Shih ldquoPast progress andfuture directions in conceptualizing customer perceived valuerdquoInternational Journal of Service Industry Management vol 16no 4 pp 318ndash336 2005

[27] X Sun and A May ldquoMobile personalisation at large sportseventsmdashuser experience and mobile device personalisationrdquoin Human-Computer Interaction 2007 vol 11 Springer BerlinGermany 2007

[28] X Sun and A May ldquoThe role of spatial contextual factors inmobile personalization at large sports eventsrdquo Personal andUbiquitous Computing vol 13 no 4 pp 293ndash302 2009

[29] J Hackos and J Redish User and Task Analysis For InterfaceDesign John Wiley amp Sons New York NY USA 1998

[30] A Nilsson U Nulden and D Olsson ldquoSpectator informationsupport exploring the context of distributed eventsrdquo in Pro-ceedings of the International ACM SIGGROUP Conference onSupporting Group Work 2004

[31] B Oertel K Steinmuller and M Kuom ldquoMobile multimediaservices for tourismrdquo in Information and Communication Tech-nologies in Tourism 2002 K W Wober A J Frew and M HitzEds pp 265ndash274 Springer New York NY USA 2002

[32] D D Salvucci ldquoPredicting the effects of in-car interfaces ondriver behavior using a cognitive architecturerdquo in Proceedingsof the Conference on Human Factors in Computing Systems (CHIrsquo01) pp 120ndash127 New York NY USA April 2001

[33] A Kaikkonen A Kekalainen M Cankar T Kallio and AKankainen ldquoUsability testing of mobile applications a compar-ison between laboratory and field testingrdquo Journal of UsabilityStudies vol 1 pp 4ndash16 2005

Submit your manuscripts athttpwwwhindawicom

Computer Games Technology

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Distributed Sensor Networks

International Journal of

Advances in

FuzzySystems

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014

International Journal of

ReconfigurableComputing

Hindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Applied Computational Intelligence and Soft Computing

thinspAdvancesthinspinthinsp

Artificial Intelligence

HindawithinspPublishingthinspCorporationhttpwwwhindawicom Volumethinsp2014

Advances inSoftware EngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Electrical and Computer Engineering

Journal of

Journal of

Computer Networks and Communications

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporation

httpwwwhindawicom Volume 2014

Advances in

Multimedia

International Journal of

Biomedical Imaging

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

ArtificialNeural Systems

Advances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

RoboticsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Computational Intelligence and Neuroscience

Industrial EngineeringJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Modelling amp Simulation in EngineeringHindawi Publishing Corporation httpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Human-ComputerInteraction

Advances in

Computer EngineeringAdvances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Page 4: Research Article A Comparison of Field-Based and …downloads.hindawi.com/journals/ahci/2013/619767.pdfbusiness, philosophy, anthropology, cognitive science, social science, and other

4 Advances in Human-Computer Interaction

Figure 2 Setting personalisation preferences

Figure 3Obtaining personalised information on particular athletes

paper-based programme or one of the mobile prototypesThey then completed the 15-item questionnaire that differen-tiated the five components of user experience

Finally they were interviewed in a semistructured formatto discuss the experiment design in the field study The studylasted around 60 minutes for each user

327 Data Collection Amultiplicity of data collectionmeth-ods was used within the study to enable a limited triangu-lation of subjective rating verbal report and observational

Figure 4 Obtaining general information on all athletes

data A video camera was used to record their interactionswith the mobile prototypes Users were encouraged (but notrequired) to ldquothink aloudrdquo during the trial A user assessmentin relation to overall user experience was captured using six-point agreedisagree rating scales Direct observation wasalso used and informal interaction with the researcher wasencouraged during the trial A posttrial structured interviewwas video recorded

328 Analysis of Data For quantitative data Friedmannonparametric tests for three related samples were calcu-lated for the main within-subjects factors Multiple pairedcomparisons were undertaken using the technique describedin Siegel and Castellan [23 page 180] to take into accountthe increased likelihood of a type I error with multiplecomparisons The qualitative data (interview transcripts andconcurrent verbal reports) and observational data were ana-lyzed using an affinity diagram technique [29] to collate andcategorise this data

33 Field StudyResults Thefield experiments generated largeamounts of rich and grounded data in relatively short timeFor all tasks the personalised mobile device consistentlygenerated the highest user experience rating (see Table 1)Thenonpersonalised device was consistently worse but still animprovement over the control condition (paper-based pro-gramme) The control condition (paper-based programme)was consistently rated poormdasha limitation also noted byNilsson et al [30] in their field observations of sports events

To make sure that the user interface itself (rather thanthe personalization approach) was not majorly influencingthe experiment outcome the user-initiated interface wasevaluated by calculating the percentage of tasks completed byparticipants and analysing user comments The experiment

Advances in Human-Computer Interaction 5

Table 1 The overall user experience assessment for each task

Task Friedman (3 related samples) Multiple paired comparisons [23 page 180] (120572 = 05)

Checking match schedules 1205942(2) = 313119875 lt 001

119873 = 18 10038161003816100381610038161003816119877Personalized minus 119877Nonpersonalized

10038161003816100381610038161003816= 215 gtZ = 1436lowast

119873 = 18 10038161003816100381610038161003816119877Personalized minus 119877paper

10038161003816100381610038161003816= 295 gtZ = 1436lowast

119873 = 18 10038161003816100381610038161003816119877Nonpersonalized minus 119877paper

10038161003816100381610038161003816= 8 ltZ = 1436

Obtaining player information 1205942(2) = 34119875 lt 001

119873 = 18 10038161003816100381610038161003816119877Personalized minus 119877Nonpersonalized

10038161003816100381610038161003816= 17 gtZ = 1436lowast

119873 = 18 10038161003816100381610038161003816119877Personalized minus 119877paper

10038161003816100381610038161003816= 34 gtZ = 1436lowast

119873 = 18 10038161003816100381610038161003816119877Nonpersonalized minus 119877paper

10038161003816100381610038161003816= 17 gtZ = 1436lowast

Reviewing match progress 1205942(2) = 306119875 lt 001

119873 = 18 10038161003816100381610038161003816119877Personalized minus 119877Nonpersonalized

10038161003816100381610038161003816= 165 gtZ = 1436lowast

119873 = 18 10038161003816100381610038161003816119877Personalized minus 119877paper

10038161003816100381610038161003816= 33 gtZ = 1436lowast

119873 = 18 10038161003816100381610038161003816119877Nonpersonalized minus 119877paper

10038161003816100381610038161003816= 165 gtZ = 1436lowast

Building a community 1205942(2) = 321119875 lt 001

119873 = 18 10038161003816100381610038161003816119877Personalized minus 119877Nonpersonalized

10038161003816100381610038161003816= 17 gtZ = 1436lowast

119873 = 18 10038161003816100381610038161003816119877Personalized minus 119877paper

10038161003816100381610038161003816= 34 gtZ = 1436lowast

119873 = 18 10038161003816100381610038161003816119877Nonpersonalized minus 119877paper

10038161003816100381610038161003816= 17 gtZ = 1436lowast

lowast

Indicates a significant difference

recorded approximately 18 hours of video capturing the 18subjectsrsquo interaction steps while completing each task Insummary 955 of the tasks were completed successfully Forthe 45 of unfinished tasks 35 usability problems with thepersonalized mobile prototype were reported

Participants in the field experiment stressed problemsof mobile ldquouserdquo rather than simply application ldquousabilityrdquoand typically those problems were expressed in the languageof the situation For example users were concerned aboutspending toomuch time personalizing the application duringthe event (detracting from the event itself) and the font onthe interface being too small to read in an open stadiumunder bright sunlight The field study also identified issues ofvalidity and precision of the data presented by the applicationFor example users were concerned about the reliability ofinformation provided by the prototypes after they found thatsome player information presented on the mobile device didnot match with the real events

To the participants the field experiment environmentfelt fairly informal and the users talked freely about theuse of the application and their feelings Users expressedhow the field experiment allowed them to feel relaxed andable to communicate with the researcher as they undertookthe evaluation scenarios Rather than focusing on interfaceissues users generally expressed broader views and wereable to give a wide range of evaluation-related informa-tion during the experiment such as expressing contextuallyrelated requirements A particular example of the kind of datagenerated during the field study was the need for the mobilecontent (and its delivery) to be highly integrated into thetemporal flow of the sporting action

Using a field experiment approach it may be possible toobtain a higher level of ldquorealismrdquo However this evaluationmethod is not easy to undertake Experiments in the fieldare influenced by external factors such as the weather andmoreover it is more difficult to actually collect data fromparticipants Users were impacted by events happening in thefield such as noise and other disturbances For example someparticipants actually forgot that they were taking part in a

research studymdashuntil prompted they focused their attentionon the competition happening in the stadium and weresubstantially distracted from the field experiment Flexibilityand pragmatism are needed to still collect useful data whilenot detracting unduly from the experience generated withinthe field setting

4 Lab-Based Evaluation ofPersonalisation Approach

41 Aims Whereas the previous section has described a field-based study this section describes a very similar lab studythis time centred around a multievent athletics meeting Thespecific objectives of this study were to use a more controlledexperimental setup to compare the user experience for aspectator at a large sports event under three conditions (1)using paper-based (not mobile) content (2) using a mobileprototype where personalisation parameters were set by theuser (3) using a similar prototype where parameters wereset automatically Similar procedures and data collectionmethods enable a comparison of findings with the field-basedstudy described in Section 3

42 Method

421 Setup This evaluation took place in a usability labo-ratory in the UK The usability lab was set up to resemble apart of the sports stadium To mimic the divided attentionthat would result from a spectator watching a sporting eventsports footage including auditory output was projectedonto the front wall of the laboratory A crowd scene wasreplicated on each of the two side walls In contrast totypical mobile applications such as tourist guides [31] aspectator is usually seated and relatively static at a sportingevent Therefore mobility per se did not need to be incor-porated into the experimental environment A video camerawas used to record usersrsquo interaction with the prototypes(Figure 5)

6 Advances in Human-Computer Interaction

Figure 5 User study in a usability lab

422 Experimental Scenarios Five scenarios were devel-oped based on the same key spectator activities as used in theprevious field-based experiment Four of the experimentaltasks were the same as those employed in the field experi-ment This lab-based study also employed a fifth task whichrequired the participant to select a suitable viewing angle for amobile broadcast on the deviceThis enabled the participantsto follow the sporting action from wherever they were in thestadium

423 Prototypes Two personalised mobile prototypes weredeveloped using content filtering and collaborative filteringpersonalisation techniqueThey shared the same look and feelas those used in the field study described in Section 3

One prototype enabled user-initiated personalisation ofcontent using an extended menu structure as before (seeFigure 2) As a result this prototype presented event infor-mation such as athlete details and event schedules based onthe usersrsquo settings

In contrast the prototype with system-initiated personal-isation did not require that participants set personalisationparameters Personalised content was then presented auto-matically to the participant using the content and collabora-tive filtering personalisation technique

As for the field experiment a control condition wasincluded that was representative of a nondigital event pro-gramme that a spectator would typically have This was apaper leaflet that provided information on competition timesand athlete information

424 Participants A different cohort of eighteen partici-pants took part in the study again split equally male-femaleaged between 18 and 38 and with various occupations Asin the previous field-based study all participants had regularexperience of personalising mobile devices and had attendeda large sports event within the last six months

425 Procedure and Analysis of Data A pilot study wasused to maximise the realism of the simulation and ensurethat the data collection methods that were used during thefield trial were effective within a laboratory setting Two keychanges were made rearrangement of speakers to broadcastaudience noise the side projection of a video of the audiencewas replaced with large-scale posters This enhanced theperceived social atmosphere within the laboratory without

unnecessarily distracting the participant from the mainprojected view (the sporting action)

The procedure followed that used for the field experimentdescribed in Section 3 Each task was completed in turnwith the three personalisation conditions counterbalancedacross participants within each task The design of the labexperiment was also discussed with participants at the end ofthe study The study lasted approximately one hour for eachparticipant

As for field study the data comprised quantitative ratingscale data concurrent verbal reports and posttrial interviewdata and observational data This was analysed as describedpreviously

43 Lab Study Results Across all tasks there were clearadvantages to having personalised content delivered overa mobile device The control condition (representing thatpresent at current stadium environments) was significantlyworse in all scenarios (see Table 2) Table 2 also indicatesthat in terms of the impact on user experience neither usernor system-initiated personalisation emerged as a single bestapproach across the range of tasks studied

The experiment recorded approximately 18 hours of videoduring the lab study In summary 92 of the tasks werecompleted successfully For the 8 of unfinished tasks 42usability problems with the personalized mobile prototypewere reported In general participants considered the userinterface of both personalized mobile prototypes easy to use

The laboratory study was relatively easy to undertake andcollect data from During the lab study participants quicklyrevealed considerable information about how a spectator usesthe prototypes The lab environment also offered more con-trol over the conditions for the experiment In comparisonto the field study participants were more focused on theexperiment being undertaken and were not influenced byexternal factors such asweather noise or others disturbancesfrom the sporting environment

There were some drawbacks to the lab experimentincluding only a limited representation of the real worldand greater uncertainty over the degree of generalization ofresults outside laboratory settingsThis laboratory study triedto ldquobringrdquo the large sporting event into the experiment bycarefully setting up the lab to resemble a stadium designingscenarios based on previous studies of context within largesports events and involving users who were familiar with theusage context It also addressed the issue of usersrsquo dividedattention by requiring subjects to watch a sport event videowhich was projected on the front wall of the lab roomwhile performing the scenario-based tasks with the mobileprototypes As a result participants were able to identifysome context related problems (eg the font was too smallto read in an open stadium) during the lab experiment Inaddition participants expressed their concerns with usingthe personalized prototypes from contextual and social per-spectives including concerns over spending too much timepersonalizing the device during the event and thereforeactually missing some of the sporting action

Advances in Human-Computer Interaction 7

Table 2 The overall user experience assessment for each task

Task Friedman (3 related samples) Multiple paired comparisons [23 page 180] (120572 = 05)

Follow sporting action119873 = 18 119873 = 18 1003816100381610038161003816

1003816119877system-initiated minus 119877use-initiated

10038161003816100381610038161003816= 15 gtZ = 1436lowast

1205942(2) = 275 119873 = 18 1003816100381610038161003816

1003816119877system-initiated minus 119877paper

10038161003816100381610038161003816= 49 gtZ = 1436lowast

119875 lt 001 119873 = 18 10038161003816100381610038161003816119877user-initiated minus 119877paper

10038161003816100381610038161003816= 34 gtZ = 1436lowast

Obtaining athlete information119873 = 18 119873 = 18 1003816100381610038161003816

1003816119877system-initiated minus 119877use-initiated

10038161003816100381610038161003816= 2 ltZ = 1436

1205942(2) = 294 119873 = 18 1003816100381610038161003816

1003816119877system-initiated minus 119877paper

10038161003816100381610038161003816= 16 gtZ = 1436lowast

119875 lt 001 119873 = 18 10038161003816100381610038161003816119877user-initiated minus 119877paper

10038161003816100381610038161003816= 18 gtZ = 1436lowast

Reviewing athletics event results119873 = 18 119873 = 18 1003816100381610038161003816

1003816119877system-initiated minus 119877use-initiated

10038161003816100381610038161003816= 175 gtZ = 1436lowast

1205942(2) = 355 119873 = 18 1003816100381610038161003816

1003816119877system-initiated minus 119877paper

10038161003816100381610038161003816= 35 gtZ = 1436lowast

119875 lt 001 119873 = 18 10038161003816100381610038161003816119877user-initiated minus 119877paper

10038161003816100381610038161003816= 175 gtZ = 1436lowast

Building a community119873 = 18 119873 = 18 1003816100381610038161003816

1003816119877system-initiated minus 119877use-initiated

10038161003816100381610038161003816= 6 ltZ = 1436

1205942(2) = 296 119873 = 18 1003816100381610038161003816

1003816119877system-initiated minus 119877paper

10038161003816100381610038161003816= 24 gtZ = 1436lowast

119875 lt 001 119873 = 18 10038161003816100381610038161003816119877user-initiated minus 119877paper

10038161003816100381610038161003816= 30 gtZ = 1436lowast

Checking event schedules119873 = 18 119873 = 18 1003816100381610038161003816

1003816119877system-initiated minus 119877use-initiated

10038161003816100381610038161003816= 165 gtZ = 1436lowast

1205942(2) = 345 119873 = 18 1003816100381610038161003816

1003816119877system-initiated minus 119877paper

10038161003816100381610038161003816= 345 gtZ = 1436lowast

119875 lt 001 119873 = 18 10038161003816100381610038161003816119877user-initiated minus 119877paper

10038161003816100381610038161003816= 18 gtZ = 1436lowast

lowast

Indicates a significant result

The lab experiment did not allow users to feel relaxedduring the experimental procedure Participants acted morepolitely during the study and they pointed out that they wereuncomfortable about expressing negative feelings about theapplications In one example when interviewing a participantabout aspects of his user experience he generally stated thatit was fine However when presented with the EmotionCards(a group of cartoon faces that were used to help promotediscussion of affective aspects of interaction) he tended topick up one emotion card and talked a lot about concernsover the time and effort required to manually personalize theapplication without feeling that he was being overly critical

5 Discussion and Conclusion

In recent years there has been much debate on whethermobile applications should be evaluated in the field or in atraditional lab environment issues including usersrsquo behaviour[11] identification of usability problems [3] the experimentsettings [5 8 32] the communication with participants[33] This research enabled a comparison between fieldand laboratory experiments based on similar users mobileapplications and task-based scenarios In particular a similaruser-initiated prototype was used in each even though thefield experiment took place at a football competition and thelab experiment recreated an athletics meeting

The number of usability problems identified during boththe field and lab experiments was similar (over all partic-ipants when using the user-initiated prototype there were35 usability problems identified in the field setting and 42found in the lab setting) These findings are consistent withthose of Kjeldskov et al [3] They specifically compared laband field-based usability results and found that the differencein effectiveness of these two approaches was nonsignificant

in identifying most usability problems Also some contextrelated problems such as the font being too small to read inan open stadium were identified in both experiment settingsHowever some key differences in the effectiveness of the fieldand laboratory approaches were found the lab experimentidentified problems related to the detail of the interfacedesign for example the colours and icons on the interfacethe field experiment identified issues of validity and precisionof the data presented by the application when using theapplication in a stadium The field experiment also stressedthe problems of mobile ldquouserdquo rather than simply applicationusability and typically these problems were expressed in thelanguage of the situation [11]

An analysis of positive versus negative behaviours [11] wasundertakenThis data included verbal reports and rating scaledata according to the user experience definitions Acceptingthe limitations of a direct comparison participants reactedmore negatively in the laboratory setting when completingsimilar tasks (using the similar user-initiated personalisationapproach) In the field individuals were influenced by theatmosphere surrounding the sports event and this resulted inan enhanced user experience In addition they focused moreattention on the actual usage of personalisation on themobiledevice instead of issues to do with the interface The labsetting was less engaging than the field setting participantsweremore likely to be critical and in general they took longerto perform certain tasks by focusing (and commenting) oninterface issues such as fonts and colours used

The field experiment was more difficult to conduct thanthe lab experiment a point noted by many authors includingKjeldskov et al [3] and Baillie [8] Confounding factors werepresent for example variations in theweather and noise fromother spectators In addition although it was desirable thatparticipants engaged in the sporting action spectatorsrsquo foci ofattention could not be controlled and was difficult to predict

8 Advances in Human-Computer Interaction

Some participants were ldquodistractedrdquo from the experimentaltasks and this did not occur during the laboratory settingThe greater control possible with a laboratory study (asdiscussed by a range of authors including [2 5 8])was clearlydemonstrated during these studies

Where there is an interest in qualitative data goodcommunication between the researcher and participants isvital The field experiment provided a more open and relaxedatmosphere for discourse Users more freely discussed theiruse of the mobile applications their underlying beliefs andattitudes that arose during the study The field experimenthelped the communication tensions with the participants asthey felt they were not being directly examined As well asgenerally promoting the generation of qualitative data thefield experiments encouraged the expression of broader aswell as more contextually relevant views An example is theidentification of contextually dependent requirements whichoccurred much less frequently during the lab-based studies

Some suggestions for user impact assessment withmobiledevices can be made based on the findings of this study Alab experiment is recommended when the focus is on theuser interface and device-oriented usability issues In suchcases a well-designed lab study should provide the validityrequired while being easier quicker and cheaper to conductHowever the results suggest that a field experiment is bettersuited for investigating a wider range of factors affectingthe overall acceptability of mobile services including systemfunctions and impact of usage contexts Where open andrelaxed communication is important (eg where participantgroups are naturally reticent to communicate) this is morereadily promoted by the use of a field study

The natural tension between a deductive and inductiveresearch design was also apparent This research in generalwas essentially deductive since it set out to explain causalrelationships between variables operationalized conceptscontrolled variables and used structured and repeatablemethods to collect data However the field study in particularalso comprised an inductive element as there was a desire tounderstand the research context and the meanings attachedto events in order to help design the lab study Van Elzakkeret al [2] underline how it is often desirable to combineapproaches within the same study The undertaking of a fieldexperiment followed by a lab experiment was an attemptat multiplicity of methods from an essentially deductiveviewpoint This recognised that the natural research processis often that of moving from a process of understanding toone of testing whilst attempting to avoid the unsatisfactorymiddle ground of user evaluations that are divorced from anyunderlying research objectives

Appendix

Likert Items Used to Assess User Experience

In all cases participant responses were based on a six-pointscale ranging from 1 (strongly disagree) to 6 (strongly agree)

User Aspect

(1) I feel happy using [ABC] during the event

(2) My expectations regarding my spectator experiencein the stadium are met using [ABC]

(3) My needs as a spectator are taken into account using[ABC]

Social Aspect

(4) Using [ABC] helps me feel I am communicatingand sharing information with others in the stadium

(5) The [ABC] helps me create enjoyable experienceswithin the stadium

(6) The [ABC] helps me share my experiences withothers within the stadium

Usage Context Aspect

(7) The [ABC] provides me with help in the stadiumwhile watching the sporting action

(8) The [ABC] provides me with information aboutother spectators in the stadium

(9) The [ABC] helps provide me with a good physicaland social environment in the stadium

Culture Aspect

(10) The [ABC] helps me feel part of a group(11) The [ABC] helps me promote my group image(12) The [ABC] helps me interact with my group

Product Aspect

(13) The [ABC] is useful at the event(14) The [ABC] is easy to learn how to use(15) The [ABC] is easy to use

Acknowledgments

This work has been carried out as part of the Philips ResearchProgramme on Lifestyle The authors would like to thank allthe participants who were generous with their time duringthis study

References

[1] D Raptis N Tselios and N Avouris ldquoContext-based designof mobile applications for museums a survey of existingpracticesrdquo in Proceedings of the 7th International Conference onHuman Computer Interaction with Mobile Devices amp Services2005

[2] C P J M Van Elzakker I Delikostidis and P J M VanOosterom ldquoField-based usability evaluation methodology formobile geo-applicationsrdquo Cartographic Journal vol 45 no 2pp 139ndash149 2008

[3] J Kjeldskov M B Skov B S Als and R T Hoslashegh ldquoIs it worththe hassle Exploring the added value of evaluating the usabilityof context-aware mobile systems in the Fieldrdquo in Proceedingsof the 6th International Mobile Conference (HCI rsquo04) Springer2004

Advances in Human-Computer Interaction 9

[4] X Sun D Golightly J Cranwelly B Bedwell and S SharplesldquoParticipant experiences of mobile device-based diary studiesrdquoInternational Journal ofMobile HumanComputer Interaction Inpress

[5] J Kjeldskov and J Stage ldquoNew techniques for usability eval-uation of mobile systemsrdquo International Journal of HumanComputer Studies vol 60 no 5-6 pp 599ndash620 2004

[6] P Christensen M Romero M Thomas A S Nielsen and HHarder ldquoChildren mobility and space using GPS and mobilephone technologies in ethnographic researchrdquo Journal of MixedMethods Research vol 5 no 3 pp 227ndash246 2011

[7] E G Coleman ldquoEthnographic approaches to digital mediardquoAnnual Review of Anthropology vol 39 pp 487ndash505 2010

[8] L Baillie ldquoFuture Telecommunication exploring actual userdquoin Proceedings of the International Conference on Human-Computer Interaction IOS press 2003

[9] M Esbjornsson B Brown O Juhlin D Normark MOstergren and E Laurier ldquoWatching the cars go round andround designing for active spectatingrdquo in Proceedings of theConference on Human Factors in Computing Systems (CHI rsquo06)pp 1221ndash1224 New York NY USA April 2006

[10] X Sun ldquoUser requirements of personalized mobile applicationsat large sporting eventsrdquo in Proceedings of the IADIS Multi Con-ference on Computer Science and Information Systems(MCCSISrsquo10) Freiburg Germany 2010

[11] H B L Duh G C B Tan and V H H Chen ldquoUsabilityevaluation for mobile device a comparison of laboratory andfield testsrdquo in Proceedings of the 8th International Conference onHuman-Computer Interaction with Mobile Devices and Services(MobileHCI rsquo06) pp 181ndash186 September 2006

[12] A Pirhonen S Brewster and C Holguin ldquoGestural andaudio metaphors as a means of control for mobile devicesrdquo inProceedings of the Conference on Human Factors in ComputingSystems (CHI rsquo02) pp 291ndash298 NewYork NY USA April 2002

[13] R Graham and C Carter ldquoComparison of speech input andmanual control of in-car devices while on-the-moverdquo in Pro-ceedings of the 2nd Workshop on Human Computer Interactionwith Mobile Devices (HCI rsquo1999) Edinburgh UK 1999

[14] J Lai K Cheng P Green andO Tsimhoni ldquoOn the road and onthe web Comprehension of synthetic and human speech whiledrivingrdquo in Proceedings of the Conference on Human Factors inComputing Systems (CHI rsquo01) pp 206ndash212 New York NY USAApril 2001

[15] B Mobasher ldquoData mining for personalizationrdquo in The Adap-tive Web Methods and Strategies of Web Personalization PBrusilovsky A Kobsa and W Nejdl Eds pp 1ndash46 SpringerBerlin Germany 2007

[16] D Wu I Im M Tremaine K Instone and M Turoff ldquoAframework for classifying personalization schemes used on e-commerce websitesrdquo in Proceedings of the 36th Hawaii Inter-national Conference on Systems Sciences (HICSS rsquo03) p 222bMaui Hawaii USA 2003

[17] L Norros E Kaasinen J Plomp and P Rama Human-Technology Interaction Research and Design VTT RoadmapVTT Research Notes 2220 Espoo Finland 2003

[18] E Frias-Martinez S Y Chen and X Liu ldquoEvaluation of apersonalized digital library based on cognitive styles adaptivityversus adaptabilityrdquo International Journal of Information Man-agement vol 29 no 1 pp 48ndash56 2009

[19] J Dewey Art as Experience Perigee New York NY USA 1980

[20] J Rasmussen ldquoHuman factors in a dynamic information soci-ety where are we headingrdquo Ergonomics vol 43 no 7 pp 869ndash879 2000

[21] L Arhippainen and M Tahti ldquoEmpirical Evaluation of UserExperience in Two AdaptiveMobile Application Prototypesrdquo inProceedings of the 2nd International Conference on Mobile andUbiquitous Multimedia Lulea Sweden 2003

[22] M Hassenzahl andN Tractinsky ldquoUser experiencemdasha researchagendardquo Behaviour and Information Technology vol 25 no 2pp 91ndash97 2006

[23] S Siegel and N J Castellan Nonparametric Statistics For theBehavioral Sciences McGraw-Hill New York NY USA 1988

[24] E Law V Roto A P O S Vermeeren J Kort and MHassenzahl ldquoTowards a shared definition of user experiencerdquoin Proceedings of the 28th Annual Conference on Human Factorsin Computing Systems (CHI rsquo08) pp 2395ndash2398 Florence ItalyApril 2008

[25] E L C Law and P van Schaik ldquoModelling user experiencemdashanagenda for research and practicerdquo Interacting with Computersvol 22 no 5 pp 313ndash322 2010

[26] C H Lin P J Sher and H Y Shih ldquoPast progress andfuture directions in conceptualizing customer perceived valuerdquoInternational Journal of Service Industry Management vol 16no 4 pp 318ndash336 2005

[27] X Sun and A May ldquoMobile personalisation at large sportseventsmdashuser experience and mobile device personalisationrdquoin Human-Computer Interaction 2007 vol 11 Springer BerlinGermany 2007

[28] X Sun and A May ldquoThe role of spatial contextual factors inmobile personalization at large sports eventsrdquo Personal andUbiquitous Computing vol 13 no 4 pp 293ndash302 2009

[29] J Hackos and J Redish User and Task Analysis For InterfaceDesign John Wiley amp Sons New York NY USA 1998

[30] A Nilsson U Nulden and D Olsson ldquoSpectator informationsupport exploring the context of distributed eventsrdquo in Pro-ceedings of the International ACM SIGGROUP Conference onSupporting Group Work 2004

[31] B Oertel K Steinmuller and M Kuom ldquoMobile multimediaservices for tourismrdquo in Information and Communication Tech-nologies in Tourism 2002 K W Wober A J Frew and M HitzEds pp 265ndash274 Springer New York NY USA 2002

[32] D D Salvucci ldquoPredicting the effects of in-car interfaces ondriver behavior using a cognitive architecturerdquo in Proceedingsof the Conference on Human Factors in Computing Systems (CHIrsquo01) pp 120ndash127 New York NY USA April 2001

[33] A Kaikkonen A Kekalainen M Cankar T Kallio and AKankainen ldquoUsability testing of mobile applications a compar-ison between laboratory and field testingrdquo Journal of UsabilityStudies vol 1 pp 4ndash16 2005

Submit your manuscripts athttpwwwhindawicom

Computer Games Technology

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Distributed Sensor Networks

International Journal of

Advances in

FuzzySystems

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014

International Journal of

ReconfigurableComputing

Hindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Applied Computational Intelligence and Soft Computing

thinspAdvancesthinspinthinsp

Artificial Intelligence

HindawithinspPublishingthinspCorporationhttpwwwhindawicom Volumethinsp2014

Advances inSoftware EngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Electrical and Computer Engineering

Journal of

Journal of

Computer Networks and Communications

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporation

httpwwwhindawicom Volume 2014

Advances in

Multimedia

International Journal of

Biomedical Imaging

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

ArtificialNeural Systems

Advances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

RoboticsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Computational Intelligence and Neuroscience

Industrial EngineeringJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Modelling amp Simulation in EngineeringHindawi Publishing Corporation httpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Human-ComputerInteraction

Advances in

Computer EngineeringAdvances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Page 5: Research Article A Comparison of Field-Based and …downloads.hindawi.com/journals/ahci/2013/619767.pdfbusiness, philosophy, anthropology, cognitive science, social science, and other

Advances in Human-Computer Interaction 5

Table 1 The overall user experience assessment for each task

Task Friedman (3 related samples) Multiple paired comparisons [23 page 180] (120572 = 05)

Checking match schedules 1205942(2) = 313119875 lt 001

119873 = 18 10038161003816100381610038161003816119877Personalized minus 119877Nonpersonalized

10038161003816100381610038161003816= 215 gtZ = 1436lowast

119873 = 18 10038161003816100381610038161003816119877Personalized minus 119877paper

10038161003816100381610038161003816= 295 gtZ = 1436lowast

119873 = 18 10038161003816100381610038161003816119877Nonpersonalized minus 119877paper

10038161003816100381610038161003816= 8 ltZ = 1436

Obtaining player information 1205942(2) = 34119875 lt 001

119873 = 18 10038161003816100381610038161003816119877Personalized minus 119877Nonpersonalized

10038161003816100381610038161003816= 17 gtZ = 1436lowast

119873 = 18 10038161003816100381610038161003816119877Personalized minus 119877paper

10038161003816100381610038161003816= 34 gtZ = 1436lowast

119873 = 18 10038161003816100381610038161003816119877Nonpersonalized minus 119877paper

10038161003816100381610038161003816= 17 gtZ = 1436lowast

Reviewing match progress 1205942(2) = 306119875 lt 001

119873 = 18 10038161003816100381610038161003816119877Personalized minus 119877Nonpersonalized

10038161003816100381610038161003816= 165 gtZ = 1436lowast

119873 = 18 10038161003816100381610038161003816119877Personalized minus 119877paper

10038161003816100381610038161003816= 33 gtZ = 1436lowast

119873 = 18 10038161003816100381610038161003816119877Nonpersonalized minus 119877paper

10038161003816100381610038161003816= 165 gtZ = 1436lowast

Building a community 1205942(2) = 321119875 lt 001

119873 = 18 10038161003816100381610038161003816119877Personalized minus 119877Nonpersonalized

10038161003816100381610038161003816= 17 gtZ = 1436lowast

119873 = 18 10038161003816100381610038161003816119877Personalized minus 119877paper

10038161003816100381610038161003816= 34 gtZ = 1436lowast

119873 = 18 10038161003816100381610038161003816119877Nonpersonalized minus 119877paper

10038161003816100381610038161003816= 17 gtZ = 1436lowast

lowast

Indicates a significant difference

recorded approximately 18 hours of video capturing the 18subjectsrsquo interaction steps while completing each task Insummary 955 of the tasks were completed successfully Forthe 45 of unfinished tasks 35 usability problems with thepersonalized mobile prototype were reported

Participants in the field experiment stressed problemsof mobile ldquouserdquo rather than simply application ldquousabilityrdquoand typically those problems were expressed in the languageof the situation For example users were concerned aboutspending toomuch time personalizing the application duringthe event (detracting from the event itself) and the font onthe interface being too small to read in an open stadiumunder bright sunlight The field study also identified issues ofvalidity and precision of the data presented by the applicationFor example users were concerned about the reliability ofinformation provided by the prototypes after they found thatsome player information presented on the mobile device didnot match with the real events

To the participants the field experiment environmentfelt fairly informal and the users talked freely about theuse of the application and their feelings Users expressedhow the field experiment allowed them to feel relaxed andable to communicate with the researcher as they undertookthe evaluation scenarios Rather than focusing on interfaceissues users generally expressed broader views and wereable to give a wide range of evaluation-related informa-tion during the experiment such as expressing contextuallyrelated requirements A particular example of the kind of datagenerated during the field study was the need for the mobilecontent (and its delivery) to be highly integrated into thetemporal flow of the sporting action

Using a field experiment approach it may be possible toobtain a higher level of ldquorealismrdquo However this evaluationmethod is not easy to undertake Experiments in the fieldare influenced by external factors such as the weather andmoreover it is more difficult to actually collect data fromparticipants Users were impacted by events happening in thefield such as noise and other disturbances For example someparticipants actually forgot that they were taking part in a

research studymdashuntil prompted they focused their attentionon the competition happening in the stadium and weresubstantially distracted from the field experiment Flexibilityand pragmatism are needed to still collect useful data whilenot detracting unduly from the experience generated withinthe field setting

4 Lab-Based Evaluation ofPersonalisation Approach

41 Aims Whereas the previous section has described a field-based study this section describes a very similar lab studythis time centred around a multievent athletics meeting Thespecific objectives of this study were to use a more controlledexperimental setup to compare the user experience for aspectator at a large sports event under three conditions (1)using paper-based (not mobile) content (2) using a mobileprototype where personalisation parameters were set by theuser (3) using a similar prototype where parameters wereset automatically Similar procedures and data collectionmethods enable a comparison of findings with the field-basedstudy described in Section 3

42 Method

421 Setup This evaluation took place in a usability labo-ratory in the UK The usability lab was set up to resemble apart of the sports stadium To mimic the divided attentionthat would result from a spectator watching a sporting eventsports footage including auditory output was projectedonto the front wall of the laboratory A crowd scene wasreplicated on each of the two side walls In contrast totypical mobile applications such as tourist guides [31] aspectator is usually seated and relatively static at a sportingevent Therefore mobility per se did not need to be incor-porated into the experimental environment A video camerawas used to record usersrsquo interaction with the prototypes(Figure 5)

6 Advances in Human-Computer Interaction

Figure 5 User study in a usability lab

422 Experimental Scenarios Five scenarios were devel-oped based on the same key spectator activities as used in theprevious field-based experiment Four of the experimentaltasks were the same as those employed in the field experi-ment This lab-based study also employed a fifth task whichrequired the participant to select a suitable viewing angle for amobile broadcast on the deviceThis enabled the participantsto follow the sporting action from wherever they were in thestadium

423 Prototypes Two personalised mobile prototypes weredeveloped using content filtering and collaborative filteringpersonalisation techniqueThey shared the same look and feelas those used in the field study described in Section 3

One prototype enabled user-initiated personalisation ofcontent using an extended menu structure as before (seeFigure 2) As a result this prototype presented event infor-mation such as athlete details and event schedules based onthe usersrsquo settings

In contrast the prototype with system-initiated personal-isation did not require that participants set personalisationparameters Personalised content was then presented auto-matically to the participant using the content and collabora-tive filtering personalisation technique

As for the field experiment a control condition wasincluded that was representative of a nondigital event pro-gramme that a spectator would typically have This was apaper leaflet that provided information on competition timesand athlete information

424 Participants A different cohort of eighteen partici-pants took part in the study again split equally male-femaleaged between 18 and 38 and with various occupations Asin the previous field-based study all participants had regularexperience of personalising mobile devices and had attendeda large sports event within the last six months

425 Procedure and Analysis of Data A pilot study wasused to maximise the realism of the simulation and ensurethat the data collection methods that were used during thefield trial were effective within a laboratory setting Two keychanges were made rearrangement of speakers to broadcastaudience noise the side projection of a video of the audiencewas replaced with large-scale posters This enhanced theperceived social atmosphere within the laboratory without

unnecessarily distracting the participant from the mainprojected view (the sporting action)

The procedure followed that used for the field experimentdescribed in Section 3 Each task was completed in turnwith the three personalisation conditions counterbalancedacross participants within each task The design of the labexperiment was also discussed with participants at the end ofthe study The study lasted approximately one hour for eachparticipant

As for field study the data comprised quantitative ratingscale data concurrent verbal reports and posttrial interviewdata and observational data This was analysed as describedpreviously

43 Lab Study Results Across all tasks there were clearadvantages to having personalised content delivered overa mobile device The control condition (representing thatpresent at current stadium environments) was significantlyworse in all scenarios (see Table 2) Table 2 also indicatesthat in terms of the impact on user experience neither usernor system-initiated personalisation emerged as a single bestapproach across the range of tasks studied

The experiment recorded approximately 18 hours of videoduring the lab study In summary 92 of the tasks werecompleted successfully For the 8 of unfinished tasks 42usability problems with the personalized mobile prototypewere reported In general participants considered the userinterface of both personalized mobile prototypes easy to use

The laboratory study was relatively easy to undertake andcollect data from During the lab study participants quicklyrevealed considerable information about how a spectator usesthe prototypes The lab environment also offered more con-trol over the conditions for the experiment In comparisonto the field study participants were more focused on theexperiment being undertaken and were not influenced byexternal factors such asweather noise or others disturbancesfrom the sporting environment

There were some drawbacks to the lab experimentincluding only a limited representation of the real worldand greater uncertainty over the degree of generalization ofresults outside laboratory settingsThis laboratory study triedto ldquobringrdquo the large sporting event into the experiment bycarefully setting up the lab to resemble a stadium designingscenarios based on previous studies of context within largesports events and involving users who were familiar with theusage context It also addressed the issue of usersrsquo dividedattention by requiring subjects to watch a sport event videowhich was projected on the front wall of the lab roomwhile performing the scenario-based tasks with the mobileprototypes As a result participants were able to identifysome context related problems (eg the font was too smallto read in an open stadium) during the lab experiment Inaddition participants expressed their concerns with usingthe personalized prototypes from contextual and social per-spectives including concerns over spending too much timepersonalizing the device during the event and thereforeactually missing some of the sporting action

Advances in Human-Computer Interaction 7

Table 2 The overall user experience assessment for each task

Task Friedman (3 related samples) Multiple paired comparisons [23 page 180] (120572 = 05)

Follow sporting action119873 = 18 119873 = 18 1003816100381610038161003816

1003816119877system-initiated minus 119877use-initiated

10038161003816100381610038161003816= 15 gtZ = 1436lowast

1205942(2) = 275 119873 = 18 1003816100381610038161003816

1003816119877system-initiated minus 119877paper

10038161003816100381610038161003816= 49 gtZ = 1436lowast

119875 lt 001 119873 = 18 10038161003816100381610038161003816119877user-initiated minus 119877paper

10038161003816100381610038161003816= 34 gtZ = 1436lowast

Obtaining athlete information119873 = 18 119873 = 18 1003816100381610038161003816

1003816119877system-initiated minus 119877use-initiated

10038161003816100381610038161003816= 2 ltZ = 1436

1205942(2) = 294 119873 = 18 1003816100381610038161003816

1003816119877system-initiated minus 119877paper

10038161003816100381610038161003816= 16 gtZ = 1436lowast

119875 lt 001 119873 = 18 10038161003816100381610038161003816119877user-initiated minus 119877paper

10038161003816100381610038161003816= 18 gtZ = 1436lowast

Reviewing athletics event results119873 = 18 119873 = 18 1003816100381610038161003816

1003816119877system-initiated minus 119877use-initiated

10038161003816100381610038161003816= 175 gtZ = 1436lowast

1205942(2) = 355 119873 = 18 1003816100381610038161003816

1003816119877system-initiated minus 119877paper

10038161003816100381610038161003816= 35 gtZ = 1436lowast

119875 lt 001 119873 = 18 10038161003816100381610038161003816119877user-initiated minus 119877paper

10038161003816100381610038161003816= 175 gtZ = 1436lowast

Building a community119873 = 18 119873 = 18 1003816100381610038161003816

1003816119877system-initiated minus 119877use-initiated

10038161003816100381610038161003816= 6 ltZ = 1436

1205942(2) = 296 119873 = 18 1003816100381610038161003816

1003816119877system-initiated minus 119877paper

10038161003816100381610038161003816= 24 gtZ = 1436lowast

119875 lt 001 119873 = 18 10038161003816100381610038161003816119877user-initiated minus 119877paper

10038161003816100381610038161003816= 30 gtZ = 1436lowast

Checking event schedules119873 = 18 119873 = 18 1003816100381610038161003816

1003816119877system-initiated minus 119877use-initiated

10038161003816100381610038161003816= 165 gtZ = 1436lowast

1205942(2) = 345 119873 = 18 1003816100381610038161003816

1003816119877system-initiated minus 119877paper

10038161003816100381610038161003816= 345 gtZ = 1436lowast

119875 lt 001 119873 = 18 10038161003816100381610038161003816119877user-initiated minus 119877paper

10038161003816100381610038161003816= 18 gtZ = 1436lowast

lowast

Indicates a significant result

The lab experiment did not allow users to feel relaxedduring the experimental procedure Participants acted morepolitely during the study and they pointed out that they wereuncomfortable about expressing negative feelings about theapplications In one example when interviewing a participantabout aspects of his user experience he generally stated thatit was fine However when presented with the EmotionCards(a group of cartoon faces that were used to help promotediscussion of affective aspects of interaction) he tended topick up one emotion card and talked a lot about concernsover the time and effort required to manually personalize theapplication without feeling that he was being overly critical

5 Discussion and Conclusion

In recent years there has been much debate on whethermobile applications should be evaluated in the field or in atraditional lab environment issues including usersrsquo behaviour[11] identification of usability problems [3] the experimentsettings [5 8 32] the communication with participants[33] This research enabled a comparison between fieldand laboratory experiments based on similar users mobileapplications and task-based scenarios In particular a similaruser-initiated prototype was used in each even though thefield experiment took place at a football competition and thelab experiment recreated an athletics meeting

The number of usability problems identified during boththe field and lab experiments was similar (over all partic-ipants when using the user-initiated prototype there were35 usability problems identified in the field setting and 42found in the lab setting) These findings are consistent withthose of Kjeldskov et al [3] They specifically compared laband field-based usability results and found that the differencein effectiveness of these two approaches was nonsignificant

in identifying most usability problems Also some contextrelated problems such as the font being too small to read inan open stadium were identified in both experiment settingsHowever some key differences in the effectiveness of the fieldand laboratory approaches were found the lab experimentidentified problems related to the detail of the interfacedesign for example the colours and icons on the interfacethe field experiment identified issues of validity and precisionof the data presented by the application when using theapplication in a stadium The field experiment also stressedthe problems of mobile ldquouserdquo rather than simply applicationusability and typically these problems were expressed in thelanguage of the situation [11]

An analysis of positive versus negative behaviours [11] wasundertakenThis data included verbal reports and rating scaledata according to the user experience definitions Acceptingthe limitations of a direct comparison participants reactedmore negatively in the laboratory setting when completingsimilar tasks (using the similar user-initiated personalisationapproach) In the field individuals were influenced by theatmosphere surrounding the sports event and this resulted inan enhanced user experience In addition they focused moreattention on the actual usage of personalisation on themobiledevice instead of issues to do with the interface The labsetting was less engaging than the field setting participantsweremore likely to be critical and in general they took longerto perform certain tasks by focusing (and commenting) oninterface issues such as fonts and colours used

The field experiment was more difficult to conduct thanthe lab experiment a point noted by many authors includingKjeldskov et al [3] and Baillie [8] Confounding factors werepresent for example variations in theweather and noise fromother spectators In addition although it was desirable thatparticipants engaged in the sporting action spectatorsrsquo foci ofattention could not be controlled and was difficult to predict

8 Advances in Human-Computer Interaction

Some participants were ldquodistractedrdquo from the experimentaltasks and this did not occur during the laboratory settingThe greater control possible with a laboratory study (asdiscussed by a range of authors including [2 5 8])was clearlydemonstrated during these studies

Where there is an interest in qualitative data goodcommunication between the researcher and participants isvital The field experiment provided a more open and relaxedatmosphere for discourse Users more freely discussed theiruse of the mobile applications their underlying beliefs andattitudes that arose during the study The field experimenthelped the communication tensions with the participants asthey felt they were not being directly examined As well asgenerally promoting the generation of qualitative data thefield experiments encouraged the expression of broader aswell as more contextually relevant views An example is theidentification of contextually dependent requirements whichoccurred much less frequently during the lab-based studies

Some suggestions for user impact assessment withmobiledevices can be made based on the findings of this study Alab experiment is recommended when the focus is on theuser interface and device-oriented usability issues In suchcases a well-designed lab study should provide the validityrequired while being easier quicker and cheaper to conductHowever the results suggest that a field experiment is bettersuited for investigating a wider range of factors affectingthe overall acceptability of mobile services including systemfunctions and impact of usage contexts Where open andrelaxed communication is important (eg where participantgroups are naturally reticent to communicate) this is morereadily promoted by the use of a field study

The natural tension between a deductive and inductiveresearch design was also apparent This research in generalwas essentially deductive since it set out to explain causalrelationships between variables operationalized conceptscontrolled variables and used structured and repeatablemethods to collect data However the field study in particularalso comprised an inductive element as there was a desire tounderstand the research context and the meanings attachedto events in order to help design the lab study Van Elzakkeret al [2] underline how it is often desirable to combineapproaches within the same study The undertaking of a fieldexperiment followed by a lab experiment was an attemptat multiplicity of methods from an essentially deductiveviewpoint This recognised that the natural research processis often that of moving from a process of understanding toone of testing whilst attempting to avoid the unsatisfactorymiddle ground of user evaluations that are divorced from anyunderlying research objectives

Appendix

Likert Items Used to Assess User Experience

In all cases participant responses were based on a six-pointscale ranging from 1 (strongly disagree) to 6 (strongly agree)

User Aspect

(1) I feel happy using [ABC] during the event

(2) My expectations regarding my spectator experiencein the stadium are met using [ABC]

(3) My needs as a spectator are taken into account using[ABC]

Social Aspect

(4) Using [ABC] helps me feel I am communicatingand sharing information with others in the stadium

(5) The [ABC] helps me create enjoyable experienceswithin the stadium

(6) The [ABC] helps me share my experiences withothers within the stadium

Usage Context Aspect

(7) The [ABC] provides me with help in the stadiumwhile watching the sporting action

(8) The [ABC] provides me with information aboutother spectators in the stadium

(9) The [ABC] helps provide me with a good physicaland social environment in the stadium

Culture Aspect

(10) The [ABC] helps me feel part of a group(11) The [ABC] helps me promote my group image(12) The [ABC] helps me interact with my group

Product Aspect

(13) The [ABC] is useful at the event(14) The [ABC] is easy to learn how to use(15) The [ABC] is easy to use

Acknowledgments

This work has been carried out as part of the Philips ResearchProgramme on Lifestyle The authors would like to thank allthe participants who were generous with their time duringthis study

References

[1] D Raptis N Tselios and N Avouris ldquoContext-based designof mobile applications for museums a survey of existingpracticesrdquo in Proceedings of the 7th International Conference onHuman Computer Interaction with Mobile Devices amp Services2005

[2] C P J M Van Elzakker I Delikostidis and P J M VanOosterom ldquoField-based usability evaluation methodology formobile geo-applicationsrdquo Cartographic Journal vol 45 no 2pp 139ndash149 2008

[3] J Kjeldskov M B Skov B S Als and R T Hoslashegh ldquoIs it worththe hassle Exploring the added value of evaluating the usabilityof context-aware mobile systems in the Fieldrdquo in Proceedingsof the 6th International Mobile Conference (HCI rsquo04) Springer2004

Advances in Human-Computer Interaction 9

[4] X Sun D Golightly J Cranwelly B Bedwell and S SharplesldquoParticipant experiences of mobile device-based diary studiesrdquoInternational Journal ofMobile HumanComputer Interaction Inpress

[5] J Kjeldskov and J Stage ldquoNew techniques for usability eval-uation of mobile systemsrdquo International Journal of HumanComputer Studies vol 60 no 5-6 pp 599ndash620 2004

[6] P Christensen M Romero M Thomas A S Nielsen and HHarder ldquoChildren mobility and space using GPS and mobilephone technologies in ethnographic researchrdquo Journal of MixedMethods Research vol 5 no 3 pp 227ndash246 2011

[7] E G Coleman ldquoEthnographic approaches to digital mediardquoAnnual Review of Anthropology vol 39 pp 487ndash505 2010

[8] L Baillie ldquoFuture Telecommunication exploring actual userdquoin Proceedings of the International Conference on Human-Computer Interaction IOS press 2003

[9] M Esbjornsson B Brown O Juhlin D Normark MOstergren and E Laurier ldquoWatching the cars go round andround designing for active spectatingrdquo in Proceedings of theConference on Human Factors in Computing Systems (CHI rsquo06)pp 1221ndash1224 New York NY USA April 2006

[10] X Sun ldquoUser requirements of personalized mobile applicationsat large sporting eventsrdquo in Proceedings of the IADIS Multi Con-ference on Computer Science and Information Systems(MCCSISrsquo10) Freiburg Germany 2010

[11] H B L Duh G C B Tan and V H H Chen ldquoUsabilityevaluation for mobile device a comparison of laboratory andfield testsrdquo in Proceedings of the 8th International Conference onHuman-Computer Interaction with Mobile Devices and Services(MobileHCI rsquo06) pp 181ndash186 September 2006

[12] A Pirhonen S Brewster and C Holguin ldquoGestural andaudio metaphors as a means of control for mobile devicesrdquo inProceedings of the Conference on Human Factors in ComputingSystems (CHI rsquo02) pp 291ndash298 NewYork NY USA April 2002

[13] R Graham and C Carter ldquoComparison of speech input andmanual control of in-car devices while on-the-moverdquo in Pro-ceedings of the 2nd Workshop on Human Computer Interactionwith Mobile Devices (HCI rsquo1999) Edinburgh UK 1999

[14] J Lai K Cheng P Green andO Tsimhoni ldquoOn the road and onthe web Comprehension of synthetic and human speech whiledrivingrdquo in Proceedings of the Conference on Human Factors inComputing Systems (CHI rsquo01) pp 206ndash212 New York NY USAApril 2001

[15] B Mobasher ldquoData mining for personalizationrdquo in The Adap-tive Web Methods and Strategies of Web Personalization PBrusilovsky A Kobsa and W Nejdl Eds pp 1ndash46 SpringerBerlin Germany 2007

[16] D Wu I Im M Tremaine K Instone and M Turoff ldquoAframework for classifying personalization schemes used on e-commerce websitesrdquo in Proceedings of the 36th Hawaii Inter-national Conference on Systems Sciences (HICSS rsquo03) p 222bMaui Hawaii USA 2003

[17] L Norros E Kaasinen J Plomp and P Rama Human-Technology Interaction Research and Design VTT RoadmapVTT Research Notes 2220 Espoo Finland 2003

[18] E Frias-Martinez S Y Chen and X Liu ldquoEvaluation of apersonalized digital library based on cognitive styles adaptivityversus adaptabilityrdquo International Journal of Information Man-agement vol 29 no 1 pp 48ndash56 2009

[19] J Dewey Art as Experience Perigee New York NY USA 1980

[20] J Rasmussen ldquoHuman factors in a dynamic information soci-ety where are we headingrdquo Ergonomics vol 43 no 7 pp 869ndash879 2000

[21] L Arhippainen and M Tahti ldquoEmpirical Evaluation of UserExperience in Two AdaptiveMobile Application Prototypesrdquo inProceedings of the 2nd International Conference on Mobile andUbiquitous Multimedia Lulea Sweden 2003

[22] M Hassenzahl andN Tractinsky ldquoUser experiencemdasha researchagendardquo Behaviour and Information Technology vol 25 no 2pp 91ndash97 2006

[23] S Siegel and N J Castellan Nonparametric Statistics For theBehavioral Sciences McGraw-Hill New York NY USA 1988

[24] E Law V Roto A P O S Vermeeren J Kort and MHassenzahl ldquoTowards a shared definition of user experiencerdquoin Proceedings of the 28th Annual Conference on Human Factorsin Computing Systems (CHI rsquo08) pp 2395ndash2398 Florence ItalyApril 2008

[25] E L C Law and P van Schaik ldquoModelling user experiencemdashanagenda for research and practicerdquo Interacting with Computersvol 22 no 5 pp 313ndash322 2010

[26] C H Lin P J Sher and H Y Shih ldquoPast progress andfuture directions in conceptualizing customer perceived valuerdquoInternational Journal of Service Industry Management vol 16no 4 pp 318ndash336 2005

[27] X Sun and A May ldquoMobile personalisation at large sportseventsmdashuser experience and mobile device personalisationrdquoin Human-Computer Interaction 2007 vol 11 Springer BerlinGermany 2007

[28] X Sun and A May ldquoThe role of spatial contextual factors inmobile personalization at large sports eventsrdquo Personal andUbiquitous Computing vol 13 no 4 pp 293ndash302 2009

[29] J Hackos and J Redish User and Task Analysis For InterfaceDesign John Wiley amp Sons New York NY USA 1998

[30] A Nilsson U Nulden and D Olsson ldquoSpectator informationsupport exploring the context of distributed eventsrdquo in Pro-ceedings of the International ACM SIGGROUP Conference onSupporting Group Work 2004

[31] B Oertel K Steinmuller and M Kuom ldquoMobile multimediaservices for tourismrdquo in Information and Communication Tech-nologies in Tourism 2002 K W Wober A J Frew and M HitzEds pp 265ndash274 Springer New York NY USA 2002

[32] D D Salvucci ldquoPredicting the effects of in-car interfaces ondriver behavior using a cognitive architecturerdquo in Proceedingsof the Conference on Human Factors in Computing Systems (CHIrsquo01) pp 120ndash127 New York NY USA April 2001

[33] A Kaikkonen A Kekalainen M Cankar T Kallio and AKankainen ldquoUsability testing of mobile applications a compar-ison between laboratory and field testingrdquo Journal of UsabilityStudies vol 1 pp 4ndash16 2005

Submit your manuscripts athttpwwwhindawicom

Computer Games Technology

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Distributed Sensor Networks

International Journal of

Advances in

FuzzySystems

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014

International Journal of

ReconfigurableComputing

Hindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Applied Computational Intelligence and Soft Computing

thinspAdvancesthinspinthinsp

Artificial Intelligence

HindawithinspPublishingthinspCorporationhttpwwwhindawicom Volumethinsp2014

Advances inSoftware EngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Electrical and Computer Engineering

Journal of

Journal of

Computer Networks and Communications

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporation

httpwwwhindawicom Volume 2014

Advances in

Multimedia

International Journal of

Biomedical Imaging

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

ArtificialNeural Systems

Advances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

RoboticsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Computational Intelligence and Neuroscience

Industrial EngineeringJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Modelling amp Simulation in EngineeringHindawi Publishing Corporation httpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Human-ComputerInteraction

Advances in

Computer EngineeringAdvances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Page 6: Research Article A Comparison of Field-Based and …downloads.hindawi.com/journals/ahci/2013/619767.pdfbusiness, philosophy, anthropology, cognitive science, social science, and other

6 Advances in Human-Computer Interaction

Figure 5 User study in a usability lab

422 Experimental Scenarios Five scenarios were devel-oped based on the same key spectator activities as used in theprevious field-based experiment Four of the experimentaltasks were the same as those employed in the field experi-ment This lab-based study also employed a fifth task whichrequired the participant to select a suitable viewing angle for amobile broadcast on the deviceThis enabled the participantsto follow the sporting action from wherever they were in thestadium

423 Prototypes Two personalised mobile prototypes weredeveloped using content filtering and collaborative filteringpersonalisation techniqueThey shared the same look and feelas those used in the field study described in Section 3

One prototype enabled user-initiated personalisation ofcontent using an extended menu structure as before (seeFigure 2) As a result this prototype presented event infor-mation such as athlete details and event schedules based onthe usersrsquo settings

In contrast the prototype with system-initiated personal-isation did not require that participants set personalisationparameters Personalised content was then presented auto-matically to the participant using the content and collabora-tive filtering personalisation technique

As for the field experiment a control condition wasincluded that was representative of a nondigital event pro-gramme that a spectator would typically have This was apaper leaflet that provided information on competition timesand athlete information

424 Participants A different cohort of eighteen partici-pants took part in the study again split equally male-femaleaged between 18 and 38 and with various occupations Asin the previous field-based study all participants had regularexperience of personalising mobile devices and had attendeda large sports event within the last six months

425 Procedure and Analysis of Data A pilot study wasused to maximise the realism of the simulation and ensurethat the data collection methods that were used during thefield trial were effective within a laboratory setting Two keychanges were made rearrangement of speakers to broadcastaudience noise the side projection of a video of the audiencewas replaced with large-scale posters This enhanced theperceived social atmosphere within the laboratory without

unnecessarily distracting the participant from the mainprojected view (the sporting action)

The procedure followed that used for the field experimentdescribed in Section 3 Each task was completed in turnwith the three personalisation conditions counterbalancedacross participants within each task The design of the labexperiment was also discussed with participants at the end ofthe study The study lasted approximately one hour for eachparticipant

As for field study the data comprised quantitative ratingscale data concurrent verbal reports and posttrial interviewdata and observational data This was analysed as describedpreviously

43 Lab Study Results Across all tasks there were clearadvantages to having personalised content delivered overa mobile device The control condition (representing thatpresent at current stadium environments) was significantlyworse in all scenarios (see Table 2) Table 2 also indicatesthat in terms of the impact on user experience neither usernor system-initiated personalisation emerged as a single bestapproach across the range of tasks studied

The experiment recorded approximately 18 hours of videoduring the lab study In summary 92 of the tasks werecompleted successfully For the 8 of unfinished tasks 42usability problems with the personalized mobile prototypewere reported In general participants considered the userinterface of both personalized mobile prototypes easy to use

The laboratory study was relatively easy to undertake andcollect data from During the lab study participants quicklyrevealed considerable information about how a spectator usesthe prototypes The lab environment also offered more con-trol over the conditions for the experiment In comparisonto the field study participants were more focused on theexperiment being undertaken and were not influenced byexternal factors such asweather noise or others disturbancesfrom the sporting environment

There were some drawbacks to the lab experimentincluding only a limited representation of the real worldand greater uncertainty over the degree of generalization ofresults outside laboratory settingsThis laboratory study triedto ldquobringrdquo the large sporting event into the experiment bycarefully setting up the lab to resemble a stadium designingscenarios based on previous studies of context within largesports events and involving users who were familiar with theusage context It also addressed the issue of usersrsquo dividedattention by requiring subjects to watch a sport event videowhich was projected on the front wall of the lab roomwhile performing the scenario-based tasks with the mobileprototypes As a result participants were able to identifysome context related problems (eg the font was too smallto read in an open stadium) during the lab experiment Inaddition participants expressed their concerns with usingthe personalized prototypes from contextual and social per-spectives including concerns over spending too much timepersonalizing the device during the event and thereforeactually missing some of the sporting action

Advances in Human-Computer Interaction 7

Table 2 The overall user experience assessment for each task

Task Friedman (3 related samples) Multiple paired comparisons [23 page 180] (120572 = 05)

Follow sporting action119873 = 18 119873 = 18 1003816100381610038161003816

1003816119877system-initiated minus 119877use-initiated

10038161003816100381610038161003816= 15 gtZ = 1436lowast

1205942(2) = 275 119873 = 18 1003816100381610038161003816

1003816119877system-initiated minus 119877paper

10038161003816100381610038161003816= 49 gtZ = 1436lowast

119875 lt 001 119873 = 18 10038161003816100381610038161003816119877user-initiated minus 119877paper

10038161003816100381610038161003816= 34 gtZ = 1436lowast

Obtaining athlete information119873 = 18 119873 = 18 1003816100381610038161003816

1003816119877system-initiated minus 119877use-initiated

10038161003816100381610038161003816= 2 ltZ = 1436

1205942(2) = 294 119873 = 18 1003816100381610038161003816

1003816119877system-initiated minus 119877paper

10038161003816100381610038161003816= 16 gtZ = 1436lowast

119875 lt 001 119873 = 18 10038161003816100381610038161003816119877user-initiated minus 119877paper

10038161003816100381610038161003816= 18 gtZ = 1436lowast

Reviewing athletics event results119873 = 18 119873 = 18 1003816100381610038161003816

1003816119877system-initiated minus 119877use-initiated

10038161003816100381610038161003816= 175 gtZ = 1436lowast

1205942(2) = 355 119873 = 18 1003816100381610038161003816

1003816119877system-initiated minus 119877paper

10038161003816100381610038161003816= 35 gtZ = 1436lowast

119875 lt 001 119873 = 18 10038161003816100381610038161003816119877user-initiated minus 119877paper

10038161003816100381610038161003816= 175 gtZ = 1436lowast

Building a community119873 = 18 119873 = 18 1003816100381610038161003816

1003816119877system-initiated minus 119877use-initiated

10038161003816100381610038161003816= 6 ltZ = 1436

1205942(2) = 296 119873 = 18 1003816100381610038161003816

1003816119877system-initiated minus 119877paper

10038161003816100381610038161003816= 24 gtZ = 1436lowast

119875 lt 001 119873 = 18 10038161003816100381610038161003816119877user-initiated minus 119877paper

10038161003816100381610038161003816= 30 gtZ = 1436lowast

Checking event schedules119873 = 18 119873 = 18 1003816100381610038161003816

1003816119877system-initiated minus 119877use-initiated

10038161003816100381610038161003816= 165 gtZ = 1436lowast

1205942(2) = 345 119873 = 18 1003816100381610038161003816

1003816119877system-initiated minus 119877paper

10038161003816100381610038161003816= 345 gtZ = 1436lowast

119875 lt 001 119873 = 18 10038161003816100381610038161003816119877user-initiated minus 119877paper

10038161003816100381610038161003816= 18 gtZ = 1436lowast

lowast

Indicates a significant result

The lab experiment did not allow users to feel relaxedduring the experimental procedure Participants acted morepolitely during the study and they pointed out that they wereuncomfortable about expressing negative feelings about theapplications In one example when interviewing a participantabout aspects of his user experience he generally stated thatit was fine However when presented with the EmotionCards(a group of cartoon faces that were used to help promotediscussion of affective aspects of interaction) he tended topick up one emotion card and talked a lot about concernsover the time and effort required to manually personalize theapplication without feeling that he was being overly critical

5 Discussion and Conclusion

In recent years there has been much debate on whethermobile applications should be evaluated in the field or in atraditional lab environment issues including usersrsquo behaviour[11] identification of usability problems [3] the experimentsettings [5 8 32] the communication with participants[33] This research enabled a comparison between fieldand laboratory experiments based on similar users mobileapplications and task-based scenarios In particular a similaruser-initiated prototype was used in each even though thefield experiment took place at a football competition and thelab experiment recreated an athletics meeting

The number of usability problems identified during boththe field and lab experiments was similar (over all partic-ipants when using the user-initiated prototype there were35 usability problems identified in the field setting and 42found in the lab setting) These findings are consistent withthose of Kjeldskov et al [3] They specifically compared laband field-based usability results and found that the differencein effectiveness of these two approaches was nonsignificant

in identifying most usability problems Also some contextrelated problems such as the font being too small to read inan open stadium were identified in both experiment settingsHowever some key differences in the effectiveness of the fieldand laboratory approaches were found the lab experimentidentified problems related to the detail of the interfacedesign for example the colours and icons on the interfacethe field experiment identified issues of validity and precisionof the data presented by the application when using theapplication in a stadium The field experiment also stressedthe problems of mobile ldquouserdquo rather than simply applicationusability and typically these problems were expressed in thelanguage of the situation [11]

An analysis of positive versus negative behaviours [11] wasundertakenThis data included verbal reports and rating scaledata according to the user experience definitions Acceptingthe limitations of a direct comparison participants reactedmore negatively in the laboratory setting when completingsimilar tasks (using the similar user-initiated personalisationapproach) In the field individuals were influenced by theatmosphere surrounding the sports event and this resulted inan enhanced user experience In addition they focused moreattention on the actual usage of personalisation on themobiledevice instead of issues to do with the interface The labsetting was less engaging than the field setting participantsweremore likely to be critical and in general they took longerto perform certain tasks by focusing (and commenting) oninterface issues such as fonts and colours used

The field experiment was more difficult to conduct thanthe lab experiment a point noted by many authors includingKjeldskov et al [3] and Baillie [8] Confounding factors werepresent for example variations in theweather and noise fromother spectators In addition although it was desirable thatparticipants engaged in the sporting action spectatorsrsquo foci ofattention could not be controlled and was difficult to predict

8 Advances in Human-Computer Interaction

Some participants were ldquodistractedrdquo from the experimentaltasks and this did not occur during the laboratory settingThe greater control possible with a laboratory study (asdiscussed by a range of authors including [2 5 8])was clearlydemonstrated during these studies

Where there is an interest in qualitative data goodcommunication between the researcher and participants isvital The field experiment provided a more open and relaxedatmosphere for discourse Users more freely discussed theiruse of the mobile applications their underlying beliefs andattitudes that arose during the study The field experimenthelped the communication tensions with the participants asthey felt they were not being directly examined As well asgenerally promoting the generation of qualitative data thefield experiments encouraged the expression of broader aswell as more contextually relevant views An example is theidentification of contextually dependent requirements whichoccurred much less frequently during the lab-based studies

Some suggestions for user impact assessment withmobiledevices can be made based on the findings of this study Alab experiment is recommended when the focus is on theuser interface and device-oriented usability issues In suchcases a well-designed lab study should provide the validityrequired while being easier quicker and cheaper to conductHowever the results suggest that a field experiment is bettersuited for investigating a wider range of factors affectingthe overall acceptability of mobile services including systemfunctions and impact of usage contexts Where open andrelaxed communication is important (eg where participantgroups are naturally reticent to communicate) this is morereadily promoted by the use of a field study

The natural tension between a deductive and inductiveresearch design was also apparent This research in generalwas essentially deductive since it set out to explain causalrelationships between variables operationalized conceptscontrolled variables and used structured and repeatablemethods to collect data However the field study in particularalso comprised an inductive element as there was a desire tounderstand the research context and the meanings attachedto events in order to help design the lab study Van Elzakkeret al [2] underline how it is often desirable to combineapproaches within the same study The undertaking of a fieldexperiment followed by a lab experiment was an attemptat multiplicity of methods from an essentially deductiveviewpoint This recognised that the natural research processis often that of moving from a process of understanding toone of testing whilst attempting to avoid the unsatisfactorymiddle ground of user evaluations that are divorced from anyunderlying research objectives

Appendix

Likert Items Used to Assess User Experience

In all cases participant responses were based on a six-pointscale ranging from 1 (strongly disagree) to 6 (strongly agree)

User Aspect

(1) I feel happy using [ABC] during the event

(2) My expectations regarding my spectator experiencein the stadium are met using [ABC]

(3) My needs as a spectator are taken into account using[ABC]

Social Aspect

(4) Using [ABC] helps me feel I am communicatingand sharing information with others in the stadium

(5) The [ABC] helps me create enjoyable experienceswithin the stadium

(6) The [ABC] helps me share my experiences withothers within the stadium

Usage Context Aspect

(7) The [ABC] provides me with help in the stadiumwhile watching the sporting action

(8) The [ABC] provides me with information aboutother spectators in the stadium

(9) The [ABC] helps provide me with a good physicaland social environment in the stadium

Culture Aspect

(10) The [ABC] helps me feel part of a group(11) The [ABC] helps me promote my group image(12) The [ABC] helps me interact with my group

Product Aspect

(13) The [ABC] is useful at the event(14) The [ABC] is easy to learn how to use(15) The [ABC] is easy to use

Acknowledgments

This work has been carried out as part of the Philips ResearchProgramme on Lifestyle The authors would like to thank allthe participants who were generous with their time duringthis study

References

[1] D Raptis N Tselios and N Avouris ldquoContext-based designof mobile applications for museums a survey of existingpracticesrdquo in Proceedings of the 7th International Conference onHuman Computer Interaction with Mobile Devices amp Services2005

[2] C P J M Van Elzakker I Delikostidis and P J M VanOosterom ldquoField-based usability evaluation methodology formobile geo-applicationsrdquo Cartographic Journal vol 45 no 2pp 139ndash149 2008

[3] J Kjeldskov M B Skov B S Als and R T Hoslashegh ldquoIs it worththe hassle Exploring the added value of evaluating the usabilityof context-aware mobile systems in the Fieldrdquo in Proceedingsof the 6th International Mobile Conference (HCI rsquo04) Springer2004

Advances in Human-Computer Interaction 9

[4] X Sun D Golightly J Cranwelly B Bedwell and S SharplesldquoParticipant experiences of mobile device-based diary studiesrdquoInternational Journal ofMobile HumanComputer Interaction Inpress

[5] J Kjeldskov and J Stage ldquoNew techniques for usability eval-uation of mobile systemsrdquo International Journal of HumanComputer Studies vol 60 no 5-6 pp 599ndash620 2004

[6] P Christensen M Romero M Thomas A S Nielsen and HHarder ldquoChildren mobility and space using GPS and mobilephone technologies in ethnographic researchrdquo Journal of MixedMethods Research vol 5 no 3 pp 227ndash246 2011

[7] E G Coleman ldquoEthnographic approaches to digital mediardquoAnnual Review of Anthropology vol 39 pp 487ndash505 2010

[8] L Baillie ldquoFuture Telecommunication exploring actual userdquoin Proceedings of the International Conference on Human-Computer Interaction IOS press 2003

[9] M Esbjornsson B Brown O Juhlin D Normark MOstergren and E Laurier ldquoWatching the cars go round andround designing for active spectatingrdquo in Proceedings of theConference on Human Factors in Computing Systems (CHI rsquo06)pp 1221ndash1224 New York NY USA April 2006

[10] X Sun ldquoUser requirements of personalized mobile applicationsat large sporting eventsrdquo in Proceedings of the IADIS Multi Con-ference on Computer Science and Information Systems(MCCSISrsquo10) Freiburg Germany 2010

[11] H B L Duh G C B Tan and V H H Chen ldquoUsabilityevaluation for mobile device a comparison of laboratory andfield testsrdquo in Proceedings of the 8th International Conference onHuman-Computer Interaction with Mobile Devices and Services(MobileHCI rsquo06) pp 181ndash186 September 2006

[12] A Pirhonen S Brewster and C Holguin ldquoGestural andaudio metaphors as a means of control for mobile devicesrdquo inProceedings of the Conference on Human Factors in ComputingSystems (CHI rsquo02) pp 291ndash298 NewYork NY USA April 2002

[13] R Graham and C Carter ldquoComparison of speech input andmanual control of in-car devices while on-the-moverdquo in Pro-ceedings of the 2nd Workshop on Human Computer Interactionwith Mobile Devices (HCI rsquo1999) Edinburgh UK 1999

[14] J Lai K Cheng P Green andO Tsimhoni ldquoOn the road and onthe web Comprehension of synthetic and human speech whiledrivingrdquo in Proceedings of the Conference on Human Factors inComputing Systems (CHI rsquo01) pp 206ndash212 New York NY USAApril 2001

[15] B Mobasher ldquoData mining for personalizationrdquo in The Adap-tive Web Methods and Strategies of Web Personalization PBrusilovsky A Kobsa and W Nejdl Eds pp 1ndash46 SpringerBerlin Germany 2007

[16] D Wu I Im M Tremaine K Instone and M Turoff ldquoAframework for classifying personalization schemes used on e-commerce websitesrdquo in Proceedings of the 36th Hawaii Inter-national Conference on Systems Sciences (HICSS rsquo03) p 222bMaui Hawaii USA 2003

[17] L Norros E Kaasinen J Plomp and P Rama Human-Technology Interaction Research and Design VTT RoadmapVTT Research Notes 2220 Espoo Finland 2003

[18] E Frias-Martinez S Y Chen and X Liu ldquoEvaluation of apersonalized digital library based on cognitive styles adaptivityversus adaptabilityrdquo International Journal of Information Man-agement vol 29 no 1 pp 48ndash56 2009

[19] J Dewey Art as Experience Perigee New York NY USA 1980

[20] J Rasmussen ldquoHuman factors in a dynamic information soci-ety where are we headingrdquo Ergonomics vol 43 no 7 pp 869ndash879 2000

[21] L Arhippainen and M Tahti ldquoEmpirical Evaluation of UserExperience in Two AdaptiveMobile Application Prototypesrdquo inProceedings of the 2nd International Conference on Mobile andUbiquitous Multimedia Lulea Sweden 2003

[22] M Hassenzahl andN Tractinsky ldquoUser experiencemdasha researchagendardquo Behaviour and Information Technology vol 25 no 2pp 91ndash97 2006

[23] S Siegel and N J Castellan Nonparametric Statistics For theBehavioral Sciences McGraw-Hill New York NY USA 1988

[24] E Law V Roto A P O S Vermeeren J Kort and MHassenzahl ldquoTowards a shared definition of user experiencerdquoin Proceedings of the 28th Annual Conference on Human Factorsin Computing Systems (CHI rsquo08) pp 2395ndash2398 Florence ItalyApril 2008

[25] E L C Law and P van Schaik ldquoModelling user experiencemdashanagenda for research and practicerdquo Interacting with Computersvol 22 no 5 pp 313ndash322 2010

[26] C H Lin P J Sher and H Y Shih ldquoPast progress andfuture directions in conceptualizing customer perceived valuerdquoInternational Journal of Service Industry Management vol 16no 4 pp 318ndash336 2005

[27] X Sun and A May ldquoMobile personalisation at large sportseventsmdashuser experience and mobile device personalisationrdquoin Human-Computer Interaction 2007 vol 11 Springer BerlinGermany 2007

[28] X Sun and A May ldquoThe role of spatial contextual factors inmobile personalization at large sports eventsrdquo Personal andUbiquitous Computing vol 13 no 4 pp 293ndash302 2009

[29] J Hackos and J Redish User and Task Analysis For InterfaceDesign John Wiley amp Sons New York NY USA 1998

[30] A Nilsson U Nulden and D Olsson ldquoSpectator informationsupport exploring the context of distributed eventsrdquo in Pro-ceedings of the International ACM SIGGROUP Conference onSupporting Group Work 2004

[31] B Oertel K Steinmuller and M Kuom ldquoMobile multimediaservices for tourismrdquo in Information and Communication Tech-nologies in Tourism 2002 K W Wober A J Frew and M HitzEds pp 265ndash274 Springer New York NY USA 2002

[32] D D Salvucci ldquoPredicting the effects of in-car interfaces ondriver behavior using a cognitive architecturerdquo in Proceedingsof the Conference on Human Factors in Computing Systems (CHIrsquo01) pp 120ndash127 New York NY USA April 2001

[33] A Kaikkonen A Kekalainen M Cankar T Kallio and AKankainen ldquoUsability testing of mobile applications a compar-ison between laboratory and field testingrdquo Journal of UsabilityStudies vol 1 pp 4ndash16 2005

Submit your manuscripts athttpwwwhindawicom

Computer Games Technology

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Distributed Sensor Networks

International Journal of

Advances in

FuzzySystems

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014

International Journal of

ReconfigurableComputing

Hindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Applied Computational Intelligence and Soft Computing

thinspAdvancesthinspinthinsp

Artificial Intelligence

HindawithinspPublishingthinspCorporationhttpwwwhindawicom Volumethinsp2014

Advances inSoftware EngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Electrical and Computer Engineering

Journal of

Journal of

Computer Networks and Communications

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporation

httpwwwhindawicom Volume 2014

Advances in

Multimedia

International Journal of

Biomedical Imaging

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

ArtificialNeural Systems

Advances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

RoboticsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Computational Intelligence and Neuroscience

Industrial EngineeringJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Modelling amp Simulation in EngineeringHindawi Publishing Corporation httpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Human-ComputerInteraction

Advances in

Computer EngineeringAdvances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Page 7: Research Article A Comparison of Field-Based and …downloads.hindawi.com/journals/ahci/2013/619767.pdfbusiness, philosophy, anthropology, cognitive science, social science, and other

Advances in Human-Computer Interaction 7

Table 2 The overall user experience assessment for each task

Task Friedman (3 related samples) Multiple paired comparisons [23 page 180] (120572 = 05)

Follow sporting action119873 = 18 119873 = 18 1003816100381610038161003816

1003816119877system-initiated minus 119877use-initiated

10038161003816100381610038161003816= 15 gtZ = 1436lowast

1205942(2) = 275 119873 = 18 1003816100381610038161003816

1003816119877system-initiated minus 119877paper

10038161003816100381610038161003816= 49 gtZ = 1436lowast

119875 lt 001 119873 = 18 10038161003816100381610038161003816119877user-initiated minus 119877paper

10038161003816100381610038161003816= 34 gtZ = 1436lowast

Obtaining athlete information119873 = 18 119873 = 18 1003816100381610038161003816

1003816119877system-initiated minus 119877use-initiated

10038161003816100381610038161003816= 2 ltZ = 1436

1205942(2) = 294 119873 = 18 1003816100381610038161003816

1003816119877system-initiated minus 119877paper

10038161003816100381610038161003816= 16 gtZ = 1436lowast

119875 lt 001 119873 = 18 10038161003816100381610038161003816119877user-initiated minus 119877paper

10038161003816100381610038161003816= 18 gtZ = 1436lowast

Reviewing athletics event results119873 = 18 119873 = 18 1003816100381610038161003816

1003816119877system-initiated minus 119877use-initiated

10038161003816100381610038161003816= 175 gtZ = 1436lowast

1205942(2) = 355 119873 = 18 1003816100381610038161003816

1003816119877system-initiated minus 119877paper

10038161003816100381610038161003816= 35 gtZ = 1436lowast

119875 lt 001 119873 = 18 10038161003816100381610038161003816119877user-initiated minus 119877paper

10038161003816100381610038161003816= 175 gtZ = 1436lowast

Building a community119873 = 18 119873 = 18 1003816100381610038161003816

1003816119877system-initiated minus 119877use-initiated

10038161003816100381610038161003816= 6 ltZ = 1436

1205942(2) = 296 119873 = 18 1003816100381610038161003816

1003816119877system-initiated minus 119877paper

10038161003816100381610038161003816= 24 gtZ = 1436lowast

119875 lt 001 119873 = 18 10038161003816100381610038161003816119877user-initiated minus 119877paper

10038161003816100381610038161003816= 30 gtZ = 1436lowast

Checking event schedules119873 = 18 119873 = 18 1003816100381610038161003816

1003816119877system-initiated minus 119877use-initiated

10038161003816100381610038161003816= 165 gtZ = 1436lowast

1205942(2) = 345 119873 = 18 1003816100381610038161003816

1003816119877system-initiated minus 119877paper

10038161003816100381610038161003816= 345 gtZ = 1436lowast

119875 lt 001 119873 = 18 10038161003816100381610038161003816119877user-initiated minus 119877paper

10038161003816100381610038161003816= 18 gtZ = 1436lowast

lowast

Indicates a significant result

The lab experiment did not allow users to feel relaxedduring the experimental procedure Participants acted morepolitely during the study and they pointed out that they wereuncomfortable about expressing negative feelings about theapplications In one example when interviewing a participantabout aspects of his user experience he generally stated thatit was fine However when presented with the EmotionCards(a group of cartoon faces that were used to help promotediscussion of affective aspects of interaction) he tended topick up one emotion card and talked a lot about concernsover the time and effort required to manually personalize theapplication without feeling that he was being overly critical

5 Discussion and Conclusion

In recent years there has been much debate on whethermobile applications should be evaluated in the field or in atraditional lab environment issues including usersrsquo behaviour[11] identification of usability problems [3] the experimentsettings [5 8 32] the communication with participants[33] This research enabled a comparison between fieldand laboratory experiments based on similar users mobileapplications and task-based scenarios In particular a similaruser-initiated prototype was used in each even though thefield experiment took place at a football competition and thelab experiment recreated an athletics meeting

The number of usability problems identified during boththe field and lab experiments was similar (over all partic-ipants when using the user-initiated prototype there were35 usability problems identified in the field setting and 42found in the lab setting) These findings are consistent withthose of Kjeldskov et al [3] They specifically compared laband field-based usability results and found that the differencein effectiveness of these two approaches was nonsignificant

in identifying most usability problems Also some contextrelated problems such as the font being too small to read inan open stadium were identified in both experiment settingsHowever some key differences in the effectiveness of the fieldand laboratory approaches were found the lab experimentidentified problems related to the detail of the interfacedesign for example the colours and icons on the interfacethe field experiment identified issues of validity and precisionof the data presented by the application when using theapplication in a stadium The field experiment also stressedthe problems of mobile ldquouserdquo rather than simply applicationusability and typically these problems were expressed in thelanguage of the situation [11]

An analysis of positive versus negative behaviours [11] wasundertakenThis data included verbal reports and rating scaledata according to the user experience definitions Acceptingthe limitations of a direct comparison participants reactedmore negatively in the laboratory setting when completingsimilar tasks (using the similar user-initiated personalisationapproach) In the field individuals were influenced by theatmosphere surrounding the sports event and this resulted inan enhanced user experience In addition they focused moreattention on the actual usage of personalisation on themobiledevice instead of issues to do with the interface The labsetting was less engaging than the field setting participantsweremore likely to be critical and in general they took longerto perform certain tasks by focusing (and commenting) oninterface issues such as fonts and colours used

The field experiment was more difficult to conduct thanthe lab experiment a point noted by many authors includingKjeldskov et al [3] and Baillie [8] Confounding factors werepresent for example variations in theweather and noise fromother spectators In addition although it was desirable thatparticipants engaged in the sporting action spectatorsrsquo foci ofattention could not be controlled and was difficult to predict

8 Advances in Human-Computer Interaction

Some participants were ldquodistractedrdquo from the experimentaltasks and this did not occur during the laboratory settingThe greater control possible with a laboratory study (asdiscussed by a range of authors including [2 5 8])was clearlydemonstrated during these studies

Where there is an interest in qualitative data goodcommunication between the researcher and participants isvital The field experiment provided a more open and relaxedatmosphere for discourse Users more freely discussed theiruse of the mobile applications their underlying beliefs andattitudes that arose during the study The field experimenthelped the communication tensions with the participants asthey felt they were not being directly examined As well asgenerally promoting the generation of qualitative data thefield experiments encouraged the expression of broader aswell as more contextually relevant views An example is theidentification of contextually dependent requirements whichoccurred much less frequently during the lab-based studies

Some suggestions for user impact assessment withmobiledevices can be made based on the findings of this study Alab experiment is recommended when the focus is on theuser interface and device-oriented usability issues In suchcases a well-designed lab study should provide the validityrequired while being easier quicker and cheaper to conductHowever the results suggest that a field experiment is bettersuited for investigating a wider range of factors affectingthe overall acceptability of mobile services including systemfunctions and impact of usage contexts Where open andrelaxed communication is important (eg where participantgroups are naturally reticent to communicate) this is morereadily promoted by the use of a field study

The natural tension between a deductive and inductiveresearch design was also apparent This research in generalwas essentially deductive since it set out to explain causalrelationships between variables operationalized conceptscontrolled variables and used structured and repeatablemethods to collect data However the field study in particularalso comprised an inductive element as there was a desire tounderstand the research context and the meanings attachedto events in order to help design the lab study Van Elzakkeret al [2] underline how it is often desirable to combineapproaches within the same study The undertaking of a fieldexperiment followed by a lab experiment was an attemptat multiplicity of methods from an essentially deductiveviewpoint This recognised that the natural research processis often that of moving from a process of understanding toone of testing whilst attempting to avoid the unsatisfactorymiddle ground of user evaluations that are divorced from anyunderlying research objectives

Appendix

Likert Items Used to Assess User Experience

In all cases participant responses were based on a six-pointscale ranging from 1 (strongly disagree) to 6 (strongly agree)

User Aspect

(1) I feel happy using [ABC] during the event

(2) My expectations regarding my spectator experiencein the stadium are met using [ABC]

(3) My needs as a spectator are taken into account using[ABC]

Social Aspect

(4) Using [ABC] helps me feel I am communicatingand sharing information with others in the stadium

(5) The [ABC] helps me create enjoyable experienceswithin the stadium

(6) The [ABC] helps me share my experiences withothers within the stadium

Usage Context Aspect

(7) The [ABC] provides me with help in the stadiumwhile watching the sporting action

(8) The [ABC] provides me with information aboutother spectators in the stadium

(9) The [ABC] helps provide me with a good physicaland social environment in the stadium

Culture Aspect

(10) The [ABC] helps me feel part of a group(11) The [ABC] helps me promote my group image(12) The [ABC] helps me interact with my group

Product Aspect

(13) The [ABC] is useful at the event(14) The [ABC] is easy to learn how to use(15) The [ABC] is easy to use

Acknowledgments

This work has been carried out as part of the Philips ResearchProgramme on Lifestyle The authors would like to thank allthe participants who were generous with their time duringthis study

References

[1] D Raptis N Tselios and N Avouris ldquoContext-based designof mobile applications for museums a survey of existingpracticesrdquo in Proceedings of the 7th International Conference onHuman Computer Interaction with Mobile Devices amp Services2005

[2] C P J M Van Elzakker I Delikostidis and P J M VanOosterom ldquoField-based usability evaluation methodology formobile geo-applicationsrdquo Cartographic Journal vol 45 no 2pp 139ndash149 2008

[3] J Kjeldskov M B Skov B S Als and R T Hoslashegh ldquoIs it worththe hassle Exploring the added value of evaluating the usabilityof context-aware mobile systems in the Fieldrdquo in Proceedingsof the 6th International Mobile Conference (HCI rsquo04) Springer2004

Advances in Human-Computer Interaction 9

[4] X Sun D Golightly J Cranwelly B Bedwell and S SharplesldquoParticipant experiences of mobile device-based diary studiesrdquoInternational Journal ofMobile HumanComputer Interaction Inpress

[5] J Kjeldskov and J Stage ldquoNew techniques for usability eval-uation of mobile systemsrdquo International Journal of HumanComputer Studies vol 60 no 5-6 pp 599ndash620 2004

[6] P Christensen M Romero M Thomas A S Nielsen and HHarder ldquoChildren mobility and space using GPS and mobilephone technologies in ethnographic researchrdquo Journal of MixedMethods Research vol 5 no 3 pp 227ndash246 2011

[7] E G Coleman ldquoEthnographic approaches to digital mediardquoAnnual Review of Anthropology vol 39 pp 487ndash505 2010

[8] L Baillie ldquoFuture Telecommunication exploring actual userdquoin Proceedings of the International Conference on Human-Computer Interaction IOS press 2003

[9] M Esbjornsson B Brown O Juhlin D Normark MOstergren and E Laurier ldquoWatching the cars go round andround designing for active spectatingrdquo in Proceedings of theConference on Human Factors in Computing Systems (CHI rsquo06)pp 1221ndash1224 New York NY USA April 2006

[10] X Sun ldquoUser requirements of personalized mobile applicationsat large sporting eventsrdquo in Proceedings of the IADIS Multi Con-ference on Computer Science and Information Systems(MCCSISrsquo10) Freiburg Germany 2010

[11] H B L Duh G C B Tan and V H H Chen ldquoUsabilityevaluation for mobile device a comparison of laboratory andfield testsrdquo in Proceedings of the 8th International Conference onHuman-Computer Interaction with Mobile Devices and Services(MobileHCI rsquo06) pp 181ndash186 September 2006

[12] A Pirhonen S Brewster and C Holguin ldquoGestural andaudio metaphors as a means of control for mobile devicesrdquo inProceedings of the Conference on Human Factors in ComputingSystems (CHI rsquo02) pp 291ndash298 NewYork NY USA April 2002

[13] R Graham and C Carter ldquoComparison of speech input andmanual control of in-car devices while on-the-moverdquo in Pro-ceedings of the 2nd Workshop on Human Computer Interactionwith Mobile Devices (HCI rsquo1999) Edinburgh UK 1999

[14] J Lai K Cheng P Green andO Tsimhoni ldquoOn the road and onthe web Comprehension of synthetic and human speech whiledrivingrdquo in Proceedings of the Conference on Human Factors inComputing Systems (CHI rsquo01) pp 206ndash212 New York NY USAApril 2001

[15] B Mobasher ldquoData mining for personalizationrdquo in The Adap-tive Web Methods and Strategies of Web Personalization PBrusilovsky A Kobsa and W Nejdl Eds pp 1ndash46 SpringerBerlin Germany 2007

[16] D Wu I Im M Tremaine K Instone and M Turoff ldquoAframework for classifying personalization schemes used on e-commerce websitesrdquo in Proceedings of the 36th Hawaii Inter-national Conference on Systems Sciences (HICSS rsquo03) p 222bMaui Hawaii USA 2003

[17] L Norros E Kaasinen J Plomp and P Rama Human-Technology Interaction Research and Design VTT RoadmapVTT Research Notes 2220 Espoo Finland 2003

[18] E Frias-Martinez S Y Chen and X Liu ldquoEvaluation of apersonalized digital library based on cognitive styles adaptivityversus adaptabilityrdquo International Journal of Information Man-agement vol 29 no 1 pp 48ndash56 2009

[19] J Dewey Art as Experience Perigee New York NY USA 1980

[20] J Rasmussen ldquoHuman factors in a dynamic information soci-ety where are we headingrdquo Ergonomics vol 43 no 7 pp 869ndash879 2000

[21] L Arhippainen and M Tahti ldquoEmpirical Evaluation of UserExperience in Two AdaptiveMobile Application Prototypesrdquo inProceedings of the 2nd International Conference on Mobile andUbiquitous Multimedia Lulea Sweden 2003

[22] M Hassenzahl andN Tractinsky ldquoUser experiencemdasha researchagendardquo Behaviour and Information Technology vol 25 no 2pp 91ndash97 2006

[23] S Siegel and N J Castellan Nonparametric Statistics For theBehavioral Sciences McGraw-Hill New York NY USA 1988

[24] E Law V Roto A P O S Vermeeren J Kort and MHassenzahl ldquoTowards a shared definition of user experiencerdquoin Proceedings of the 28th Annual Conference on Human Factorsin Computing Systems (CHI rsquo08) pp 2395ndash2398 Florence ItalyApril 2008

[25] E L C Law and P van Schaik ldquoModelling user experiencemdashanagenda for research and practicerdquo Interacting with Computersvol 22 no 5 pp 313ndash322 2010

[26] C H Lin P J Sher and H Y Shih ldquoPast progress andfuture directions in conceptualizing customer perceived valuerdquoInternational Journal of Service Industry Management vol 16no 4 pp 318ndash336 2005

[27] X Sun and A May ldquoMobile personalisation at large sportseventsmdashuser experience and mobile device personalisationrdquoin Human-Computer Interaction 2007 vol 11 Springer BerlinGermany 2007

[28] X Sun and A May ldquoThe role of spatial contextual factors inmobile personalization at large sports eventsrdquo Personal andUbiquitous Computing vol 13 no 4 pp 293ndash302 2009

[29] J Hackos and J Redish User and Task Analysis For InterfaceDesign John Wiley amp Sons New York NY USA 1998

[30] A Nilsson U Nulden and D Olsson ldquoSpectator informationsupport exploring the context of distributed eventsrdquo in Pro-ceedings of the International ACM SIGGROUP Conference onSupporting Group Work 2004

[31] B Oertel K Steinmuller and M Kuom ldquoMobile multimediaservices for tourismrdquo in Information and Communication Tech-nologies in Tourism 2002 K W Wober A J Frew and M HitzEds pp 265ndash274 Springer New York NY USA 2002

[32] D D Salvucci ldquoPredicting the effects of in-car interfaces ondriver behavior using a cognitive architecturerdquo in Proceedingsof the Conference on Human Factors in Computing Systems (CHIrsquo01) pp 120ndash127 New York NY USA April 2001

[33] A Kaikkonen A Kekalainen M Cankar T Kallio and AKankainen ldquoUsability testing of mobile applications a compar-ison between laboratory and field testingrdquo Journal of UsabilityStudies vol 1 pp 4ndash16 2005

Submit your manuscripts athttpwwwhindawicom

Computer Games Technology

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Distributed Sensor Networks

International Journal of

Advances in

FuzzySystems

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014

International Journal of

ReconfigurableComputing

Hindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Applied Computational Intelligence and Soft Computing

thinspAdvancesthinspinthinsp

Artificial Intelligence

HindawithinspPublishingthinspCorporationhttpwwwhindawicom Volumethinsp2014

Advances inSoftware EngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Electrical and Computer Engineering

Journal of

Journal of

Computer Networks and Communications

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporation

httpwwwhindawicom Volume 2014

Advances in

Multimedia

International Journal of

Biomedical Imaging

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

ArtificialNeural Systems

Advances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

RoboticsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Computational Intelligence and Neuroscience

Industrial EngineeringJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Modelling amp Simulation in EngineeringHindawi Publishing Corporation httpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Human-ComputerInteraction

Advances in

Computer EngineeringAdvances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Page 8: Research Article A Comparison of Field-Based and …downloads.hindawi.com/journals/ahci/2013/619767.pdfbusiness, philosophy, anthropology, cognitive science, social science, and other

8 Advances in Human-Computer Interaction

Some participants were ldquodistractedrdquo from the experimentaltasks and this did not occur during the laboratory settingThe greater control possible with a laboratory study (asdiscussed by a range of authors including [2 5 8])was clearlydemonstrated during these studies

Where there is an interest in qualitative data goodcommunication between the researcher and participants isvital The field experiment provided a more open and relaxedatmosphere for discourse Users more freely discussed theiruse of the mobile applications their underlying beliefs andattitudes that arose during the study The field experimenthelped the communication tensions with the participants asthey felt they were not being directly examined As well asgenerally promoting the generation of qualitative data thefield experiments encouraged the expression of broader aswell as more contextually relevant views An example is theidentification of contextually dependent requirements whichoccurred much less frequently during the lab-based studies

Some suggestions for user impact assessment withmobiledevices can be made based on the findings of this study Alab experiment is recommended when the focus is on theuser interface and device-oriented usability issues In suchcases a well-designed lab study should provide the validityrequired while being easier quicker and cheaper to conductHowever the results suggest that a field experiment is bettersuited for investigating a wider range of factors affectingthe overall acceptability of mobile services including systemfunctions and impact of usage contexts Where open andrelaxed communication is important (eg where participantgroups are naturally reticent to communicate) this is morereadily promoted by the use of a field study

The natural tension between a deductive and inductiveresearch design was also apparent This research in generalwas essentially deductive since it set out to explain causalrelationships between variables operationalized conceptscontrolled variables and used structured and repeatablemethods to collect data However the field study in particularalso comprised an inductive element as there was a desire tounderstand the research context and the meanings attachedto events in order to help design the lab study Van Elzakkeret al [2] underline how it is often desirable to combineapproaches within the same study The undertaking of a fieldexperiment followed by a lab experiment was an attemptat multiplicity of methods from an essentially deductiveviewpoint This recognised that the natural research processis often that of moving from a process of understanding toone of testing whilst attempting to avoid the unsatisfactorymiddle ground of user evaluations that are divorced from anyunderlying research objectives

Appendix

Likert Items Used to Assess User Experience

In all cases participant responses were based on a six-pointscale ranging from 1 (strongly disagree) to 6 (strongly agree)

User Aspect

(1) I feel happy using [ABC] during the event

(2) My expectations regarding my spectator experiencein the stadium are met using [ABC]

(3) My needs as a spectator are taken into account using[ABC]

Social Aspect

(4) Using [ABC] helps me feel I am communicatingand sharing information with others in the stadium

(5) The [ABC] helps me create enjoyable experienceswithin the stadium

(6) The [ABC] helps me share my experiences withothers within the stadium

Usage Context Aspect

(7) The [ABC] provides me with help in the stadiumwhile watching the sporting action

(8) The [ABC] provides me with information aboutother spectators in the stadium

(9) The [ABC] helps provide me with a good physicaland social environment in the stadium

Culture Aspect

(10) The [ABC] helps me feel part of a group(11) The [ABC] helps me promote my group image(12) The [ABC] helps me interact with my group

Product Aspect

(13) The [ABC] is useful at the event(14) The [ABC] is easy to learn how to use(15) The [ABC] is easy to use

Acknowledgments

This work has been carried out as part of the Philips ResearchProgramme on Lifestyle The authors would like to thank allthe participants who were generous with their time duringthis study

References

[1] D Raptis N Tselios and N Avouris ldquoContext-based designof mobile applications for museums a survey of existingpracticesrdquo in Proceedings of the 7th International Conference onHuman Computer Interaction with Mobile Devices amp Services2005

[2] C P J M Van Elzakker I Delikostidis and P J M VanOosterom ldquoField-based usability evaluation methodology formobile geo-applicationsrdquo Cartographic Journal vol 45 no 2pp 139ndash149 2008

[3] J Kjeldskov M B Skov B S Als and R T Hoslashegh ldquoIs it worththe hassle Exploring the added value of evaluating the usabilityof context-aware mobile systems in the Fieldrdquo in Proceedingsof the 6th International Mobile Conference (HCI rsquo04) Springer2004

Advances in Human-Computer Interaction 9

[4] X Sun D Golightly J Cranwelly B Bedwell and S SharplesldquoParticipant experiences of mobile device-based diary studiesrdquoInternational Journal ofMobile HumanComputer Interaction Inpress

[5] J Kjeldskov and J Stage ldquoNew techniques for usability eval-uation of mobile systemsrdquo International Journal of HumanComputer Studies vol 60 no 5-6 pp 599ndash620 2004

[6] P Christensen M Romero M Thomas A S Nielsen and HHarder ldquoChildren mobility and space using GPS and mobilephone technologies in ethnographic researchrdquo Journal of MixedMethods Research vol 5 no 3 pp 227ndash246 2011

[7] E G Coleman ldquoEthnographic approaches to digital mediardquoAnnual Review of Anthropology vol 39 pp 487ndash505 2010

[8] L Baillie ldquoFuture Telecommunication exploring actual userdquoin Proceedings of the International Conference on Human-Computer Interaction IOS press 2003

[9] M Esbjornsson B Brown O Juhlin D Normark MOstergren and E Laurier ldquoWatching the cars go round andround designing for active spectatingrdquo in Proceedings of theConference on Human Factors in Computing Systems (CHI rsquo06)pp 1221ndash1224 New York NY USA April 2006

[10] X Sun ldquoUser requirements of personalized mobile applicationsat large sporting eventsrdquo in Proceedings of the IADIS Multi Con-ference on Computer Science and Information Systems(MCCSISrsquo10) Freiburg Germany 2010

[11] H B L Duh G C B Tan and V H H Chen ldquoUsabilityevaluation for mobile device a comparison of laboratory andfield testsrdquo in Proceedings of the 8th International Conference onHuman-Computer Interaction with Mobile Devices and Services(MobileHCI rsquo06) pp 181ndash186 September 2006

[12] A Pirhonen S Brewster and C Holguin ldquoGestural andaudio metaphors as a means of control for mobile devicesrdquo inProceedings of the Conference on Human Factors in ComputingSystems (CHI rsquo02) pp 291ndash298 NewYork NY USA April 2002

[13] R Graham and C Carter ldquoComparison of speech input andmanual control of in-car devices while on-the-moverdquo in Pro-ceedings of the 2nd Workshop on Human Computer Interactionwith Mobile Devices (HCI rsquo1999) Edinburgh UK 1999

[14] J Lai K Cheng P Green andO Tsimhoni ldquoOn the road and onthe web Comprehension of synthetic and human speech whiledrivingrdquo in Proceedings of the Conference on Human Factors inComputing Systems (CHI rsquo01) pp 206ndash212 New York NY USAApril 2001

[15] B Mobasher ldquoData mining for personalizationrdquo in The Adap-tive Web Methods and Strategies of Web Personalization PBrusilovsky A Kobsa and W Nejdl Eds pp 1ndash46 SpringerBerlin Germany 2007

[16] D Wu I Im M Tremaine K Instone and M Turoff ldquoAframework for classifying personalization schemes used on e-commerce websitesrdquo in Proceedings of the 36th Hawaii Inter-national Conference on Systems Sciences (HICSS rsquo03) p 222bMaui Hawaii USA 2003

[17] L Norros E Kaasinen J Plomp and P Rama Human-Technology Interaction Research and Design VTT RoadmapVTT Research Notes 2220 Espoo Finland 2003

[18] E Frias-Martinez S Y Chen and X Liu ldquoEvaluation of apersonalized digital library based on cognitive styles adaptivityversus adaptabilityrdquo International Journal of Information Man-agement vol 29 no 1 pp 48ndash56 2009

[19] J Dewey Art as Experience Perigee New York NY USA 1980

[20] J Rasmussen ldquoHuman factors in a dynamic information soci-ety where are we headingrdquo Ergonomics vol 43 no 7 pp 869ndash879 2000

[21] L Arhippainen and M Tahti ldquoEmpirical Evaluation of UserExperience in Two AdaptiveMobile Application Prototypesrdquo inProceedings of the 2nd International Conference on Mobile andUbiquitous Multimedia Lulea Sweden 2003

[22] M Hassenzahl andN Tractinsky ldquoUser experiencemdasha researchagendardquo Behaviour and Information Technology vol 25 no 2pp 91ndash97 2006

[23] S Siegel and N J Castellan Nonparametric Statistics For theBehavioral Sciences McGraw-Hill New York NY USA 1988

[24] E Law V Roto A P O S Vermeeren J Kort and MHassenzahl ldquoTowards a shared definition of user experiencerdquoin Proceedings of the 28th Annual Conference on Human Factorsin Computing Systems (CHI rsquo08) pp 2395ndash2398 Florence ItalyApril 2008

[25] E L C Law and P van Schaik ldquoModelling user experiencemdashanagenda for research and practicerdquo Interacting with Computersvol 22 no 5 pp 313ndash322 2010

[26] C H Lin P J Sher and H Y Shih ldquoPast progress andfuture directions in conceptualizing customer perceived valuerdquoInternational Journal of Service Industry Management vol 16no 4 pp 318ndash336 2005

[27] X Sun and A May ldquoMobile personalisation at large sportseventsmdashuser experience and mobile device personalisationrdquoin Human-Computer Interaction 2007 vol 11 Springer BerlinGermany 2007

[28] X Sun and A May ldquoThe role of spatial contextual factors inmobile personalization at large sports eventsrdquo Personal andUbiquitous Computing vol 13 no 4 pp 293ndash302 2009

[29] J Hackos and J Redish User and Task Analysis For InterfaceDesign John Wiley amp Sons New York NY USA 1998

[30] A Nilsson U Nulden and D Olsson ldquoSpectator informationsupport exploring the context of distributed eventsrdquo in Pro-ceedings of the International ACM SIGGROUP Conference onSupporting Group Work 2004

[31] B Oertel K Steinmuller and M Kuom ldquoMobile multimediaservices for tourismrdquo in Information and Communication Tech-nologies in Tourism 2002 K W Wober A J Frew and M HitzEds pp 265ndash274 Springer New York NY USA 2002

[32] D D Salvucci ldquoPredicting the effects of in-car interfaces ondriver behavior using a cognitive architecturerdquo in Proceedingsof the Conference on Human Factors in Computing Systems (CHIrsquo01) pp 120ndash127 New York NY USA April 2001

[33] A Kaikkonen A Kekalainen M Cankar T Kallio and AKankainen ldquoUsability testing of mobile applications a compar-ison between laboratory and field testingrdquo Journal of UsabilityStudies vol 1 pp 4ndash16 2005

Submit your manuscripts athttpwwwhindawicom

Computer Games Technology

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Distributed Sensor Networks

International Journal of

Advances in

FuzzySystems

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014

International Journal of

ReconfigurableComputing

Hindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Applied Computational Intelligence and Soft Computing

thinspAdvancesthinspinthinsp

Artificial Intelligence

HindawithinspPublishingthinspCorporationhttpwwwhindawicom Volumethinsp2014

Advances inSoftware EngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Electrical and Computer Engineering

Journal of

Journal of

Computer Networks and Communications

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporation

httpwwwhindawicom Volume 2014

Advances in

Multimedia

International Journal of

Biomedical Imaging

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

ArtificialNeural Systems

Advances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

RoboticsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Computational Intelligence and Neuroscience

Industrial EngineeringJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Modelling amp Simulation in EngineeringHindawi Publishing Corporation httpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Human-ComputerInteraction

Advances in

Computer EngineeringAdvances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Page 9: Research Article A Comparison of Field-Based and …downloads.hindawi.com/journals/ahci/2013/619767.pdfbusiness, philosophy, anthropology, cognitive science, social science, and other

Advances in Human-Computer Interaction 9

[4] X Sun D Golightly J Cranwelly B Bedwell and S SharplesldquoParticipant experiences of mobile device-based diary studiesrdquoInternational Journal ofMobile HumanComputer Interaction Inpress

[5] J Kjeldskov and J Stage ldquoNew techniques for usability eval-uation of mobile systemsrdquo International Journal of HumanComputer Studies vol 60 no 5-6 pp 599ndash620 2004

[6] P Christensen M Romero M Thomas A S Nielsen and HHarder ldquoChildren mobility and space using GPS and mobilephone technologies in ethnographic researchrdquo Journal of MixedMethods Research vol 5 no 3 pp 227ndash246 2011

[7] E G Coleman ldquoEthnographic approaches to digital mediardquoAnnual Review of Anthropology vol 39 pp 487ndash505 2010

[8] L Baillie ldquoFuture Telecommunication exploring actual userdquoin Proceedings of the International Conference on Human-Computer Interaction IOS press 2003

[9] M Esbjornsson B Brown O Juhlin D Normark MOstergren and E Laurier ldquoWatching the cars go round andround designing for active spectatingrdquo in Proceedings of theConference on Human Factors in Computing Systems (CHI rsquo06)pp 1221ndash1224 New York NY USA April 2006

[10] X Sun ldquoUser requirements of personalized mobile applicationsat large sporting eventsrdquo in Proceedings of the IADIS Multi Con-ference on Computer Science and Information Systems(MCCSISrsquo10) Freiburg Germany 2010

[11] H B L Duh G C B Tan and V H H Chen ldquoUsabilityevaluation for mobile device a comparison of laboratory andfield testsrdquo in Proceedings of the 8th International Conference onHuman-Computer Interaction with Mobile Devices and Services(MobileHCI rsquo06) pp 181ndash186 September 2006

[12] A Pirhonen S Brewster and C Holguin ldquoGestural andaudio metaphors as a means of control for mobile devicesrdquo inProceedings of the Conference on Human Factors in ComputingSystems (CHI rsquo02) pp 291ndash298 NewYork NY USA April 2002

[13] R Graham and C Carter ldquoComparison of speech input andmanual control of in-car devices while on-the-moverdquo in Pro-ceedings of the 2nd Workshop on Human Computer Interactionwith Mobile Devices (HCI rsquo1999) Edinburgh UK 1999

[14] J Lai K Cheng P Green andO Tsimhoni ldquoOn the road and onthe web Comprehension of synthetic and human speech whiledrivingrdquo in Proceedings of the Conference on Human Factors inComputing Systems (CHI rsquo01) pp 206ndash212 New York NY USAApril 2001

[15] B Mobasher ldquoData mining for personalizationrdquo in The Adap-tive Web Methods and Strategies of Web Personalization PBrusilovsky A Kobsa and W Nejdl Eds pp 1ndash46 SpringerBerlin Germany 2007

[16] D Wu I Im M Tremaine K Instone and M Turoff ldquoAframework for classifying personalization schemes used on e-commerce websitesrdquo in Proceedings of the 36th Hawaii Inter-national Conference on Systems Sciences (HICSS rsquo03) p 222bMaui Hawaii USA 2003

[17] L Norros E Kaasinen J Plomp and P Rama Human-Technology Interaction Research and Design VTT RoadmapVTT Research Notes 2220 Espoo Finland 2003

[18] E Frias-Martinez S Y Chen and X Liu ldquoEvaluation of apersonalized digital library based on cognitive styles adaptivityversus adaptabilityrdquo International Journal of Information Man-agement vol 29 no 1 pp 48ndash56 2009

[19] J Dewey Art as Experience Perigee New York NY USA 1980

[20] J Rasmussen ldquoHuman factors in a dynamic information soci-ety where are we headingrdquo Ergonomics vol 43 no 7 pp 869ndash879 2000

[21] L Arhippainen and M Tahti ldquoEmpirical Evaluation of UserExperience in Two AdaptiveMobile Application Prototypesrdquo inProceedings of the 2nd International Conference on Mobile andUbiquitous Multimedia Lulea Sweden 2003

[22] M Hassenzahl andN Tractinsky ldquoUser experiencemdasha researchagendardquo Behaviour and Information Technology vol 25 no 2pp 91ndash97 2006

[23] S Siegel and N J Castellan Nonparametric Statistics For theBehavioral Sciences McGraw-Hill New York NY USA 1988

[24] E Law V Roto A P O S Vermeeren J Kort and MHassenzahl ldquoTowards a shared definition of user experiencerdquoin Proceedings of the 28th Annual Conference on Human Factorsin Computing Systems (CHI rsquo08) pp 2395ndash2398 Florence ItalyApril 2008

[25] E L C Law and P van Schaik ldquoModelling user experiencemdashanagenda for research and practicerdquo Interacting with Computersvol 22 no 5 pp 313ndash322 2010

[26] C H Lin P J Sher and H Y Shih ldquoPast progress andfuture directions in conceptualizing customer perceived valuerdquoInternational Journal of Service Industry Management vol 16no 4 pp 318ndash336 2005

[27] X Sun and A May ldquoMobile personalisation at large sportseventsmdashuser experience and mobile device personalisationrdquoin Human-Computer Interaction 2007 vol 11 Springer BerlinGermany 2007

[28] X Sun and A May ldquoThe role of spatial contextual factors inmobile personalization at large sports eventsrdquo Personal andUbiquitous Computing vol 13 no 4 pp 293ndash302 2009

[29] J Hackos and J Redish User and Task Analysis For InterfaceDesign John Wiley amp Sons New York NY USA 1998

[30] A Nilsson U Nulden and D Olsson ldquoSpectator informationsupport exploring the context of distributed eventsrdquo in Pro-ceedings of the International ACM SIGGROUP Conference onSupporting Group Work 2004

[31] B Oertel K Steinmuller and M Kuom ldquoMobile multimediaservices for tourismrdquo in Information and Communication Tech-nologies in Tourism 2002 K W Wober A J Frew and M HitzEds pp 265ndash274 Springer New York NY USA 2002

[32] D D Salvucci ldquoPredicting the effects of in-car interfaces ondriver behavior using a cognitive architecturerdquo in Proceedingsof the Conference on Human Factors in Computing Systems (CHIrsquo01) pp 120ndash127 New York NY USA April 2001

[33] A Kaikkonen A Kekalainen M Cankar T Kallio and AKankainen ldquoUsability testing of mobile applications a compar-ison between laboratory and field testingrdquo Journal of UsabilityStudies vol 1 pp 4ndash16 2005

Submit your manuscripts athttpwwwhindawicom

Computer Games Technology

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Distributed Sensor Networks

International Journal of

Advances in

FuzzySystems

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014

International Journal of

ReconfigurableComputing

Hindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Applied Computational Intelligence and Soft Computing

thinspAdvancesthinspinthinsp

Artificial Intelligence

HindawithinspPublishingthinspCorporationhttpwwwhindawicom Volumethinsp2014

Advances inSoftware EngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Electrical and Computer Engineering

Journal of

Journal of

Computer Networks and Communications

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporation

httpwwwhindawicom Volume 2014

Advances in

Multimedia

International Journal of

Biomedical Imaging

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

ArtificialNeural Systems

Advances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

RoboticsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Computational Intelligence and Neuroscience

Industrial EngineeringJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Modelling amp Simulation in EngineeringHindawi Publishing Corporation httpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Human-ComputerInteraction

Advances in

Computer EngineeringAdvances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Page 10: Research Article A Comparison of Field-Based and …downloads.hindawi.com/journals/ahci/2013/619767.pdfbusiness, philosophy, anthropology, cognitive science, social science, and other

Submit your manuscripts athttpwwwhindawicom

Computer Games Technology

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Distributed Sensor Networks

International Journal of

Advances in

FuzzySystems

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014

International Journal of

ReconfigurableComputing

Hindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Applied Computational Intelligence and Soft Computing

thinspAdvancesthinspinthinsp

Artificial Intelligence

HindawithinspPublishingthinspCorporationhttpwwwhindawicom Volumethinsp2014

Advances inSoftware EngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Electrical and Computer Engineering

Journal of

Journal of

Computer Networks and Communications

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporation

httpwwwhindawicom Volume 2014

Advances in

Multimedia

International Journal of

Biomedical Imaging

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

ArtificialNeural Systems

Advances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

RoboticsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Computational Intelligence and Neuroscience

Industrial EngineeringJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Modelling amp Simulation in EngineeringHindawi Publishing Corporation httpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Human-ComputerInteraction

Advances in

Computer EngineeringAdvances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014