chamberlain, alan and bødker, mads and hazzard,...

17
Chamberlain, Alan and Bødker, Mads and Hazzard, Adrian and McGookin, David and De Roure, David and Willcox, Pip and Papangelis, Konstantinos (2017) Audio technology and mobile human-computer interaction: from space and place, to social media, music, composition and creation. International Journal of Mobile Human Computer Interaction, 9 (4). pp. 25-40. ISSN 1942-3918 Access from the University of Nottingham repository: http://eprints.nottingham.ac.uk/45081/8/Audio-Technology-and-Mobile-Human-Computer- Interaction--From-Space-and-Place-to-Social-Media-Music-Composition-and-Creation.pdf Copyright and reuse: The Nottingham ePrints service makes this work by researchers of the University of Nottingham available open access under the following conditions. This article is made available under the University of Nottingham End User licence and may be reused according to the conditions of the licence. For more details see: http://eprints.nottingham.ac.uk/end_user_agreement.pdf A note on versions: The version presented here may differ from the published version or from the version of record. If you wish to cite this item you are advised to consult the publisher’s version. Please see the repository url above for details on accessing the published version and note that access may require a subscription. For more information, please contact [email protected]

Upload: vantram

Post on 15-Jun-2018

217 views

Category:

Documents


0 download

TRANSCRIPT

Chamberlain, Alan and Bødker, Mads and Hazzard, Adrian and McGookin, David and De Roure, David and Willcox, Pip and Papangelis, Konstantinos (2017) Audio technology and mobile human-computer interaction: from space and place, to social media, music, composition and creation. International Journal of Mobile Human Computer Interaction, 9 (4). pp. 25-40. ISSN 1942-3918

Access from the University of Nottingham repository: http://eprints.nottingham.ac.uk/45081/8/Audio-Technology-and-Mobile-Human-Computer-Interaction--From-Space-and-Place-to-Social-Media-Music-Composition-and-Creation.pdf

Copyright and reuse:

The Nottingham ePrints service makes this work by researchers of the University of Nottingham available open access under the following conditions.

This article is made available under the University of Nottingham End User licence and may be reused according to the conditions of the licence. For more details see: http://eprints.nottingham.ac.uk/end_user_agreement.pdf

A note on versions:

The version presented here may differ from the published version or from the version of record. If you wish to cite this item you are advised to consult the publisher’s version. Please see the repository url above for details on accessing the published version and note that access may require a subscription.

For more information, please contact [email protected]

DOI: 10.4018/IJMHCI.2017100103

International Journal of Mobile Human Computer InteractionVolume 9 • Issue 4 • October-December 2017

Copyright©2017,IGIGlobal.CopyingordistributinginprintorelectronicformswithoutwrittenpermissionofIGIGlobalisprohibited.

Audio Technology and Mobile Human Computer Interaction:From Space and Place, to Social Media, Music, Composition and CreationAlan Chamberlain, School of Computer Science, University of Nottingham, Nottingham, UK

Mads Bødker, Department of IT Management, Copenhagen Business School, Copenhagen, Denmark

Adrian Hazzard, School of Computer Science, University of Nottingham, Nottingham, UK

David McGookin, Department of Computer Science, Aalto University, Helsinki, Finland

David De Roure, Oxford e-Research Centre, University of Oxford, Oxford, UK

Pip Willcox, Bodleian Libraries, University of Oxford, Oxford, UK

Konstantinos Papangelis, Computer Science and Software Engineering, Xi’an Jiaotong-Liverpool University, Suzhou, China

ABSTRACT

Audio-basedmobiletechnologyisopeninguparangeofnewinteractivepossibilities.Thispaperbringssomeofthosepossibilitiestolightbyofferingarangeofperspectivesbasedinthisarea.Itisnotonlythetechnicalsystemsthataredeveloping,butnovelapproachestothedesignandunderstandingofaudio-basedmobilesystemsareevolvingtooffernewperspectivesoninteractionanddesignandsupportsuchsystemstobeappliedinareas,suchasthehumanities.

KeywoRdSDesign, Ethnography, HCI, Humanities, Interaction, Location, Mobile

1. INTRodUCTIoN

Thisarticlerepresentsaseriesofresearchprojectsandapproaches,whichexpanduponaninitialworkshop,“AudioinPlace”thattookplacein2016(Chamberlainetal.,2016).ThekeythemeoftheworkshopwasaudioanditsrelationshiptomobileHCI(Human-ComputerInteraction).Theapproachoftheworkshopwasnottorestrictsubmissionstooneparticulararea,buttoencouragepapersfromamultiplicityofdomains.Theresearchpresentedinthispaperaimstoextendourunderstandingsofaudio-basedmobileHCI,provideaplatformfordiscussionanddebate,extendingourunderstandingsofmobileHCIandencouragingapproachestounderstandingtechnologythatareattheforefrontofMobileHCI.

Mobiledevices(inparticularsmartphones)asoftenstated,arenowcommonplace,affordableandusedbylargeamountsofpeopleonaglobalscale.Theirportability,adaptabilityand‘alwayson’capabilitymeanthattheyareoftenthe“goto”technologyfordeveloping,testingandresearchinggeo-locativeinteractivesystems.However,recentlytherehasbeenagrowthinevenmoreaffordable

25

International Journal of Mobile Human Computer InteractionVolume 9 • Issue 4 • October-December 2017

26

technologies,systemssuchastheArduinoandBeagleBone,arebuttwoofthemanysmallform-factorhardwaresystemsavailable.Oftenthesesystemsareadaptable,areeasilyrepurposedandcanbeusedinconjunctionwithotherexistingtechnologiesmakingthemanattractivepropositionforpeoplewanting todevelopportableexploratoryaudio-basedsystems.Fora fullerexpositionseeMcPhersonetal.(2016)andMorealeetal.(2017).

However,thetechnologyisonlyonepartofthestory,beingabletotaketechnologyoutintotheworldopensupaplethoraofinteractivepossibilities,differentwaystounderstandandinteractwithone’ssurroundings,newwaystointeractwitheachotherandthechancetodevelopdifferentapproachestounderstandsuchinteractions.Ofcourse,theworkinthisarticleprimarilyfocusesuponaudiointeraction,butstill,indoingithasenabledpeopleofferawiderangeofperspectivesthatrangefromautoethnographicmethods(Eslambolchilaretal.,2016)formobilesystemdesign,throughtolocation-basedinteraction,anddesignapproachesthatusethepastaswaytodesign,re-imagineanddeveloptechnologiestoday(DeRoureetal.,2016a).

2. ReCoRdING SoUNdS, MAKING MeANINGS

In‘AcousticTerritories’(2010),LaBellemakesacaseforunderstandingsoundsaspartofa“chargedspatiality”(xviii);asocialandemotionaltopologyimplicatedintheemergenceofasense(-andpolitics)ofplace.Sounds,LaBellesuggests,areneverprivate,butmustbeunderstoodasarelationalforcethatsomehowweavesaspacetogetherwhileatthesametime‘unhinging’itinunforeseenwaysbyassociations,formingpersonalmemoriesandemotionsbutalsonetworksofsocialconnections,culturalmeaningsandsoon.Inthissection,wesuggestthatinordertoimagineandbuildmobileinteractionsaroundlocalizedsound,weneedmethodstoaddresstheopportunitiesinthisfieldandthatareattentivetothepossibleemotionalandsubjectivevaluesinscribedintheconsumptionaswellasproductionoflocalizedsound.

Assuchitsuggeststhatpracticesofsoundrecordingonmobiledevices(aswellastheensuinguseofrecordedsoundsforthepurposesofdistributing,sharingormapping)offersrelativelyuntappedmodesofengagement.Particularlyasbothurbanandruralarchitectures,infrastructuresandmobiledigitalmediasarebecomingintertwinedtoformmeaningandexperiences(underthegeneralmotifsofubiquitouscomputing,InternetofThings,orambientintelligence),weseeaneedformethodsthatcanhelpinexploringsomeofthepossiblewaysinwhichtangibleplacesand‘intangible’contentsuchassoundcaninteract.Beyond(orarguablybefore)anytechnicalchallengesassociatedwithbuildingaccessibleandeasytousesystemstoallowpeopletorecordandsharesoundsusingamobiledevice,comesthequestionwhatvaluessuchsystemsmightfulfilorwhatpurposestheymightserve.

We suggest that autoethnographies can play a part in an effort to connect sound recordingto mobile HCI. Specifically, we explore possible connections between digital media, space and‘meaningmaking’,suggestinghowautoethnographies(BødkerandChamberlain,2016)mighthelpindiscoveringandarticulatingdesignopportunitiesinthemergingofmobiledigitaldevicesandmeaningfulplacemakingpractices.

3. AUToeTHNoGRAPHy

Usinganauto-ethnographicallyinspireddesignapproachtolocatemediagoesbeyondtheconventional“implications for design” (Button, 2000), is more than a “scenic” study (Dourish, 2006), or an‘imposed’analytic.Wesuggestthatitcanbeusedasmethodofimagination,asalensthroughwhichwecandiscoversocialaswellaspersonalpotentialsindesign.Itcanbeassociatedwithaself-designagenda…thatfocusesonthepotentialofself-experienceandarticulationforthedesignerswork.

Autoethnographiesare‘personal’accountsofculturalexperiences.Theyrelyontheabilityoftheethnographertoreflectivelyconnectanautobiographicalaccountwithbroaderculturalformations(Ellis,2000).Asautobiographicalrenderings,theyareparticularlyopento‘vulnerable’narratives,

International Journal of Mobile Human Computer InteractionVolume 9 • Issue 4 • October-December 2017

27

narrativesthatfavouremotionalandaffectiveencounterswithourselvesengagedintheworld.Asamethod,wefindthatwritingnarrativesinthestyleofanautoethnographyarecapableofteasingoutsignificantandemotionallypoignantstoriesandlooseimplicationsthatareotherwisedifficulttorecount.Theyarewaysofseeingourselvesengagedinaculture,andassuchtheycanbewaysof recognizing (and rendering) experiences that have particular emotional resonances or strong,embodiedandaffectiveimpactsthattraditional‘analytical’ethnographies(orindeedothermethods)mayneitherbereceptivetonorhaveaccessto.

Considerforinstancehowaworkshopparticipant(M)recountsaparticularsoundthattriggersasurgeofnostalgiaandvisceralmemory:

In my imagination the sounds affix themselves more readily to a broader childhood landscape of flat, overcast marshlands. Maybe driving through it, sitting in the backseat of my parents’ car, or to the repetitiveness of the wet landscape?

Hewishesthatthesound:

… could seep out of the ground there too, that other people might share in my experience of the short musical sequence in what has, for me, become the ‘proper’ setting. Perhaps the music is also meaningful to other people?

Theimaginedconnectionofasoundwithaplaceisarguablyprivate,butexperiencingsuchconnectionsbetweensoundandplacecanbepositivelycommunal,anotiondeliberatedinTruax’workonwhathecalls‘acousticcommunities’(Truax,2001).Communalsounds“areusuallyacousticallyrichandmayevenhavemusicalvalue,andthereforetheyacquiretheirsignificanceinthesoundscapethroughtheirabilitytomakeastrongimprintonthemind,animprintthatembodiestheentirecontextofthecommunity”.M’srenderingoftheemotionaleffectofasoundseemsprivateatfirst,butlaterpointstothepossibilityofsoundsaspartofasharedprocessofmakingmeaning.

Anothernarrative suggestshowanotherworkshopparticipant,wanted to sharea significantsonicexperiencewithotherpeople:

I know it’s odd but I’d like to share those kinds of things about the place that I live now. Just leave an audio trail, a story or a thought that people might come across in the ether. There’s a Celtic hill fort next to the town that we walk over and I’d like to leave things for other people that I know to respond to, I have plans to use the defenses of the fort, they are layered like giant steps designed by some ancient architect.

Soundsemplacedandlocalizableinthelandscape,asimaginedintheaboveexample,mightcometoworkastracesofanemergentheritagelandscape,partofanongoingengagementinrememberingormakingsenseofplace.

Autoethnography,asanexampleofahighlyself-reflectiveandemotionallychargedapproach,isamethodthatseemsmorepersonallyrelevantthanthosetypicallyassociatedwithmoresystem-baseddesignapproaches.Traditionalmethodsarelesssensitivetothewaysthatthefleetingsenseoflisteningandtheensuingemotions,senseofrelationships,belonging,memoryandmeaningcometogether.Ourmethodrelatestothewaythatwe,personally,imaginethepotentialsofwhatmightfallundermoretechnicaltermssuchas“fieldrecording”and“localizedmedia”onmobiledevicestorepresentanddevelopmemory-scapesinrespecttothewaythatweexperiencetheworldaroundus.Inthisway,themethodsupportsaprocessofself-designwherethegapbetweenpersonalreflections,emotionsordesiresanddesigncanbenarrowed.

International Journal of Mobile Human Computer InteractionVolume 9 • Issue 4 • October-December 2017

28

Anumberoftoolsofvaryingcomplexityexisttodriverapidexperimentationwiththeconceptsthatemergefromautoethnographies.Usinglittle-bits(www.littlebits.cc),smallwearableandmobileprototypes that includesoundandavarietyofwirelesssensorscanbebuilt.RaspberryPimini-computers(https://www.raspberrypi.org)canbeusedtobuildsoundstreamingdevices(e.g.thoseoftenusedforthelocusonussoundmap,seehttp://locusonus.org/soundmap/051/).Recho(http://www.recho.org),acommercialiPhoneapp,allowsuserstoleavebehindshortsoundstiedtoalocation.PercussaAudioCubes(https://www.percussa.com/what-are-audiocubes/)thatallowausertotangiblyinteractwithdifferentsounds.Wehaveusedaudiocubesinworkshopstofacilitatediscussionsaboutsound,tangibleinteractionandmapping.

Further,wemightsupplementthestandardautoethnographicmedium(writing)withmorefocusedrenderingsoftheexperienceoffieldrecordingorconsuminglocation-basedsoundsonmobiledevices.Thismaypointtofurtheralternativewaysinwhichusersofmobiledevicescan‘tunein’toplacesandgeneratenewcollectiveandemergentmeaningsinplace.

4. MUSICAL LANdSCAPeS: FoRMING ReLATIoNSHIPS BeTweeN THe PHySICAL ANd dIGITAL

‘Locativemedia’originatedinanefforttodistinguishtheemergingbodyofcreativeworkundertakenwith new mobile technologies from the commercial impetus of location-based services (Tuters2004).Therearenumerousexamplesoflocativemediaexperiencesthatexplorearangeofdigitalinteractions,suchas:augmentedreality,wheredigitalmediaisoverlaid.Wenowturnourattentiontolocativemedia,withaspecificfocusonlocativemusic.Thetermontotherealworld(Izadietal.2002;Vlahakisetal.2002;ZimmermannandLorenz2005;Gildersleeve2015);mediascapes,atermcoinedbyHPLabstodescribesitespecificlocativeexperiences,asauthoredbytheirMscapetool(Behrendt2005;Reidetal.2005);context-aware,oftenusedtodescribeworkswherearangeofcontextualdataisusedotherthanpurelypositioningtechnologies(Yuetal.2008;ElliottandTomlinson2006;Abowdetal.1996);mixed-reality,todescribethecombinationofreal-worldactivitycombinedwithonlineactivity(BenfordandGiannachi2011;Crabtreeetal.2004;Galani2003;Flinthametal.2003)andspecificallyinrelationtolocation-basedgames,thetermpervasivegames(Benfordetal.2003;Chamberlainetal.2010;Waern,Montola,andStenros2009).

‘Mapping’constitutesthecentralfunctionofmostlocativemediaexperiences,“makingdatageographicallyspecificorplacingadigitalobjectinspace”(Hemment2006,350).ThelatterhalfofthisquoteacknowledgesFraukeBrehendt’splacedsoundexperiences–aclassfromhertaxonomyofmobilemusic(Behrendt2010).Placedsoundsdescribethetriggeringofdigitalaudioatpredefinedlocales,typicallyrealisedviaGPSpositioningandGPSenabledmobiledevices.Earlyadoptersofthesetechnologicaldevelopments(i.e.Rueb2002;Kirisitsetal.2008;Tanaka2008;Blistein2015;Collier2015;Gayeetal.2006;Bryan-Kinnsetal.2009;Reidetal.2005;Chamberlainetal.2010;Benford et al. 2003) were “asking what can be experienced now that could not be experiencedbefore…anewunderstandingoftherelationbetweenphysicalanddigital”(Hemment2006).Thesepractitionersnotedthatplacingnew,novelaudioexperiencesintospacesfunctionedasaproducerofcontentandalsoastageuponwhichthecontentisdelivered(Tanaka2008).

Inthissection,weplaceaspecificfocusonformingrelationshipsbetweenphysicalspaceandtheconstructionofdigitalmusicalsoundtracks;proposingthatthephysicalassemblyofachosenlocationwithallisnaturallyoccurringandbuiltartefactscanserveasastructuralblueprintfromwhichamusiccompositioncanbecreatedandplaced.Wedrawuponakeyexampletoillustratethisapproach,aspresentedinthefollowing.

Weuseanexampleofalarge-scalemusicalsoundtracktoaccompanyvisitorsatTheYorkshireSculpturePark,NorthYorkshire,intheUnitedKingdom,aschartedin(Hazzard,Benford,andBurnett2015;Hazzardetal.2015).TheYorkshireSculptureParkisaculturalheritageparksetintheYorkshirecountryside.Itisformedfromarangeofdiverseanddistinctlandscaping,fromformalgardensto

International Journal of Mobile Human Computer InteractionVolume 9 • Issue 4 • October-December 2017

29

openparkland.Sculptureexhibitsaredistributedthroughouttheparkandvisitorsarefreetoroam,exploreandinteractwiththemintheirowntime.ThissoundtrackwasaGPSdrivenexperienceusingsmartphonesandheadphones,whereasoundtrackwascomposedandpresentedviaalargenumberofmusicalcellsandlayersthatcouldbetriggeredbyGPSasvisitorsexploredtheparktocreateonesinglecontinuousmusicalarrangement.Originalmusicwasusedthroughout,composedspecificallyfortheexperience.Theintentionofthesoundtrackwassimilartothatofthefilmsoundtrack,whoseroleistoprovideasupportiveaccompaniment,toreflectthenarrativeratherthandominateit.

Inordertocomposemusicforsuchalargelocation,withindeterminatedurationandnon-lineararrangement(i.e.dependentontherouteandwalkingpaceofeachvisitor)acoherentmethodtostructurethemusicwasnecessitated.Toachievethiswesoughttoestablishrelationshipsbetweenthephysicalattributesoftheparkandkeymusicalandfeaturesinthesoundtrack.Scrutinizingtheparkrevealedthefollowingkeyattributes:(1)sculptureexhibitspacedinspecificlocales;(2)apatchworkofphysical regionsdemarcatedby transitions in landscaping(e.g. formalgardens toexpansesofgrassedareastorollingfieldstoconstrainedpaths);(3)theplacementofwalkways,hedgerowsandtrees;and(4)finallybuildings.Thisthreedimensionalassemblyofinterconnectedphysicalattributeswasusedtodirectlystructurethesoundtracksmusicalfeatures,atahighandatalowlevel.

First,wedescribethehigh-levelapproach,termedmusicallandscapes.Oneofthemainattributesoftheparkwasthispatchworkofphysicalregionsmadedistinctbydifferenttypesoflandscapingandattributesthatboundedthem.Inresponseoursoundtrackwasformedfromaseriesofmusicalsections,ormovements,overlaiddirectlyontothispatchworkstructure.Thus,adistinctmusicalsectionwasheardwithintheboundsofeachphysicalregion.Figure1marksthecollectionofregions,asillustratedbydifferentblocksofcolourthatspannedthepark.ThewhiteareasinFigure1representthelocationofsculptureexhibits.Inpractice,theboundarylinesoftheseregionsactedasGPStriggerzones,soavisitor’stransitionacrossboundarylineisaccompaniedbyatransitionfromthecurrentlyplayingmusicalsectiontothatnewregion’smusicalsection.Therationaleforthesizeandshapeoftheseregionsalongsidetheplacementofneighbouringboundarylinescanbedescribedinfourways:

Figure 1. Musical Landscapes - patchwork of regions with key centre’s marked

International Journal of Mobile Human Computer InteractionVolume 9 • Issue 4 • October-December 2017

30

• Constrained boundaries:Drawninreferencetofixedphysicalstructuresthatimpedetheuseraccess,i.e.,wall,fences,hedgerows,buildings,areathatausercannotcross;

• Implied boundaries:Drawninreferencetophysicalattributessuchasstructuresorobjectsthatcanbetraversed,butimplyatransition,i.e.,gates,changesoflandscapesuchasformalgardensintoruralparklandorfromgrassedareasontoconcretewalkway;

• Negotiated boundaries:Boundariesarenotdefinedbyanyphysicalstructure,but rather inresponse,ornegotiationtoanotheradjacentnegotiatedboundary;

• Unconstrained boundaries:Notdrawntoanyphysicalattributeorinrelationtoanyotherboundary.

Themajorityoftheregionsareformedby‘implied’boundarytypes,thusdefinedinresponsetoexistingphysicalattributes.

Fromamusicalperspective, a setof compositional techniqueswere thenused to rendereachmusicalsectiondistinct,especiallythosewithneighbouringregions.Eachsectionwascomposedinadifferentbutcomplementarykeycentre(seeFigure2),newthematicideaswereemployedwithineachregion,alongsidedistinctmusicaltextures(i.e.combinationofinstrumentsounds).Thismeantthatasvisitorswalkedoverthesephysicaltransitionstheyweresimilarlyaccompaniedbymusicaltransitions.

Withmusicallandscapesdepictingourhighlevelmusicalstructure–transitionsbetweensections–wethenconsideredalowerlevelofmusicalinteractionthattakesplacewithintheregionsthemselves.Specifically,howtoaccompanywithmusicavisitor’s trajectoryintoandoutof interactionwithsculptureexhibits.Wetermedthisprocess,musicaltrajectories.WedrewinspirationfromFoshetal.,(Foshetal.,2013)who’s‘trajectoryofinteraction’urgesdesignersofsuchvisitorexperiencestocontemplatefourkeyphasesofinteractionwithartefactsorpointsofinterest,namelyapproach,arrive,engageanddepart.Inreplytothis,wecomposedmusicalinteractionsforourregionsthatfollowedthesefourphases.Figure2updatesourillustrationtoshowanumberofadditionaltriggerzonesplacedwiththemajorityoftheregions.Typically,theyfanoutfromthelocationofthesculptureexhibitsto

Figure 2. Musical trajectories through regions

International Journal of Mobile Human Computer InteractionVolume 9 • Issue 4 • October-December 2017

31

theouterboundaryofaregion.Eachofthese‘internal’triggerzonesenactafurtheradjustmenttothemusicplayedinthatregion.Thefollowingdescribesourmusicalapproach:

• Approach:Asavisitorwalkstowardsasculpturetheyanticipatetheforthcominginteraction;consequently,musicwascomposedtoforeshadowedthemomentofarrivalbygraduallyincreasingthemusicalintensity(i.e.crescendo)uponentryintoeachsuccessivetriggerzone;

• Arrive:Thiscrescendoisthenresolvedatthemomentofarrival(entryintoafinaltriggerzoneencompassinganexhibit).Resolutionistypicallyenactedbyakeychangeandafinalpushofmusicaldynamic;

• Engage:Themomentofarrivalthenquicklygiveswaytoareductionofmusicalintensityandactivitytoprovidespaceforthevisitortointeractwiththeexhibitwithoutdistraction;

• Depart:Asvisitorswithdrawfromtheexhibitandheadbackoutthroughthetriggerzonesadifferentmusicaltreatmentispresented,onethatmaintainsalowlevelofmusicalactivitytopreparethevisitorforthenextapproachphaseencountered.

Thetechniquesofmusicallandscapesandmusictrajectoriesenabledforthedesign,compositionandauthoringofalocativemusicalexperiencethatexhibitscloseandintertwinedrelationshipsbetweenthephysicalandthedigital.Theflowthroughthephysicalparkissynchronisedwiththecourseoftheaccompanyingsoundtrack,atbothahighandlowlevelofmusicaldetail.Thisapproachenabledustoconsiderthephysicalformoftheparkinanewlightwhilstgracefullyguidingandintegratingacoherentstructureintothecompositionofalarge-scalemusicalwork.Wehavenotenteredintodiscussionregardstheexperientialimpactofmusicoraudiowhenplaceduponartefacts,orlocales.Rather,weexploredhowmultiplefragmentsofdigitalaudiocanbecomposed,arrangedandplacedtoreflectandcomplimenttheinterconnectednessofattributescontainedataspecificlocation.Thefootingsofthisapproacharetransferabletothedesignoflocativeexperiencesthatdeliverrelationshipsbetweenthephysicalanddigital.

5. MoVING BeyoNd LoCATIoN: SoCIAL MedIA ANd dATA AS CoNTeXT IN MoBILe AUdIo SySTeMS

Location-basedaudioexperiences,whereaudioisgeographically locatedwithintheworld,areacommonandprevalent techniquetopresentaudioinplace.Initsmostbasicform,adatabaseofaudiofilesisassociatedwithasetofgeographiccoordinates,andpresentedtoauserwhenheorsheentersanactivationzone(usuallywithinafewmeters)ofthegeographicalcoordinates.Whilstotherlocationandproximitysensingtechnologieshavebeenused,thisbasicapproachhasbeenacommontechniquefromtheearliestwork(Bederson,1995)totheamongstthelatest(Ciolfi&McLoughlin,2012).Significanttechnicaladvanceshavebeenmade,movingfromstraightforwardplaybackofaudio(Bederson,1995)totheuseofvirtualspatializedauditoryenvironmentsthatcanpresentmultipleoverlappingsounds,andvarythesoundspresentedbasedonuserdistancefromthegeographicsource(Vazquez-Alvarez,Oakley,&Brewster,2011).However,theaudioitselfisoftenfixed(existingasaprocreatedfileontheuser’smobiledevice)andismanuallycuratedandfixedtoageographicallocation.Thismanualcreationmeansthatusershavea“canned”experience,wherethesameaudioistriggeredinthesamephysicallocation.Thisworkswellformanyofthe,usuallyculturalheritage,applications where location-based audio is employed to augment and enhance existing physicalplaces.AsSmith(Smith,2011)notes:“heritageisnotthehistoricmonument,archaeologicalsite,ormuseumartefact,butrathertheactivitiesthatoccuratthem”.Theembodiedexperience(Ciolfi,2015)oflocation-basedaudioiswellplacedtofulfilthisrole,contextualisingthebuiltenvironment.

Forexample,Reidetal.(Reid,Geelhoed,Hull,Cater,&Clayton,2005)evaluatedalocation-basedaudioplayaboutriotsthathadtakenplaceinacitysquare.Audiovignettesweretriggeredasusersstrolledaroundthesquare,beingplayedincontextuallyrelevantplaces(e.g.thesoundofrioters

International Journal of Mobile Human Computer InteractionVolume 9 • Issue 4 • October-December 2017

32

plunderingbuildingswereplayednearthebuildingsthereferredto).Suchworkassumesthattheaugmentedexperienceis“visited”,withtheuserinteractingwiththeaudio-augmentedenvironmentastheirprimarypurpose,beforeswitchingitoffanddoingsomethingelse.Thereisalsonoassumptionthattheuserwouldrevisittheexperience(inmuchthesamewayasheorshemayonlyvisitamuseumexhibitiononce),givenitsstaticnature.

Withincreasingamountsofuser-generatedcontentbeingtaggedwiththelocationofitscreation,andanincreasingfocusoncommunityparticipationinthecreationofculturalheritage(Han,Shih,Rosson,&Carroll,2014),whereuserscreategeo-locatedcontentaboutthehistoryoftheirlocalarea,weargueitisbothvaluableandnecessarytoconsiderhowlocation-basedaudiocouldactasaubiquitousmeanstogainnewinsightintoeverydayenvironments.Throughcontentbeingdynamic,thereisvalueinmorecontinuoususe,withtheauditoryenvironmentbecomingsecondarytotheuser’smainactivity.Allowing,forexample,theusertogainunderstandingofthe‘heritage’ofplace(Smith,2011),asaby-productofanotheractivity.Forexample,goingforajogcouldbeawaytolearnaboutwhatisgoingoninthecommunityviathesamesortoflocation-basedaudiosystemasstudiedbyReidetal.(Reidetal.,2005).

Doingsohoweverraisesnewissuesandchallenges.Webrieflypresenttwoexamplesofourownworkthataredesignedtobeusedinthisway.PULSE(McGookin&Brewster,2012a):anauditorydisplay topresentgeotagged twittermessages insitu;andExplore:aculturalheritage‘app’ thatincorporatesseasonal(winterandsummer)variationsintoitscontent.Weoutlinehowreflectionsofbothhighlightchallengesdynamicaudiodisplayofcontenttohelpcontextualisetheenvironment.

5.1. PulsePULSE(McGookin&Brewster,2012b)wasdesignedtoallowuserstogaina‘vibe’-anintrinsicunderstandingofthepeople,placesandactivitiesaroundtheircurrentlocation-derivedfromtheTwittersocialmediaservice(www.twitter.com).PULSEranasan app’onaniPhone,withapassiveuserinteraction:interactionwasdrivensolelybyauser’sphysicalmovementthroughtheirenvironment.Asusersmoved,PULSEdownloadedpublicTwittermessages(‘tweets’)generatedbyanyuserinthecurrentlocation.Periodically,PULSEwouldselecttheclosesttweet,synthesizeitusingahighqualityText-to-Speechengineandinsertitinavirtual3Dauditoryenvironment.Thususersheardtweetsaswhisperedconversationsinpassing.Theseaudiotweetswereaugmentedwithmeta-data(suchastherateofmessagegenerationortrendsinmessagetopic)basedontheimmediategeographicalarea.Thesewereeitherpresentedtousersimplicitly(suchasmappingthetimebetweenspokenmessagepresentationstotheratetweetswerebeingcreatedinthearea),orexplicitly(bymappingtheseontoanambientwatersoundscape.Openearbudswereusedtoallowmergingbetweenthevirtualandrealauditoryenvironment.WeevaluatedPULSEoveratwo-weekperiodinEdinburgh,UK.AgroupoffivelocalresidentswereaskedtouseeachofthetwoversionsofPULSEastheywishedoverafive-dayperiod.In-linewithourearlierdiscussion,PULSEwasusedasasecondaryactivity,beingintegratedwithinotheractivities,ratherthanparticipantssettingtimeasidetouseit.

5.2. exploreExplore(Mcgookinetal.,2017)runsasanAndroid`app’,designedtoprovidebothtriggeredandambientaudioaboutculturalheritageonavisittotheFinnishrecreationalislandofSerausaariinHelsinki.Althoughthe islandisrich inculturalandnaturalheritage,peoplevisit for leisureandrecreation.WhilstitcontainsanOpen-Airmuseum,thisisonlypartoftheisland’shistory.Visitorsdonotvisitsolelytoattendtheopen-airmuseum(whichisonlyopenafewdaysoftheyear),rathertheyvisittoexperiencetheplaceanditsnature.Ourgoalistohelpinformthemofthehistoryandexpandtheircuriositytowardstheislandwhenvisiting.Whilstourworkisbasedonexistingaudio-basedculturalheritage(suchas(Ciolfi&McLoughlin,2012;McGookin,Vazquez-Alvarez,Brewster,&Bergstrom-Lehtovirta,2012)),butunlikethese“dipping”inandoutofcontentasitbecomesavailable

International Journal of Mobile Human Computer InteractionVolume 9 • Issue 4 • October-December 2017

33

andasanaturalexperienceofvisitingtheplace.Explorepresentsstandardnotificationsasausercomesneargeo-locatedcontent(historicalimages,audiovignettesorvideos).Explorealsosupportsseasonalityofcontent,asthetemperaturecanvarybetween-20to+20degreescentigrade.Twosetsofcontent,presentingawinterandsummerperspective,couldbeswitchedbetweenbyparticipants.Afieldstudyinvolvingforty-fivevisitorstotheislandwasconductedtoevaluateExplore.

5.3. Future ChallengesBasedontheresultsofbothstudiesweoutlineissuesandchallengesthatshouldbefurtherconsideredwhenapplyinglocation-basedaudiomorewidely.

5.3.1. The World ChangesInExploreparticipantscouldswitchbetweeneitherthecurrentseason(summer)oroutofseasoncontent(winter),whilsttheusersofPULSEheardtweetsgeneratedovertheprevious24hours.Thetemporalrelationshipbetweenwhenthecontentwaslocatedandwhenitwasheardwasimportant.WhilstparticipantswereopentoviewingvisualcontentinExplorethatreflectedthecontentoutofseason,suchasthecurrentenvironmentinthewinter,audiocontent,suchasvignettesofhistoricalindividualsjarredwhenheardoutofseason.Similarly,withPULSE,participantsidentifiedtweetstheyheardthatdidn’tcorrelatewiththecurrentenvironment.Forexample,atweetgeneratedaboutgoingtotheparkbecauseitwasasunnyday,whentheenvironmentwassunny,jarredwithaparticipanthearingitintheparkafewhourslaterwhenitwasraining.Yetothermessageswithasimilargapbetweencreationandhearingdidnotdothis.Asaudioisalwaysheardasanintegrationintotheexistingenvironment,ratherthanbeingclearlydifferenttoit(e.g.viewingvisualcontentonascreen),itmaybeparticipantsaremoresensitivetoaudioincongruence.Howthedynamicelementsoftheenvironment(morningtoevening,weatherconditionsetc.)affecttherelevanceofcontent,andhowthisshouldbeincorporatedintomobileaudioworkisanunderdevelopedarea.

5.3.2. Filtering and Prioritising ContentInExplore,participantsfounditannoyingwhenahighdensityofnotificationswerepresentedinquicksuccession.Thiswascommoninareaswheretherewasalotofcontent.IntheworkdescribedbyHanetal. (Hanetal.,2014), it is likely that someplaceswouldhavehigh levelsofcontent,whilstotherswouldbelower.Howtoprioritiseandfilterthiscontenttoavoiduserannoyanceisimportant.However,ourworkinPULSEshowsthatcontentviewedasmundane,orlowpriority,canbeinterestingbecauseitispresentedinsituviaaudio.Withseveralinstancesofmundane,everydaymessageshighlightedbecauseofthis.Whilstnotoverloadingtheuserisimportant,audiocanchangethenatureofwhatausermayconsidermostinsightful.

5.3.3. Considering Interaction as a Secondary TaskBothPULSEandExplorewereusedasancillarytotheuser’smainactivity.InExplore,thepreviouslymentionednotificationdensitywashighlightedasanissue,whilstwithPULSEtheambientaudiotorepresentmetadataaboutthecurrentareawaseither“tunedout”byparticipants,orcoulddistractthemfromtheirprimarytasks(e.g.oneparticipantnotedhowheleftasupermarketwithoutcollectinghischange).Pervasiveandimmersiveaudioexperiences(e.g.(McGookinetal.,2012;Vazquez-Alvarezetal.,2011)),wheninteractionisaprimaryexperiencedon’twork.Userswillonlyinteractwithshortperiodsoftime.Whilstweusednotifications,thesedidnothelpimmerseusersinthecontent.Worksupportingtheseshorter interactionsinamoreimmersivewayis important.Apotentialsolutionmightbetoworkof(Fosh,Benford,Reeves,Koleva,&Brundell,2013)inconsideringtrajectoriestocreateshortimmersiveexperiencesthetransitionquicklyintoandoutoftheaudiocontent,withoutdisruptionoftheprimaryactivity.

International Journal of Mobile Human Computer InteractionVolume 9 • Issue 4 • October-December 2017

34

5.3.4. The Concurrency of PlacesParticipantsinPULSEwerelocal,usingitinfamiliarlocations.However,inmanycasesitsupportedthediscoveryofnewplacesandeventsinfamiliarenvironments.Inparticular,awarenessofsocialactivitiesorgroupsofpeoplewithdivergentintereststotheirown:“You’vegotthe‘GodI’vesuchahangoverthismorning’ones.Whichareabitboring,butalsointerestinginaway,causeitremindsyou of a social group of people”. Another noted a message from a performer in the Edinburghfestivalthatwastakingplaceduringourstudy:“Thisguywhosaiditwastwoandahalfhourstillhisshowstarted.Thatwasquiteinteresting.Seeingitfromtheotherside.”Inthisway,PULSEwasabletoillustratehowdifferentcommunities(or`places’)couldexistconcurrentlyinthesamespace.Userunderstandingofthesefamiliarenvironmentswaschangedandprovoked,highlightingothercommunitiesofpractice,ratherthanthembeing“perceptuallyhidden”duetofamiliarity.Ouruseofapassiveauditorydisplayallowedthistooccurserendipitously,withparticipants‘stumbling’uponnewinsightsduringtheirregularactivities,ratherthanhavingtoexplicitlyconsiderlookingforthem.

6. eXPeRIMeNTING wITH THe PAST: FICTIoNAL FUTUReS AS MATeRIAL FoR CoMPoSITIoN ANd deSIGN

Wehaveconductedaseriesofexperiments,under thebannerof“numbers intonotes”, inwhichthemathematicsoftheearly19thcenturyisusedtogeneratemusicthatisguidedbyhumanandmachineinteraction.Ourapproachplaysatonceintogenerativedesignandintoalternativehistoriesofalgorithmsandmechanisms.Wedescribeascenarioinwhichdevicesplacedinaninteractivespaceblurthetraditionalboundariesofcomposer,performerandaudience,andsimultaneouslyquestionthecreativeboundariesofhumanandmachine.

Ourworkisre-imaginingprospectiveandtheoreticaltechnologiesofAdaLovelace’stimetoday,inspiringnewresponsesanddiscoveriesrelatingtomusicpracticeandthephilosophyandhistoryoftechnology.Byco-designingdigitalanddigital-physicalartefactstointerpretthewritingandtimesofLovelaceandBabbage,weenablecriticalreflectionandre-interpretation.Weframethisapproachasexperimentalhumanities:throughmaking,throughprototypingandco-design,weclose-readthethoughtprocessesLovelaceandBabbagerecorded,andwecanpointtopathsinthedevelopmentofcomputingandprogrammingthatwerenottaken,digitallyreachingbeyondwhatwaspracticableinthenineteenthcentury.

6.1. Historical ContextCharlesBabbagegaveaseminarabouthisproposedsteam-poweredcomputer,theAnalyticalEngine,attheUniversityofTurinin1840.OriginallytranslatedintoFrenchforpublication,in1842AdaLovelacetranslateditintoEnglishandaddedextensivenotes.Famouslythesenotescontainthefirstalgorithmdesignedforexecutionbyageneral-purposecomputingmachine,whichisoftencelebratedasthefirstcomputerprogram.Theyalsocontaintwoimportantideaswhichhavebeentakenupinsubsequentacademicdebateandinformourwork:

• “Supposing, for instance, that the fundamental relationsofpitchedsounds in the scienceofharmonyandofmusicalcompositionweresusceptibleofsuchexpressionandadaptations,theenginemightcomposeelaborateandscientificpiecesofmusicofanydegreeofcomplexityorextent.”(noteAinMenabrea,1843).Thisisoftencitedwithinthecomputersandmusicdomain;

• “TheAnalyticalEnginehasnopretensionstooriginateanything.Itcandowhateverweknowhowtoorderittoperform.Itcanfollowanalysis;butithasnopowerofanticipatinganyanalyticalrelationsortruths.Itsprovinceistoassistustomakingavailablewhatwearealreadyacquaintedwith.”(noteGinMenabrea,1843),Lovelace’sitalics),Thisiscitedindebatesaroundcomputersandcreativity,inparticularbyTuringinthe“imitationgame”paper(Turing,1950)and,inturn,MargaretBodeninher“Lovelacequestions”(Boden,1990).

International Journal of Mobile Human Computer InteractionVolume 9 • Issue 4 • October-December 2017

35

6.2. experimentsDecember2015sawthe200thanniversaryofLovelace’sbirth,andamajorsymposiumwasheldtomarktheoccasion.Thisincludedthediscussionofathoughtexperiment:hadLovelacelivedlonger,andhadBabbagesuccessfullybuilttheanalyticalengine,whatmighthavehappenedinpursuitofLovelace’smusicalobservation?(HowardE.andDeRoureD.,2015)

Aspartof thisexperimentwehypothesizedaworkflowinvolving theanalyticalengine: themachinerunsaparameterizedprogramtogenerateanumbersequence,andpartsofthissequencearethengiventodifferentinstruments.WedemonstratedthiswithanemulatorandthendevelopedaninteractiveWebAudiotoolsothat,withoutspecialistknowledge,anyonecansimplymapintegersequencesformusicalexpression.Theinteractivetool,asinglepagewebapplication,providesseveralalgorithmswhichillustratethemathematicsoftheearly19thcentury(http://demeter.oerc.ox.ac.uk/NumbersIntoNotes).InspiredbytheuseofpunchedcardsintheJacquardloomandtheproposedanalyticalengine,wegeneratedvirtual‘pianorolls’andaMIDIoutputthancanbeimportedintoothertoolsforeditingorperformance.

Thenextpartofour thought experimentwas to ask“whatwouldLovelacedo today?”Theproposedanalyticalenginearchitecturewasimpressiveevenbytoday’sstandards;forexample,167binarybitswouldberequiredtodaytoachieveits50decimaldigitarithmetic.However,precisionaside,todaythealgorithmsrunreadily(andinrealtime)ontheprocessorsofopensourceArduinohardware,suchasan8bitmicrocontroller,or32bitARM.

OurfirstexperimentwastobuildtheArduinoanalyticalengine.Withaslaveaudioprocessor(VS1053[1])and3Wamplifiertodrivespeakers,wereplicatedtheNumbersintoNotestoolasasmallstandalone“musicengine”.

Fiveofthesedeviceswereconstructed,withavarietyofsensorstoinfluencetheparametersofthealgorithms:ourprototypesusedinfraredremotecontrolsandultrasoundproximitymeasures.Thesecanbelocatedinaninteractivespace.Hereweexploitamusicaloutcomeofourearlierwork:multiplesequencesplayedalongsideeachothergenerateharmonies,andthiseffectisricherwhensequencesareplayedatdifferentbutgeometricallyrelatedtempos(e.g.120,60,30beatsperminute).Theresultingharmonysequencesareexperiencedaccordingtothelocationandorientationofthelistenerrelativetothedevices.

Themultipledeployeddevicesareeffectivelycoupledbythehumanssituatedinthespace,andtheresultingcompositionisemergent.Wecouldapproachthatashumansenteringthespaceandcomposingan“experience”forotherhumanswhofollowthem,goinginandinteracting—creatingthe locative audio interactively and asynchronously. The experience is therefore socially andmathematicallyco-constituted.NowwecanreturntoLovelacenoteGandaskwhetherthecomputeroriginatesthemusic?Furthermore,inourdemonstrationatMobileHCI(DeRoure&Willcox,2016)werecognizedthatnospaceisfreefromsignals,asweexperiencedtheinteractionsofmultipleaudioandinfrareddevicesinthesamephysicalspace.

Whatwouldhappenwithmorearduinosandhowmighttheybeconfigured?ToexplorethiswecreatedasimulationusingthenetLogoagent-basedsimulationenvironment.Thesimulationenablesustoexplorenewdesignfeaturesandincreasingscale.Forexample,thesimulatedarduinos(turtles)can‘hear’eachotherandrespond,therearemultipletypesofarduino,andtheycanbeprogrammedtomove.

6.2.1. Closing Thoughts –Mobile Music-BoxesThe 1800s saw mechanical music machines of various descriptions, with the devices producingthemusicpositionedinthehomeorpublicspace.Intheinterveningyears,wehavemovedtowardsmakingrecordingsofperformancesandreplayingthese;indeed,muchoftoday’srecordingindustryworksonthisbasis.However,wearenowabletocreatedigital-physicalsimulationsofmusicboxesandautomata,andtoexplorehowthesemighthavedeveloped.

International Journal of Mobile Human Computer InteractionVolume 9 • Issue 4 • October-December 2017

36

7. CoNCLUSIoN

ThispaperhaspresentedaseriesofperspectivesthatfocuseduponmobileHCIandaudio.Eachoftheseperspectivesalthoughdifferenthashighlightedthebreadthofresearchinthisgrowingarea,eachwiththeirownchallengesandissues.Asmobiletechnologiesandcommunicationinfrastructure,andmobileaudioinput/outmechanismsdevelopnewopportunitiesbecomemoreapparent.AsmobileHCImovesinto‘new’territoryitbecomesapparentthatwhatareneededarenewapproachesandmethodsthatallowresearcherstobothunderstandanddesigninteractivesystems.Oneofthekeylessons learntfromtheworkshopwasnot tounderestimate theresearchareasandfields thatarelookingforwaystodevelopandunderstandnewwaysofusingaudio-basedmobiletechnologies.Withthisinmindthefutureofmobileinteractionwillofferanexcitingrangeofopportunitiesacrossmanyfieldsofresearch.

ACKNowLedGMeNT

Thisresearch(researchersAlanChamberlain,AdrianHazzard,DavidDeRoureandPipWilcox)was supported through the following EPSRC project: Fusing Semantic and Audio TechnologiesforIntelligentMusicProductionandConsumption(EP/L019981/1)&(byDeRoureandWilcox)TransformingMusicology,fundedbytheUKArtsandHumanitiesResearchCouncil(AHRC)undergrantAH/L006820/1intheDigitalTransformationsprogramme.

International Journal of Mobile Human Computer InteractionVolume 9 • Issue 4 • October-December 2017

37

ReFeReNCeS

Abowd,G.D.,Atkeson,C.G.,Hong,J.,Long,S.,Kooper,R.,&Pinkerton,M.(1996).Cyberguide:AMobileContext-AwareTourGuide.Baltzer Journals, September 2.

Bederson,B.B.(1995).Audioaugmentedreality:aprototypeautomatedtourguide.InConference companion on Human factors in computing systems.NewYork,NY,USA:ACM.doi:10.1145/223355.223526

Behrendt,F.(2005).FromCallingaCloudtoFindingtheMissingTrack:ArtisticApproachestoMobileMusic.Proc. of 2nd International Workshop on Mobile Music Technology(Vol.5).

Behrendt,F.(2010).Mobile Sound: Media Art in Hybrid Spaces[PhDDissertation].UniversityofSussex.

Benford,S.,Anastasi,R.,Flintham,M.,Drozd,A.,Crabtree,A.,Greenhalgh,C.,&Row-Farr,J.et al.(2003).CopingwithUncertaintyinaLocation-BasedGame.IEEE Pervasive Computing,2(3),34–41.doi:10.1109/MPRV.2003.1228525

Benford,S.,&Giannachi,G.(2011).Performing Mixed Reality.TheMITPress.

Blistein,J.(2015).SwedishBandMakeAlbumAvailableExclusivelyintheWoods.Rolling Stone,May28.

Boden,M.A.(1990).The Creative Mind: Myths and Mechanisms.London:WeidenfieldandNicholson.

Bødker,M.,&Chamberlain,A.(2016,June12-15).AffectTheoryandAutoethnographyinOrdinaryInformationSystems.Proceedings of 2016 European Conference on Information Systems ECIS ‘16,Istanbul,Turkey.

Bryan-Kinns,N.,Airantzis,D.,Angus,A.,Fencott,R.,Lane,G.,Lesage,F.,Marshall,J.,Martin,K.,Roussos,GandTaylor,J.(2009).SensoryThreads:PerceivingtheImperceptible.InIntelligent Environments(pp.404–410).

Button,G.(2000).Theethnographictraditionanddesign.Design Studies,21(4),319–332.doi:10.1016/S0142-694X(00)00005-3

Chamberlain,A.,Bødker,M.,Hazzard,A.,&Benford,S.(2016).AudioinPlace:Media,MobilityandHCI-CreatingMeaninginSpace.Proceedings of the 18th International Conference on Human-Computer Interaction with Mobile Devices and Services. Mobile HCI ‘16.ACMPress.

Chamberlain,A.,Rowland,D.,Foster,J.,&Giannachi,G.(2010).RidersHaveSpoken:ReplayingandArchivingPervasivePerformances.Leonardo,43(1),90–91.doi:10.1162/leon.2010.43.1.90

Ciolfi,L.(2015).Embodiment and Place Experience in Heritage Technology Design.TheInternationalHandbooksofMuseumStudies.doi:10.1002/9781118829059.wbihms319

Ciolfi,L.,&McLoughlin,M.(2012).DesigningforMeaningfulVisitorEngagementataLivingHistoryMuseum.Proceedings of the 7th Nordic Conference on Human-Computer Interaction: Making Sense Through Design(pp.69–78).NewYork,NY,USA:ACM.doi:10.1145/2399016.2399028

Collier,D.(2015).Bluebrain–MobileMusic.Retrievedfromhttp://davidcollier.ie/bluebrain-listen-to-the-light/

Crabtree,A.,Benford,S.,Rodden,T.,Greenhalgh,C.,Flintham,M.,Anastasi,R.,&Tandavanitj,N.et al.(2004).OrchestratingaMixedRealityGame’ontheGround.Proceedings of the SIGCHI Conference on Human Factors in Computing Systems(pp.391–398).ACM.

De Roure, D., & Willcox, P. (2016). Numbers in places: creative interventions in musical space & time.Proceedings of the 18th International Conference on Human-Computer Interaction with Mobile Devices and Services Adjunct - Mobile- HCI ’16(pp.1059–1063).ACM.doi:10.1145/2957265.2964199

DeRoure,D.,Willcox,P.,&Chamberlain,A.(2016a).ExperimentalDigitalHumanities:Creativeinterventionsin algorithmic composition on a hypothetical mechanical computer. DMRN+11: Digital Music ResearchNetwork,December.

Dourish,P.(2006).Implicationsfordesign.Proceedings of CHI 2006,ACMPress.

Ellis,C.S.,&Bochner,A.(2000).Autoethnography, Personal Narrative, Reflexivity: Researcher as Subject. The Handbook Of Qualitative Research. N. Denzin and Y.Lincoln:Sage.

International Journal of Mobile Human Computer InteractionVolume 9 • Issue 4 • October-December 2017

38

Eslambolchilar, P., Bødker, M., & Chamberlain, A (2016). Ways of Walking: Understanding Walking’sImplications for the Design of Handheld Technology via a Humanistic Ethnographic Approach. Human Technology: An Interdisciplinary Journal on Humans in ICT Environments.

Flintham,M.,Benford,B.,Anastasi,R.,Hemmings,T.,Crabtree,A.,Greenhalgh,G.,&Row-Farr,J.et al.(2003).Whereon-LineMeetsontheStreets:ExperienceswithMobileMixedRealityGames.Proceedings of the SIGCHI Conference on Human Factors in Computing Systems(pp.569–576).ACM.doi:10.1145/642611.642710

Fosh,L.,Benford,S.,Reeves,S.,Koleva,B.,&Brundell,P.(2013).SeeMe,FeelMe,TouchMe,HearMe:TrajectoriesandInterpretationinaSculptureGarden.Proceedings of the SIGCHI Conference on Human Factors in Computing Systems(pp.149–158).ACM.doi:10.1145/2470654.2470675

Galani,A.(2003).MixedRealityMuseumVisits:UsingNewTechnologiestoSupportCo-VisitingforLocalandRemoteVisitors.Museological Review, 10.

Gaye, L., Holmquist, L. E., Behrendt, F., & Tanaka, A. (2006). Mobile Music Technology: Report on anEmergingCommunity.Proceedings of the 2006 Conference on New Interfaces for Musical Expression (pp.22–25).IRCAM—CentrePompidou.

Gildersleeve,Z.(2015).AnExplorationofMobileAugmentedReality.Retrievedfromhttp://sww.zachgildersleeve.com/downloads/finalReport.pdf

Greg,E.T.,&Tomlinson,B.(2006).PersonalSoundtrack:Context-AwarePlaylistsThatAdapttoUserPace.InCHI’06ExtendedAbstractsonHumanFactorsinComputingSystems(pp.736–741).ACM.

Han, K., Shih, P. C., Rosson, M. B., & Carroll, J. M. (2014). Enhancing Community Awareness of andParticipation in Local Heritage with a Mobile Application. Proceedings of the 17th ACM Conference on Computer Supported Cooperative Work & Social Computing(pp.1144–1155).NewYork,NY,USA:ACM.doi:10.1145/2531602.2531640

Hazzard,A.,Benford,S.,&Burnett,G.(2015).SculptingaMobileMusicalSoundtrack.Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems.ACM.

Hazzard,A.,Benford,S.,Chamberlain,AandGreenhalgh.C.(2015).ConsideringMusicalStructureinLocation-BasedExperiences.Proceedings of NIME ‘15.

Hemment,D.(2006).LocativeArts.Leonardo,39(4),348–355.doi:10.1162/leon.2006.39.4.348

Howard,E.,&DeRoure,D.(2015).Turningnumbersintonotes”inAdaLovelaceSymposium2015-Celebrating200YearsofaComputerVisionary.Proceedings of the Ada Lovelace Symposium ’15.ACMPress.

Izadi,S.,Fraser,M.,Benford,S.,Flintham,M.,Greenhalgh,C.,Rodden,T.,&Schnädelbach,H.(2002).Citywide:SupportingInteractiveDigitalExperiencesacrossPhysicalSpace.Personal and Ubiquitous Computing,6(4),290–298.doi:10.1007/s007790200030

Kirisits,N.,Behrendt,F.,Gaye,L.,&Tanaka,A.(2008).Creative Interactions–The Mobile Music Workshops 2004-2008.DieAngewandtePress.

LaBelle,B.(2006).Background Noise: Perspectives on Sound Art.ContinuumBooks.

Maanen,J.(1988).Tales of the Field.Chicago:ChicagoUniversityPress.

McGookin,D.,&Brewster,S.(2012a).PULSE:thedesignandevaluationofanauditorydisplaytoprovideasocialvibe.Proceedings of the CHI 2012(pp.1263–1272).NewYork,NY,USA:ACM.doi:10.1145/2207676.2208580

McGookin,D.,&Brewster,S.(2012b).PULSE:thedesignandevaluationofanauditorydisplaytoprovideasocialvibe.Proc. CHI 2012(pp.1263–1272).Austin,TX,USA:ACM.doi:10.1145/2207676.2208580

Mcgookin,D.,Tahiro,K.,Vaittinen,T.,Kyt,M.,Monastero,B.,&Vasquez,J.C.(2017).ExploringSeasonalityinMobileCulturalHeritage.ProceedingsofCHI‘17.ACMPress.

McGookin, D., Vazquez-Alvarez, Y., Brewster, S., & Bergstrom-Lehtovirta, J. (2012). Shaking the dead:multimodal locationbasedexperiences forun-stewardedarchaeological sites. InProc. NordiCHI 2012 (pp.199–208).NewYork,NY,USA:ACM.doi:10.1145/2399016.2399048

International Journal of Mobile Human Computer InteractionVolume 9 • Issue 4 • October-December 2017

39

McPherson,A.,Chamberlain,A.,Hazard,A.,McGrath,S.,&Benford,S.(2016,June4-8).DesigningforExploratoryPlaywithaHackableDigitalMusicalInstrument.Proceedings of Designing Interactive Systems, DIS’16,Brisbane,Australia.ACMPress.doi:10.1145/2901790.2901831

Menabrea,L.F.(1843).SketchoftheAnalyticalEngineinventedbyCharlesBabbage,Esq.InR.Taylor(Ed.),ScientificMemoirs(Vol.3).London,UK:RichardandJohnE.Taylor.

Morreale,F.,Moro,G.,Chamberlain,A.,Benford,S.,&McPherson,A.(2017).BuildingaMakerCommunityAroundanOpenHardwarePlatform.Proc. of CHI’17,Denver,USA.

Reid, J., Cater, K., Fleuriot, C., & Hull, R. (2005). ‘Experience Design Guidelines for Creating Situated Mediascapes’. Mobile and Media Systems Laboratory.HPLaboratoriesBristol.

Reid,J.,Geelhoed,E.,Hull,R.,Cater,C.,&Clayton,B.(2005).Parallelworlds:immersioninlocation-basedexperiences.ProceedingsofCHI‘05(pp.1733–1736).Portland,USA:ACM.

Rueb, T. (2002). Sonic Space-Time: Sound Installation and Secondary Orality. Consciousness Re f ra m e d . Re t r i eve d f ro m h t t p : / / l o c a t i ve . a r t i c u l e . n e t / w p - c o n t e n t / u p l o a d s / 2 0 1 3 / 0 6 /TRuebSonicSpaceTimeSoundInstallationAndSecondaryOralitypdf

Smith,L.(2011).TheDoingofheritage:heritageasperformance.InPerforming Heritage: Research, Practice and Innovation in Museum Theatre and Live Interpretation(pp.69–81).

Tanaka,A.(2008).VisceralMobileMusicSystems.InTransdisciplinaryDigitalArt.Sound,VisionandtheNewScreen(pp.155–170).Springer.doi:10.1007/978-3-540-79486-8_15

Truax,B.(2001).Acoustic Communication(2nded.).Westport,CT:GreenwoodPress.

Turing,A.M.(1950).ComputingMachineryandIntelligence.Mind,49(236),433–460.doi:10.1093/mind/LIX.236.433

Tuters,M.(2004).The Locative Utopia.TransculturalMappingLocativeReader.

Vazquez-Alvarez,Y.,Oakley,I.,&Brewster,S.(2011).Auditorydisplaydesignforexplorationinmobileaudio-augmentedreality.Personal and Ubiquitous Computing.doi:10.1007/s00779-011-0459-0

Vlahakis,V.,Ioannidis,M.,Karigiannis,J.,Tsotros,M.,Gounaris,M.,Stricker,D.,&Almeida,L.et al.(2002).Archeoguide:AnAugmentedRealityGuideforArchaeologicalSites.IEEE Computer Graphics and Applications,22(5),52–60.doi:10.1109/MCG.2002.1028726

Waern,A.,Montola,M.,&Stenros,J.(2009).‘TheThree-SixtyIllusion:DesigningforImmersioninPervasiveGames’.InProceedings of the SIGCHI Conference on Human Factors in Computing Systems,1549–1558.ACM.doi:10.1145/1518701.1518939

Yu,Z.,Zhou,X.,Yu,Z.,HyukPark,J.,&Ma,J.(2008).iMuseum:AScalableContext-AwareIntelligentMuseumSystem.Computer Communications,31(18),4376–4382.doi:10.1016/j.comcom.2008.05.004

Zimmermann,A.,&Lorenz,A.(2005).CreatingAudio‐augmentedEnvironments.International Journal of Pervasive Computing and Communications,1(1),31–42.doi:10.1108/17427370580000111

International Journal of Mobile Human Computer InteractionVolume 9 • Issue 4 • October-December 2017

40

Alan Chamberlain is a Senior Research Fellow in the Mixed Reality Lab, Computer Science at the University of Nottingham. He has published numerous papers on many aspects of Human Computer Interaction. His research interests relate to real-world problems as articulated by various communities of practice and relate to Human Computer Interaction, Ethnography, Action Research, Participatory Design, Digital Arts/Humanities and User Engagement.

Mads Bødker is Associate Professor at the Department of IT Management, Copenhagen Business School. His work combines human-computer interaction and user experience with ideas from the arts, philosophy, and anthropology.

Adrian Hazzard is a Research Fellow in the Mixed Reality Lab, School of Computer Science, University of Nottingham. Adrian has a background in music performance and composition. The main focus of his current research is in interactive musical experiences from both a compositional and experiential viewpoint.

David McGookin is an Assistant Professor at Aalto University Finland in Human-Computer Interaction. His work focuses on multi-sensory interaction in mobile and nomadic environment. In particular, he focuses on how individuals engage with geo-tagged media in their immediate environments through different senses.

David De Roure is Professor of e-Research and Director of the Oxford e-Research Centre. He has strategic responsibility for Digital Humanities at Oxford and directed the national Digital Social Research programme for ESRC, for whom he is now a strategic adviser. His personal research is in Computational Musicology, Web Science, and Internet of Things. He is a frequent speaker and writer on digital scholarship and the future of scholarly communications.

Pip Willcox is the Head of the Centre for Digital Scholarship at the Bodleian Libraries, University of Oxford, and a Senior Research at the Oxford e-Research Centre. She co-directs the Digital Humanities at Oxford Summer School and convenes its introductory workshop strand. With a background in textual editing and book history, her current work investigates the intersection between the material and the digital, exploring the experimental humanities.

Konstantinos Papangelis is a Lecturer at Xi’an Jiaotong-Liverpool University (XJTLU) and an honorary Lecturer at the University of Liverpool. His research addresses emerging problems and issues that revolve around socio-technical phenomena and focuses on the intersections between new media technologies and social practices. To this end, he conducts research that spans the areas of, ethno(techno)methodology, interaction design, computer supported cooperative work, user experience, and user-centered design.