privacy, security, surveillance and · pdf fileoutline of lecture • what is privacy?...

43
Privacy, Security, Surveillance and Regula3on Charles D. Raab Professorial Fellow School of Social and Poli3cal Science University of Edinburgh Director of CRISP (Centre for Research into Informa3on, Surveillance and Privacy) Turing Fellow, Alan Turing Ins3tute Presented in the Professional Issues Course, School of Informa3cs, University of Edinburgh, 24 October 2017 © Charles D. Raab 2017 1

Upload: phungtuong

Post on 26-Mar-2018

219 views

Category:

Documents


3 download

TRANSCRIPT

Privacy,Security,Surveillanceand

Regula3on

CharlesD.RaabProfessorialFellow

SchoolofSocialandPoli3calScienceUniversityofEdinburgh

DirectorofCRISP(CentreforResearchintoInforma3on,

SurveillanceandPrivacy)TuringFellow,AlanTuringIns3tute

PresentedintheProfessionalIssuesCourse,SchoolofInforma3cs,UniversityofEdinburgh,24October2017

©CharlesD.Raab2017

1

OutlineofLecture•  Whatisprivacy?•  Whatissecurity?•  Howaretheyrelated?•  Whatissurveillance?•  Howdoesitrelatetodatascience?•  Whatkindsofregula<on?•  Privacybydesignanddefault•  Datascienceanddata•  Internetresearch•  Ethics,codes,andstandards•  Bibliography

2

IndividualPrivacy•  Philosophy,socialsciences,law:nosingledefini<onorconceptualisa<on•  Seventypes(Finn,etal.,2013):privacyof:

theperson behaviourandac<on communica<on dataandimage thoughtsandfeelings loca<onandspace associa<on

•  Othertypes(WrightandRaab,2014)

•  Context-dependent(Nissenbaum,2010)

•  Conven<onalprivacyparadigm:individualis<c,classicalliberal,rights-orientedonly(BenneTandRaab,2006)

3

IndividualPrivacy’sValue•  Deontological(right/wrongac<on)andconsequen<alist(right/wrong

consequences)•  Privacyisanindividualright:

–  fundamentalbutnotabsolute(Raab,2017)–  ‘Everyonehastherighttorespectforhisorherprivateandfamilylife,

homeandcommunica<ons.’(CharterofFundamentalRightsoftheEU,Ar<cle7)

–  servessel_ood,autonomy,dignity,butalsosociality•  Privacy’simportancegoesbeyondthattotheindividual;acrucial

underpinningof:–  interpersonalrela<onships–  societyitself–  theworkingsofademocra<cpoli<calsystem

•  Whenprivacyisprotected,thefabricofsociety,thefunc<oningofpoli<calprocessesandtheexerciseofimportantfreedomsareprotected.Wheneroded,societyandthepolityarealsoharmed;privacyprotec<onisbothanindividualandapublicinterest 4

PrivacyanditsSocialValue(Regan,1995)

•  Commonvalue:allhavecommoninterestinrighttoprivacybutmaydifferonspecificcontentoftheirprivacyorwhattheythinksensi<ve

•  Publicvalue:privacyinstrumentallyvaluabletodemocra<cpoli<calsystem,e.g.,forfreedomofspeechandassocia<on,andforsedngboundariestostate’sexerciseofpower

•  Collec<vevalue:economis<cconcep<onofprivacy’svalueascollec<ve,non-excludiblegoodthatcannotbedividedandthatcannotbeefficientlyprovidedbymarket

•  Manyotherwritersonhowprivacyworksinsocietyandsocialrela<ons(Goffman,manyworks;Wes<n,1967;Altman,1975;Solove,2008;Schoeman,1992;Bygrave,2002;Goold,2009;Steeves,2009;Raab,2014,2012;…)

•  Society,notjusttheindividual,isbeTeroffwhenprivacyexists•  Basedonunderstandingprivacy’simportanceforsociety,socialand

poli<calrela<onships;notonlyforindividualrightsorvalues

5

ButWhatAboutSecurity?

•  Whatever‘privacy’means,itisnottheonlyimportantvalueinpolicy-making,andnottheonlypublic-interestvalue

•  Securityisalsoafundamentalright:‘Everyonehastherighttolibertyandsecurityofperson.(CharterofFundamentalRightsoftheEU,Ar<cle6)

•  Security(na<onalandother)seemsnowtobetheover-ridingvalue,facingterrorism,crime,manykindsofadverseevent

•  Doesthisinevitablyleadto(tenden<ous)‘privacyv.publicinterest/security/etc.’construct?

•  Whatelsecanbesaidabouttherela<onshipbetweenprivacyandsecurity?(discussedlater)

Butwhatis‘security’?

6

SomeDefini<onsof‘Security’•  ‘[T]hecondi<on(perceivedorconfirmed)ofanindividual,acommunity,an

organisa<on,asocietalins<tu<on,astate,andtheirassets(suchasgoods,infrastructure),tobeprotectedagainstdangerorthreatssuchascriminalac<vity,terrorismorotherdeliberateorhos<leacts,disasters(naturalandman-made).’(adoptedbyCENBT/WG161onProtec<onandSecurityoftheCi<zen,January2005;citedinMarpSempere,2010:6)

•  ‘[A]fundamentalgoodwithoutwhichsocie<escannotprosper.’(MarpSempere2010:2;emphasisinoriginal)

•  ‘Theconceptofsecurityhasfortoolongbeeninterpretednarrowly:assecurityofterritoryfromexternalaggression,orasprotec<onofna<onalinterestsinforeignpolicyorasglobalsecurityfromthethreatofanuclearholocaust.Ithasbeenrelatedmoretona<on-statesthantopeople….ForgoTenwerethelegi<mateconcernsofordinarypeoplewhosoughtsecurityintheirdailylives.Formanyofthem,securitysymbolizedprotec<onfromthethreatofdisease,hunger,unemployment,crime,socialconflict,poli<calrepressionandenvironmentalhazards…Inthefinalanalysis,humansecurityisachildwhodidnotdie,adiseasethatdidnotspread,ajobthatwasnotcut,anethnictensionthatdidnotexplodeinviolence,adissidentwhowasnotsilenced.Humansecurityisnotaconcernwithweapons–itisaconcernwithhumanlifeanddignity.…Humansecurityispeople-centred.’(UNDP,HumanDevelopmentReport1994:22-23)

7

‘Security’inTechnicalDiscourse•  ‘Computersecurity,alsoknownascybersecurityorITsecurity,istheprotec<onof

computersystemsfromthethesanddamagetotheirhardware,sosware,orinforma<on,aswellasfromdisrup<onormisdirec<onoftheservicestheyprovide’(Wikipedia,quo<ngGasser1988,p.3)

•  Thisrefersonlytoonemeaningof‘security’•  Thisrefersonlytoonesourceofprivacyviola<on•  Dataprotec<on(informa<onprivacy)principlesincludethiskindofsecurity:

‘[Personaldatashallbe](f)processedinamannerthatensuresappropriatesecurityofthepersonaldata,includingprotec<onagainstunauthorisedorunlawfulprocessingandagainstaccidentalloss,destruc<onordamage,usingappropriatetechnicalororganisa<onalmeasures’(EUGeneralDataProtec<onRegula<on,Ar<cle5(1)(f))

Computer/cyber/ITsecuritycanprotectprivacy,butisonlypartofthewholestoryComputerscien>sts(andrelatedspecialists)shouldthereforethinkoutsidetheirbox

tothewiderlegalandethicalframeofreferencefor‘security’ 8

Security:Types•  Informa3onsecurity:toprotectinforma<onandinforma<onsystemsfrom

unauthorisedaccess,modifica<onordisrup<on;computersecurity•  Physicalsecurity:tosafeguardthephysicalcharacteris<csandproper<esof

systems,spaces,objectsandhumanbeings•  Poli3calsecurity:protec<onofacquiredrights,establishedins<tu<ons/structures

andrecognizedpolicychoices•  Socio-Economicsecurity:economicmeasurestosafeguardindividuals•  Culturalsecurity:tosafeguardthepermanenceoftradi<onalschemasoflanguage,

culture,associa<ons,iden<tyandreligiousprac<ces•  Environmentalsecurity:toprovidesafetyfromenvironmentaldangerscausedby

naturalorhumanprocesses•  Radicaluncertaintysecurity:toprovidesafetyfromexcep<onalandrareviolence/

threatsnotdeliberatelyinflictedbyanexternalorinternalagentbutcans<llthreatendras<callytodegradethequalityoflife

•  Humansecurity:tocopewithvariousthreatsinthedailylivesofpeople•  Na3onalsecurity:toprotecttheintegrityofsovereignstateterritoryandassets

(Source:partlydrawnfromPRISMSFP7project,Deliverable2.1:Preliminaryreportoncurrentdevelopmentsandtrendsregardingtechnologiesforsecurityandprivacy,28February2013:11-12)

9

Security:DimensionsandDilemmas•  Aswithprivacy,manywaysofunderstandingthis

–  Individualorpersonalsecurity;securityofpersonaldata–  Collec3vesecurityatmanylevelsbeyondtheindividual:interna<onal,na<onal,local,

neighbourhood,socialgroup;securityofsystems–  Objec3vesecurity:probabili<esofrisk–  Subjec3vesecurity:feelingsof(in)security

•  Which(ifany)oftheseshouldprevail,andhowcantheybereconciled?•  ‘Aman’shomeishiscastle’:privacyandliber<es/freedomscanbe

regardedinsomerespectsasvaluablebecauseofthesecurityandsafety–notleast,ofpersonaldata–theyprovideforindividuals,groupsandsocie<es(cf.LibertyandSecurityinaChangingWorld:14;Raab2014)

Ifso,therela>onshipbetweenprivacyandsecurityisfarmore complexandcannotbeglossedoverbyarhetoricofthe‘opposed’ rightsorvaluesofsecurityandprivacy

10

ConflictBetweenPrivacyandSecurity?

•  ‘[t]herealmofrights,privatechoice,self-interest,anden<tlement…[versus]corollarysocialresponsibili<esandcommitmentstothecommongood…[theirneglecthas]nega<veconsequencessuchasthedeteriora<onofpublicsafety…’(Etzioni1999:195)

Butwhatdoesthisconstruc>onignore?Doesthisconstruc>onhaveanyprac>caleffect?

11

IntelligenceandSecurityCommiTeeofParliament:CallforEvidence(2013)

•  ‘Inaddi<ontoconsideringwhetherthecurrentstatutoryframeworkgoverningaccesstoprivatecommunica<onsremainsadequate,theCommiTeeisalsoconsideringtheappropriatebalancebetweenourindividualrighttoprivacyandourcollec<verighttosecurity.’

•  Rhetoricalandimprecise,impedingdeeperunderstandingofwhatisatstakefortheindividual,societyandthestate

•  Threedifficul<es:(Raab,2017) ‘privacy’ ‘security’ ‘na<onalsecurityv.personalprivacy’framing

12

FormerUKForeignSecretary’sViewPhilipHammond,IntelligenceandSecuritySpeechattheRoyalUnitedServicesIns<tute,10March2015

‘Weareaserall,allofusinourprivatelives,individualswhoseekprivacyforourselvesandourfamilies,aswellasci<zenswhodemandprotec<onbyourgovernmentfromthosewhowouldharmus.Sowearerighttoques<onthepowersrequiredbyouragencies–andpar<cularlybyGCHQ–tomonitorprivatecommunica<onsinordertodotheirjob.Butweshouldnotlosesightofthevitalbalancingactbetweentheprivacywedesireandthesecurityweneed.’(emphasisadded)

13

ReviewGrouponIntelligenceandCommunica<onsTechnologies

LibertyandSecurityinaChangingWorld(12/12/13)‘Wesuggestcarefulconsidera<onofthefollowingprinciples:[pp.14-16]‘1.TheUnitedStatesGovernmentmustprotect,atonce,twodifferentformsofsecurity:na<onalsecurityandpersonalprivacy.‘IntheAmericantradi<on,theword“security”hashadmul<plemeanings.Incontemporaryparlance,itosenreferstona<onalsecurityorhomelandsecurity.Oneofthegovernment’smostfundamentalresponsibili<esistoprotectthisformofsecurity,broadlyunderstood.Atthesame<me,theideaofsecurityreferstoaquitedifferentandequallyfundamentalvalue,capturedintheFourthAmendmenttotheUnitedStatesCons<tu<on:“Therightofthepeopletobesecureintheirpersons,houses,papers,andeffects,againstunreasonablesearchesandseizures,shallnotbeviolated...”.Bothformsofsecuritymustbeprotected.’ 14

‘Balance’?

•  Conven<onalprivacyparadigm:‘balancing’aspolicyaim(butthumbonscale)•  Problemswith‘balance’(e.g.,LoaderandWalker,2007;Waldron,2003;Dworkin,1977;Zedner,2009;Raab,

1999;RUSI,2013;Anderson,2013;others)

•  ‘Theideaof“balancing”hasanimportantelementoftruth,butitisalsoinadequateandmisleading.Itistemp<ngtosuggestthattheunderlyinggoalistoachievetheright“balance”betweenthetwoformsofsecurity[na<onalsecurityandpersonalprivacy].…Butsomesafeguardsarenotsubjecttobalancingatall.Inafreesociety,publicofficialsshouldneverengageinsurveillanceinordertopunishtheirpoli<calenemies;torestrictfreedomofspeechorreligion;tosuppresslegi<matecri<cismanddissent;tohelptheirpreferredcompaniesorindustries;toprovidedomes<ccompanieswithanunfaircompe<<veadvantage;ortobenefitorburdenmembersofgroupsdefinedintermsofreligion,ethnicity,race,andgender.’(ReviewGrouponIntelligenceandCommunica<onsTechnologies,LibertyandSecurityinaChangingWorld(12/12/13))

15

‘(Na<onal)Securityv.PersonalPrivacy’?

•  ‘Howmuchsecurityshouldwegiveuptoprotectprivacy?’israrelyasked•  Assump<onsaboutrisk,equilibriumandacommonmetricforweighing

arenotclearanddoubyullywarranted•  Canweknowandagreehowmuch(andwhose)privacyshouldorshould

notoutweighhowmuch(andwhose)security?•  ‘Balancing’issilentaboutthemethodbywhichabalancecanbe

determinedandchallenged,andaboutwhoistodetermineit•  Whether‘balance’isanounoraverb,andreferstoamethodortoits

outcome,isosenambiguous;legalcasedecisionsareonesourceforunderstanding,andperhapsdispu<ng,theweighingprocessandtheargumentsused,forinstanceaboutnecessityandpropor<onality

•  Remainstobeseenhowtheseunderstandingscanbedisseminatedinthemuchmoreclosedcondi<onsoftheintelligenceandsecurityservice/lawenforcementwherestrategicandopera<onaldecisionshavetobemade,andalsobroughttobearintheiroversightandscru<ny

16

PRISMSProject:SelectedSurveyFindings

•  Bothprivacyandsecurityimportanttopeople•  Peopledonotvaluesecurityandprivacyintermsof‘trade-off’

•  Nosignificantrela<onshipbetweenpeople’svalua<onofprivacyandvalua<onofsecurity

•  Significantcorrela<onbetweenvalua<onofpersonalandgeneralsecurity

SecurityandPrivacy:Affini<es(Raab,2014,2012)•  Privacyitselfisasecurityvalue,osenpromotedassuch

protec<ve,defensive,precau<onary,risk-aversionvalue infaceoftechnologicallyassistedpolicyini<a<ves insocietydrivenbycounter-terrorism,law-enforcement,preoccupa<on withpersonalsafety providessecurerefugeforindividualsandgroups forinward-lookingpurposes forexternalsocialityandpar<cipa<on guardingagainstspa<alorinforma<onalencroachments

•  Privacyadvocates(osenfear-driven)invokeprecau<onaryprinciple,cri<cisingstatesecuritypoliciesandsurveillancetechnologies

•  ‘Privacyimpactassessment’basedonprecau<onaryrisk-minimisa<on•  ‘Securi<sa<on’ofinforma<onorsystemsininterestofprivacy(e.g.,encryp<on)•  Bothprivacyandsecurityofsocietyorstatecanthereforebeseenas

two‘takes’onpublicinterest,changingnatureofargument

18

Surveillance: Types •  watching(eyesandcameras)•  listening(earsandelectronicdevices)•  loca<ng/tracking•  detec<ng/sensing•  personaldatamonitoring(‘dataveillance’)•  dataanaly<cs(‘bigdata’)Allhavepoten>aloractualimpactonethicalandsocialvalues,

includingprivacy;butwhat’sthat?(seeearlier)Allusedforpurposesofsecurity;butwhat’sthat?(seeearlier)

Allsubjecttoregula>on;buthow?

19

FromComputerScienceto‘DataScience’

•  Datascience:extrac<ngknowledgeorinsightsfromdata•  Muchofthedataarepersonaldata•  Muchofthepersonaldataaregatheredthrough

surveillance•  Muchsurveillanceusestechnologiesdesignedforthat

purpose•  Muchofdatasciencedatausestechnologiesand

processesdesignedforextrac<ngknowledgeandinsightsDoesthisrequireregula>on?Whatandhow?

20

RegulatoryInstruments

•  Lawsandregulatoryagencies•  Codesofprac<ce/ethics/standards•  Privacy-enhancingtechnologies(PETs)•  Privacybydesign(anddefault)(PbD)•  Publicawareness•  Trainingrequirementsfordatausers

Theseinstrumentsrelatetotheprotec>onofpersonaldata,nottoallformsofsurveillanceifpersonaldataarenot

‘processed’(collected,stored,etc.)

21

Privacy/DataProtec<onbyDesignandbyDefault(1)

EU,GeneralDataProtec<onRegula<on(2016)

ArBcle25Dataprotec>onbydesignandbydefault1.Takingintoaccountthestateoftheart,thecostofimplementa<onandthenature,scope,contextandpurposesofprocessingaswellastherisksofvaryinglikelihoodandseverityforrightsandfreedomsofnaturalpersonsposedbytheprocessing,thecontrollershall,bothatthe<meofthedetermina<onofthemeansforprocessingandatthe<meoftheprocessingitself,implementappropriatetechnicalandorganisa<onalmeasures,suchaspseudonymisa<on,whicharedesignedtoimplementdata-protec<onprinciples,suchasdataminimisa<on,inaneffec<vemannerandtointegratethenecessarysafeguardsintotheprocessinginordertomeettherequirementsofthisRegula<onandprotecttherightsofdatasubjects.2.Thecontrollershallimplementappropriatetechnicalandorganisa<onalmeasuresforensuringthat,bydefault,onlypersonaldatawhicharenecessaryforeachspecificpurposeoftheprocessingareprocessed.Thatobliga<onappliestotheamountofpersonaldatacollected,theextentoftheirprocessing,theperiodoftheirstorageandtheiraccessibility.Inpar<cular,suchmeasuresshallensurethatbydefaultpersonaldataarenotmadeaccessiblewithouttheindividual'sinterven<ontoanindefinitenumberofnaturalpersons.

22

Privacy/DataProtec<onbyDesignandbyDefault(2)

EU,GeneralDataProtec<onRegula<on(2016)Recital78

Theprotec<onoftherightsandfreedomsofnaturalpersonswithregardtotheprocessingofpersonaldatarequirethatappropriatetechnicalandorganisa<onalmeasuresbetakentoensurethattherequirementsofthisRegula<onaremet.InordertobeabletodemonstratecompliancewiththisRegula<on,thecontrollershouldadoptinternalpoliciesandimplementmeasureswhichmeetinpar<culartheprinciplesofdataprotec<onbydesignanddataprotec<onbydefault.Suchmeasurescouldconsist,interalia,ofminimisingtheprocessingofpersonaldata,pseudonymisingpersonaldataassoonaspossible,transparencywithregardtothefunc<onsandprocessingofpersonaldata,enablingthedatasubjecttomonitorthedataprocessing,enablingthecontrollertocreateandimprovesecurityfeatures.Whendeveloping,designing,selec<ngandusingapplica<ons,servicesandproductsthatarebasedontheprocessingofpersonaldataorprocesspersonaldatatofulfiltheirtask,producersoftheproducts,servicesandapplica<onsshouldbeencouragedtotakeintoaccounttherighttodataprotec<onwhendevelopinganddesigningsuchproducts,servicesandapplica<onsand,withdueregardtothestateoftheart,tomakesurethatcontrollersandprocessorsareabletofulfiltheirdataprotec<onobliga<ons.Theprinciplesofdataprotec<onbydesignandbydefaultshouldalsobetakenintoconsidera<oninthecontextofpublictenders.

23

Privacy,Security,and(D)PbD•  InformaBonsecurityisalso(partof)informa<onprivacy,providedthrough

technologicalmeans•  Designing-in,anddefaul<ngto,privacyistoprovideacollec3vegoodto

beenjoyedbyallwhousethetechnologyorsystem,notagoodtobechosenasan‘extra’bytheindividualwhohappenstocareaboutprivacy

•  ‘[M]anytechnologiesandinforma<onsystemsexacerbatesocialdifferences….Thissocialdivisionislikelytohappenunlessprivacy’scollec<vevalueisexplicitlyrecognizedinorganiza<onalprac<ceandbuiltintotheconstruc<onofinforma<onandcommunica<onstechnologiesandsystems.However,thisvaluecouldbesubvertedifsomepeoplewerebeTerablethanotherstobuyprotec<veinforma<ontechnologiesfortheirownuse,inkeepingwiththeindividualistparadigm.Thiswouldbetheinforma<onsociety’sequivalentof“gatedcommuni<es”.’(BenneTandRaab,2006:41-2)

•  Thisfurtherunderlinestheaffinitybetweenprivacyandsecurity,whetherindividualorcollec<ve

•  Itbringsequalityintoviewasaneglecteddimensionofthesedebates 24

EthicalRobo<cs?:BS8611:2016•  ‘ThisBri<shStandardgivesguidanceontheiden<fica<onof

poten<alethicalharmandprovidesguidelinesonsafedesign,protec<vemeasuresandinforma<onforthedesignandapplica<onofrobots’

•  ‘Ethicalhazardsarebroaderthanphysicalhazards.Mostphysicalhazardshaveassociatedpsychologicalhazardsduetofearandstress.Thus,physicalhazardsimplyethicalhazardsandsafetydesignfeaturesarepartofethicaldesign.Safetyelementsarecoveredbysafetystandards;thisBri<shStandardisconcernedwithethicalelements’

•  ‘Examplesofethicalharmincludestress,embarrassment,anxiety,addic<on,discomfort,decep<on,humilia<on,beingdisregarded.Thismightbeexperiencedinrela<ontoaperson’sgender,race,religion,age,disability,povertyormanyotherfactors’ 25

‘Facebookrevealsnewsfeedexperimenttocontrolemo<ons’

‘Protestsoversecretstudyinvolving689,000usersinwhichfriends'pos<ngsweremovedtoinfluencemoods’(RobertBooth,TheGuardian,Monday,30June2014)

‘InastudywithacademicsfromCornellandtheUniversityofCalifornia,Facebookfilteredusers'newsfeeds–theflowofcomments,videos,picturesandweblinkspostedbyotherpeopleintheirsocialnetwork.Onetestreducedusers'exposuretotheirfriends'"posi<veemo<onalcontent",resul<nginfewerposi<vepostsoftheirown.Anothertestreducedexposureto"nega<veemo<onalcontent"andtheoppositehappened.‘JamesGrimmelmann,professoroflawatMarylandUniversity,saidFacebookhadfailedtogain"informedconsent"asdefinedbytheUSfederalpolicyfortheprotec<onofhumansubjects,whichdemandsexplana<onofthepurposesoftheresearchandtheexpecteddura<onofthesubject'spar<cipa<on,adescrip<onofanyreasonablyforeseeablerisksandastatementthatpar<cipa<onisvoluntary."ThisstudyisascandalbecauseitbroughtFacebook'stroublingprac<cesintoarealm–academia–wherewes<llhavestandardsoftrea<ngpeoplewithdignityandservingthecommongood,”hesaid.’

26

Dataveillance:Profiling

Analysisofdataon(e.g.)druguse,crime,migrants,asylum-seekers,

welfarefraud,consump<onhistory,internetbehaviour,credithistory,educa<onrecords,health,etc.

Poten<allybeneficial,poten<allyharmful,forindividualsorsocietyIden<fiesorcreatesgroups,categories,individualsPredictsbehaviourDecisionsbasedonprofilesFalseposi<ves,falsenega<ves‘Socialsor<ng’:discrimina<on,socialexclusion/inclusionUsedbystates/publicauthori<es,lawenforcers,businesses;

researchers

27

InternetResearch:Whatisit?(1)‘Thisdocumentusesthefollowingworkingdefini<ons:

Internetresearchencompassesinquirythat:(a)u<lizestheinternettocollectdataorinforma<on,e.g.,throughonlineinterviews,

surveys,archiving,orautomatedmeansofdatascraping;(b)studieshowpeopleuseandaccesstheinternet,e.g.,throughcollec<ngand

observingac<vi<esorpar<cipa<ngonsocialnetworksites,listservs,websites,blogs,games,virtualworlds,orotheronlineenvironmentsorcontexts;

(c)u<lizesorengagesindataprocessing,analysis,orstorageofdatasets,databanks,and/orrepositoriesavailableviathe[internet]

(d)studiessosware,code,andinternettechnologies(e)examinesthedesignorstructuresofsystems,interfaces,pages,andelements(f)employsvisualandtextualanalysis,semio<canalysis,contentanalysis,orother

methodsofanalysistostudytheweband/orinternet-facilitatedimages,wri<ngs,andmediaforms.

(g)studieslargescaleproduc<on,use,andregula<onoftheinternetbygovernments,industries,corpora<ons,andmilitaryforces.’

FinalCopy:EthicalDecision-MakingandInternetResearch:Recommenda<onsfromtheAOIREthicsCommiTee.ApprovedbytheEthicsWorkingCommiTee,08/2012.EndorsedbytheAOIRExecu<veCommiTee,09/2012.ApprovedbytheAOIRgeneralmembership,12/2012.

28

InternetResearch:Whatisit?(2)

‘Theinternetisasocialphenomenon,atool,andalsoa(field)siteforresearch.Dependingontheroletheinternetplaysintheresearchprojectorhowitisconceptualizedbytheresearcher,differentepistemological,logis<calandethicalconsidera<onswillcomeintoplay.Theterm“Internet”originallydescribedanetworkofcomputersthatmadepossiblethedecentralizedtransmissionofinforma<on.Now,thetermservesasanumbrellaforinnumerabletechnologies,devices,capaci<es,uses,andsocialspaces.Withinthesetechnologies,manyethicalandmethodologicalissuesariseandassuch,internetresearchcallsfornewmodelsofethicalevalua<onandconsidera<on.Becausethetypesofinterac<onandinforma<ontransmissionmadepossiblebytheinternetvarysowidely,researchersfinditnecessarytodefinetheconceptmorenarrowlywithinindividualstudies.Thisiscomplicatedbythefactthatstudiesofandontheinternetcutacrossallacademicdisciplines.‘FinalCopy:EthicalDecision-MakingandInternetResearch:Recommenda<onsfromtheAOIREthicsCommiTee.ApprovedbytheEthicsWorkingCommiTee,08/2012.EndorsedbytheAOIRExecu<veCommiTee,09/2012.ApprovedbytheAOIRgeneralmembership,12/2012.

29

InternetResearchEthics(IRE)(1)

‘IREisdefinedastheanalysisofethicalissuesandapplica<onofresearchethicsprinciplesastheypertaintoresearchconductedonandintheInternet.Internet-basedresearch,broadlydefined,isresearchwhichu<lizestheInternettocollectinforma<onthroughanonlinetool,suchasanonlinesurvey;studiesabouthowpeopleusetheInternet,e.g.,throughcollec<ngdataand/orexaminingac<vi<esinoronanyonlineenvironments;and/or,usesofonlinedatasets,databases,orrepositories.’Ess,CharlesandtheAssocia<onofInternetResearchersEthicsWorkingcommiTee,2002,“EthicalDecision-MakingandInternetResearch:Recommenda<onsfromtheAoIREthicsWorkingCommiTee,” quotedin‘InternetResearchEthics’,StanfordEncyclopediaofPhilosophy,2012

30

InternetResearchEthics(IRE)(2)‘Themul<pledisciplinesalreadylongengagedinhumansubjectsresearch(medicine,sociology,anthropology,psychology,communica<on)haveestablishedethicalguidelinesintendedtoassistresearchersandthosechargedwithensuringthatresearchonhumansubjectsfollowsbothlegalrequirementsandethicalprac<ces.ButwithresearchinvolvingtheInternet—whereindividualsincreasinglysharepersonalinforma<ononplayormswithporousandshisingboundaries,whereboththespreadandaggrega<onofdatafromdisparatesourcesisincreasinglythenorm,andwhereweb-basedservices,andtheirprivacypoliciesandtermsofservicestatements,morphandevolverapidly—theethicalframeworksandassump<onstradi<onallyusedbyresearchersandREBsarefrequentlychallenged.’StanfordEncyclopediaofPhilosophy,2012

31

Ar<cle29DataProtec<onWorkingParty:WP203(Opinion03/2013onPurposeLimita<on)

‘Underthecurrentframework[EUDataProtec<onDirec<ve95/46/EC],itisuptoeachMemberStatetospecifywhatsafeguardsmaybeconsideredasappropriate.Thisspecifica<onistypicallyprovidedinlegisla<on,whichcouldbeprecise(e.g.na<onalcensusorotherofficialsta<s<cs)ormoregeneral(mostotherkindsofsta<s<csorresearch).InthelaTercase,thisleavesroomforprofessionalcodesofconductand/orfurtherguidancereleasedbythecompetentdataprotec<onauthori<es.’

32

Codes,Statements,Etc.:MainlyGeneral

BSA(Bri<shSociologicalAssocia<on)2002ASA(AmericanSociologicalAssocia<on)1999/2008BPS(Bri<shPsychologicalSociety)2013PSA(Poli<calStudiesAssocia<on)n.d.(1990s)SRA(SocialResearchAssocia<on)2003AAAS(AmericanAssocia<onfortheAdvancementofScience)

2014MRS(MarketResearchSociety)2014AOIR(Associa<onofInternetResearchers)2002/2012UKRIO(UKResearchIntegrityOffice)2009[adoptedbythe

UniversityofEdinburgh]etc. 33

UKRIO:CodeofPrac<ceforResearch(2009)

High-leveltemplateNomen<onofprivacy(doesmen<onpersonaldata)Nomen<onofInternetNomen<onofsocialmedia

But…‘3.7.1Organisa<onsandresearchersshouldmakesurethatanyresearchinvolvinghumanpar<cipants,humanmaterialorpersonaldatacomplieswithalllegalandethicalrequirementsandotherapplicableguidelines.Appropriatecareshouldbetakenwhenresearchprojectsinvolve:vulnerablegroups,suchastheveryold,childrenorthosewithmentalillness;andcovertstudiesorotherformsofresearchwhichdonotinvolvefulldisclosuretopar<cipants.Thedignity,rights,safetyandwell-beingofpar<cipantsmustbetheprimaryconsidera<oninanyresearchstudy.Researchshouldbeini<atedandcon<nuedonlyifthean<cipatedbenefitsjus<fytherisksinvolved.’

34

UKRIO:CodeofPrac<ceforResearch(2009)

‘3.7.3Organisa<onsandresearchersshouldensuretheconfiden<alityandsecurityof:personaldatarela<ngtohumanpar<cipantsinresearch;andhumanmaterialinvolvedinresearchprojects.’‘3.7.10Researchersonprojectsinvolvinghumansubjectsmustsa<sfythemselvesthatpar<cipantsareenabled,bytheprovisionofadequateaccurateinforma<oninanappropriateformthroughsuitableprocedures,togiveinformedconsent,havingpar<cularregardtotheneedsandcapaci<esofvulnerablegroups,suchastheveryold,childrenandthosewithmentalillness.’

35

UniversityofEdinburghCollegeofHumani<esandSocialScienceResearchEthicsFramework,

May2008•  High-levelprinciples

•  Men<onsdignity•  Men<onsconsent•  ‘Thestorage,processinganddisposalofinforma<onabout

individualswhoareresearchsubjectsmustmeetlegalrequirements,includingtheindividual’sexplicitwriTenconsenttotheproposedholdinganduseofthedata.Individuals’righttoaccessandcorrectinforma<onheldaboutthemshouldalsobeexplained.’ but‘explicitwriMenconsent’isnotpartoftheUKDataProtec>onAct 1998,Schedule3,evenforprocessing‘sensi>ve’personaldata;nor isitpartoftheEUGeneralDataProtec>onRegula>on

36

‘Butasonlineresearchtakesplaceinarangeofnewvenues(email,chatrooms,webpages,variousformsof“instantmessaging,”MUDsandMOOs,USENETnewsgroups,audio/videoexchanges,etc.)–researchers,researchsubjects,andthosechargedwithresearchoversightwillosenencounterethicalques<onsanddilemmasthatarenotdirectlyaddressedinextantstatementsandguidelines.Inaddi<on,boththegreatvarietyofhumaninter/ac<onsobservableonlineandtheclearneedtostudytheseinter/ac<onsininterdisciplinarywayshavethusengagedresearchersandscholarsindisciplinesbeyondthosetradi<onallyinvolvedinhumansubjectsresearch:forexample,researchingthemul<pleusesoftextsandgraphicsimagesindiverseInternetvenuesosenbenefitsfromapproachesdrawnfromarthistory,literarystudies,etc.Thisinterdisciplinaryapproachtoresearchleads,however,toacentralethicaldifficulty:theprimaryassump<onsandguidingmetaphorsandanalogies-andthustheresul<ngethicalcodes-canvarysharplyfromdisciplinetodiscipline,especiallyasweshisfromthesocialsciences(whichtendtorelyonmedicalmodelsandlawforhumansubjects’protec<ons)tothehumani<es(whichstresstheagencyandpublicityofpersonsasar<stsandauthors).’CharlesEssandtheAoIRethicsworkingcommiTee,‘Ethicaldecision-makingandInternetresearch:Recommenda<onsfromtheAoIRethicsworkingcommiTee’,ApprovedbyAoIR,November27,2002,www.aoir.org/reports/ethics.pdf

However…

37

Therefore…•  Needtoreviewandreviseethicalandlegalprinciples,codes

andguidanceforresearchusing‘bigdata’/analy<cs;isthishappening?

•  Needtorecognisethat,especiallywhereprinciples,codesandguidelinesleaveoff,judgementmustbeexercisedbecauseconflic<ngrightsandinterestsareinvolved;no‘<ck-boxes’

•  Judgementisneededaboutthejus<fica<onof‘bigdata’research;itslimits;its(un)intendedconsequences;itsrisks;itslegality;itsethics

(How)canresearchersbetrainedtoexercisejudgementofthiskind?

38

Bibliography(1)•  Altman,I.(1975),TheEnvironmentandSocialBehavior:Privacy,PersonalSpace,

Territory,Crowding,Monterey,CA:Brooks/ColePublishingCo.•  Anderson,D.(2015),AQuesBonofTrust:ReportoftheInvesBgatoryPowers

Review[AndersonReport],London:TheSta<oneryOffice.•  Bauman,Z.(2006),LiquidFear,Cambridge:PolityPress.•  BenneT,C.andRaab,C.(2006),TheGovernanceofPrivacy:PolicyInstrumentsin

GlobalPerspecBve,Cambridge,MA:TheMITPress.•  Bygrave,L.(2002),DataProtecBonLaw:ApproachingitsRaBonale,Logicand

Limits,TheHague:KluwerLawInterna<onal.•  Dworkin,R.(1977),TakingRightsSeriously,London:Duckworth.•  Etzioni,A.(1999),TheLimitsofPrivacy,NewYork,NY:BasicBooks.•  EuropeanUnion(2016),GeneralDataProtecBonRegulaBon•  Finn,R.,Wright,D.andFriedewald,M.(2013),‘SevenTypesofPrivacy’,pp.3-32in

S.Gutwirth,R.Leenes,P.DeHert,etal.(eds.),EuropeanDataProtec<on:ComingofAge?,Springer.

39

Bibliography(2)•  Gasser,M.(1988)BuildingaSecureComputerSystem,VanNostrandReinhold•  Goffman,E.(manyworks)•  Goold,B.(2009),‘SurveillanceandthePoli<calValueofPrivacy’.AmsterdamLawForum

1(4).hTp://amsterdamlawforum.org.•  IntelligenceandSecurityCommiTeeofParliament(2015),PrivacyandSecurity:AModern

andTransparentLegalFramework,London:TheSta<oneryOffice.•  Loader,I.andWalker,N.(2007),CivilizingSecurity,Cambridge:CambridgeUniversityPress.•  MarpSempere,C.(2010).“TheEuropeanSecurityIndustry:AResearchAgenda”.Economics

ofSecurityWorkingPaper29,Berlin:EconomicsofSecurity.•  Nissenbaum,H.(2010),PrivacyinContext.Technology,Policy,andtheIntegrityofSocialLife.

Stanford,CA:StanfordUniversityPress.•  Raab,C.(1999),’FromBalancingtoSteering:NewDirec<onsforDataProtec<on’,pp.68-93in

C.BenneTandR.Grant(eds.),VisionsofPrivacy:PolicyApproachesfortheDigitalAge,Toronto:UniversityofTorontoPress.

•  Raab,C.(2012),‘Privacy,SocialValuesandthePublicInterest’,pp.129-151inA.BuschandJ.Hofmann(eds.)‘Poli<kunddieRegulierungvonInforma<on’[‘Poli<csandtheRegula<onofInforma<on’],PoliBscheVierteljahresschriWSonderhes46,Baden-Baden:NomosVerlagsgesellschas.

40

Bibliography(3)•  Raab,C.(2014),‘PrivacyasaSecurityValue’,pp.39-58inD.W.Schartum,L.BygraveandA.

G.B.Bekken(eds.),JonBing:EnHyllest/ATribute,Oslo:Gyldendal;availableat:hTp://bigdataandprivacy.org/wpcontent/uploads/2014/08/Raab_PrivacySecurityValue.pdf

•  Raab,C.(2017),‘Security,PrivacyandOversight’,pp.77-102inAndrewNeal(ed.),SecurityinaSmallNaBon:Scotland,Democracy,PoliBcs.Cambridge:OpenBookPublishers.

•  Regan,P.(1995),LegislaBngPrivacy:Technology,SocialValues,andPublicPolicy,ChapelHill,NC:UniversityofNorthCarolinaPress.ReviewGrouponIntelligenceandCommunica<ons(2013)Technologies,LibertyandSecurityinaChangingWorld,Washington,DC.

•  RoyalUnitedServiceIns<tuteforDefenceandSecurityStudies(2015),ADemocraBcLicencetoOperate:ReportoftheIndependentSurveillanceReview,London:RoyalUnitedServicesIns<tuteforDefenceandSecurityStudies.

•  Schoeman,F.(1992),PrivacyandSocialFreedomCambridge:CambridgeUniversityPress.•  Solove,D.(2008),UnderstandingPrivacy,Cambridge,MA:HarvardUniversityPress.•  Steeves,V.(2009),‘ReclaimingtheSocialValueofPrivacy’,pp.191-208inI.Kerr,V.Steeves

andC.Lucock(eds.),LessonsFromtheIdenBtyTrail:Privacy,AnonymityandIdenBtyinaNetworkedSociety,Oxford:OxfordUniversityPress

41

Bibliography(4)•  ThePresident’sReviewGrouponIntelligenceandCommunica<onsTechnologies(2013)

LibertyandSecurityinaChangingWorld–ReportandRecommendaBonsofThePresident’sReviewGrouponIntelligenceandCommunicaBonsTechnologies,December12,2013(Washington,DC).

•  UnitedNa<onsDevelopmentProgramme(1994),HumanDevelopmentReport1994,NewYork,NY:OxfordUniversityPress.

•  Waldron,J.(2003),‘SecurityandLiberty:TheImageofBalance’,TheJournalofPoliBcalPhilosophy,vol.11,pp.191-210.

•  Wes<n,A.(1967),PrivacyandFreedom.NewYork,NY:Atheneum.•  Wright,D.andRaab,C.(2014),‘PrivacyPrinciples,RisksandHarms’,InternaBonalReviewof

Law,Computers&Technology,Vol.28,No.3,pp.277-298.•  Zedner,L.(2009),Security,London:Routledge.

42