privacy for pervasive computing slides based on jasonh/courses/ubicomp-sp2007

34
Privacy for Pervasive Computing Slides based on http://www.cs.cmu.edu/~jasonh/courses/ubicomp-sp2007/

Upload: timothy-gilmore

Post on 05-Jan-2016

214 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Privacy for Pervasive Computing Slides based on jasonh/courses/ubicomp-sp2007

Privacy for Pervasive Computing

Slides based on http://www.cs.cmu.edu/~jasonh/courses/ubicomp-sp2007/

Page 2: Privacy for Pervasive Computing Slides based on jasonh/courses/ubicomp-sp2007

• Protection from spam, identity theft, mugging• Discomfort over surveillance– Lack of trust in work environments– Might affect performance, mental health– May contribute to feeling of lack of control over life

• Starting over– Something stupid you did as a kid

• Creativity and freedom to experiment– Protection from total societies– Room for each person to develop individually

• Lack of adoption of ubicomp tech

Why Care About Privacy?End-User Perspective

Everyday Risks Extreme Risks

Stalkers, Muggers_________________________________

Well-beingPersonal safety

Employers_________________________________

Over-monitoringDiscrimination

Reputation

Friends, Family_________________________________

Over-protectionSocial obligationsEmbarrassment

Government__________________________

Civil liberties

Page 3: Privacy for Pervasive Computing Slides based on jasonh/courses/ubicomp-sp2007

• Ubicomp envisions– lots of sensors for gathering data– rich world models describing people, places, things– pervasive networks for sharing

• This data can be used for good and for bad

The Fundamental Tension

Find Friends

Smart Homes

Smart Stores

Page 4: Privacy for Pervasive Computing Slides based on jasonh/courses/ubicomp-sp2007

• Most obvious problem with ubicomp by outsiders

Why Care?Designer and App Developer Perspective

Page 5: Privacy for Pervasive Computing Slides based on jasonh/courses/ubicomp-sp2007

• “Do I wear badges? No way. I am completely against wearing badges. I don't want management to know where I am. No. I think the people who made them should be taken out and shot... it is stupid to think that they should research badges because it is technologically interesting. They (badges) will be used to track me around. They will be used to track me around in my private life. They make me furious.”

• Ubicomp “might lead directly to a future of safe, efficient, soulless, and merciless universal surveillance” – Rheingold

Why Care?Designer and App Developer Perspective

Page 6: Privacy for Pervasive Computing Slides based on jasonh/courses/ubicomp-sp2007

What is Privacy?

• No standard definition, many different perspectives

• Different kinds of privacy– Bodily, Territorial, Communication, Information

Page 7: Privacy for Pervasive Computing Slides based on jasonh/courses/ubicomp-sp2007

What is Information Privacy?

• Many different philosophical views on info privacy– Different views -> different values -> different

designs– Note that these are not necessarily mutually

exclusive

Page 8: Privacy for Pervasive Computing Slides based on jasonh/courses/ubicomp-sp2007

Principles vs. Common Interest

• Principled view -> Privacy as a fundamental right– Embodied by constitutions, longstanding legal precedent– Government not given right to monitor people

• Common interest -> Privacy wrt common good– Emphasizes positive, pragmatic effects for society

• Examples: National ID cards, mandatory HIV testing

Page 9: Privacy for Pervasive Computing Slides based on jasonh/courses/ubicomp-sp2007

Self-determination vs. Personal Privacy

• Self-determination (aka data protection)– Arose due to increasing number of databases in 1970s– “Privacy is the claim of individuals, groups or institutions to determine for

themselves when, how, and to what extent information about them is communicated to others” (Westin)

– Led to Fair Information Practices (more shortly)– More of individual with respect to government and orgs

• Personal privacy– How I express myself to others and control access to myself– More of individual with respect to other individuals

• Examples: – Cell phone communication, instant messaging, Facebook

Page 10: Privacy for Pervasive Computing Slides based on jasonh/courses/ubicomp-sp2007

Privacy as Solitude

• “The right to be let alone”• People tend to devise strategies “to restrict their

own accessibility to others while simultaneously seeking to maximize their ability to reach people” – (Darrah et al 2001)

• Example: – Spam protection, undesired social obligations

• Ubicomp: – Able to turn system off, invisible mode

Page 11: Privacy for Pervasive Computing Slides based on jasonh/courses/ubicomp-sp2007

Privacy as Anonymity

• Hidden among a crowd• Example: – Web proxy to hide actual web traffic

• Ubicomp: – Location anonymity– “a person” vs “Asian person”

vs “Jason Hong”

Page 12: Privacy for Pervasive Computing Slides based on jasonh/courses/ubicomp-sp2007

• Transparent Society (Brin 1998)– Multi-way flow of info (vs. one-way to govts or corporations)

• Don’t care– I’ve got nothing to hide – We’ve always adapted– "You have zero privacy anyway. Get over it."

• Fundamentalist– Don’t understand the tech– Don’t trust others to do the right thing

• Pragmatist– Cost-benefit– Communitarian benefit to society as well as individual

Other Views on Privacy

Page 13: Privacy for Pervasive Computing Slides based on jasonh/courses/ubicomp-sp2007

• Hard to define until something bad happens– “Well, of course I didn’t mean to share that”

• Risks not always obvious– Burglars went to airports to collect license plates– Credit info used by kidnappers in South America

• Change in comfort with time and/or experience• Cause and effect may be far in time and space• Malleable depending on situation– Still use credit cards to buy online– Benefit outweighs cost

Why is Privacy Hard?

Page 14: Privacy for Pervasive Computing Slides based on jasonh/courses/ubicomp-sp2007

• Getting easier to store data– Think embarrassing facts from a long time (google knows)

• Hard to predict effect of disclosure– Hard to tell what companies (e.g., credit card, Amazon)

are doing• Market incentives not aligned• Easy to misinterpret– Went to drug rehabilitation clinic, why?

• Bad data can be hard to fix– Sen. Ted Kennedy on TSA watch list

Why is Privacy Hard?

Page 15: Privacy for Pervasive Computing Slides based on jasonh/courses/ubicomp-sp2007

Fair Information Practices (FIPs)

• US Privacy Act of 1974 (based on the work by Alan Westin)

• Based on Self-determination / Data Protection view

• Set of principles stating how organizations should handle personal information

• Note: many variants of FIPs

Page 16: Privacy for Pervasive Computing Slides based on jasonh/courses/ubicomp-sp2007

Fair Information Practices (FIPs)• Openness and transparency

– No secret record keeping• Individual participation

– Individual can see and correct the records• Collection limitation

– Data collection should be proportional (to its purpose)• Data quality

– Data should be relevant to their purpose (and up-to-date)• Use limitation

– Data should be used for their specific purposed (by authorized)• Reasonable security

– Security based on the sensitivity of the data collected• Accountability

– Data keepers must be accountable for compliance with the other principles

Page 17: Privacy for Pervasive Computing Slides based on jasonh/courses/ubicomp-sp2007

Adapting FIPs for Ubicomp

• Presents a method for analyzing ubicomp systems– Assume designers trying to do “the right thing”

• Versus evil people actively trying to intrude

• Main areas of innovation and system design – Notice– Choice and consent– Anonymity and Pseudonymity– Proximity and locality– Adequate security– Access and Recourse

Page 18: Privacy for Pervasive Computing Slides based on jasonh/courses/ubicomp-sp2007

Adapting FIPs for Ubicomp

• Notice– Physical beacons beaming out P3P policies

• P3P (Platform for Privacy Preferences) developed for Web access privacy notification and configuration

– Personal system that logs policies

• Issues– Overwhelmed by notifications?– Understandability of notifications?

Page 19: Privacy for Pervasive Computing Slides based on jasonh/courses/ubicomp-sp2007

Adapting FIPs for Ubicomp

• Choice and consent– Need a way to confirm that a person has consented– Can digitally sign a “contract” notification

• Issues– How can people specify their policies? – Can policies match what people really want?– How to make people aware of auto-accepts?– What if people don’t have a real choice?

Page 20: Privacy for Pervasive Computing Slides based on jasonh/courses/ubicomp-sp2007

Adapting FIPs for Ubicomp

• Anonymity and Pseudonymity– Try to eliminate any trace of identity– Or have a disposable identifier not linked to actual

identity

• Issues– What kinds of services can be offered

anonymously?– Business models for anonymous services?

Page 21: Privacy for Pervasive Computing Slides based on jasonh/courses/ubicomp-sp2007

Adapting FIPs for Ubicomp

• Proximity– Limit behavior of smart objects based on proximity

• Ex. “Record voice only if owner nearby”

– Simple mental model, could be hard to implement though– Weakness: could be easy to subvert

• Locality– Information tied to places it was collected– Require physical proximity to query– Weakness: limits some utility (ex. Find friend)

Page 22: Privacy for Pervasive Computing Slides based on jasonh/courses/ubicomp-sp2007

Adapting FIPs for Ubicomp

• Access and Recourse– How to know what the system knows about you?– What mechanisms for recourse?

• Use limitation, access, or repudiation, etc.• E.g., privacy aware data mining

• Adequate security– Security solves privacy?? Not really– Ubicomp’s challenges (less capable devices)– Principle of proportionality? (what to make secure is

proportional to its value)

Page 23: Privacy for Pervasive Computing Slides based on jasonh/courses/ubicomp-sp2007

Unpacking “Privacy” for a Networked World

Slides based on http://www.cs.cmu.edu/~jasonh/courses/ubicomp-sp2007/

Palen & Dourish HCI 2003

Page 24: Privacy for Pervasive Computing Slides based on jasonh/courses/ubicomp-sp2007

Overview

• Palen & Dourish present model for privacy– Based on theory by social psychologist “Irwin

Altman” (1975/1977)– Concept of privacy as dynamic, dialectic process

• Privacy management as a balancing act – Multiple factors govern these social interactions– Case studies involving technology illustrate model

Page 25: Privacy for Pervasive Computing Slides based on jasonh/courses/ubicomp-sp2007

Traditional approach in HCI

• Design of pervasive and mobile systems– Interactions with systems impact personal privacy

– New technologies introduce novel situations

• Relationship between privacy and technology– Consider outside of static, rule-based systems

– Draw on earlier concepts to better understand privacy for new situations involving information technology

Page 26: Privacy for Pervasive Computing Slides based on jasonh/courses/ubicomp-sp2007

Privacy regulation theory

• Altman sees privacy between individuals:– “As a dialectic process, privacy regulation is

conditioned by our own expectations and experiences, and by those of others with whom we interact.”

– “As a dynamic process, privacy is understood to be under continuous negotiation and management, with the boundary that distinguishes privacy and publicity refined according to circumstance.”

(Palen and Dourish, 2003)

Page 27: Privacy for Pervasive Computing Slides based on jasonh/courses/ubicomp-sp2007

Privacy management

• Privacy as a social negotiation: – “Privacy management is a process of give and

take between and among technical and social entities—from individuals to groups to institutions—in ever-present and natural tension with the simultaneous need for publicity. Our central concern is with how this process is conducted in the presence of information technology.”

(Palen and Dourish, 2003)

Page 28: Privacy for Pervasive Computing Slides based on jasonh/courses/ubicomp-sp2007

Altman’s model: limitations

• For managing personal access in interactions:– Circumstance = f(local physical environment,

audience, social status, task or objective, motivation and

intention, information technology)

• Information technology changed the view on disclosure (physical space vs. unknown distance), identity, time boundaries– IT changed the concept of “conventional” circumstance

• Privacy outside physicality: when digital information flows outside physical and temporal constraints, it changes the way to regulate privacy.

Page 29: Privacy for Pervasive Computing Slides based on jasonh/courses/ubicomp-sp2007

Disclosure boundary

Participation in the social world requires selective disclosure of personal info Bumper stickers, letter to the editor, sitting in sidewalk cafes, walking

down public streets, etc

People seek to maintain a personal life and a public face Managing privacy is to pay attention to both of these desires

Enter IT: deliberate vs. non-deliberate disclosure (e.g., online shopping vs. google search)

Tension around privacy and publicity is influenced by identity and temporal concerns (next slides)

Page 30: Privacy for Pervasive Computing Slides based on jasonh/courses/ubicomp-sp2007

Identity boundary Boundary between self and other

Beyond the spatial extent of the body social phenomenon Affiliation and allegiance make “self” complicated

In(ex)clusiveness amplified by “self” and “other” is continually enacted in and through one’s actions in the world

Recipient design phenomenon: the way that one’s actions and utterances are designed with respect to specific others Different time/others: professionals, students, fellow bus riders, etc.

Reflexive interpretability of action (one’s own ability to access how one’s action appears to others ) was driving privacy management in IT world

Technologically mediated “interaction” is less effective Representation is impoverished; indicators of boundary between privacy and publicity

are not clear Information persistence makes the problem complicated

Page 31: Privacy for Pervasive Computing Slides based on jasonh/courses/ubicomp-sp2007

Temporality boundary Tension between past, present, and future Information disclosure could persist Active privacy control and management needs to be seen in the context of

temporal sequence Relevance of permanence and impermanence constrain, undermine, or modify

regulatory behavior

“ The Disclosure (privacy vs. publicity), Identity (self vs. other) and Temporality (past - future) boundaries, and the tensions that occur with their negotiation, are the primary features of our framework. They demonstrate that privacy regulation is a dynamic, dialectic, negotiated affair. Technology itself does not directly support or interfere with personal privacy; rather it destabilizes the delicate and complex web of regulatory practices.”

Page 32: Privacy for Pervasive Computing Slides based on jasonh/courses/ubicomp-sp2007

Genres of Disclosure

Genres of disclosure: socially-constructed patterns of privacy management Disclosure, identity, temporal boundaries

Socially-constructed genre has both structural properties of communication and social patterns of expectation/response Encounters between representational forms

(people/action) and social practice (setting expectations around representations)

Page 33: Privacy for Pervasive Computing Slides based on jasonh/courses/ubicomp-sp2007

Systems support different understanding

Family Intercom (seamless comm using active badge at home): genre mismatch (home vs. work place); e.g., 6 year old kit vs. parents, 16 year old vs. sibling?

Shared calendars: could reveal explicit patterning and sequencing of information (e.g., layoff example) – disclosure

Active badges: researcher vs. administrative staff (identity) Cell-phones: boundary between self and other is destabilized (as

the caller can’t control the receiver’s physical space) Instant messaging: Tension on the temporal boundary about

messages that can be stored for future use (e.g., conventional IM + facebook, tweeter)

Page 34: Privacy for Pervasive Computing Slides based on jasonh/courses/ubicomp-sp2007

Conclusion

• When considering privacy concerns raised by the dev of new tech, the whole of the social and institutional setting (in which tech are deployed) must be considered

• Need to pay attention to the historical continuity of practice (privacy regulation)

• Privacy management = balancing act (tension resolution) people and/or their internal conflicts

• Active process of privacy management with help of technology; need to be as responsible for what we make possible (enable) as for what we make real (use)