lecture 6 page 1 cs 236, spring 2008 privacy and anonymity cs 236 advanced computer security peter...

47
Lecture 6 Page 1 CS 236, Spring 2008 Privacy and Anonymity CS 236 Advanced Computer Security Peter Reiher May 6, 2008

Upload: thomasina-osborne

Post on 13-Dec-2015

215 views

Category:

Documents


1 download

TRANSCRIPT

Lecture 6Page 1CS 236, Spring 2008

Privacy and AnonymityCS 236

Advanced Computer Security Peter ReiherMay 6, 2008

Lecture 6Page 2CS 236, Spring 2008

Groups for This Week

1. Golita Benoodi, Darrell Carbajal, Sean MacIntyre

2. Andrew Castner, Chien-Chia Chen, Chia-Wei Chang

3. Hootan Nikbakht, Ionnis Pefkianakis, Peter Peterson

4. Yu Yuan Chen, Michael Cohen, Zhen Huang

5. Jih-Chung Fan, Vishwa Goudar, Michael Hall

6. Abishek Jain, Nikolay Laptev, Chen-Kuei Lee

7. Chieh-Ning Lien, Jason Liu, Kuo-Yen Lo

8. Min-Hsieh Tsai, Peter Wu, Faraz Zahabian

Lecture 6Page 3CS 236, Spring 2008

Outline

• Privacy vs. security?

• Data privacy issues

• Network privacy issues

• Some privacy solutions

Lecture 6Page 4CS 236, Spring 2008

What Is Privacy?

• The ability to keep certain information secret

• Usually one’s own information

• But also information that is “in your custody”

• Includes ongoing information about what you’re doing

Lecture 6Page 5CS 236, Spring 2008

Privacy and Computers• Much sensitive information currently kept

on computers

– Which are increasingly networked

• Often stored in large databases

– Huge repositories of privacy time bombs

• We don’t know where our information is

Lecture 6Page 6CS 236, Spring 2008

Privacy and Our Network Operations

• Lots of stuff goes on over the Internet– Banking and other commerce– Health care– Romance and sex– Family issues– Personal identity information

• We used to regard this stuff as private– Is it private any more?

Lecture 6Page 7CS 236, Spring 2008

Threat to Computer Privacy• Cleartext transmission of data

• Poor security allows remote users to access our data

• Sites we visit can save information on us

– Multiple sites can combine information

• Governmental snooping

• Location privacy

• Insider threats in various places

Lecture 6Page 8CS 236, Spring 2008

Privacy Vs. Security

• Best friends?

• Or deadly rivals?

• Does good security kill all privacy?

• Does good privacy invalidate security?

• Are they orthogonal?

• Or can they just get along?

Lecture 6Page 9CS 236, Spring 2008

Conflicts Between Privacy and Security

• Many security issues are based on strong authentication

– Which means being very sure who someone is

• If we’re very sure who’s doing what, we can track everything

• Many privacy approaches based on obscuring what you’re doing

Lecture 6Page 10CS 236, Spring 2008

Some Specific Privacy Problems• Poorly secured databases that are remotely

accessible– Or are stored on hackable computers

• Data mining by companies we interact with• Eavesdropping on network communications by

governments• Insiders improperly accessing information• Cell phone/mobile computer-based location

tracking

Lecture 6Page 11CS 236, Spring 2008

Data Privacy Issues

• My data is stored somewhere

– Can I control who can use it/see it?

• Can I even know who’s got it?

• How do I protect a set of private data?

– While still allowing some use?

• Will data mining divulge data “through the back door”?

Lecture 6Page 12CS 236, Spring 2008

Personal Data

• Who owns data about you?• What if it’s real personal data?

– Social security number, DoB, your DNA record?

• What if it’s data someone gathered about you?– Your Google history or shopping records– Does it matter how they got it?

Lecture 6Page 13CS 236, Spring 2008

Controlling Personal Data

• Currently impossible

• Once someone has your data, they do what they want with it

• Is there some way to change that?

• E.g., to wrap data in way to prevent its misuse?

• Could TPM help?

Lecture 6Page 14CS 236, Spring 2008

Tracking Your Data

• How could I find out who knows my passport number?

• Can I do anything that prevents my data from going certain places?

– With cooperation of those who have it?

– Without their cooperation?

– Via legal means?

Lecture 6Page 15CS 236, Spring 2008

Protecting Data Sets

• If my company has (legitimately) a bunch of personal data,

• What can I/should I do to protect it?

– Given that I probably also need to use it?

• If I fail, how do I know that?

– And what remedies do I have?

Lecture 6Page 16CS 236, Spring 2008

Options for Protecting Data

• Careful system design• Limited access to the database

– Networked or otherwise• Full logging and careful auditing• Using only encrypted data

– Must it be decrypted?– If so, how to protect the data and the keys?

Lecture 6Page 17CS 236, Spring 2008

Data Mining and Privacy

• Data mining allows users to extract models from databases

– Based on aggregated information

• Often data mining allowed when direct extraction isn’t

• Unless handled carefully, attackers can use mining to deduce record values

Lecture 6Page 18CS 236, Spring 2008

Insider Threats and Privacy

• Often insiders need access to private data

– Under some circumstances

• But they might abuse that access

• How can we determine when they misbehave?

• What can we do?

Lecture 6Page 19CS 236, Spring 2008

Network Privacy

• Mostly issues of preserving privacy of data flowing through network

• Start with encryption

– With good encryption, data values not readable

• So what’s the problem?

Lecture 6Page 20CS 236, Spring 2008

Traffic Analysis Problems

• Sometimes desirable to hide that you’re talking to someone else

• That can be deduced even if the data itself cannot

• How can you hide that?

– In the Internet of today?

Lecture 6Page 21CS 236, Spring 2008

Location Privacy• Mobile devices often communicate while

on the move

• Often providing information about their location

– Perhaps detailed information

– Maybe just hints

• This can be used to track our movements

Lecture 6Page 22CS 236, Spring 2008

Implications of Location Privacy Problems

• Anyone with access to location data can know where we go

• Allowing government surveillance

• Or a private detective following your moves

• Or a maniac stalker figuring out where to ambush you . . .

Lecture 6Page 23CS 236, Spring 2008

Some Privacy Solutions

• The Scott McNealy solution

– “Get over it.”

• Anonymizers

• Onion routing

• Privacy-preserving data mining

• Preserving location privacy

• Handling insider threats via optimistic security

Lecture 6Page 24CS 236, Spring 2008

Anonymizers• Network sites that accept requests of

various kinds from outsiders

• Then submit those requests

– Under their own or fake identity

• Responses returned to the original requestor

• A NAT box is a poor man’s anonymizer

Lecture 6Page 25CS 236, Spring 2008

The Problem With Anonymizers

• The entity running knows who’s who

• Either can use that information himself

• Or can be fooled/compelled/hacked to divulge it to others

• Generally not a reliable source of real anonymity

Lecture 6Page 26CS 236, Spring 2008

Onion Routing• Meant to handle issue of people knowing

who you’re talking to

• Basic idea is to conceal sources and destinations

• By sending lots of crypo-protected packets between lots of places

• Each packet goes through multiple hops

Lecture 6Page 27CS 236, Spring 2008

A Little More Detail

• A group of nodes agree to be onion routers

• Users obtain crypto keys for those nodes

• Plan is that many users send many packets through the onion routers

– Concealing who’s really talking

Lecture 6Page 28CS 236, Spring 2008

Sending an Onion-Routed Packet

• Encrypt the packet using the destination’s key

• Wrap that with another packet to another router

– Encrypted with that router’s key

• Iterate a bunch of times

Lecture 6Page 29CS 236, Spring 2008

In Diagram Form

Source Destination

Onion routers

Lecture 6Page 30CS 236, Spring 2008

What’s Really in the Packet

Lecture 6Page 31CS 236, Spring 2008

Delivering the Message

Lecture 6Page 32CS 236, Spring 2008

What’s Been Achieved?

• Nobody improper read the message

• Nobody knows who sent the message

– Except the receiver

• Nobody knows who received the message

– Except the sender

• Assuming you got it all right

Lecture 6Page 33CS 236, Spring 2008

Issues for Onion Routing

• Proper use of keys

• Traffic analysis

• Overheads

– Multiple hops

– Multiple encryptions

Lecture 6Page 34CS 236, Spring 2008

Privacy-Preserving Data Mining

• Allow users access to aggregate statistics

• But don’t allow them to deduce individual statistics

• How to stop that?

Lecture 6Page 35CS 236, Spring 2008

Approaches to Privacy for Data Mining

• Perturbation

– Add noise to sensitive value

• Blocking

– Don’t let aggregate query see sensitive value

• Sampling

– Randomly sample only part of data

Lecture 6Page 36CS 236, Spring 2008

Preserving Location Privacy

• Can we prevent people from knowing where we are?

• Given that we carry mobile communications devices

• And that we might want location-specific services ourselves

Lecture 6Page 37CS 236, Spring 2008

Location-Tracking Services

• Services that get reports on our mobile device’s position

– Probably sent from that device

• Often useful

– But sometimes we don’t want them turned on

• So, turn them off then

Lecture 6Page 38CS 236, Spring 2008

But . . .

• What if we turn it off just before entering a “sensitive area”?

• And turn it back on right after we leave?

• Might someone deduce that we spent the time in that area?

• Very probably

Lecture 6Page 39CS 236, Spring 2008

Handling Location Inferencing

• Need to obscure that a user probably entered a particular area

• Can reduce update rate

– Reducing certainty of travel

• Or bundle together areas

– Increasing uncertainty of which was entered

Lecture 6Page 40CS 236, Spring 2008

Privacy in the Face of Insider Threats

• A real problem

• Recent UCLA medical center case of worker who peeked at celebrity records

– They had the right to look at records

– But only for the appropriate reasons

• Generally, how do we preserve privacy when insider needs some access?

Lecture 6Page 41CS 236, Spring 2008

One Approach

• A better access control model

• Don’t give insider full access

• Only allow access when he really needs it

• But how do you tell?

• Could use optimistic access control

Lecture 6Page 42CS 236, Spring 2008

Optimistic Access Control

• Don’t normally allow access to protected stuff

– Even if worker sometimes needs it

• On need, require worker to request exceptional action

– And keep record it happened

• Audit log of exceptional actions later

Lecture 6Page 43CS 236, Spring 2008

A Sample Scenario

• Nurse X needs to access medical records of patient Y

• Nurse X doesn’t have ordinary access

• She requests exceptional access

• Which is granted

– And a record is made of her request

Lecture 6Page 44CS 236, Spring 2008

Checking the Scenario

• An auditor (human or otherwise) examines all requests for sensitive access

• If “not reasonable,” reports to an authority

• In this case, nurse X acted properly

– Since patient Y was being treated

Lecture 6Page 45CS 236, Spring 2008

How About Misbehavior?

• Nurse X requests medical information on celebrity Z

• The request is logged

• The authority finds it bogus

– Good question: How?

• Nurse X’s ass is fired

Lecture 6Page 46CS 236, Spring 2008

How Does This Help?

• Every access to a sensitive record is logged

– Well, it could have been, anyway

• Worker is alerted to the sensitive nature of his actions

• There’s a built-in mechanism for validating access

Lecture 6Page 47CS 236, Spring 2008

Will It Be Too Cumbersome?

• Depends

• On frequency of requests

• And auditor’s ability to identify illegitimate accesses

• Also requires reasonable remediation method

• And strong authentication