Download - FPF Privacy Papers 2013
-
8/12/2019 FPF Privacy Papers 2013
1/29
Privacy Papers for
Policy Makers2013
-
8/12/2019 FPF Privacy Papers 2013
2/29
The publication of Privacy Papers for Policy Makers was supported by
AT&T, Microsoft, and GMAC.
-
8/12/2019 FPF Privacy Papers 2013
3/29
January 1st, 2014
We are delighted to provide you with FPFs fourthannual Privacy Papers for Policy Makers,
representing cutting-edge research and analytical work on a variety of important privacy topics.
The featured papers analyze current and emerging privacy issues and propose solutions or offerfree analysis that could lead to new approaches in privacy law. Academics, privacy advocates
and Chief Privacy Officers on FPFs Advisory Board reviewed all submitted papers,
emphasizing clarify, practicality and overall utility as the most important criteria for selection.
We received many excellent submissions from scholars on both sides of the Atlantic, and we
believe our Advisory Board has chosen a diverse and thought-provoking collection of papers.
Additionally, two of the papers were recipients of the IAPP award for best papers presented at
the 2013 Privacy Law Scholars Conference.
We hope this relevant and timely scholarship helps inform policy makers in Congress, at the
FTC, and in other federal and state agencies as the address privacy issues. This compilation isalso being provided to policy makers abroad.
We want to thank AT&T, Microsoft and GMAC for their special support of this project. And
thank you for your interest in exploring new ways to think about privacy.
Sincerely yours,
Christopher Wolf Jules PolonetskyFounder and Co-Chair Executive Director and Co-Chair
-
8/12/2019 FPF Privacy Papers 2013
4/29
Future of Privacy Forum Advisory Board
Alessandro AcquistiAssociate Professor of InformationTechnology and Public Policy at the HeinzCollege, Carnegie Mellon University
Ellen AgressSenior Vice President and Deputy GeneralCounsel, News Corporation
Annie I. AntnProfessor and Chair, Georgia Tech Schoolof Interactive Computing
Jonathan AvilaChief Privacy OfficerWalmart
Stephen BalkamChief Executive Officer,Family Online Safety Institute
Kenneth A. Bamberger
Professor of Law, Berkeley School of Law
Lael BellamyChief Privacy Officer, The WeatherChannel
Elise BerkowerAssociate General Counsel, Privacy,The Nielsen Company
Debra Berlyn
President, Consumer Policy Solutions
Joan (Jodie) Z. BernsteinCounsel, Kelley Drye & Warren, LLP andformer director of the Bureau ofConsumer Protection at the Federal TradeCommission
Michael BlumGeneral Counsel, Quantcast
Bruce BoydenAssistant Professor of Law, MarquetteUniversity Law School
Allen BrandtCorporate Counsel, Data Privacy &Protection, Graduate ManagementAdmission Council (GMAC)
Justin BrookmanDirector, Consumer Privacy, Center forDemocracy & Technology
Stuart N. BrotmanStuart N. Brotman Communications
J. Beckwith BurrDeputy General Counsel and Chief
Privacy Officer, Neustar
James M. ByrneChief Privacy Officer, Lockheed MartinCorporation
Ryan CaloAssistant Professor, University ofWashington School of LawAffiliate Scholar, Stanford Center forInternet and Society
Anna-Lisa CorralesGeneral Counsel and Secretary
Jaguar Land Rover North America, LLCJaguar Land Rover Canada ULC
Dr. Ann CavoukianInformation and Privacy Commissioner ofOntario
Brian ChaseGeneral Counsel, Foursquare Labs, Inc.
Danielle CitronProfessor of Law, University of MarylandLaw School
Allison CohenManaging Counsel, Toyota
Maureen CooneyHead of Privacy, Sprint
Lorrie Faith Cranor
Associate Professor of Computer Scienceand Engineering,Carnegie Mellon University
Mary CulnanProfessor Emeritus, Bentley University
Simon DaviesFounder, Privacy International
Kim DawsonSenior Director of Privacy,Nordstrom, Inc.
Michelle De MooySenior Associate, National Priorities,Consumer Action
Elizabeth DenhamInformation and Privacy Commissionerfor British Columbia
Michelle DennedyChief Privacy Officer, McAfee, Inc.
Benjamin EdelmanAssistant Professor, Harvard BusinessSchool
Erin EganChief Privacy Officer, Policy, Facebook
Keith EnrightSenior Corporate Counsel, Google
Leigh FeldmanChief Privacy Counsel, American Express
Alex Fowler
Global Privacy & Public Policy Lead,Mozilla
Eric FriedbergCo-President, Stroz Friedberg
Christine FryeSenior Vice President, Chief PrivacyOfficer, Bank of America
Arkadi Gerney
Senior FellowCenter for American Progress
Julie GibsonGlobal Privacy Program LeaderThe Procter & Gamble Company
Jennifer Barrett GlasgowChief Privacy OfficerAcxiom
Scott Goss
Senior Privacy Counsel, Qualcomm
Kimberly GrayChief Privacy Officer, IMS Health
Sean Hanley
Director of Compliance, Zynga GameNetwork, Inc.
Pamela Jones HarbourFormer Federal Trade Commissioner;Partner,Fulbright & Jaworski LLP
Woodrow HartzogAssistant ProfessorCumberland School of Law, SamfordUniversity and Affiliate Scholar, TheCenter for Internet & Society at StanfordLaw School
Eric HeathDirector of Legal Global Privacy,LinkedIn
Rita S. HeimesClinical Professor and Director, Center forLaw and Innovation, University of MaineSchool of Law
Megan HertzlerDirector of Information Governance, XcelEnergy
Michael HoChief Executive Officer, Bering Media
David HoffmanDirector of Security Policy and GlobalPrivacy Officer, Intel
Lara Kehoe HoffmanPrivacy and Data Security Counsel,Autodesk
Marcia Hoffman
Staff Attorney, Electronic FrontierFoundation
Chris Hoofnagle
Director, Berkeley Center for Law &Technologys information privacy
programs and senior fellow to theSamuelson Law, Technology & Public
Policy Clinic
Jane Horvath
Director of Global Privacy, Apple, Inc.
Sandra R. HughesChief Executive Officer and President,Sandra Hughes Strategies, Ltd.
Brian HusemanDirector, Public Policy, Amazon
-
8/12/2019 FPF Privacy Papers 2013
5/29
Future of Privacy Forum Advisory Board (continued)Jeff JarvisAssociate Professor; Director of theInteractive Program, Director of the Tow-Knight Center for Entrepreneurial
Journalism at the City University of NewYork
David Kahan
General Counsel, Jumptap
Ian KerrCanada Research Chair in Ethics, Law &Technology,University of Ottawa, Faculty of Law
Bill KerriganChief Executive Officer, Abine, Inc.
Stephen KlineSenior Counsel, Privacy and RegulatoryMatters, Omnicom Media Group
Anne KlinefelterAssociate Professor of Law, Director of theLaw Library, University of North Carolina
Fernando LaguardaVice President, External Affairs and PolicyCounselor, Time Warner Cable
Barbara Lawler
Chief Privacy Officer, Intuit
Adam LehmanChief Operating Officer and GeneralManager, Lotame Solutions
Gerard Lewis
Senior Counsel and Chief Privacy Officer,Comcast
Chris LibertelliHead of Global Public Policy, Netflix
Harry LightseyExecutive Director, Federal Affairs,General Motors
Chris LinExecutive Vice President, General Counseland Chief Privacy Officer, comScore, Inc.
Brendon LynchChief Privacy Officer, Microsoft
Mark MacCarthyVice President of Public Policy, TheSoftware & Information IndustryAssociation
Larry Magid
Co-Founder and Co-Director, Connect
Safely
Wendy Mantel
Privacy & IP CounselHulu
Debbie MattiesVice President, Privacy, CTIA-TheWireless Association
Michael McCulloughVice President, Enterprise InformationManagement and Privacy, Macys Inc.
William McGeveranAssociate Professor, University ofMinnesota Law School
Terry McQuayPresident, Nymity, Inc.
Scott MeyerChief Executive Officer, Evidon
Doug MillerGlobal Privacy Leader, AOL, Inc.
Maggie Mobley
General Counsel and Chief PrivacyOfficer, Carrier IQ
Marcus MorissettePrivacy Counsel, eBay
Saira NayakDirector of Policy, TRUSTe
Jill Nissen
Principal and Founder, Nissen Consulting
Lina Ornelas
General Director for Privacy Self-Regulation, Federal Institute for Access toInformation and Data Protection Mexico
Kimberley OversAssistant General Counsel, Pfizer, Inc.
Harriet PearsonPartner, Hogan Lovells US, LLP
Christina PetersSenior Counsel, Security and Privacy, IBM
Robert QuinnChief Privacy Officer and Senior VicePresident for Federal Regulatory, AT&T
MeMe RasmussenVP, Chief Privacy Officer, AssociateGeneral Counsel, Adobe Systems
Katie Ratt
Executive Counsel, Privacy Policy andStrategy, The Walt Disney Company
Joel R. ReidenbergProfessor of Law, Fordham UniversitySchool of Law
Neil Richards
Professor of Law, Washington UniversityLaw School
Shirley Rooker
President, Call for Action
Mike SandsPresident and Chief Executive Officer,BrightTag
Patrick Sayler
Chief Executive OfficerGigya, Inc.
Russell SchraderChief Privacy Officer and AssociateGeneral Counsel Global Enterprise Risk,Visa, Inc.
Paul SchwartzProfessor of Law, University of California-Berkeley School of Law
Evan Selinger, Ph.D.Associated Professor, PhilosophyDepartment, Rochester Institute ofTechnology (RIT); MAGIC Center Head ofResearch Communications, Community &
Ethics, RIT Fellow, Institute for Ethics andEmerging Technology
Ho ShinGeneral Counsel, Millennial Media
Meredith SidewaterSenior Vice President and GeneralCounsel, Lexis Nexis Risk Solutions
Al SilipigniSenior Vice President, Chief PrivacyOfficer, HSBC
Dale SkivingtonChief Privacy Officer, Dell
Will Smith
Chief Executive Officer, Euclid, Inc.
Daniel SoloveProfessor of Law, George WashingtonUniversity Law School
Cindy SouthworthVice President of Development &Innovation, National Network to EndDomestic Violence (NNEDV)
JoAnn StonierSVP and Global Privacy & Data ProtectionOfficer, MasterCard, Inc.
Lior Jacob Strahilevitz
Sidley Austin Professor of Law,
University of Chicago Law School
Greg StuartChief Executive Officer, Mobile MarketingAssociation
Chris SundermeierGeneral Counsel, Chief Privacy Officer,Reputation.com
Peter SwireNancy J. & Lawrence P. Huang Professor,Scheller College of Business, GeorgiaInstitute of Technology
Omer TeneAssociate Professor, College ofManagement School of Law,
Rishon Le Zion, Israel
Adam Thierer
Senior Research Fellow, Mercatus Center,George Mason University
Anne TothTrustworks Privacy Advisors
-
8/12/2019 FPF Privacy Papers 2013
6/29
Future of Privacy Forum Advisory Board (continued)Catherine TuckerMark Hyman, Jr. Career DevelopmentProfessor and Associate Professorof Management Science, Sloan School ofManagement, MIT
David C. VladeckProfessor, Georgetown University, FormerDirector of the Bureau of Consumer
Protection, Federal Trade Commission(FTC)
Hilary WandallChief Privacy Officer, Merck & Co., Inc.
Daniel J. WeitznerCo-Director, MIT CSAIL DecentralizedInformation Group; W3C Technology andSociety Policy Director; Former DeputyChief Technology Officer, The WhiteHouse Office of Science and TechnologyPolicy
Yael Weinman
Vice President, Global Privacy andGeneral Counsel, Information TechnologyIndustry Council
Robert YonaitisSenior Vice President of Engineering andChief Research Scientist, Ave Point, Inc.
Karen ZachariaChief Privacy Officer, VerizonCommunications, Inc.
Michael Zimmer
Assistant Professor in the School ofInformation Studies, University ofWisconsin-Milwaukee
Oracle
United Health Group
Yahoo!
(As of December 31, 2013)
-
8/12/2019 FPF Privacy Papers 2013
7/29
Table of Contents
Digital Market ManipulationM. Ryan Calo* ..................................................................................................................................1
Facing Real-Time Identification in Mobile Apps & Wearable ComputersYana Welinder ................................................................................................................................2
A Framework for Benefit-Cost Analysis in Digital Privacy DebatesAdam Thierer .................................................................................................................................4
The FTC and the New Common Law of PrivacyDaniel J. Solove and Woodrow Hartzog* ....................................................................................7
Information Privacy in the CloudPaul M. Schwartz ............................................................................................................................9
Obscurity by DesignWoodrow Hartzog and Frederic D. Stutzman ................................................................................ 11
A Primer on Metadata: Separating Fact from FictionAnn Cavoukian ..............................................................................................................................13
Privacy in Europe: Initial Data on Governance Choices and CorporatePracticeKenneth Bamberger and Deirdre Mulligan ...............................................................................15
Reconciling Personal Information in the U.S. and EUPaul M. Schwartz and Daniel J. Solove .......................................................................................18
Why Data Privacy Law Is (Mostly) ConstitutionalNeil M. Richards .............................................................................................................................20
*Recipients of the IAPP award for best papers at the 2013 Privacy Law Scholars Conference
Out of respect for copyright law and for ease of reference, this compilation is a digest of the papers selected by the
Future of Privacy Forum Advisory Board and does not contain full text. The selected papers in full text are available
through the referenced links.
-
8/12/2019 FPF Privacy Papers 2013
8/29
Digital Market Manipulation
1
M. Ryan Calo
Forthcoming in the George Washington Law Review.Full paper available at: http://papers.ssrn.com/sol3/papers.cfm?abstract_id=2309703
Executive Summary:
Jon Hanson and Douglas Kysar coined theterm market manipulation in 1999 todescribe how companies exploit thecognitive limitations of consumers.Everything costs $9.99 because consumerssee the price as closer to $9 than $10.Although widely cited by academics, theconcept of market manipulation has hadonly a modest impact on consumer
protection law.
This Article demonstrates that the conceptof market manipulation is descriptively andtheoretically incomplete, and updates theframework for the realities of a marketplacethat is mediated by technology. Todaysfirms fastidiously study consumers and,increasingly, personalize every aspect oftheir experience. They can also reachconsumers anytime and anywhere, rather
than waiting for the consumer to approachthe marketplace. These and related trendsmean that firms can not only takeadvantage of a general understanding ofcognitive limitations, but can uncover andeven trigger consumer frailty at anindividual level.
A new theory of digitalmarket manipulationreveals the limits of consumer protection lawand exposes concrete economic and privacyharms that regulators will be hard-pressed toignore. This Article thus both meaningfullyadvances the behavioral law and economicsliterature and harnesses that literature toexplore and address an impending seachange in the way firms use data to persuade.
Author:
M. Ryan Calo is anassistant professor atthe University ofWashington Schoolof Law and anaffiliate scholar at theStanford Law SchoolCenter for Internetand Society. He is a
co-director of theUniversity of
Washingtons Tech Policy Lab. Caloresearches the intersection of law andemerging technology, with an emphasis onprivacy and robotics. His work on these andother topics has appeared in law reviews andmajor news outlets, including the New YorkTimes, the Wall Street Journal, and NPR. In2013, Professor Calo testified before the full
Judiciary Committee of the United States
Senate regarding the domestic use of drones.Professor Calo serves on numerous advisoryboards, including the Electronic PrivacyInformation Center (EPIC), the ElectronicFrontier Foundation (EFF), the Future ofPrivacy Forum, and National Robotics Week.Professor Calo co-chairs the Robotics andArtificial Intelligence committee of theAmerican Bar Association and is a member ofthe Executive Committee of the AmericanAssociation of Law Schools (AALS) Section
on Internet and Computer Law.
-
8/12/2019 FPF Privacy Papers 2013
9/29
Facing Real-Time Identification in Mobile Apps &Wearable Computers
2
Yana Welinder
Forthcoming in the Santa Clara Computer and High Technology Law Journal.
Full paper available at: http://papers.ssrn.com/sol3/papers.cfm?abstract_id=2280968Executive Summary:
This article explores the privacyimplications of face recognition technologyin mobile applications and wearablecomputers and provides recommendationsfor developing policy with respect to theseuses. Face recognition apps in portabledevices challenge individuals ability toremain anonymous in public places. Theycan also link individuals offline activities totheir online profiles, generating a digitalpaper trail of their every move. The appscan therefore interfere with the ability to gooff the radar, which is often consideredessential for quiet reflection and daringexperimentation - processes that are criticalfor a productive and democratic society. Sogiven what we stand to lose, we ought to becautious with groundbreaking technologicalprogress. It does not mean that we have to
move any slower, but we should thinkabout potential consequences of the stepsthat we take.
This article maps out the recently launchedface recognition apps and still developingwearable computing technologies, as wellas some emerging regulatoryresponses. Based on these developments, itoffers initial considerations for better policyresponses to these uses. The article
recommends solutions that focus on howthe relevant individuals could be put onnotice given that the apps will not only beusing information about their users, butalso about the persons being identified. Itfurther recommends minimization of datacollection and retention and discusses howbiometric data can be kept secure. Todays
face recognition apps mostly use photosfrom social networks. They therefore call forregulatory responses that consider thecontext in which users originally shared thephotos. Most importantly, the articlehighlights that the Federal TradeCommissions first policy response toconsumer applications that use facerecognition did not follow the well-established principle of technologyneutrality. The article argues that anyregulation with respect to identification inreal time should be technology neutral andnarrowly address harmful uses of computervision without hampering the developmentof useful applications.
Author:
Yana Welinder is aLegal Counsel at
the WikimediaFoundation and a
Junior AffiliateScholar at theStanford Center forInternet & Society.Before joiningWikimedia, she wasa Visiting AssistantProfessor at
California Western School of Law, where she
taught Information Privacy Law and E-Commerce Law. Her research focuses oninternet law, privacy, and intellectual property.She also works with net neutrality policy as amember to the UN Internet GovernanceForum Dynamic Coalition on Net Neutrality.Yana has previously served as a Google PolicyFellow at the Electronic Frontier Foundation
-
8/12/2019 FPF Privacy Papers 2013
10/29
Facing Real-Time Identification in Mobile Apps &Wearable Computers
3
and conducted research on the representationof privacy through user interface design as afellow at Harvard Law School. She holds anLL.M. from Harvard Law School, a J.D. from
University of Southern California, and an LL.B.from the London School of Economics andPolitical Science.
-
8/12/2019 FPF Privacy Papers 2013
11/29
A Framework for Benefit-Cost Analysis in Digital PrivacyDebates
4
Adam Thierer
Published in The George Mason Law Review. Full paper available at:
http://www.georgemasonlawreview.org/doc/Thierer_Website.pdf
Executive Summary:
Policy debates surrounding online childsafety and digital privacy share much incommon. Both are complicated by thornydefinitional disputes and highly subjectivevaluations of harm. Both issues can besubject to intense cultural overreactions, ortechnopanics.It is common to heardemands for technical quick fixes or silverbullet solutions that are simple yetsophisticated. In both cases, the purpose ofregulation is some form of informationcontrol. Preventing exposure toobjectionable content or communications isthe primary goal of online safety regulation,whereas preventing the release of personalinformation is typically the goal of onlineprivacy regulation. The common response isregulation of business practices or defaultservice settings.
Once we recognize that online child safetyand digital privacy concerns are linked bymany similar factors, we can considerwhether common solutions exist. Many ofthe solutions proposed to enhance onlinesafety and privacy are regulatory incharacter. But information regulation is nota costless exercise. It entails both economicand social costs. Measuring those costs is anextraordinarily complicated and
contentious matter, since both online childsafety and digital privacy are riddled withemotional appeals and highly subjectiveassertions of harm.
This Article will make a seeminglycontradictory argument: benefit-costanalysis (BCA) is extremely challenging
in online child safety and digital privacydebates, yet it remains essential thatanalysts and policy-makers attempt toconduct such reviews. While we will neverbe able to perfectly determine either thebenefits or costs of online safety or privacycontrols, the very act of conducting aregulatory impact analysis (RIA)will helpus to better understand the trade-offsassociated with various regulatoryproposals. However, precisely becausethose benefits and costs re-main soremarkably subjective and contentious, thisArticle will argue that we should look toemploy less restrictive solutionseducationand aware-ness efforts, empowerment tools,alternative enforcement mechanisms, etc.before resorting to potentially costly andcumbersome legal and regulatory regimesthat could disrupt the digital economy andthe efficient pro-vision of services that
consumers desire.This model has workedfairly effectively in the online safety contextand can be applied to digital privacyconcerns as well.
This Article focuses primarily on digitalprivacy policy and sketches out aframework for applying BCA to proposalsaimed at limiting commercial online datacollection, aggregation, and use.Information about online users is regularly
collected by online operators to tailoradvertising to them (so-called targetedorbehavioral advertising), to offer themexpanded functionality, or to provide themwith additional service options. Suchoperators include social networkingservices, online search and e-mail providers,online advertisers, and other digital content
-
8/12/2019 FPF Privacy Papers 2013
12/29
A Framework for Benefit-Cost Analysis in Digital PrivacyDebates
5
providers. While this produces manybenefits for consumersnamely, a broadand growing diversity of online content andservices for little or no chargeit also raises
privacy concerns and results in calls forregulatory limitations on commercial datacollection or reuse of personal information.
This Article does not focus on assertions ofprivacy rights against government,however. The benefit-cost calculus is clearlydifferent when state actors, as opposed toprivate actors, are the focus ofregulation. Governments have uniquepowers and responsibilities that qualify
them for a different type of scrutiny.
To offer a more concrete example of howprivacy-related BCA should work in practice,the recent actions of the Obamaadministration and the Federal TradeCommission (FTC) are consideredthroughout the Article. The Obamaadministration has been remarkably active oncommercial privacy issues over the past threeyears yet has largely failed to adequately
consider the full range of costs associatedwith increased government activity on thisfront. It has also failed to conclusively showthat any sort of market failure exists as itrelates to commercial data collection ortargeted online advertising or services.
At a minimum, this Article will make it clearwhy independent agencies should be requiredto carry out BCA of any privacy-relatedpolicies they are considering.Currently, many
agencies, including the FTC and the FederalCommunications Commission (FCC), arenot required to conduct BCA or have theirrulemaking activities approved by the WhiteHouse Office of Information and RegulatoryAffairs (OIRA), which oversees federalregulations issued by executiveagencies. Regulatory impact analysis is
important even if there are problems indefining, quantifying, and monetizingbenefitsas is certainly the case forcommercial online privacy concerns.
In Part I, this Article examines the use ofBCA by federal agencies to assess the utilityof government regulations. Part II considershow BCA can be applied to online privacyregulation and the challenges federalofficials face when determining thepotential benefits of regulation. Part III thenelaborates on the cost considerations andother trade-offs that regulators face whenevaluating the impact of privacy-related
regulations. In Part IV, this Article willdiscuss alternative measures that can betaken by government regulators whenattempting to address online safety andprivacy concerns. This Article concludesthat policymakers must consider BCA whenproposing new rules but also recognize theutility of alternative remedies, such aseducation and awareness campaigns, toaddress consumer concerns about onlinesafety and privacy.
Author:
Adam Thierer is asenior researchfellow with theTechnology PolicyProgram at theMercatus Center atGeorge MasonUniversity. He
specializes intechnology, media,Internet, and free-
speech policies, with a particular focus ononline child safety and digital privacy. Hiswritings have appeared in the Wall Street
Journal, the Economist, the Washington Post,the Atlantic, and Forbes, and he has appeared
-
8/12/2019 FPF Privacy Papers 2013
13/29
A Framework for Benefit-Cost Analysis in Digital PrivacyDebates
6
on national television and radio. Thierer is afrequent guest lecturer and has testifiednumerous times on Capitol Hill.
Thierer has authored or edited seven books ontopics ranging from media regulation andchild safety to the role of federalism in high-technology markets. He contributes tothe Technology Liberation Front, a leadingtechnology-policy blog. Thierer has served onseveral distinguished online-safety task forces,including Harvard University Law SchoolsInternet Safety Technical Task Force.Previously, Thierer was president of theProgress and Freedom Foundation, director oftelecommunications studies at the CatoInstitute, and a senior fellow at the HeritageFoundation.
Thierer received his MA in internationalbusiness management and trade theory at theUniversity of Maryland and his BA in
journalism and political philosophy fromIndiana University.
-
8/12/2019 FPF Privacy Papers 2013
14/29
The FTC and the New Common Law of Privacy
7
Daniel J. Solove & Woodrow Hartzog
Forthcoming in the Columbia Law Review.Full paper available at: http://papers.ssrn.com/sol3/papers.cfm?abstract_id=2312913
Executive Summary:
One of the great ironies about information
privacy law is that the primary regulation of
privacy in the United States has barely been
studied in a scholarly way. Since the late
1990s, the Federal Trade Commission (FTC)
has been enforcing companies privacy
policies through, among other things, its
authority to police unfair and deceptive
trade practices.
Despite over fifteen years of FTC
enforcement, there are hardly any judicial
decisions to show for it. The cases have
nearly all resulted in settlement agreements.
Nevertheless, companies look to these
agreements to guide their decisions
regarding privacy practices. In practice,
FTC privacy jurisprudence has become the
broadest and most influential regulatingforce on information privacy in the United
States more so than nearly any privacy
statute and any common law tort. It is
therefore quite surprising that so little
scholarly attention has been devoted to the
FTCs privacy jurisprudence.
In this Article, we endeavor to map this
uncharted terrain. We explore how and
why the FTC, and not contract law, came todominate the enforcement of privacy
policies. In the late 1990s, it was far from
clear that the body of law regulating
privacy policies would come from the FTC
and not from traditional contract and
promissory estoppel. We seek to
understand why the FTC jurisprudencedeveloped the way that it did and how it
might develop in the future. We contend
that the FTCs privacy jurisprudence is the
functional equivalent to a body of common
law, and we examine it as such.
Our primary thesis is that through a
common law-like process, the FTCs actions
have developed into a rich jurisprudence
that is effectively the law of the land forbusinesses that deal in personal
information. This jurisprudence has the
foundations to grow even more robust. By
clarifying its standards and looking beyond
a companys privacy promises, the FTC is
poised to enforce a holistic and robust
privacy regulatory regime that draws upon
industry standards and consumer
expectations of privacy to remain potent,
feasible, and adaptable in the face of
technological change.
Authors:
Daniel J. Solove is
the John Marshall
Harlan Research
Professor of Law at
the George
WashingtonUniversity Law
School. He is also
Senior Policy
Advisor at Hogan Lovells. Additionally, he
is the founder of TeachPrivacy, a company
that provides privacy and security training.
-
8/12/2019 FPF Privacy Papers 2013
15/29
The FTC and the New Common Law of Privacy
8
One of the worlds leading experts in
privacy law, Solove is the author of
numerous books, including Nothing to Hide:
The False Tradeoff Between Privacy and
Security (Yale 2011), Privacy Law
Fundamentals (IAPP 2011),Understanding
Privacy (Harvard 2008), and The Future of
Reputation: Gossip and Rumor in the
Information Age (Yale 2007). Additionally,
he is also the author of a textbook,
Information Privacy Law, as well as more
than 40 articles. Solove has testified before
Congress and has consulted in a number of
high-profile privacy cases.
Woodrow Hartzogisan Assistant
Professor at the
Cumberland School
of Law at Samford
University. He is also
an Affiliate Scholar at
the Center for
Internet and Society
at Stanford Law School. His research focuses
on privacy, human-computer interaction,
contracts, and robotics. His work has been or is
scheduled to be published in numerous
scholarly publications such as the Columbia
Law Review, California Law Review, and
Michigan Law Review and popular publications
such as The Atlantic and The Nation. He has
been quoted or referenced in numerous articles
and broadcasts, including NPR, the New York
Times, the Los Angeles Times, USA Today, andBloomberg. He previously worked as a
trademark attorney at the United States Patent
and Trademark Office and in private practice.
He has also served as a clerk for the Electronic
Privacy Information Center.
-
8/12/2019 FPF Privacy Papers 2013
16/29
Information Privacy in the Cloud
9
Paul M. Schwartz
Published in the University of Pennsylvania Law Review. Full paper available at:
http://www.pennlawreview.com/print/?id=402
Executive Summary:
Cloud computing is the locating ofcomputing resources on the Internet in afashion that makes them highly dynamic andscalable. Moreover, cloud computing permitsdramatic flexibility in processing decisionsand on a global basis. The rise of the cloudhas also significantly challenged establishedlegal paradigms. This Article analyzes currentshortcomings of information privacy law in
the context of the cloud. It developsnormative proposals to allow the cloud tobecome a central part of the evolving Internet.These proposals rest on strong and effectiveprotections for information privacy that aresensitive to technological changes.
This Article examines three areas of change inpersonal data processing due to the cloud.The first area of change concerns the nature ofinformation processing at companies. Formany organizations, data transmissions are
no longer point-to-point transactions withinone country; they are now increasinglyinternational in nature. As a result of thisdevelopment, the legal distinction betweennational and international data processing isless meaningful than in the past. The
jurisdictional concepts of EU law do not fitwell with these changes in the scale andnature of international data processing. ThisArticle proposes modifications to theapplicable EU jurisdictional law and, in
particular, the sweeping rules of the ProposedDraft Regulation.
A second legal issue concerns the multi-directional nature of modern data flows,which occur today as a networked series ofprocesses made to deliver a business result.Due to this development, established
concepts of privacy law, such as the definitionof personal information and the meaning ofautomated processing have becomeproblematic. There is also no internationalharmonization of these concepts. As a result,European Union and U.S. officials may differon whether certain cloud-based activitiesimplicate the restrictions and regulations ofprivacy law. This Article applies the authorstiered conception of personally identifiableinformationPII 2.0to create incentives
for cloud companies to maintain informationin an indentifiable or even nonidentifiableform and thus begin harmonizing the U.S.and EU approaches to PII.
A final change relates to a shift to a process-oriented management approach. Users nolonger need to own technology, whethersoftware or hardware, that is placed in thecloud. Rather, different parties in the cloudcan contribute inputs and outputs andexecute other kinds of actions. In short,
technology has provided new answers to aquestion that Ronald Coase first posed inThe Nature of the Firm. New technologiesand accompanying business models nowallow firms to approach Coasian make orbuy decisions in innovative ways. Yet,privacy laws approach to liability for privacyviolations and data losses in the new makeor buy world of the cloud may not createadequate incentives for the multiple partieswho handle personal data. This Article
explores the need for a model contractprivacy law that would provide a corebaseline of protections in business-to-consumer arrangements.
-
8/12/2019 FPF Privacy Papers 2013
17/29
Information Privacy in the Cloud
10
Author:
Paul Schwartz is aleading internationalexpert on
information privacylaw. He is a professorat the University ofCalifornia, BerkeleyLaw School and adirector of theBerkeley Center forLaw andTechnology. He has
testified before Congress and served as anadvisor to international organizations,
including Directorate Generals of theEuropean Union. He assists numerouscorporations and organizations withregulatory, policy and governance issuesrelating to information privacy. Schwartz isa frequent speaker at technologyconferences and corporate events in theUnited States and abroad. He is a SpecialAdvisor to the privacy and data securitypractice of Paul Hastings LLP.
Professor Schwarz is the author of manybooks, including the leading casebook,Information Privacy Law, and thedistilled guide, Privacy LawFundamentals, each with Daniel Solove.Information Privacy Law, now in itsfourth edition, is used in courses at morethan 20 law schools. Schwartzs over fiftyarticles have appeared in journals such asthe Harvard Law Review, Yale Law Journal,Stanford Law Review, University of ChicagoLaw Review and California Law Review. He
publishes on a wide array of privacy andtechnology topics including data analytics,cloud computing, telecommunicationssurveillance, data security breaches, healthcare privacy, privacy governance, datamining, financial privacy, European dataprivacy law, and comparative privacy law.
-
8/12/2019 FPF Privacy Papers 2013
18/29
Obscurity by Design
11
Woodrow Hartzog & Frederic D. Stutzman
Published in the Washington Law Review. Full paper available at:http://digital.law.washington.edu/dspace-
law/bitstream/handle/1773.1/1247/88WLR385.pdf?sequence=1
Executive Summary:
Design-based solutions to confronttechnological privacy threats are becomingpopular with regulators. One populardesign solution, Privacy by Design, hasbeen described as the philosophy andapproach of embedding privacy into thedesign specifications of varioustechnologies. However, these promising
solutions have left the full potential ofdesign untapped. With respect to onlinecommunication technologies, design-basedsolutions for privacy remain incompletebecause they have yet to successfullyaddress the trickiest aspect of the Internetsocial interaction. This Article posits thatprivacy-protection strategies such asPrivacy by Design face unique challengeswith regard to social software and socialtechnology due to their interactional nature.
This Article proposes that design-basedsolutions for social technologies benefitfrom increased attention to user interaction,with a focus on the principles of obscurityrather than the expansive and vagueconcept of privacy. The main thesis of thisArticle is that obscurity is the optimalprotection for most online socialinteractions and, as such, is a natural locusfor design-based privacy solutions for socialtechnologies. To that end, this Article
develops a model of obscurity by designas a means to address the privacy problemsinherent in social technologies and theInternet.
Where the pursuit of privacy in designoften seems like a quest for near-perfectprotection, the goal of designing for
obscurity is that it be good enough formost contexts or to accommodate a usersspecific needs. As the natural state for manyonline social communications, obscurity isthe logical locus for the front end design ofsocial technologies. Obscurity by designutilizes the full potential of design-basedsolutions to protect privacy and serve as aroadmap for organizations and regulatorswho seek to confront the vexing problems
and contradictions inherent in socialtechnologies.
Authors:
Woodrow Hartzogis
an Assistant
Professor at the
Cumberland School
of Law at Samford
University. He is also
an Affiliate Scholar atthe Center for
Internet and Society
at Stanford Law School. His research focuses
on privacy, human-computer interaction,
contracts, and robotics. His work has been or is
scheduled to be published in numerous
scholarly publications such as the Columbia
Law Review, California Law Review, and
Michigan Law Review and popular publications
such as The Atlantic and The Nation. He has
been quoted or referenced in numerous articles
and broadcasts, including NPR, the New York
Times, the Los Angeles Times, USA Today, and
Bloomberg. He previously worked as a
trademark attorney at the United States Patent
and Trademark Office and in private practice.
-
8/12/2019 FPF Privacy Papers 2013
19/29
Obscurity by Design
12
He has also served as a clerk for the Electronic
Privacy Information Center.
Fred Stutzman isfounder of Eighty
Percent Solutions, aLAUNCH Incubatorcompany whichbuilds the innovativeproductivity softwareFreedom and Anti-Social. Previously, hewas co-founder of
ClaimID.com and technology researcher atUNC-Chapel Hill and Carnegie MellonUniversity. He holds a Ph.D. in InformationScience, a graduate certificate inquantitative research, and a B.A. inEconomics. Currently, he is adjunctprofessor at UNCs School of Informationand Library Science, where he teachescourses about privacy and social media.
-
8/12/2019 FPF Privacy Papers 2013
20/29
A Primer on Metadata: Separating Fact from Fiction
13
Ann Cavoukian
Full paper available at:http://www.realprivacy.ca/index.php/paper/primer-metadata-separating-fact-fiction/
Executive Summary:
Since the June 2013 revelations of the NSAssweeping surveillance of the publicsmetadata, the term metadata has beenregularly used in the media, frequentlywithout any explanation of its meaning. Inan effort to educate the public and drawimportance to this issue, OntariosInformation and Privacy Commissioner, Dr.Ann Cavoukian, set out in this paper toprovide a clear understanding of metadata
and how revealing its content can be.
Metadatas reach can be extensive including information that reveals the timeand duration of a communication, theparticular devices used, email addresses,numbers contacted, which kinds ofcommunications services were used, and atwhat geolocations. And since virtuallyevery device we use has a uniqueidentifying number, our communications
and Internet activities may be linked andtraced with relative ease, ultimately back tothe individuals involved.
All this metadata is collected and retainedby communications service providers forvarying periods of time and, for legitimatebusiness purposes. Key questions arise,however, including who else may haveaccess to all this information, and for whatpurposes? Senior U.S. government officials
have been defending their sweeping andsystemic seizure of the publicscommunications data on the basis that it isonly metadata. They say it is neithersensitive nor privacy-invasive since it doesnot access the actual content contained inthe associated communications.
A Primer on Metadata: Separating Fact fromFiction, explains that metadata can be farmore revealing than accessing the content ofour communications. The paper disputespopular claims that the information beingcaptured is neither sensitive, nor privacy-invasive, since it does not access anycontent. Given the implications for privacyand freedom, it is critical that we allquestion the dated but ever-so prevalenteither/or, zero-sum mindset of privacy vs.security. Instead, what is needed are
proactive measures designed to provide forboth security and privacy, in an accountableand transparent manner.
In this globally networked age, privacyknows no bounds it is no longer simply alocal issue it transcends borders,demanding global attention. Accordingly,we urge governments to adopt a proactiveapproach to securing the rights affected byintrusive surveillance programs. To protect
privacy and liberty, any power to seizecommunications metadata must come withstrong safeguards directly embedded intoprograms and technologies,that are clearlyexpressed in the governing legalframework. More robust judicial oversight,parliamentary or congressional controls,and systems capable of providing foreffective public accountability should bebrought to bear. The need for operationalsecrecy must not stand in the way of public
accountability. Our essential need forprivacy and the preservation of ourfreedom and liberty are at stake.
-
8/12/2019 FPF Privacy Papers 2013
21/29
A Primer on Metadata: Separating Fact from Fiction
14
Author:
Ann Cavoukian, Ph.D., Information andPrivacy Commissioner, Ontario, Canada.
Dr. Ann Cavoukianis recognized as oneof the leadingprivacy experts inthe world. Noted forher seminal work onPrivacy EnhancingTechnologies (PETs)in 1995, her conceptof Privacy by Designseeks to proactivelyembed privacy into
the design specifications of informationtechnology and accountable businesspractices, thereby achieving the strongestprotection possible. In October, 2010,regulators from around the world gatheredat the annual assembly of International DataProtection and Privacy Commissioners in
Jerusalem, Israel, and unanimously passeda landmark Resolution recognizing Privacyby Design as an essential component offundamental privacy protection. This was
followed by the U.S. Federal TradeCommissions inclusion of Privacy by Designas one of its three recommended practicesfor protecting online privacy a majorvalidation of its significance. This was laterfollowed by the inclusion of Privacy byDesign in the draft EU Data ProtectionRegulation.
An avowed believer in the role thattechnology can play in the protection of
privacy, Dr. Cavoukians leadership hasseen her office develop a number of toolsand procedures to ensure that privacy isstrongly protected globally. She has beeninvolved in numerous internationalcommittees focused on privacy, security,technology and business, and endeavours tofocus on strengthening consumer
confidence and trust in emergingtechnology applications.
Dr. Cavoukian serves as the Chair of theIdentity, Privacy and Security Institute atthe University of Toronto, Canada. She isalso a member of several Boardsincluding, the European Biometrics Forum,Future of Privacy Forum, RIM Council, andhas been conferred as a DistinguishedFellow of the Ponemon Institute. Dr.Cavoukian was honoured with theprestigious Kristian Beckman Award in2011 for her pioneering work on Privacyby Designand privacy protection inmodern international environments. Inthe same year, Dr. Cavoukian was alsonamed by Intelligent Utility Magazine asone of the Top 11 Movers and Shakers forthe Global Smart Grid industry, receivedthe SC Canada Privacy Professional ofthe Year Award and was honoured by theUniversity of Alberta for her positivecontribution to the field of privacy.
-
8/12/2019 FPF Privacy Papers 2013
22/29
Privacy in Europe: Initial Data on Governance Choicesand Corporate Practice
15
Kenneth Bamberger & Deirdre Mulligan
Paper published in the George Washington Law Review. Full paper available at:http://www.gwlr.org/2013/09/14/bamberger_mulligan/
Executive Summary:
As this Article goes to press, the EuropeanUnion is embroiled in debates over thecontours of a proposed new privacyregulation. These efforts, however, havelacked critical information necessary forreform. For, like privacy debates generally,they focus almost entirely on law on thebookslegal texts enacted by legislatures
or promulgated by agencies.
By contrast, they largely ignore privacy onthe groundthe ways in whichcorporations in different countries haveoperationalized privacy protection in thelight of divergent formal laws, differentapproaches taken by local administrativeagencies, and other jurisdiction-specificsocial, cultural, and legal forces. Indeed,despite the new regulations central goal of
harmonizing privacy across Europe bypreempting todays enormous variation innational approaches, policymakers havebeen hobbled by an absence of evidence as towhich national choices about privacygovernance have proven more or lessresilient in the face of radical technologicaland social change. Information about therelative strengths and benefits of thealternate regulatory approaches that haveflourished in the living laboratories of the
European member states is largelyundeveloped.
This Article begins to fill this gapand at acritical juncture. Our on the ground projectuses qualitative empirical inquiryincluding interviews with, andquestionnaires completed by, corporate
privacy officers, regulators, and other actorswithin the privacy field in three Europeancountries, France, Germany and Spaintoidentify the ways in which privacyprotection is implemented in different
jurisdictions, and the combination of social,market, and regulatory forces that drivethese choices. It thus offers a comparativein-the-wild assessment ofthe effects of thedifferent regulatory approaches adopted by
these three countries as well as with similarresearch previously completed about privacyon the ground in the United States.
Our comparative analysis indicatesfundamental flaws in the dominantnarratives regarding the regulation ofprivacy in the United States and Europeaccounts that have dominated privacyscholarship and advocacy for over a decade.Those narratives have portrayed the U.S.
regulatory regime as a weak one that fails toprovide across-the-board procedures thatempower individuals to control the use anddissemination of their personal information.By contrast, those accounts promote aEuropean model of privacy governancetypified by rigorous privacy principlesembodied in law or binding codes,mandating processes to protect individualchoice about the use of personal data, andinterpreted and monitored by an
independent and dedicated privacyagencyas the sort of privacy regime towhich the United States must aspire.
Our research, offers evidence to thecontrary. First it demonstrates that there isnot one single European approach, butrather that the implementation of privacy
-
8/12/2019 FPF Privacy Papers 2013
23/29
Privacy in Europe: Initial Data on Governance Choicesand Corporate Practice
16
varies widely among Europeanjurisdictions, reflecting different governancechoices and regulatory approaches. Second,it suggests that a variety of new
governance approaches to privacyresonant in both German and U.S. privacygovernance contribute to regulatoryframeworks that can both fosteradaptability in the face of rapidtechnological change, and encouragedevelopment of the type of privacyexpertise within corporations that canrespond new privacy threat models raisedby new products, services, and businessmodels. Third, it indicates the shortcomings
of a traditional European rights-basedmodel of privacy protection focused on theprotection of individual choice, and thestrengths of a model intended to promote theoperationalization of privacy withincorporate structures, such that privacyexpertise informs business decisions abouttechnology, products, and services from thestart of the development process to itscompletion.
In the face of novel challenges to privacy,leveraging the adaptability of distinctregulatory approaches and institutions hasnever been more important. As technologicaland social change has altered the generationand use of data, the definition of privacy thathas prevailed in the political sphereindividual control over the disclosure and useof personal informationhas increasinglylost its salience. In particular, the commoninstruments of protection generated by this
definitionprocedural mechanisms toprotect individual choicehave offered aninapt paradigm for privacy protection in theface of data ubiquity and computing capacity.In developing new metrics for protectingprivacy, policymakers must take into accounta far more granular and bottom-up analysisof both differences in national practice and
the forces on the ground that result in thediffusionor lack thereofof corporatestructures and institutions that researchsuggests are most adaptive in protecting
privacy in the face of change.
Authors:
Kenneth A.Bamberger isProfessor of Law atthe University ofCalifornia, Berkeley,and Faculty Directorof the Berkeley
Center for Law andTechnology. Hisresearch focuses oninstitutional designand decisionmaking,
the governance of technology, and corporateregulation. In particular, his recent workexplores the regulation of data protection andinformation privacy, the use of technology byadministrative agencies, and the reliance ontechnology in corporate compliance. At
Berkeley, Bamberger teaches AdministrativeLaw, The First Amendment, and Technologyand Governance.
Deirdre K.Mulligan is anAssistant Professorin the School ofInformation at UCBerkeley, and aDirector of the
Berkeley Centerfor Law &Technology. Priorto joining the
School of Information in 2008, she was aClinical Professor of Law, foundingDirector of the Samuelson Law,Technology & Public Policy Clinic, and
-
8/12/2019 FPF Privacy Papers 2013
24/29
Privacy in Europe: Initial Data on Governance Choicesand Corporate Practice
17
Director of Clinical Programs at the UCBerkeley School of Law (Boalt Hall).Mulligan is the Policy lead for the NSF-funded TRUST Science and Technology
Center, which brings together researchersat U.C. Berkeley, Carnegie-MellonUniversity, Cornell University, StanfordUniversity, and Vanderbilt University.Mulligans current research agendafocuses on information privacy andsecurity. Current projects includecomparative, qualitative research toexplore the conceptualization andmanagement of privacy withincorporations based in different
jurisdictions, and policy approaches toimproving cybersecurity. She is Chair ofthe Board of Directors of the Center forDemocracy and Technology, and a Fellowat the Electronic Frontier Foundation. Sheis co-chair of Microsoft's TrustworthyComputing Academic Advisory Board,which comprises technology and policyexperts who meet periodically to adviseMicrosoft about products and strategy.Prior to Berkeley, she served as staff
counsel at the Center for Democracy &Technology in Washington, D.C.
-
8/12/2019 FPF Privacy Papers 2013
25/29
Reconciling Personal Information in the U.S. and EU
18
Paul M. Schwartz & Daniel J. Solove
Forthcoming in the California Law Review. Full paper available at:http://papers.ssrn.com/sol3/papers.cfm?abstract_id=2271442
Executive Summary:
U.S. and EU privacy law can greatly
diverge. Even at the threshold level
determining what information is covered by
the regulationthe United States and
European Union differ significantly. The
existence of personal information
commonly referred to as personally
identifiable information (PII)is the
trigger for when privacy laws apply.
PII is defined quite differently in U.S. and
EU privacy law. The U.S. approach
involves multiple and inconsistent
definitions of PII that are often quite
narrow. The EU approach defines PII to
encompass all information identifiable to a
person, which is a definition that can be
quite broad and vague. This divergence is
so basic that it threatens the current statusquo built around second-order mechanisms
for allowing international data transfers,
including the presently contentious Safe
Harbor. It also raises compliance costs for
companies who do business in both areas of
the world. But since both the United States
and the European Union are deeply
committed to their respective approaches,
attempts to harmonize U.S. and EU privacy
law by turning EU privacy law into a U.S.-
style approach, or vice versa, are unlikely to
succeed.
In this Essay, we argue that there is a way to
bridge these differences regarding PII. We
contend that a tiered approach to the
concept of PII (which we call PII 2.0)represents a superior way of defining PII
than the current approaches in the United
States and European Union. We also argue
that PII 2.0 is consistent with the different
underlying philosophies of the U.S. and EU
privacy law regimes. Under PII 2.0, all of
the Fair Information Practices (FIPs) should
apply when data refers to an identified
person or where there is a significant risk of
the data being identified. Only some of the
FIPs should apply when data is merely
identifiable, and no FIPs should apply when
there is a minimal risk that the data is
identifiable. We demonstrate how PII 2.0
advances the goals of both U.S. and EU
privacy law and how PII 2.0 is consistent
with their different underlying
philosophies. PII 2.0 thus advances the
process of bridging the current gap betweenU.S. and EU privacy law.
Authors:
Paul Schwartz is aleading internationalexpert oninformation privacylaw. He is aprofessor at theUniversity ofCalifornia, BerkeleyLaw School and adirector of theBerkeley Center for
Law and Technology. He has testifiedbefore Congress and served as an advisor tointernational organizations, includingDirectorate Generals of the European
-
8/12/2019 FPF Privacy Papers 2013
26/29
Reconciling Personal Information in the U.S. and EU
19
Union. He assists numerous corporationsand organizations with regulatory, policyand governance issues relating toinformation privacy. Schwartz is a frequentspeaker at technology conferences and
corporate events in the United States andabroad. He is a Special Advisor to theprivacy and data security practice of PaulHastings LLP.
Professor Schwarz is the author of manybooks, including the leading casebook,Information Privacy Law, and thedistilled guide, Privacy LawFundamentals, each with Daniel Solove.Information Privacy Law, now in itsfourth edition, is used in courses at morethan 20 law schools. Schwartzs over fiftyarticles have appeared in journals such asthe Harvard Law Review, Yale Law Journal,Stanford Law Review, University of ChicagoLaw Review and California Law Review. Hepublishes on a wide array of privacy andtechnology topics including data analytics,cloud computing, telecommunicationssurveillance, data security breaches, healthcare privacy, privacy governance, datamining, financial privacy, European data
privacy law, and comparative privacy law.
Daniel J. Soloveis
the John Marshall
Harlan Research
Professor of Law
at the George
Washington
University Law
School. He is also
Senior Policy
Advisor at Hogan Lovells. Additionally, he
is the founder of TeachPrivacy, a company
that provides privacy and security training.
One of the worlds leading experts in
privacy law, Solove is the author of
numerous books, including Nothing to Hide:
The False Tradeoff Between Privacy and
Security (Yale 2011), Privacy Law
Fundamentals (IAPP 2011),Understanding
Privacy (Harvard 2008), and The Future of
Reputation: Gossip and Rumor in the
Information Age (Yale 2007). Additionally,
he is also the author of a textbook,
Information Privacy Law, as well as more
than 40 articles. Solove has testified before
Congress and has consulted in a number of
high-profile privacy cases.
-
8/12/2019 FPF Privacy Papers 2013
27/29
Why Data Privacy Law Is (Mostly) Constitutional
20
Neil M. Richards
Excerpt from Intellectual Privacy(forthcoming). Full paper available at:http://papers.ssrn.com/sol3/papers.cfm?abstract_id=2335196
Executive Summary:
A few kinds of privacy rights run into conflictwith the First Amendment, most notably theold Warren and Brandeis argument for a tortby which the rich and famous could keepunflattering and embarrassing truths aboutthem out of the newspapers. But privacy canmean many things, and most of these thingsare fully consistent with the Americancommitments to broad rights of free speechand free press. Specifically, we use the termprivacy to refer to the many laws regulatingpersonal data, including consumer credit andvideo rental information, and informationgiven to doctors and lawyers. Despite callsfrom industry groups and a few isolatedacademics that these laws somehow menacefree public debate, the vast majority ofinformation privacy law is constitutionalunder ordinary settled understandings of theFirst Amendment. Policymakers can thusmake information policy on the merits ratherthan being distracted by spurious free speechclaims.
Throughout the world, democratic societiesregulate personal data using laws thatembody the Fair Information Practices orFIPs. The FIPs are a set of principles thatregulate the relationships between businessand government entities that collect, use,and disclose personal information about
data subjects, and which were developedby the United States Government in the1970s. Over the past decade, some (but notall) industry groups and a handful ofscholars have argued that the FIPssomehow offend the First Amendment, anargument seemingly strengthened by theSupreme Courts 2011 decision in Sorrell v.
IMS Health, which struck down a Vermontlaw preventing drug reps (but no one else)from using data-based marketing to speakto physicians.
Before Sorrell, there was a settledunderstanding that general commercialregulation of the huge data trade wasntcensorship. It was seen on the contrary as partof the ordinary business of commercial
regulation that fills thousands of pages of theUnited States Code and the Code of FederalRegulations. Nothing in the Sorrell opinionshould lead policymakers to conclude that thissettled understanding has changed. Thepoorly-drafted Vermont law in Sorrelldiscriminated against particular kinds ofprotected speech (in-person advertising), andparticular kinds of protected speakers(advertisers but not their opponents). Suchcontent- and viewpoint discrimination woulddoom even unprotected speech under well-settled First Amendment law. As the Courtmade clear, the real problem with the Vermontlaw at issue was that it didnt regulate enough,unlike the more coherent policy of theundoubtedly constitutional federal HealthInsurance Portability and Accountability Actof 1996.
Notwithstanding the Courts clarity on thispoint, a few observers have suggested thatdata flows are somehow speech protected
by the First Amendment. But the data isspeech argument makes no sense from aFirst Amendment perspective. People dothings every day that are more clearlyspeech than a data flow, from bloggingand singing in the shower to insidertrading, sexually harassing co-workers,verbally abusing children, and even hiring
-
8/12/2019 FPF Privacy Papers 2013
28/29
Why Data Privacy Law Is (Mostly) Constitutional
21
assassins. Well-settled First Amendmentallows us to separate out which of theseactivities cannot be regulated (the first two)from those which can (the rest). FirstAmendment lawyers dont ask whether
something is speech, because almosteverything is expressive in some way.Instead, they ask which kinds ofgovernment regulation are particularlythreatening to long-standing FirstAmendment values. And commercialregulation of sexual harassment, unfairtrade practices, and commercial data flowsbased on the FIPs is rarely threatening toFirst Amendment values, properlyunderstood by their settled meaning.
The ordinary understandings of FirstAmendment lawyers are supported by a morefundamental reason. During the New Deal,American society decided that, by and large,commercial regulation should be made on thebasis of economic and social policy rather thanblunt constitutional rules. This has becomeone of the basic principles of AmericanConstitutional law. As we move into thedigital age, in which more and more of oursociety is affected or constituted by data flows,
we face a similar threat. If data weresomehow speech, virtually every economiclaw would become clouded by constitutionaldoubt. Economic or commercial policyaffecting data flows (which is to say alleconomic or social policy) would becomealmost impossible. This might be a validpolicy choice, but it is not one that the FirstAmendment commands. Any radicalsuggestions to the contrary are unsupportedby our Constitutional law.
Privacy law is thus (mostly) constitutional.And when were talking about theregulation of commercial data flows, itsentirely constitutional, except for a fewpoorly-drafted outliers like the law struckdown in Sorrell. In a democratic society, thebasic contours of information policy must
ultimately be up to the people and theirpolicymaking representatives, and not tounelected judges. We should decide policyon that basis, rather than on odd readings ofthe First Amendment.
Author:
Neil Richards is aninternationally-
recognized expert inprivacy law,information law, andfreedom ofexpression. He is aprofessor of law at
Washington
University School ofLaw, a member ofthe Advisory Board
of the Future of Privacy Forum, and aconsultant and expert in privacy cases. Hegraduated in 1997 from the University ofVirginia School of Law, and served as a lawclerk to Chief Justice William H. Rehnquist.His first book, Intellectual Privacy, will bepublished by Oxford University Press in 2014.
Professor Richards' many writings on privacyand civil liberties have appeared in prominentlegal journals such as the Harvard LawReview, the Columbia Law Review, theVirginia Law Review, and the California LawReview. He has written for a more generalaudience in Wired Magazine UK, CNN.com,and the Chronicle of Higher Education.
Professor Richards appears frequently inthe media, and he is a past winner of the
Washington University School of Law'sProfessor of the Year award. At WashingtonUniversity, he teaches courses on privacy,free speech, and constitutional law. He wasborn in England, educated in the UnitedStates, and lives with his family in St. Louis.He is an avid cyclist and a lifelongsupporter of Liverpool Football Club.
-
8/12/2019 FPF Privacy Papers 2013
29/29
About the Future Privacy Forum
Future of Privacy Forum (FPF) is a Washington, DC based think tank that seeks to advance
responsible data practices. The forum is led by Internet privacy experts Jules Polonetsky and
Christopher Wolf and includes an advisory board comprised of leading figures from industry,
academia, law and advocacy groups.
To learn more about FPF, please visit www.futureofprivacy.org