—stakeholders ó—will be involved...

47
Greetings, With much fanfare, the Obama administration released its proposed blueprint for consumer privacy last week. It was a profound moment; I was in the Eisenhower Room to witness it. Since then, reaction has rung far and wide, with some lauding it and others predicting its failure. Now begins the work to put the blueprint in play. More people—“stakeholders”—will be involved—industry, advocacy, academia, technologists. There will be debate, lobbying, conciliation. It was the second major policy paper to have been released so far this year. The European Commission in January proposed a new data protection framework and has embarked on the lengthy process of getting it before the European Parliament for a vote. The U.S. Federal Trade Commission is expected to release its final staff privacy report in the coming weeks— another big paper at a big time for privacy and data protection. Much of this work, in the U.S., the EU and in many other locales around the world, is being conducted on behalf of consumers, yet consumers are largely unaware of the issues over which those working on their behalf split hairs. Outside of the bubble of our profession, most people are much less informed. However, there are signs that this is changing. The Pew Internet & American Life Project just released a report showing that more Americans are employing the privacy settings on social networks, and data privacy makes the headlines of mainstream newspapers across the globe on a near-daily basis now, which will inevitably lead to greater awareness among the general population. The IAPP is about to pass the 10,000-member mark. It is remarkable that the profession has grown to this extent given low citizen involvement in the issues. What will happen when civil society takes note? I believe there will be great truth in that moment. Sincerely, J. Trevor Hughes, CIPP President & CEO, IAPP

Upload: others

Post on 29-Oct-2020

2 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: —stakeholders ó—will be involved industry,iapp.org/media/pdf/publications/advisor_03_12_print.pdf · By Mehmet Munur, CIPP/US, Sarah Branam and Matt Mrkobrad, CIPP/US/G Changes

Greetings,

With much fanfare, the Obama administration released its proposed blueprint for consumer

privacy last week. It was a profound moment; I was in the Eisenhower Room to witness it.

Since then, reaction has rung far and wide, with some lauding it and others predicting its

failure.

Now begins the work to put the blueprint in play. More people—“stakeholders”—will be involved—industry,

advocacy, academia, technologists. There will be debate, lobbying, conciliation.

It was the second major policy paper to have been released so far this year. The European Commission in January

proposed a new data protection framework and has embarked on the lengthy process of getting it before the

European Parliament for a vote.

The U.S. Federal Trade Commission is expected to release its final staff privacy report in the coming weeks—

another big paper at a big time for privacy and data protection.

Much of this work, in the U.S., the EU and in many other locales around the world, is being conducted on behalf of

consumers, yet consumers are largely unaware of the issues over which those working on their behalf split hairs.

Outside of the bubble of our profession, most people are much less informed.

However, there are signs that this is changing. The Pew Internet & American Life Project just released a report

showing that more Americans are employing the privacy settings on social networks, and data privacy makes the

headlines of mainstream newspapers across the globe on a near-daily basis now, which will inevitably lead to

greater awareness among the general population.

The IAPP is about to pass the 10,000-member mark. It is remarkable that the profession has grown to this extent

given low citizen involvement in the issues. What will happen when civil society takes note?

I believe there will be great truth in that moment.

Sincerely,

J. Trevor Hughes, CIPP

President & CEO, IAPP

Page 2: —stakeholders ó—will be involved industry,iapp.org/media/pdf/publications/advisor_03_12_print.pdf · By Mehmet Munur, CIPP/US, Sarah Branam and Matt Mrkobrad, CIPP/US/G Changes

Five considerations before publicizing privacy policy updates

By Mehmet Munur, CIPP/US, Sarah Branam and Matt Mrkobrad, CIPP/US/G

Changes in the law, in practices of your industry or to your business’s or vendor’s data collection or use practices

may trigger a need to update your privacy policy. We recommend that you think about the following five

considerations when making changes to your privacy policy. These considerations should help you educate your

users; be transparent and accurate in disclosing your practices, and steer clear of regulatory scrutiny.

Abide by your own privacy policy terms

It is crucial that any entity revising its privacy policy abide by the provisions in its own privacy policy for revisions.

For example, if the current privacy policy states that users will be e-mailed about the revisions to the privacy

policy, then users should be e-mailed about the revisions to the privacy policy.

Some courts have concluded that privacy policies are, in fact, contracts and must, therefore, be revised according

to their terms. If your privacy policy is incorporated into your website’s terms of use, there may be an even greater

likelihood that it will be considered a contract. However, other courts have disagreed with this conclusion.

Nevertheless, this distinction may be irrelevant, as the Federal Trade Commission (FTC) believes that privacy

policies represent “privacy promises,” and you must abide by them or face enforcement actions from the FTC for

deceptive or unfair trade practices under Section 5 of the FTC Act. Therefore, failure to abide by the terms of the

privacy policy in revising the policy could result in arguments for breach of contract or enforcement actions for

deceptive or unfair trade practices. In particular, the FTC has focused extensively on retroactive applicability of

privacy policy changes, as will be discussed in greater detail below.

Note that you can significantly mitigate the challenges presented by revising a privacy policy through proper and

thoughtful initial drafting of a privacy policy. For example, if you currently do not share data with affiliates for

marketing purposes, but anticipate that you may in the future, it would be shortsighted to state in the initial

policy that you will not share with affiliates for marketing purposes. It is best to consider all potential data uses

and transfers and accommodate for those rather than continuously revise your policy.

Privacy policy effective date

All privacy policies should have an “effective date” or “last revised date” legend that is easily identifiable. Users

may easily identify revisions to a privacy policy if the effective date of the privacy policy has been changed. So, be

sure to revise the effective date to reflect the effective date of the new policy.

Moreover, providing advance notice of upcoming revisions to the privacy policy before they become effective for

already existing users may increase the enforceability of those revisions. During this period, users who are

dissatisfied with the revisions will have the opportunity to cease the use of the website or service without being

Page 3: —stakeholders ó—will be involved industry,iapp.org/media/pdf/publications/advisor_03_12_print.pdf · By Mehmet Munur, CIPP/US, Sarah Branam and Matt Mrkobrad, CIPP/US/G Changes

bound by the new revisions. If users continue using the website or service at the end of such a grace period, that

use may increase the likelihood that the revisions will be upheld in a court of law.

Notice of changes to users

If there are material changes to the policy, there may be additional considerations with respect to notice. As

discussed below, based on FTC investigations and orders, you should not use data previously collected for new

purposes unless you obtain express consent to do so. With respect to ways companies have provided notice of

revised privacy policies, some companies have used the privacy policy link on their website to draw the user’s

attention to the revised privacy policy. For example, Yahoo and Google websites often indicate that the privacy

policy has been updated by using a “Privacy Policy (Updated)” label instead of the “Privacy Policy” link to draw

attention to the updates. While the use of this technique is rather new, it is a very effective method of alerting

users that the privacy policy has been updated. Such a method uses only a small amount of resources, but the

impact on users is significant.

Further, sending e-mails to registered users alerting them to the revisions is also an option. Doing so should

increase the perceived adequacy of notice. However, this method may not be applicable to services that do not

require user registration. In those circumstances, posting on a blog regarding the revisions may prove more

helpful.This may also allow notice to be provided to individuals that have not received an e-mail update. For

example, Dropbox, Google, LinkedIn and Yahoo often include these updates on their blogs.

If feasible, you could also publish previous versions of your privacy policy. Doing so may increase transparency to

consumers by providing an easily identifiable method for an individual to ascertain what was previously covered in

a privacy policy and what is currently covered. Google, IBM and eBay provide the previous versions of their privacy

policies on their websites. This provides the user another opportunity to review and understand the differences

between the policies. Even if you are unable to publish previous versions of the policy, you should always keep the

previous versions archived as business records for consumer, business, regulatory or governmental inquiries.

A final possibility for notice is publishing a summary of the revisions to the privacy policy. This allows users to

identify the revisions to the privacy policy without doing a line-by-line comparison of the previous version with the

new version. For example, Google, Dell and LinkedIn provide a summary or a comparison version of their privacy

policies to enable users to understand the differences between the previous and the new privacy policy. However,

you should ensure that any summary is accurate. Otherwise, deceptive and unfair trade practices may result. In

fact, in its recent enforcement action of Facebook, the FTC cited Facebook’s Privacy Wizard for misrepresenting

the summary of revisions to its privacy policy.

Express consent for retroactive applicability

If you would like the privacy policy to apply retroactively to data previously collected, you must obtain express

consent from users. The FTC’s guidance on the issue of retroactive revisions to privacy policies is clear that

“companies may not unilaterally alter their policies and use previously collected data in a manner that materially

differs from the terms under which the data was originally collected.” Therefore, material revisions to a privacy

policy that are also retroactive in application require that you obtain the explicit consent of the user in order to

avoid enforcement actions by the FTC. You may do this through click-through boxes with provisions stating, “I

have read and agree to the revisions to the Privacy Policy” at login screens.

Consider implications of controversial changes to your privacy policy.

Page 4: —stakeholders ó—will be involved industry,iapp.org/media/pdf/publications/advisor_03_12_print.pdf · By Mehmet Munur, CIPP/US, Sarah Branam and Matt Mrkobrad, CIPP/US/G Changes

Finally, you should reconsider any controversial revisions to the privacy policy. GM’s reversal on the revisions to its

privacy policy regarding the tracking of users for its OnStar service is one example of the kind of revisions that

may require reconsideration. GM revised its privacy policy so that it would continue to track users who no longer

used its service and possibly sell anonymized data relating to those users. However, the public uproar and the

congressional attention resulted in a course change regarding these privacy policy revisions. Therefore, you

should reconsider any revisions that may be too controversial or result in negative publicity before making the

revisions to how you collect and use personal information. You should also be prepared to justify your revisions

where necessary.

In conclusion, taking these five considerations into account when updating your privacy policy should ease the

concerns that users may have when you publish the updates to your privacy policy and allow your organization to

transition smoothly. The time and resources spent will be well worth it.

Mehmet Munur is an attorney at Tsibouris & Associates, LLC; Sarah Branam is the privacy manager for Epsilon, and

Matt Mrkobrad, CIPP/US/G, is an associate at Vorys, Sater, Seymour and Pease LLP.

Page 5: —stakeholders ó—will be involved industry,iapp.org/media/pdf/publications/advisor_03_12_print.pdf · By Mehmet Munur, CIPP/US, Sarah Branam and Matt Mrkobrad, CIPP/US/G Changes

Data privacy in the cloud—A dozen myths and facts

By Lothar Determann

Corporate and consumer users are increasingly embracing hosted information technology

solutions. Business models and terminologies vary and include service, rental and advertising-

financed offerings, described as Software as a Service (SaaS), hosted solution, cloud computing

and with other labels. In line with current nomenclature, this article will use “cloud computing”

collectively for all hosted solutions that allow users to obtain additional functionality, storage or

processing capacity without having to buy additional devices or software copies. Instead, users access enhanced

software, computing power and data storage space on remote servers via existing computers and Internet

browsers. This typically means less upfront investment to users and opportunities for leverage, specialization and

economies of scale for providers.

While users increasingly embrace cloud computing, data privacy advocates, regulators and lawyers not so much.

Critics increasingly raise concerns due to perceived risks for privacy and security of personal data. To them, cloud

computing means primarily that users transfer data to faraway systems they do not understand, own or control.

As is often the case with respect to legally and technologically complex topics, oversimplifications,

overgeneralizations, buzz words and slogans are quickly established and (ab)used to pursue various policy and

competitive agendas, including keeping jobs in-country, protecting local industries and shielding established

business models from disruptive alternatives.

In this article, I am taking aim at a dozen myths that tend to cloud the decision-making process regarding data

privacy compliance challenges related to hosted solutions. In conclusion, it is a myth that cloud computing is

somehow bad or risky for privacy or that it raises insurmountable compliance hurdles. It is a fact that data is far

more secure and protected in some clouds than on traditional systems and devices and that compliance

requirements are often very manageable—if approached reasonably from the vendor and customer side. For

illustration purposes, a global human resources system (HRIS) shall serve as an example of a common application

that a multinational enterprise can patch together from decentralized legacy systems, host centrally itself or hire

a cloud computing vendor to host; e.g., Workday.

Myth 1: Cloud computing presents fundamentally new and unique challenges for data privacy and security

compliance

Fact is that consumers and companies have been entrusting specialized service providers with personal data for a

long time, including telecommunications companies, payment processors, accountants and various outsourcing

service providers; e.g., payroll, call centers and IT support. For nearly two decades, we have made the Internet an

integral part of our information society and economy—and the Internet is founded on the principle of

decentralized transfer of data across geographies, devices and connections. Even the very services and models

that are currently hyped as “cloud computing” have been promoted for 15 years or so by an up-and-coming

industry that initially referred to itself as “application service providers.” Transferring data to service providers and

remotely hosted solutions certainly creates challenges—but they are neither new nor unique.

Myth 2: Cloud computing involves more data sharing, and that is inherently bad for privacy

Page 6: —stakeholders ó—will be involved industry,iapp.org/media/pdf/publications/advisor_03_12_print.pdf · By Mehmet Munur, CIPP/US, Sarah Branam and Matt Mrkobrad, CIPP/US/G Changes

Fact is that transferring data to data processing agents—who process data only on behalf and in the interest of

the customer—is very different from transferring data to data controllers, who use data in their own interest and

possibly impair the data subject’s privacy. In fact, transferring data to processing agents is so different from

transferring to data controllers that German data protection laws define “transfer of personal data” specifically to

exclude sharing with data processing agents.

Data sharing with data processing agents is not inherently bad or good for privacy—it is neutral. Companies

always need people and data sharing. Companies are a legal fiction. When companies use and process personal

data, they have to act through human people, and such human people can be statutory employees, individual

independent contractors or employees or contractors of corporate contractors. In either one of these scenarios, it

is important that the individual person who processes the data acts on behalf and in the interest of the data

controller. And it is important that the data controller in turn complies with applicable data protection laws. It is

less relevant for privacy compliance purposes how the data controller company engages and compensates the

person who conducts the processing—as employee or independent contractor. It is important that the person

follows the law and applicable instructions.

For most companies, the switch from internally maintained human resources databases to an external cloud-

based HRIS solution does not result in more sharing or transmission to additional geographies. Most data these

days is transferred across geographical borders because it is needed in various locations. Traditionally, data has

been shared over the Internet, via myriad devices, connections and persons with access. Switching from paper

files, spreadsheets and e-mailing across legacy systems with varying degrees of data protection to a centralized

cloud computing solution with access controls will not add to the prevalence of data sharing. Usually, it just

means more orderly, organized and secure sharing.

Myth 3: Cloud computing is bad for data security

Fact is that employee malice and negligence; e.g., lost laptop, smart phone, etc., causes many data security

breaches, and hacks by cybercriminals are also on the rise. Whether personal data is safer on a system secured by

the data controller in house or an external vendor depends on security measures deployed by each particular

organization. Moving data to the cloud can be a bad thing for data security if the vendor is weak on security and

careless. It can be a good thing if the vendor brings better technologies to the table and helps the data controller

manage access, data retention and data integrity. And it can be neutral if the vendor’s cloud system is more

secure but the way the customer uses the system keeps exposing the data; e.g., because the customer does not

configure security to properly restrict data to the appropriate users, the customer uses unsecured connections or

the customer downloads data from the cloud to unsecured local devices.

Returning to the example of a global HRIS, each multinational employer needs to ask itself whether its own IT

capabilities and security policies are superior to the measures deployed by a specialized vendor that can leverage

economies of scale and is motivated by the risk of reputational harm to keep data of its customers secure. If the

vendor has better security measures, systems and processes, then cloud computing should be good for data

security. If the employer instructs its worldwide employees to stop using spreadsheets on local devices and paper

printouts, and instead limits access to data in the HRIS on a need-to-know basis and subject to differentiated

controls, then ultimately the risk of unauthorized access, inadvertent disclosures and outdated or inaccurate data

should diminish.

Myth 4: Cloud computing causes additional issues under privacy law because data is transmitted

internationally

Page 7: —stakeholders ó—will be involved industry,iapp.org/media/pdf/publications/advisor_03_12_print.pdf · By Mehmet Munur, CIPP/US, Sarah Branam and Matt Mrkobrad, CIPP/US/G Changes

Fact is that most companies are already transmitting data internationally because they use the Internet—for

example, to e-mail spreadsheets to various office locations— or because they have subsidiaries, customers,

suppliers or channel partners in other jurisdictions. In most cases, data transfers occur because data is needed in

different jurisdictions not because of where the data is stored. With respect to employee data in a global HRIS,

companies need to cause personal data to be transferred across borders to populate the system—whether they

host it themselves or engage a cloud computing service provider.

Myth 5: Data in the U.S. is endangered by the USA PATRIOT Act

Fact is that the United States enacted the Uniting and Strengthening America by Providing Appropriate Tools

Required to Intercept and Obstruct Terrorism Act (USA PATRIOT Act) in October 2001, following the September

11 terrorist attacks, to help fight terrorism and money laundering activities and to provide certain additional

investigative powers to U.S. law enforcement officials. But:

These powers are not relevant for most types of data in cloud computing arrangements;

The government is much more likely to obtain the data directly from the data controllers; i.e., users of

cloud computing services;

If the data controller is not based in the United States and has no strong nexus to the United States,

chances are the U.S. government is not interested in the data regardless of whether the data is hosted in

the United States;

If the U.S. government is interested, it can usually obtain access to the data through foreign

governments through judicial assistance and cooperation treaties, regardless of where the data is

hosted;

Similar powers exist in most other countries and data is typically much more at risk of being accessed by

governments at the place where the data controller is based, and

The USA PATRIOT Act issue is raised to support unrelated agendas, such as protecting local companies

or unionized jobs from global competition.

The information that the U.S. government seeks to fight terrorism and money laundering is not what most

companies store or process in the cloud. Returning to our global HRIS example, it seems quite unlikely that the

U.S. government would be interested in the kind of data that resides in a HRIS system. Even if the U.S.

government were interested, it would more likely turn to the employer first to obtain the data, because the

employer is more closely connected to the data subjects and may have additional information, and the

government would not know initially what cloud services the employer uses. If the employer is located in another

country, the U.S. government can typically exercise pressure through some kind of nexus—even Swiss and other

foreign banks, for example, had to turn over banking details to the U.S. government due to pressure on their

market presence and activities in the United States.

Also, the U.S. government can obtain information through cooperation with foreign governments. The U.S. has

entered into mutual legal assistance treaties with over 50 countries, including Canada, Mexico, Australia, Hong

Kong and a majority of the EU Member States, as well as a mutual legal assistance agreement with the EU. The

cooperation under mutual legal assistance arrangements can include substantial sharing of electronic information

between law enforcement authorities in the two countries. Similarly, the G8 nations (Germany, France, the

United Kingdom, Italy, Japan, the U.S., Canada and Russia) recently reaffirmed their commitment to enhance the

exchange of information and judicial cooperation with respect to individuals involved in acts of terrorism.

Accordingly, in most cases, even if information is stored outside of the U.S., it is still possible for U.S. law

Page 8: —stakeholders ó—will be involved industry,iapp.org/media/pdf/publications/advisor_03_12_print.pdf · By Mehmet Munur, CIPP/US, Sarah Branam and Matt Mrkobrad, CIPP/US/G Changes

enforcement authorities to obtain such information through existing treaties or agreements with the jurisdiction

in which the information is stored.

In some cases, the additional hurdles established by jurisdictional complications will make a difference, but this

difference works both ways. For example, a U.S. bankruptcy court recently refused to hand over e-mails to and

from a German resident to the German government. In this case, the German resident’s privacy was better

protected due to the fact that his e-mails were stored in the United States. The German government would have

easily gotten access to his e-mail if he had used a German Internet service provider. Most countries around the

world allow law enforcement access to private information to a similar extent as in the United States, and many

countries have updated their privacy laws to provide for minimum data retention requirements—European

telecommunications laws go beyond what the U.S. requires—and otherwise lowered privacy protection

standards.

In one of the first cases that raised concerns regarding the USA PATRIOT Act in the context of international

outsourcing, a trade union in Canada, the British Columbia Government Service and Employee’s Union, sued the

British Columbia Ministry of Health Services. The union tried to prevent the British Columbia government from

contracting out the administration of the province’s public health insurance program to a U.S.-based service

provider. The union argued that the proposed outsourcing would contravene British Columbia’s public-sector

privacy law (FOIPPA) by making the personal health information of British Columbians accessible to U.S.

authorities pursuant to the provisions of the USA PATRIOT Act. The court dismissed the case, and the trade union

initiative has since been cited as an example of privacy concerns being raised as an excuse for other agendas, such

as local job or industry protectionism.

Myth 6: Record keeping laws require data to stay local

Fact is that some tax, bookkeeping and corporate laws in some jurisdictions historically required certain records to

stay in country, but such requirements apply only to certain kinds of records and they do not prohibit the transfer

of data into the cloud so long as originals or backup copies are also kept local. If and to the extent such laws were

to apply to employment records, which is not typically the case in most major jurisdictions, the global employer

and HRIS user could still upload copies of the records in to the global HRIS—whether self-hosted or hosted by a

cloud computing service provider.

Myth 7: U.S.-EU Safe Harbor doesn’t apply to service provider arrangements

Fact is that the Safe Harbor Principles expressly state that data processors may participate and achieve adequacy

through certification, and that the EU Commission ordered European Economic Area (EEA) Member States to

consider companies ‘adequate’ if they certify under the U.S.-EU Safe Harbor program, whether they act as data

controllers or processors. The U.S.-EU Safe Harbor program has been heavily criticized over the years, and the

head of a data protection authority in one German state has even called for a rescission of the program. However,

attention-grabbing calls by local politicians do not seem much more relevant and noteworthy as the occasional

buzz around plans by Bavaria to cede from Germany or Texas to cede from the United States. Until the European

Union or the United States cancels the Safe Harbor arrangement, all data protection authorities in the 30 EEA

Member States are legally obligated to accept that Safe Harbor-certified cloud computing service providers in the

United States are as “adequate” as EEA-based service providers.

Of course, data controllers need to verify that each particular service provider can be trusted with personal data,

but this requirement applies equally if such provider is based in the United States and Safe Harbor-certified or not

certified and based in a Member State of the EEA. When comparing the relative strengths and weaknesses of data

Page 9: —stakeholders ó—will be involved industry,iapp.org/media/pdf/publications/advisor_03_12_print.pdf · By Mehmet Munur, CIPP/US, Sarah Branam and Matt Mrkobrad, CIPP/US/G Changes

protection technologies and laws across geographies, it is worth recalling that the EEA does not only consist of

France and Germany, which historically pride themselves of relatively high data protection law and technology

standards, but also 28 other countries, some of which have joined the EEA only recently and do not have much of

an information technology industry or history of data protection laws.

As a data protection law practitioner and scholar in both Germany and California, I am troubled by calls for an end

to cloud computing and data transfers to the United States by German politicians, singling out highly respected

global technology leaders headquartered in California with unsubstantiated attacks, while accepting, without any

additional scrutiny, the free flow of data in the EEA. Why should a multinational employer assume that its global

HRIS is more secure in Bulgaria and Romania—the youngest EEA Member States—compared to California?

Nothing from a data protection law, industry and policy perspective supports such double standards.

Unsubstantiated attacks do not help advance the discussion but risk provoking concerns regarding local

protectionism and national arrogance.

Myth 8: Contractual clauses are unnecessary if the service provider is Safe Harbor-certified

Fact is that the Safe Harbor certification only qualifies a service provider outside the EEA as adequate as a service

provider within the EEA is presumed to be. But, European laws require particular contractual clauses for data

transfers to any service provider—whether the provider is in the EEA or outside.

A company has to take on three hurdles before it may transfer European personal data internationally—into the

cloud or otherwise. First, the collection and use has to be permitted, based on consent, contractual duties,

statute, etc. Second, the transfer has to be justified, and third, the recipient has to achieve adequacy if it is not

based in the EEA or a country that has been declared adequate by the EU Commission. By certifying under the

U.S.-EU Safe Harbor, a U.S.-based service provider helps its customers over the third hurdle—adequacy—but the

Safe Harbor certification does not justify the transfer as such. Whether a recipient is in the U.S. or in the EEA, the

customer has to justify any transfer. In the context of data transfers to other data controllers; i.e., recipients that

want to use the data in their own interest, data controllers typically have to obtain consent from the data subject

or rely on statutory data transfer obligations. But, when a company transmits data to a service provider, it does

not have to obtain consent so long as it retains control over the data via an adequate contractual arrangement.

This is where the need for certain particular contractual clauses comes into play.

One option is to sign the Standard Contractual Clauses promulgated by the EU Commission for data transfers to

data processors. These are not absolutely required if the service provider is certified under the U.S.-EU Safe

Harbor program, because the Safe Harbor certification already provides adequacy. But, the alternative would be

to implement agreements that satisfy the national law requirements for agreements with service providers. The

templates promulgated, blessed or prescribed by the various data protection authorities in the 30 EEA Member

States vary quite a bit, and it is less clear how binding such clauses are and whether they will be accepted without

questions and scrutiny in data protection authority approval processes. Consequently, any customer or service

provider opting for “self-made” or “national government templates” should be prepared for time- and cost-

consuming legal analysis, lengthy negotiations and ultimately a multitude of agreements. By comparison, the

European Commission’s SCC may appear more attractive, because all EEA Member States are supposed to accept

this form as assuring “adequacy,” it is in both parties’ best interest not to negotiate or modify the SCC—to

preserve the preemptive blessing of the commission regarding adequacy—and one form agreement can be used

for the entire EEA.

Myth 9: Data privacy and security law compliance is the provider’s responsibility

Page 10: —stakeholders ó—will be involved industry,iapp.org/media/pdf/publications/advisor_03_12_print.pdf · By Mehmet Munur, CIPP/US, Sarah Branam and Matt Mrkobrad, CIPP/US/G Changes

Fact is that data privacy and security laws primarily hold the data controller responsible for compliance; i.e., the

customer in a service provider relationship. The customer has to ensure that the data made available to the

service provider has been legally collected, data subjects have consented or received notice, filings have been

made, etc. The service provider has typically only two duties under data privacy laws: The processor has to follow

its customer’s instructions and keep the data secure against unauthorized access. The second duty—data

security—is also and foremost on the data controller. Therefore, it is important for customer and vendor to reach

a reasonable agreement about what level of security is appropriate for particular types of data and who should be

doing what. For example, if a customer hires a service provider to store archival statistical data, music files or

strongly encrypted information in the cloud, it may not be necessary for the vendor to invest heavily in security

features because unauthorized disclosure of such data would not typically harm the data subjects. On the other

hand, backup copies of credit card transaction information should be very carefully guarded because such

information is actively pursued by hackers to steal identities and commit fraud. Data in an HRIS can also include

highly sensitive information; e.g., U.S. Social Security numbers—a primary target for identity thieves and hackers,

even though most of the data typically stored in HRIS is of little or no interest to hackers.

If the service provider offers a specialized application for certain types of data that are typically sensitive and

worth protecting, the service provider should take the initiative and either implement certain security measures

itself, for example, encryption, or recommend that its customers do. Customers should also take the initiative to

identify the various compliance obligations that apply to a particular type of data processing activity and then

consider which of these obligations can or should efficiently be handled by service providers and which are better

discharged by the data controller itself. In the end, the various compliance obligations on the data controller

depend on where the data controller is located, and if the data processor is based in another jurisdiction, the data

controller will need to educate and instruct the data processor about the compliance requirements that need to

be satisfied.

Myth 10: Cloud service providers cannot cede control to their customers

Fact is that many organizations find it difficult to stay in control over modern IT systems, whether they hire

service providers to provide IT infrastructure or whether they host, operate and maintain systems themselves.

Even with respect to self-operated systems, most companies usually have to work with support service providers

who have to be granted access to the systems and data to analyze performance problems, troubleshoot errors

and provide support and maintenance. Most companies find it prohibitively expensive to customize systems—

whether self-hosted or hosted by a service provider—beyond the configuration options provided by the vendor as

part of the standard offering. Consequently, there are significant limits to the degree of control that users can and

want to exercise over their systems, whether self-hosted or hosted by a provider.

Yet, from a legal perspective, it is imperative that the service provider remains in the role of a data processor and

the customer in the role of the data controller. If the service provider obtains or retains too much discretion about

aspects or details of the processing, the service provider could become a co-controller, which is not acceptable for

either party. The service provider would suddenly assume all kinds of compliance obligations, including to issue

notices to data subjects, assure data integrity, grant access and correction requests, submit filings to data

protection authorities, ensure compliance with data retention and deletion requirements, etc. A cloud computing

service provider cannot discharge these data controller obligations because it does not know the data subjects or

what data is uploaded into its systems. And, if the service provider did in fact qualify as data controller, then the

customer would typically violate statutory prohibitions and privacy policy promises regarding data sharing.

Therefore, both provider and customer have to work toward an arrangement that keeps the provider limited to

the role of a data processor.

Page 11: —stakeholders ó—will be involved industry,iapp.org/media/pdf/publications/advisor_03_12_print.pdf · By Mehmet Munur, CIPP/US, Sarah Branam and Matt Mrkobrad, CIPP/US/G Changes

In the context of self-hosted systems, it tends to be easier to prove that the provider retains little or no control

over the system after delivery. In the cloud computing scenario, on the other hand, the data resides on servers on

location that the provider controls. But, it is important to note that “control” from a data protection law

perspective refers to “control” of the data—not “control” of the premises where data resides. Landlords, for

example, are not viewed as data controllers merely because they own a building where data is stored and have

access and repossession rights under contract and statutory law regarding the building and tenant property.

The focus of control regarding cloud computing is to ensure that the customer decides what data to upload,

download, access, transfer, delete and otherwise process. This is the case with respect to most cloud computing

offerings because the service providers tend to offer a platform or software functionality as a service, without any

interest, knowledge or influence regarding data types and processing purposes. For example, with respect to a

hosted HRIS, the service provider does not have any interest to view or use the data that the customer uploads

and processes, but the customer may need to provide to access data in order to provide support for technical

issues. If the parties contractually and technologically assure that the vendor’s personnel only access data on the

system to provide the service and prevent support issues, then the customer is nearly as much in control as the

customer can be with respect to self-hosted systems.

Nearly, because with respect to self-hosted systems, the customer can monitor and safeguard physical safety and

access limits—albeit that in practice, many companies outsource premises security in any event to security service

providers. To keep the customer in control, the cloud computing service provider has to provide key information

about storage locations, processing practices and subcontractors. Some service providers withhold such

information based on trade secret protection objectives and this can create friction with respect to the objective

to keep the customer in control for purposes of data protection law compliance.

To resolve such conflicts and keep the service provider in the “processor” role, vendors could disclose to their

customers key aspects of their data processing practices, equipment locations and significant subcontractors with

access to the customer’s personal data. Also, any contractual clauses to safeguard control and data protection

need to be passed on to the subcontractors. In connection with cost-efficiently designed, standardized cloud

computing solutions, it can be very difficult and expensive for providers to accommodate customization requests

by individual customers. But, a customer can also remain in control over data processing if the customer retains a

right to receive prior notice regarding all relevant details of the data processing and changes thereto, so that the

customer can withdraw data or change the use of a cloud solution in case changes are not acceptable for certain

kinds of data. Or, the customer could agree or instruct the provider to update service and technology from time to

time, as the provider deems appropriate, on the condition that the provider will not lessen the security and

privacy measures set forth in the agreement. Whether customers and providers also agree on contract

termination rights and early termination fees is a commercial point and not prescribed from a data protection

compliance perspective.

Myth 11: Vendor has and should accept unlimited liability for data security breaches

Fact is that service providers may not always be able to limit their liability vis-à-vis the data subjects in scenarios

where they contract with corporate customers and not the data subjects themselves. If hackers gain unlawful

access to employee information residing in a global HRIS, the service provider may be liable directly vis-à-vis the

employees under negligence theories—if and to the extent economic harm resulting from data access is covered

by tort liability under a particular jurisdiction’s laws.

But, data protection laws do not prescribe the allocation of commercial liabilities between the parties.

Sophisticated companies usually slice and dice exposure in various ways in indemnification, limitation of liability

Page 12: —stakeholders ó—will be involved industry,iapp.org/media/pdf/publications/advisor_03_12_print.pdf · By Mehmet Munur, CIPP/US, Sarah Branam and Matt Mrkobrad, CIPP/US/G Changes

and warranty clauses. It is quite common to differentiate in risk allocation clauses based on whether customer

and/or service provider contributed primarily to a breach or resulting harm; whether the service provider was in

compliance with its contractual obligations, its information security policies and applicable law, and whether a risk

materialized that could have affected any other company, including the customer.

Also, cloud service providers are increasingly mindful that they can be held liable for violations of laws by their

customers or their customers’ customers, for example in the context of uploaded viruses, illegally copied files and

pornographic materials. Such risks are then shifted contractually from the provider to the customer.

Myth 12: Customer needs to have the right to access the provider’s data centers and systems for audit

purposes

Fact is that customers need to reserve a right to audit the cloud service provider’s compliance measures. But, it is

also a fact that the service provider may not let customers into its data centers or systems because that would

impair the security of other customers’ data. Also, individual audits would be unnecessarily disruptive and costly.

As a compromise, cloud service providers can arrange for routine, comprehensive audits of their systems by a

generally accepted audit firm and make the results available to all customers. If customers demand additional

topics on the audit list, providers can expand the scope of the next scheduled audit, if the customers are willing to

pay for the additional effort and the controls are within the scope of the service.

Beyond generally applicable compliance and audit requirements, there can occasionally be additional needs with

respect to particular types of data processing arrangements or industries. For example, European laws

implementing the Markets in Financial Instruments Directive (2004/39/EC, MiFID) require that regulated entities

perform due diligence on their vendors "when relying on a third party for the performance of operational

functions which are critical for the performance of regulated activities, listed activities or ancillary services." But,

in the context of such special regulatory requirements, it is important to differentiate what types of data and

processing services are within and outside scope. If a regulated financial services firm chooses to use a hosted

solution for human resources data relating to its own employees, this can hardly be considered a “critical

operational function” under financial services regulations. If the same firm were to outsource core functions such

as calculations of its funds values, the additional audit requirements may come into play.

Prof. Lothar Determann practices data privacy and technology law as a partner in Baker & McKenzie’s Palo Alto

office and teaches data privacy, e-commerce and computer law at the University of California, Berkeley School of Law

(Boalt Hall), Hastings College of Law, Stanford Law School and Freie Universität Berlin.

Page 13: —stakeholders ó—will be involved industry,iapp.org/media/pdf/publications/advisor_03_12_print.pdf · By Mehmet Munur, CIPP/US, Sarah Branam and Matt Mrkobrad, CIPP/US/G Changes

Obama administration and Congress step up efforts to protect against cyber threats

By Heidi Salow, CIPP/US

After years of discussion and several false starts, 2012 is shaping up to be the year that national

cybersecurity legislation may become a reality in the U.S. Several recent proposals from the

White House and both houses of Congress have revealed a sense of urgency and strong

bipartisan support for strengthening the nation’s private and public infrastructure from cyber

attack. The parameters of any final legislation, however, remain very much in debate. Proposals

have ranged from bills designed simply to facilitate the sharing of cyber-threat information between the private

and public sectors to comprehensive schemes granting the executive branch broad powers to declare a national

cyber emergency and to compel private companies to implement a response plan. To further complicate matters,

in both the Senate and House, numerous committees and subcommittees claim some jurisdiction over

cybersecurity.

When the president released his Cyberspace Policy Review almost two years ago, he declared that the “cyber

threat is one of the most serious economic and national security challenges we face as a nation.” Members of both

parties have also recognized this challenge—approximately 50 cyber-related bills were introduced in the last

session of Congress.

Given the rapid pace of recent developments and the potential impact of federal legislation in this area, it is

important for companies in critical sectors—such as telecommunications, defense, energy, transportation and

information technology—to monitor these developments closely and consider getting involved in the policy

discussions.

Bipartisan momentum for cybersecurity legislation is building

In the White House, President Barack Obama has identified cybersecurity as one of the most serious economic

and security challenges facing the nation today. In his recent State of the Union Address, the president

highlighted the growing dangers of cyber threats and called on Congress to act on proposed legislation submitted

by the White House last May. On February 1, the administration held a classified briefing with Senate leaders to

stress the urgent need for legislative action.

In the Senate, several cybersecurity bills have been introduced. Senate Majority Leader Harry Reid (D-NV) has

said that he considers cybersecurity a top priority. Last year, Reid and Minority Leader Mitch McConnell (R-KY)

took the unusual step of exchanging public letters on the subject and formed a bipartisan cybersecurity working

group in an attempt to overcome the committee turf battles that have plagued earlier efforts.

Meanwhile, the Cybersecurity and Internet Freedom Act (CIFA-S 413), sponsored by Sens. Joe Lieberman (I-CT),

Susan Collins (R-ME) and Thomas Carper (D-DE), gained a good deal of attention in 2011. It has been replaced by

a new bill, S 2105, resulting from the joint efforts of several committees—Homeland Security and Government

Affairs, Commerce and the Select Committee on Intelligence.

Just this month, a group of bipartisan senators introduced a pre-publication version of S 2105, the Cybersecurity

Act of 2012. This bill is an attempt to bring together CIFA (S 413) and other Senate bills, such as one introduced by

Page 14: —stakeholders ó—will be involved industry,iapp.org/media/pdf/publications/advisor_03_12_print.pdf · By Mehmet Munur, CIPP/US, Sarah Branam and Matt Mrkobrad, CIPP/US/G Changes

Chairman Jay Rockefeller (D-WV). S 2105 calls for the Department of Homeland Security (DHS) to assess risks and

vulnerabilities of computer systems running at critical infrastructure sites and to work with the operators of such

systems to develop security standards.

Under S 2105, the DHS would determine which systems fit the definition of “critical infrastructure,” namely those

“whose disruption from a cyber attack would cause mass death, evacuation or major damage to the economy,

national security or daily life.” Companies would have the right to appeal the designation. Owners or operators of

critical infrastructure systems would be able to determine how to best meet performance requirements and

would either "self-certify" compliance or use a third-party assessor for certification.

The bill also contains provisions for information sharing between the government and the private sector, and it

would reform the Federal Information Security Management Act (FISMA). FISMA would “focus on continuous

monitoring of agency information systems and streamlined reporting requirements rather than overly

prescriptive manual reporting.” The DHS would consolidate its cybersecurity programs into a National Center for

Cybersecurity and Communications office.

Sen. Dianne Feinstein (D-CA) introduced separate legislation this month. The Cybersecurity Information Sharing

Act of 2012 (S 2102) would have required the federal government to designate a single focal point for

cybersecurity information sharing. This bill was later incorporated into S 2105 as Title VII of that bill, so S 2102 is

unlikely to be acted upon separately unless S 2105 is defeated or substantially amended.

S 2102—and now Title VII—would create “cyber exchanges” (CEs). CEs are organizations established “to

efficiently receive and distribute cybersecurity threat indicators.” The DHS Secretary would be required to

establish, by regulation, at least one governmental CE as the lead CE. It would act as “the focal point within the

federal government for cybersecurity information sharing among federal entities and with non-federal entities.”

Title VII also affirmatively provides private-sector companies the authority to monitor and protect the information

on their own computer networks and encourages information sharing about cyber threats within the private

sector by providing a good faith defense against lawsuits. It establishes procedures for the government to share

classified cybersecurity threat information with companies that can effectively use and protect that information.

The DHS secretary, in consultation with privacy and civil liberties experts, would have to develop policies for the

receipt, retention, use and disclosure of cybersecurity threat information by federal entities to minimize the

impact on privacy and civil liberties and to safeguard personal information.

In the House of Representatives, Republican leadership formed a Cybersecurity Taskforce in June of 2011. The

taskforce was asked to make recommendations to House Republican leadership on four issues: critical

infrastructure and incentives; information sharing and public-private partnerships; updating existing cybersecurity

laws, and legal authorities. In October, the taskforce unveiled its recommendations in a report to House

leadership.

Meanwhile, two cybersecurity bills are rapidly making their way through the House of Representatives. The Cyber

Intelligence Sharing and Protection Act (CISPA-HR 3523) was introduced on November 30 and garnered over 50

cosponsors and 20 letters of support from industry leaders. CISPA was marked up and approved by the House

Intelligence Committee by a commanding vote of 17-1. The next step for the bill is the House floor.

On December 15, Rep. Dan Lungren (R-CA) introduced the Promoting and Enhancing Cybersecurity and

Information Sharing Effectiveness Act (PrECISE-HR 3674). It was marked up and unanimously passed on February

Page 15: —stakeholders ó—will be involved industry,iapp.org/media/pdf/publications/advisor_03_12_print.pdf · By Mehmet Munur, CIPP/US, Sarah Branam and Matt Mrkobrad, CIPP/US/G Changes

1 by the House Homeland Security Subcommittee on Cybersecurity, Infrastructure Protection and Security

Technologies. The bill, discussed in more detail below, will now be sent to the full committee.

The shape of final legislation is uncertain

Although there is broad consensus that cybersecurity legislation is needed, there are obviously many emerging

proposals, and much has yet to be decided.

The House Intelligence Committee’s CISPA bill and Sen. Feinstein’s Cybersecurity Information Sharing Act appear

to be the least controversial and least ambitious of the legislative proposals. The primary purpose of these bills is

to facilitate the confidential sharing of cyber-threat information between the federal government and the private

sector by various means. They authorize and expedite security clearances for qualified private-sector entities,

protect private disclosures from Freedom of Information Act requests, restrict the government’s use of

information received from the private sector and limit liability for entities’ sharing and use of cyber-threat

information.

Like the CISPA, the House Homeland Security Committee’s PrECISE Act would facilitate the confidential sharing

of cyber-threat information between the public and private sectors. It calls for the creation of a nonprofit

organization called the National Information Sharing Organization (NISO)—with a majority private-sector

board—to serve as a secure, confidential clearinghouse for the exchange of cyber-threat information between

public and private entities. The PrECISE Act goes further than the CISPA, however, by empowering the DHS to

conduct risk assessments and collect existing security standards to evaluate the best methods to mitigate risks.

Notably, it is intended to create as little new regulation as possible, instead requiring regulators to assess current

critical infrastructure protection regulations against DHS-identified risks. Gaps would be identified and

redundancies eliminated.

In the Senate, last year’s CIFA legislation weighed in at more than 200 pages. Its successor, S 2105, is equally

comprehensive. Much like the PrECISE Act, it would give the DHS power to conduct risk assessments and

determine cybersecurity performance requirements but only for critical systems that are not already

appropriately secured. Owners of “covered critical infrastructure” would have flexibility in meeting performance

requirements as they deem fit.

Although Majority Leader Reid wants quick floor action on S 2105, a group of top Republicans, including Minority

Leader Mitch McConnell and Sen. John McCain (R-AZ), have expressed concern that the measure is being rushed.

They have stated that the bill “does not satisfy our substantive concerns, nor does it satisfy our process concerns.”

Given the numerous legislative proposals under consideration—with more on the way—and the continuing

jurisdictional debates, it is unlikely that any cybersecurity legislation will make headway in the near future.

Nevertheless, there is clear bipartisan recognition that something needs to be done quickly to protect this

country’s critical infrastructure.

Heidi Salow, CIPP/US, is an expert in privacy and data security,intellectual property, e-commerce and global data

protection laws, having focused on these areas for well over a decade. Her experience as a negotiator includes

legislative advocacy, negotiating complex transactions and compliance counseling. She has represented numerous

clients on regulatory and public policy matters before Congress, the administration and federal and state agencies.

Page 16: —stakeholders ó—will be involved industry,iapp.org/media/pdf/publications/advisor_03_12_print.pdf · By Mehmet Munur, CIPP/US, Sarah Branam and Matt Mrkobrad, CIPP/US/G Changes

Facial recognition technology: Should faceprints be considered personally identifiable

information?

By Jedidiah Bracy, CIPP/US

A little more than two decades ago, the idea that technology could be capable of

recognizing an individual’s face was merely the stuff of science fiction. Yet, at the dawn of a

new decade, facial recognition technology has not only become a reality, it is becoming

commonplace—from security surveillance to social media photo tagging.

Like the tips of our fingers, our face, when coupled with the appropriate algorithm and database, becomes a

unique biometric identifier. Yet, unlike fingerprints and other biometric identifiers, a faceprint can be captured

from a distance without an individual’s knowledge.

Experts discussed the implications of this technology last December at a roundtable hosted by the Federal Trade

Commission (FTC). The forum made it clear that the new technology creates a vast array of opportunities for

business and law enforcement but also raises concerns about privacy and anonymity.

One expert suggests faceprints be considered personally identifiable information.

This seemingly simple assertion to what has been considered a complex problem was posed by one of the

inventors of facial recognition technology. As vice chairman of the International Biometric & Identification

Association (IBIA)—a biometrics trade association—Joseph Atick has spent the last 20-plus years leading several

companies in the identity management industry.

Atick says it’s critical for individuals to have control over their faceprints. A complex system of assigning fair

information practices to varying facial recognition applications “can be resolved and addressed if you give the

consumer the control over the faceprint and say no application can exploit the faceprint without (the consumer’s)

explicit consent.”

In other words, our faces should be copyrighted.

Atick says despite the vast deployment of the technology, a generic approach such as legislating that a faceprint

is PII gives weight to responsible use and makes companies liable if they do not appropriately protect faceprints—

similar to the practices for health or financial records.

“If we limit regulations to a faceprint equals PII, then industry self-regulates to police that. It creates liability and

allows the legal system to do the enforcement,” says Atick.

“Privacy advocates on the (FTC) panel talked about a careful tiered approach to privacy,” says Pam Dixon, but

Atick pointed out that it is difficult to police facial recognition. Dixon is an author, researcher, and founder of the

World Privacy Forum. She wrote a report on digital signage and facial recognition technology, The One-Way

Mirror Society, and has led an effort to create the first set of collaborative consumer privacy principles for digital

signage.

Page 17: —stakeholders ó—will be involved industry,iapp.org/media/pdf/publications/advisor_03_12_print.pdf · By Mehmet Munur, CIPP/US, Sarah Branam and Matt Mrkobrad, CIPP/US/G Changes

Where is the technology being used? Is it collecting and storing faceprints? This can be solved by making faceprint

processing a liability when misused.

“What Dr. Atick was saying was a wake-up call.”

“He’s not a privacy advocate,” she says. “He was there to advocate for companies using it. So for him to say that

was stunning.”

In its comments to the FTC, the World Privacy Forum wrote, “Dr. Atick’s approach provides an important avenue

of thinking that we urge the FTC to explore further. We believe it holds significant promise and has the most

potential for a positive and fair outcome. The policy dialogue around facial recognition and detection

technologies has been overlaid by approaches with roots in past technologies from past eras. Much of the policy

discussions to date have not been informed by Dr. Atick’s level of knowledge and, as such, have not taken into

sufficient account the uniqueness of the faceprint, the nature of the technology and the manner in which it is

being deployed.”

Dan Solove agrees that faceprints should be considered PII. “Faceprints are specifically designed to be identifiable

to a person and to be a means of identifying a person, and thus I believe they should be considered PII.”

As John Marshall Harlan Research Professor of Law at George Washington University Law School, Solove is

considered to be one of the world’s leading experts in privacy law. He says a “combination of self-regulation as

well as legislation and agency supervision” will be the best solution. “Self-regulation alone lacks the teeth and

uniformity to be effective, but when combined with reasonable and flexible legislation, self-regulation can be

effective in many ways.”

The Software & Information Industry Association (SIIA) has a different take on privacy legislation. In its comments

to the FTC, the SIIA asserts that “there is no need to develop specialized privacy principles for facial recognition or

facial detection technologies.”

“Privacy is context-dependent, not technology-specific,” the SIIA contends.

SIIA Public Policy Vice President Mark MacCarthy says facial recognition and detection technologies “raise

different privacy issues depending on the context in which they are used.” Digital signage that does not collect

faceprints “might call for notice,” while “more advanced use of facial recognition technology such as tagging

pictures on a social network might call for some kind of consent. There doesn’t need to be legislation specific to

faceprints,” he contends, “because the technology can be used in so many different contexts it would be

impossible to write meaningful privacy protections.”

MacCarthy also says, “In the contexts where the technology is being used now, such as digital signage or tagging

photos on social networks, the industry has struck a pretty good balance.”

Speaking at the FTC roundtable, Facebook Director of Privacy Erin Egan raised the notion of context as well. The

social networking site came under fire from privacy advocates when it introduced facial recognition software to

streamline photo tagging.

“We’re not automatically identifying you,” Egan said. “We’re just suggesting people who might be in your photos

so we can make tagging easier for you. I mean, that’s the context…even in that context, there are important

principles around notice, around control, around security…I still think these framework principles apply, but again,

I think that when we look at how they should be applied, it should depend on the context and users expectations.”

Page 18: —stakeholders ó—will be involved industry,iapp.org/media/pdf/publications/advisor_03_12_print.pdf · By Mehmet Munur, CIPP/US, Sarah Branam and Matt Mrkobrad, CIPP/US/G Changes

Last October, Atick published Face Recognition in the Era of the Cloud and Social Media: Is it Time to Hit the Panic

Button? In it, Atick describes the “perfect storm” that gives him reason for concern. The combination of

“enthusiastic participation in social media,” increased use of digital cameras and improved facial recognition

algorithms “opens the door for potentially achieving the unthinkable: the linking of online and offline identities.”

Dixon questions whether Pandora is already out of the box regarding facial recognition technology. “We

proceeded down a path unconsciously because the market is already capitalizing.” Dixon has spent time in India

for the government’s biometric identification system and in Japan during the implementation of its smart grid.

“One thing I’ve learned about biometrics: once it’s out there, the game is up. You can’t take it back.”

She points out that the Fair Credit Reporting Act and the Equal Credit Opportunity Act were both pieces of U.S.

legislation that rolled back discriminatory market practices to protect consumer rights. Dixon thinks something

similar could help roll back current market practices to help protect individual identities.

Dixon thinks these new technologies call for a new way of thinking about our privacy rights. When in public, our

expectation of privacy is naturally diminished, but in the past, we’ve often had some form of anonymity. As facial

recognition becomes more ubiquitous in the public space, our expectation of privacy and anonymity could be

eradicated, she warns.

The recent U.S. Supreme Court ruling on GPS tracking, the United States v. Jones, sheds light on the new privacy

paradigm for Dixon. In particular, Dixon cites Justice Sonia Sotomayor’s concurring opinion. “More

fundamentally,” wrote Sotomayor, “it may be necessary to reconsider the premise that an individual has no

reasonable expectation of privacy in information voluntarily disclosed to third parties…This approach is ill-suited

to the digital age, in which people reveal a great deal of information about themselves to third parties in the

course of carrying out mundane tasks.”

Dixon says Sotomayor’s concurring opinion “forms the threshold of what we need to be looking at.”

In comments to the FTC, the Electronic Privacy Information Center (EPIC) recently called for a moratorium on

commercial deployment of facial recognition technology. “While the use of facial recognition technology has

increased over the past several years, few legal safeguards currently protect consumer privacy and security.”

EPIC cites two existing state laws--in Illinois and Texas--that have biometric statutes in effect, but adds that the

U.S. Constitution “only protects individuals from privacy intrusion by the state, not private companies or other

individuals.” Internationally, the new EU data protection framework mandates any organization processing

biometric data needs to conduct a “data protection impact assessment” as well as meet personal data obligations.

As such, EPIC is asking the FTC to require companies “collecting, handling, storing and transmitting” biometric

data “to adhere to a framework of Fair Information Practices.”

When asked if current EU and UK legislation appropriately protects individuals from facial recognition’s misuse, a

spokesman from the UK Information Commissioner’s Office said, “Images of individuals are likely to be regarded

as personal data and therefore UK laws require organisations to have a legitimate purpose for processing this

information. In some cases this may be in the legitimate business interest of the organisation but in others this

will require the consent of the individual. UK and EU law is clear that consent must be both freely given and

informed. Other aspects of the Data Protection Act such as subject access and the right to object to processing

will also apply to facial recognition technologies.”

Page 19: —stakeholders ó—will be involved industry,iapp.org/media/pdf/publications/advisor_03_12_print.pdf · By Mehmet Munur, CIPP/US, Sarah Branam and Matt Mrkobrad, CIPP/US/G Changes

For Atick, the solution resides in legislation requiring faceprints to be considered PII. Once this is done, and

liability becomes a driving factor for companies to ensure it is protected, then industry can self-regulate. He says

facial recognition technology is helpful for fighting crime and terrorism and for other security needs. It has proven

helpful in Ontario, Canada, for problem gamblers who opt in to the Ontario Lottery Gaming Corporation’s

voluntary self-exclusion program.

Atick says the industry has been “fully behind” the IBIA, but adds, it is concerned about “rogue applications” that

use the technology “to make a name for themselves.” He says many of the newer social media sites and Internet-

based companies are exploiting facial recognition technology. “We’ve reached out and asked them to use the

technology responsibly.”

The biggest concern, according to Atick, is the construction of large databases of faceprints. “We need to address

the root cause of this threat to privacy by focusing on the ease with which identification databases can be built

through automated harvesting of identity-tagged images over the web,” he writes. Changing the attitudes of data

handlers—like social media and image sharing sites—and elevating faceprints to the level of PII can help.

Page 20: —stakeholders ó—will be involved industry,iapp.org/media/pdf/publications/advisor_03_12_print.pdf · By Mehmet Munur, CIPP/US, Sarah Branam and Matt Mrkobrad, CIPP/US/G Changes

Legal analysis of the new proposed EU regulation on data protection

By Fabio Di Resta and Nicola Fabiano

In the new proposed regulation on EU data protection law, there are many important provisions. Most of them are

necessary to address the future challenges of data protection in the Internet environment. The principles of

effectiveness; i.e., stronger powers to DPAs, PIAs, mandatory appointment of DPOs, the principles of privacy by

design and by default; accountability, and transparency are the founding stones on which the new proposed

regulation was built.

The main objective of the regulation draft is to fulfill the ambitious harmonisation of the data protection laws of

EU Member States and enhance consumers’ trust on the Internet through stronger data protection rules at the EU

level.

In this article, different legal aspects of the proposed framework will be analysed.

Extra-territorial criterion: More specific exemptions

With respect to external scope, it should be considered that the main reason of the broad scope of the existing

95/46/EC Directive is to ensure that individuals are not deprived of EU data protection law and to prevent actions

to circumvent the EU law.

The choice of the European Commission to enhance the threshold—as recently amended in the published draft—

to trigger the application of EU law outside the EU/EEA seems appropriate to address the future challenges of the

Internet but still could use some amendment, such as more structured exemptions to prevent discriminating

against complex organisations. In respect of this point, different situations are exempted from the EU law

application—Article 3 par. 2 and Article 25: Any controller established in third countries which ensures an

adequate level of data protection; any public body; any controller only occasionally offering goods and services to

data subjects residing in the EU, and all enterprises employing fewer than 250 persons.

This last exemption—which refers particularly to SMEs— also could use some amending. The complexity of

organisations that operate through the Internet, where single departments or business units sometimes

operate—with limited staff and an independent budget—as a controller, offering specific products or services,

should be considered. Thus, the quantitative or dimension criterion of 250 persons with regard to the overall

activity of big organisations should probably be rethought and the relevance—ancillary or otherwise—in the

specific organisation of the products or services offered in EU (recitals 20, 63 and 64 of the EC regulation draft)

should be taken into account.

The mandatory appointment of a representative established in the EU/EEA could have a negative impact on the

activity of these departments and business units if they are considered data processors rather than controllers,

Page 21: —stakeholders ó—will be involved industry,iapp.org/media/pdf/publications/advisor_03_12_print.pdf · By Mehmet Munur, CIPP/US, Sarah Branam and Matt Mrkobrad, CIPP/US/G Changes

and this provision could be considered too dissuasive by big organisations, which have only ancillary activity in

Europe, especially owing to the fact that these rules already apply to SMEs.

Consequently, without an enlargement of the exemption, there could be several negative effects; for example,

the representative appointment could be an economic barrier that restricts the choice of EU/EEA consumers who

will not be able to purchase online products and services coming from organisations located outside of the EU.

Cloud computing scenario: Comparative analysis under the existing 46/95/CE Directive and under the

regulation draft—one-stop shop and the main establishment criteria

In the following paragraphs, one scenario will be analysed—both under the existing EU/EEA directive and the new

regulation draft.

In this IT model, personal data are usually processed and stored on servers in several places around the world. The

exact place where the data are stored is not always known, and it can change over the time. In order to trigger the

applicability of EU law, the relevant information is the context of activity of the establishment within the EU

(principle of establishment) and the location of the equipment.

In order to deeply understand the applicable legal issues, the first step is to identify the data controller and its

activities. For example, the buyer of a cloud service could be a data controller. Say a company uses an online

agenda service, if the company uses the agenda service in the context of the activity of its establishment in the

EU, the EU law will be applicable. However, the cloud provider could also be, under some circumstances, a data

controller. Such is the case when it provides for an online agenda and document sharing, where private parties

can upload all of their personal appointments and contacts, synchronize them and upload documents to store or

share with selected persons. In this context, different key factors should be taken into account: the context of the

activity of the establishment, its degree of involvement and the nature of its activity. Let’s say the cloud provider

is a data collector located in the UK, Germany and Italy—and all of them are establishments—but server and

technical staff for the online agenda are located in the UK, while the servers, software and technical staff for the

document sharing activities are located in Germany. The establishment in Italy is not involved in this activity.

According to Article 4 of the existing directive, English law is applicable to the establishment located in UK and,

likewise, German law applies to the establishment located in Germany, with the further to obligation to deal with

German and English DPAs. Italian law is not applied as this data processing not being the Italian establishment

involved.

One of the implications of the approach mentioned above is the risk of overlapping national laws applicable to the

same data processing with the further consequence to deal with several jurisdictions and data protection

authorities. In order to overcome this problem and to give more legal certainty, in the EU regulation draft the

main establishment principle—also called one-stop shop criterion—was worded in such a way that it will apply

when a data controller or processor is established in more Member States (Recitals 13 and 98 – Article 51,

Paragraph 2).

This is a good principle because it gives legal certainty to companies that do business in Europe with the

possibility to comply with one law for the whole of the EU territory and to deal with a single data protection

authority (lead authority). Analysing the above-mentioned cloud computing scenario in light of the EU regulation

draft, once the context of establishments’ activity has been identified, the purpose, means and conditions should

be taken into account. Thus, it could be considered that all the services offered by the cloud provider to users—

who upload data for the agenda and documents for storing and/or sharing—has the same purpose, so it should be

regarded that only one process is operated. Furthermore, considering that the English establishment manages

Page 22: —stakeholders ó—will be involved industry,iapp.org/media/pdf/publications/advisor_03_12_print.pdf · By Mehmet Munur, CIPP/US, Sarah Branam and Matt Mrkobrad, CIPP/US/G Changes

the main activity—being the place of central administration, which decides in order of the processing of the

users—then only the English law will be applicable and the English information commissioner will be the lead

authority to address all complaints from data subjects residing in the EU territory.

The principles of privacy by design and default and the commission’s controls

The new data protection legal framework proposed by the European Commission introduces, with respect to the

Directive 95/46/EC, the reference to “data protection by design and by default” (Article 23 of the Proposal for a

Regulation and Article 19 of the Proposal for a Directive). Also, even though these articles do not describe the

data protection by design and by default, they compel the controller to “implement appropriate technical and

organisational measures and procedures” and to “implement mechanisms for ensuring that, by default, only

those personal data are processed which are necessary for each specific purpose of the processing…” The

commission preferred to describe the controller duties instead of setting legal status of data protection by design

and by default. It is very important to clarify the meaning of “data protection by design and by default,” focusing

on the true sense of these terms. On the other hand, it is as interesting to distinguish the expression “data

protection by design” from “data protection by default” and to find the actual meaning of each term, because the

phrase used by the commission seems to highlight a difference between the two terms. According to the text of

the article, it is clear that the commission shall consider “by design” and “by default” as different concepts, even if

they are used in the same sentence.

This approach seems quite different from the one officially used by the International Conference of Data

Protection and Privacy Commissioners, which last year adopted a resolution on Privacy by Design proposed by

Ontario, Canada, Information and Privacy Commissioner Ann Cavoukian.

In this context, the expression “Privacy by Design” is used to describe a method to deal with privacy issues in this

new era, where a correct approach to privacy is most valuable. In this respect, it should be said that in the EU legal

framework approach “by design” or “by default” the term “data protection” is used instead of “privacy.”

Furthermore, the commission’s proposal seems to pay a lot of attention to the technical and security aspects

instead of the legal concerns. The specific reference to “measures and procedures” seems oriented towards the

PETs (privacy-enhancing technologies), which are certainly important, but the future of privacy is Privacy by

Design. In conclusion, the hope is that the expression “by design and by default” will not represent a cutting-edge

movement or a system founded on technological and security support but a real, methodological approach to the

future handling of our privacy, according to the international commissioners’ statement, and to becoming a

worldwide privacy standard in the near future.

The right to judicial remedy against data controller

Article 75, Paragraph 2 provides that in case of infringements of data protection rights, “proceedings may be

brought before the courts of the Member States where the data subject has its habitual residence.”This article

also entails that jurisdictional and international issues will be also brought before the national courts, which will

decide on the compensation of damages of data subjects—requiring judges highly specialised both in EU data

protection law and in international issues, and even to decide in the event of an appeal of the decision of other

countries’ DPAs.

On the other hand, the new consistency mechanism increases the power of the European Commission a lot. It

becomes the ultimate supervisory authority on the protection of data subjects, with the power to suspend the

draft measures adopted by national DPAs through both the presence of serious doubts on their consistency with

the EU regulation and a reasoned decision. Furthermore, with regard to the cloud computing scenario analysed

Page 23: —stakeholders ó—will be involved industry,iapp.org/media/pdf/publications/advisor_03_12_print.pdf · By Mehmet Munur, CIPP/US, Sarah Branam and Matt Mrkobrad, CIPP/US/G Changes

above, more and more EU-based data subjects will be able both to access one national lead authority and to take

action against national courts of that country in which the main establishment is located. Lastly, as stated in

Paragraph 4 of Article 75, “all the Member States shall enforce final decisions by the courts.” This paragraph

underlines further consequences of the regulation draft adoption, both stronger free movements of judgements

on data protection and the need to assure the recognition of the judgments on data protection from other

countries’ courts.

Data protection officer requirement and its impact on the national laws

The EU legal framework introduces the data protection officer and, according to the article 35 of the proposal for

a regulation, this rule is mandatory if

processing is carried out by a public authority or body;

processing is carried out by an enterprise employing 250 persons or more;

core activities of the controller or the processor consist of processing operations which, by virtue of their

nature, their scope and/or their purposes, require regular and systematic monitoring of data subjects.

There is no doubt about the relevance of the choice to set up the data protection officer. This solution, strongly

hoped for by some Italian privacy professionals, shows how the European Commission has taken the data

protection officer into account, demonstrating that there is a great need to pay attention to privacy matters. It is

necessary for people dealing with privacy to have specific expertise and proficiency. This will obviously have

consequences on the national law that will need to be implemented to set up the data protection officer.

According to the EU legal framework, public bodies and enterprises will have a specific department for the

competence on privacy matters. Although not mandatory, the data protection officer rule should be regarded as

very important for organisations with less than 250 employees, too, because privacy is a fundamental right that is

not related to the size of a company. Furthermore, this measure would go in the direction of making the EU data

protection law more user-centred.

Data protection impact assessment requirement

A valuable concept introduced by the EU proposal is the assessment of the data protection impact. The main

reference is Article 33 of the proposal for a regulation, “where processing operations present specific risks to the

rights and freedoms of data subjects” the controller “shall carry out an assessment of the impact of the envisaged

processing operations on the protection of personal data.” Certainly the PIA (privacy impact assessment) is well-

known in the international context. Recently, EU public bodies began talking about impact assessments,

particularly about the PIA. The privacy legal framework in force (Directive 95/46/EC) does not contain reference to

the impact assessment, and there are only a few recent official European documents on this topic. Therefore, the

choice of the European Commission to include the data protection impact assessment in the proposal for a

regulation and directive is key. The aforementioned Article 33 describes when and how it is necessary to set up a

data protection impact assessment (DPIA).

Explicit consent requirement and navigation over the Internet

According to the aforementioned EU legal framework, Article 7, the controller shall acquire the data subject’s

consent for specified purposes and the “data subject shall have the right to withdraw his or her consent at any

time.” In this respect, Article 4 states that consent “means any freely given specific, informed and explicit

indication of his or her wishes by which the data subject, either by a statement or by a clear affirmative action,

signifies agreement to personal data relating to them being processed.” In case of minors, a “child below the age

Page 24: —stakeholders ó—will be involved industry,iapp.org/media/pdf/publications/advisor_03_12_print.pdf · By Mehmet Munur, CIPP/US, Sarah Branam and Matt Mrkobrad, CIPP/US/G Changes

of 13 years shall only be lawful if and to the extent that consent is given or authorised by the child's parent or

custodian.” Last but not least, the provision about a right to be forgotten is very relevant in the Internet

environment, and this also guarantees users the right to withdraw their consent when “there are no other

legitimate grounds for retaining the data.”

The European Data Protection Board

The EU regulation shall establish a European Data Protection Board that, according to Article 66 of the proposal

for a regulation, “shall ensure the consistent application of this regulation...on its own initiative or at the request

of the commission.” This article describes different actions that the DPB can realise, and that it shall frequently

inform the commission about the outcome of its activities. Finally, it should be pointed out that this supervisory

authority will supersede the current Article 29 Working Party and it will play a relevant role in the consistency

mechanism to guarantee the unity of EU law application.

Conclusion

The European Commission’s proposal deserves to be appreciated especially because it addresses the main crucial

challenges for data protection law in a globalised world. However, a detailed analysis shows that this proposal

could use some amendments. Particularly, more attention should be paid to widespread involvement of all

stakeholders, especially multinationals and overseeing authorities, such as the U.S. Federal Trade Commission.

Furthermore, with respect to the extra-territorial jurisdiction of EU law, there is concern that all the provisions will

be considered mere theoretical principles by extra EU/EAA countries, without further international legal

agreements and worldwide cooperation, which will probably require an amended text in favor of the

enforceability of EU law.

Additionally, the implications of the DPA jurisdiction among the member states where the processor has the

registered office in one country while the data subject lives in another should be also considered; Likewise, the

rule of the EDPB (European Data Protection Board) should be deepened in order to avoid a European privacy body

depending only on the choices of “ivory tower” politicians who are far from the real privacy issues. Lastly, the

recent updating of the privacy policy by Google, Inc., seems to underline a relevant weakness in the worldwide

context of the new proposed regulation.

Fabio Di Resta is an attorney at Di Resta law firm where he specialises in data protection and ICT law. Nicola

Fabiano is an attorney at Studio Legale Fabiano; counsel at Panetta & Associati, and a Privacy by Design

Ambassador, and specialises in privacy and ICT.

Read more by Fabio Di Resta:

The European Commission’s new proposed regulation: From “equipment/means” to “directed to” criterion

Page 25: —stakeholders ó—will be involved industry,iapp.org/media/pdf/publications/advisor_03_12_print.pdf · By Mehmet Munur, CIPP/US, Sarah Branam and Matt Mrkobrad, CIPP/US/G Changes

Getting to know a privacy pro The Assistant General Counsel and Director of Data Privacy at Xcel Energy talks privacy, smart meters and New Year’s resolutions.

By Angelique Carson, CIPP/US

Megan Hertzler’s path to privacy was sort of an accident. Starting off at the Minnesota Attorney

General’s Office as counsel to the Minnesota Public Utilities Commission (PUC), she advised the

commission on matters relating to PUC regulation in the state, but, she says, there wasn’t a

great emphasis on customer privacy back in 1997. After working in private practice for a time,

Hertzler went to work for Xcel Energy in 2009 and, in response to significant data breaches being reported in the

media, got down to work on privacy pretty quickly.

“My chief information officer at the time was walking the hallways of the law department at about 7 p.m., and I

was the only attorney still working,” Hertzler remembers. “He came into my office and sat down and said, ‘We will

not be the next TJX,’ (a retailer whose 2007 data breach resulted in the theft of 45.6 million credit and debit card

numbers), and I had no idea what he meant. I thought, ‘I’ll Google it later.’ He told me that he was becoming more

and more concerned about what we needed to do to be proactive around data security and privacy, and he

wanted someone else to be up at night worrying about it, as he put it.”

When Hertzler focused in on the matter, she found that there was more to do than she originally imagined when it

came to data protection.

“It wasn’t that we were noncompliant, but we certainly were not being proactive in identifying

emerging privacy issues,”’ she said.

In 2010, Hertzler pitched to management that it create a position dedicated exclusively to

privacy, one Hertzler has held ever since. Writing her own job description was somewhat

difficult, she recalls, because there really wasn’t a model.

“It’s not usual for this type of stand-alone position to be a part of utilities’ standard operations,” she said. “I think

that will change because, more and more, utilities have to be thinking about privacy and data security proactively

in order to stay ahead of emerging data risks. And also, with the growing awareness of privacy issues for customer

energy use information, utilities will have to respond to a growing number of questions from regulators and

customers on their privacy practices. It is best if you have someone that is accountable for all of these issues.”

The Privacy Advisor caught up with Hertzler to ask about the key privacy challenges utilities are facing today—

namely as they increasingly deploy smart meters capable of capturing granular data on consumer energy usage—

to get her predictions on what 2012 will bring and to learn a bit more about the life of a privacy professional.

The Privacy Advisor: When it comes to privacy issues involving customer data, how should utilities get

proactive?

Hertzler: Privacy discussions need to occur at all levels of the organization so that business need and customer

expectations are both considered when developing internal policy. A good example of this for Xcel Energy was the

effort we made in 2010 around customer information, including their energy usage information. We formed an

Page 26: —stakeholders ó—will be involved industry,iapp.org/media/pdf/publications/advisor_03_12_print.pdf · By Mehmet Munur, CIPP/US, Sarah Branam and Matt Mrkobrad, CIPP/US/G Changes

internal task force made up of representatives from all the areas of the company where customer information was

collected, maintained or used. It was a very broad and diverse group of individuals.

The task force was charged with developing Xcel Energy’s privacy principles for customer information. We spent

10 months identifying the privacy issues Xcel Energy was facing, including, for example, how the company was

using customer information in providing service, how we would handle a request for the customer’s data and

whether our response would be different if the request was directly from a customer or from an unrelated third

party. By identifying these privacy issues and looking to the existing body of work around privacy, we developed

principles that accommodated Xcel Energy’s use of the information to provide service, allowed us to process the

data in a fair and transparent way, and maintain the trust our customers placed in us when they gave us the

information. We then translated these privacy principles into our company policies and procedures. For example,

we developed a data classification standard specific to customer information and a process for authorizing release

of customer information to third parties, including identifying necessary informed consent requirements.

Our task force also considered the big picture issues, such as the role Xcel Energy should play within the utility

industry in the area of customer privacy. When, prompted by the development of the Smart Grid, the Colorado

PUC later issued proposed customer information privacy rules in December of 2010, our internal privacy work

ensured that we were ready to provide the PUC with thoughtful, practical feedback, using our privacy principles as

the basis for our comments in that rulemaking.

The Privacy Advisor: Should we really be concerned with the privacy implications of smart meter data? Or is it

all hype?

Hertzler: It’s not hype. More granular energy data can reveal information on how energy is used in the home,

which in turn could identify routines or practices by the individual user.

Historically, utilities have typically afforded some level of privacy to the customer’s energy usage information.

What the implementation of smart meters and other advanced meter technology has changed is that the data has

more uses and is perceived to be more valuable to a broader group of interests. Once upon a time, no one would

have asked for energy usage information except to understand their own energy bill. Now, because the data is

much more granular, it has many more potential uses. We get quite a few requests from a variety of non-

customers for both individual and aggregated usage data to understand things like carbon footprints, the success

of energy efficiency programs or even possible criminal activity. Before releasing the data, we have to consider

who is making the request, what their relationship is to the customer, whether they have a legal authority to

compel the data and whether the possibility of that request is even transparent to our customers. Five years ago,

we weren’t even thinking about these issues because we were not getting these types of requests.

We have seen a lot of interest from our customers in our privacy practices based on the ongoing dialog around

these issues. YouTube has hundreds of videos on privacy and health issues for smart meters, including discussions

on whether these meters act as illegal surveillance devices on what people do in their home. What a scary idea. I

respond to many of these customer inquiries by providing assurances that we will only use the data to service, and

that we will not release this information to others except in limited circumstances, such as when we are legally

required or with the customer’s knowledge and consent.

One thing to keep in mind is that utilities “get” the importance of maintaining trust with their customers.

Electricity is an essential service. It is also a highly regulated service, with considerable oversight by state and

federal agencies. This puts us in a very different posture from some other industries. We collect and use customer

information to provide electric service. We are not collecting data so that we can sell it to others for their business

Page 27: —stakeholders ó—will be involved industry,iapp.org/media/pdf/publications/advisor_03_12_print.pdf · By Mehmet Munur, CIPP/US, Sarah Branam and Matt Mrkobrad, CIPP/US/G Changes

purposes. Instead, our focus is to implement privacy controls (such as transparency and consent) that we believe

provide appropriate privacy protections and make the release of data for non-utility purposes subject to law or the

customer’s choice.

The Privacy Advisor: Will 2012 be the year that utilities really get it right when it comes to smart meter

privacy?

Hertzler: We are going to hear a lot more about this issue in 2012. For example, the National Association of

Regulatory Utility Commissioners (NARUC) issued a statement last summer recommending that all state

regulatory commissions consider privacy in the context of information collected from smart meters and advanced

metering technology. I believe that this recommendation has started a domino effect that we will start to realize

in 2012. Prior to NARUC’s announcement, a handful of states, such as California and Colorado, were already

proactively looking at the privacy implication of smart meter deployment. But the NARUC announcement really

put this topic on the map. I would expect that in 2012 you will see even more dialogue around smart meter

deployment and privacy among federal agencies, state regulatory commissions, regulated utilities and other

stakeholders. In fact, the recent IAPP web conference on smart grid privacy in which I participated was part of the

dialogue. Each state will make a determination as to what the outcome of this dialogue will be, but the hope is for

a fairly uniform approach to privacy and smart meter deployment issues across state lines.

The Privacy Advisor: Okay, enough about smart meters. If you weren’t working in privacy, what would you do

for work?

Hertzler: I would try and talk myself onto Anthony Bourdain’s “The Layover” so I could go around the world, eat,

drink and talk about how great—or not—the particular local cuisine was. I think his honest, unfiltered assessment

of the food he tries on the show is refreshing. I also like his wicked sense of humor.

The Privacy Advisor: Are you big on privacy in real life?

Hertzler: While I deal with social media issues at work, in my personal life I am not on Facebook or Twitter, and I

still mail my bills. My family and friends have offered to set up a Facebook or a LinkedIn account for me, thinking

that my absence from social media is a time issue rather than a deliberate avoidance in my personal life. Their

conclusion that I don’t have time probably has a lot of merit. My life is full. In other words, this choice of mine may

not be so much a principle as a survival mechanism.

The Privacy Advisor: Have you had good mentors within your career?

Hertzler: I’ve had fantastic mentors in my career, including with my present employer. I don’t think you can

advance in a career without the benefit of having people invest in you and share with you their wisdom, and so I’ve

been really fortunate. I would say early on in my career, when I was a law student, I had a woman attorney who

mentored me at a time when I really needed someone to provide me with perspective on my career, and that

experience was incredibly valuable to me. Once I graduated, we moved on to a less formal mentor relationship, to

more of a friendship role. She said, “You need to mentor others. That is all I am asking you to do in recognition of

what I’ve done for you.” I’ve mentored law students and non-lawyers throughout my career to give back, and

found the process extremely rewarding.

The Privacy Advisor: Any New Year’s resolutions?

Page 28: —stakeholders ó—will be involved industry,iapp.org/media/pdf/publications/advisor_03_12_print.pdf · By Mehmet Munur, CIPP/US, Sarah Branam and Matt Mrkobrad, CIPP/US/G Changes

Hertzler: It’s sort of a developmental goal for me this year: I am signed up for CIPP certification, even though I had

promised myself after taking the bar exam that I would never take another test again. My goal is to pass the exam

with a solid score. Wish me luck!

Page 29: —stakeholders ó—will be involved industry,iapp.org/media/pdf/publications/advisor_03_12_print.pdf · By Mehmet Munur, CIPP/US, Sarah Branam and Matt Mrkobrad, CIPP/US/G Changes

Elevating data privacy within governments

The Privacy Advisor asks the OECD’s Andrew Wyckoff to expound

During remarks at an event in Mexico City in November, the Organisation for Economic Co-operation and

Development’s (OECD) director of science, technology and industry, Andrew Wyckoff, said the matter of data

privacy needs to be elevated within governments.

The OECD event, “Current Developments in Privacy Frameworks: Towards Global Interoperability,” was held in

conjunction with the 33rd International Conference of Data Protection and Privacy Commissioners.

The Privacy Advisor caught up with Mr. Wyckoff to ask some follow-up questions.

The Privacy Advisor: Has data privacy moved into the upper tier of any governments? In which jurisdictions

does it seem to have the most elevated status?

Wyckoff: With the growing prominence of the Internet economy, social networking and data breaches that affect

millions, the importance of data privacy has grown and has begun to extend beyond privacy specialists to leaders.

As Secretary-General of the OECD Angel Gurría said at the ministerial meeting on the Future of the Internet

Economy, “The currency of the Internet economy is personal information.”

The growing visibility of this issue is reflected in an OECD report released in 2011 that documents the dramatic

change of scale in terms of the role of personal data in our economies, societies and, of course, our lives over the

last 30 years. This report notes that OECD members agree on the need to elevate the importance of privacy to

the highest levels in governments through national privacy strategies. The OECD signaled a high-level

commitment to this idea when the OECD secretary-general delivered remarks for a conference on Privacy

Frameworks in November 2011.

Privacy frameworks, however, are in flux around the world. A number of governments are conducting reviews of

existing legislation, others are agreeing to new legislation and still others are working on whole-of-government

national strategies. The reviews at the international level, at the EU and Council of Europe as well as OECD, signal

a greater sense of urgency to better adapt privacy to the realities of modern data protection.

The Privacy Advisor: Whose job is it to bring data privacy matters up on governmental agendas?

Wyckoff: Governments face a wide and difficult array of policy challenges today. Pushing privacy up the priority

list is not easy—in part because its cross-cutting nature implicates so many aspects of government and the private

sector. All stakeholders must raise awareness about the pressing need for greater attention to this issue. The

OECD’s recent Recommendation on Internet Policy Making contains a principle on the importance of engaging all

stakeholders in the policy-making process. This engagement is key to ensuring effective policies. Each

stakeholder group can raise the visibility and attention within its own community, the combined effect of which

will raise visibility of privacy on governmental agendas.

The Privacy Advisor: In a separate session in Mexico City, New Zealand Privacy Commissioner Marie Shroff

suggested that in the age of big data, data protection authorities (DPAs) should move into more of a leadership

role and be “motivators of governments and businesses.” Do you agree with this? If so, how can DPAs motivate

their governments to focus on privacy and data protection when so many seemingly more important concerns

persist?

Page 30: —stakeholders ó—will be involved industry,iapp.org/media/pdf/publications/advisor_03_12_print.pdf · By Mehmet Munur, CIPP/US, Sarah Branam and Matt Mrkobrad, CIPP/US/G Changes

Wyckoff: Data protection authorities—we call them privacy law enforcement authorities —play a vital role in

making privacy regimes effective. In 2007, the OECD produced a Council Recommendation to address the role of

these authorities and, more particularly, to improve cross-border co-operation in the enforcement of privacy laws.

But, as Commissioner Shroff suggests, these authorities can also play an important role in raising the visibility of

privacy among governments and businesses. One method for enhancing their visibility is to work together to

share best practices and collectively call attention to pressing issues. In terms of their international role, we have

sought to involve these authorities directly in the policy making work at the OECD by inviting the International

Conference of Data Protection and Privacy Commissioners as an observer to our Working Party on Information

Security and Privacy. This gives privacy authorities an independent voice at the OECD table alongside

government authorities—as well as business, civil society and the technical community—and an opportunity to

impress on policy makers their perspective on the issues.

—IAPP Staff

Page 31: —stakeholders ó—will be involved industry,iapp.org/media/pdf/publications/advisor_03_12_print.pdf · By Mehmet Munur, CIPP/US, Sarah Branam and Matt Mrkobrad, CIPP/US/G Changes

CANADA—Bill would require TSPs to facilitate lawful interception

By John Jager, CIPP/US, CIPP/C

On February 14, the government of Canada introduced a bill in the House of Commons that, if

passed, will require telecommunications service providers (TSPs) to implement capabilities to

facilitate lawful interception of information transmitted by telecommunications and to provide

basic information about their subscribers. As stated in Section 3, Bill C-30, entitled The

Protecting Children from Internet Predators Act, has as its purpose to “ensure that telecommunications service

providers have the capability to enable national security and law enforcement agencies to exercise their authority

to intercept communications and to require telecommunications service providers to provide subscriber and other

information, without unreasonably impairing the privacy of individuals…” The bill requires TSPs to “provide

intercepted communications to authorized persons” and “provide authorized persons with the prescribed

information that is in the possession or control of the service provider respecting the location of equipment used

in the transmission of communications.” This requirement extends to communication where the intercepted

communication is “encoded, compressed, encrypted or otherwise treated by the telecommunications service

provider.” In such a case, the TSP must provide the communications in the form it was before it was treated, but

the bill provides an exemption if the TSP would be required to develop or acquire decryption techniques or tools.

All current and future software for any transmission apparatus must have the capability to meet the requirements

of the bill. The minister, at the request of the Royal Canadian Mounted Police (RCMP) or the Canadian Security

Intelligence Service (CSIS), may order a TSP to comply with a number of obligations under the bill, including to

meet an operational requirement in respect of transmission apparatus operated by the service provider that it

would not otherwise be required to meet. In such a case, the bill provides that the RCMP or CSIS must pay the TSP

an amount—that the minister considers reasonable—towards the expenses considered necessary—by the

minister—that the TSP will incur to comply.

On written request by a designated person, which includes the RCMP, CSIS and police departments, the TSP must

provide the identifying information in respect of the name, address, telephone number and electronic mail

address of any subscriber to any of the service provider’s telecommunications services and the Internet protocol

address and local service provider identifier that are associated with the subscriber’s service and equipment. In

certain circumstances, such request may be made orally. No warrant or court order is required for a designated

person to make such a request to the TSP.

Wilful violations of the obligation to implement the capability to intercept communications may result in fines up

to a maximum of $500,000.

The bill has created, not surprisingly, a fair amount of negative reaction. The opposition parties strongly

expressed their concern about this bill, and media coverage has generally focused on the warrantless intrusion

into the online private lives of individuals. As the Conservative government of Prime Minister Stephen Harper has

a majority in both Houses, it is likely the bill will pass in much the same way it was tabled.

Page 32: —stakeholders ó—will be involved industry,iapp.org/media/pdf/publications/advisor_03_12_print.pdf · By Mehmet Munur, CIPP/US, Sarah Branam and Matt Mrkobrad, CIPP/US/G Changes

John Jager, CIPP/US, CIPP/C, is vice president of research services at Nymity, Inc., which offers Web-based privacy

support to help organizations control their privacy risk. He can be reached at [email protected].

Page 33: —stakeholders ó—will be involved industry,iapp.org/media/pdf/publications/advisor_03_12_print.pdf · By Mehmet Munur, CIPP/US, Sarah Branam and Matt Mrkobrad, CIPP/US/G Changes

FRANCE—CCTV systems diverted by employers

By Pascale Gelly, CIPP/E, and Caroline Doulcet

In two recent decisions, employers have been reminded of the limits of the use of CCTV systems in the workplace.

Following a complaint made by employees, the CNIL made an onsite investigation of a rather small company of

eight employees specializing in the supply of IT equipment for healthcare professionals. This company had

implemented no less than eight cameras with microphones and speakers on its premises; seven of them were

placed in rooms not available to the public—the workshop; the employees’ offices, where the cameras focused on

the computer screens and the employees themselves; the meeting room; the kitchen, and the corridor leading to

the offices.

These devices had actually been notified to the CNIL, as required by French data protection law. However, the

CNIL found in the course of its investigation that, contrary to what had been specified in the notification form, the

company did not use the CCTV to ensure the security of people and goods but to constantly monitor the activity

of employees by watching and listening to them all day long.

The CNIL, in its findings, pointed not only to this diversion of purpose of the CCTV devices but also to their

particularly disproportionate and intrusive use. It also noted, among others, the lack of retention policy for the

recordings and the lack of notice to employees. This resulted in a formal notice from the CNIL to the employer to

modify its devices within a given timeframe—here, six weeks. The CNIL decided to use its new power to publicize

its decision because of the seriousness of the breach, although no sanction has been decided yet, since the so-

called “litigation phase” before the CNIL will begin only if the company fails to comply within the allocated

timeframe.

The Supreme Court also had to deal with the issue of diversion of purpose of a CCTV device. In order to control

the working hours of its employees—cleaning staff working at a client’s site—a company had obtained from that

client the recordings of its CCTV cameras and used them to verify at which time the employees came in and out

and to make consistency checks with the entry logbook kept by the team manager.

On January 10, the Supreme Court (Social Chamber) considered such recordings as unlawful evidence. Indeed, the

purpose of the clients’ CCTV system was to ensure the security of people and goods, not to monitor employees.

Moreover, the concerned employees had not been informed of this CCTV system for monitoring purposes.

These two decisions show that, in France, the use of CCTV in the workplace is under stringent restrictions and

must be handled with care to remain within acceptable limits.

Page 34: —stakeholders ó—will be involved industry,iapp.org/media/pdf/publications/advisor_03_12_print.pdf · By Mehmet Munur, CIPP/US, Sarah Branam and Matt Mrkobrad, CIPP/US/G Changes

Pascale Gelly, CIPP/E, of the French law firm Cabinet Gelly, can be reached at [email protected].

Caroline Doulcet, of the French law firm Cabinet Gelly, can be reached at [email protected].

Page 35: —stakeholders ó—will be involved industry,iapp.org/media/pdf/publications/advisor_03_12_print.pdf · By Mehmet Munur, CIPP/US, Sarah Branam and Matt Mrkobrad, CIPP/US/G Changes

FRANCE—European regulation proposal under review at Parliament

By Pascale Gelly, CIPP/E, and Caroline Doulcet

On February 7, the Commission of European Affairs of the National Assembly—House of the Parliament elected

by the French people—has adopted a draft resolution in reaction to the proposal for European regulation on the

protection of personal data. It welcomes the objectives of modernization, harmonization and simplification of the

draft European regulation and the stress put on greater accountability of data controllers.

It acknowledges also a certain number of measures such as the right to be forgotten; the right to data portability;

consent based on positive action as opposed to silence or inaction, and mandatory appointment of DPO in public

sector and for companies above 250 employees.

However, the resolution points to provisions raising economical, political and legal concerns.

Some concerns are actually shared with the CNIL, the French Data Protection Authority—such resolution is in line

with the concerns previously raised by the CNIL—in particular regarding the competence given to the supervisory

authority (DPA) of the country of the main establishment of the data controller in the EU. The Commission of

European Affairs considers that DPAs must remain close to citizens, who must be provided with easy and rapid

means to defend their rights in their own country. It supports the alternative that a member state’s DPA must be

competent over any processing targeting specifically the population of that member state, wherever the data

controller is established.

The process of cooperation between DPAs, as described in the draft European regulation, is not sufficient to

provide effective and rapid solutions to data protection issues. For example, it is noted that in case of a security

breach, impacting several member states but notified only to the DPA having jurisdiction, it is not required that

such DPA informs the others. Concerns are also expressed with respect to sensitive data processing which would

require a “strengthened” cooperation for a more rigorous control.

It is also noted that with the centralization of powers in the hands of the European Commission, exclusively

competent for specifying the guidelines and implementation rules of the draft European regulation, the powers of

the DPAs will be necessarily limited. According to the Commission of European Affairs, a better balance should be

struck because of DPAs’ technical expertise on such matters.

The resolution is also in favor of stronger controls around international data transfers by preserving the possibility

for DPAs to provide prior authorizations.

Page 36: —stakeholders ó—will be involved industry,iapp.org/media/pdf/publications/advisor_03_12_print.pdf · By Mehmet Munur, CIPP/US, Sarah Branam and Matt Mrkobrad, CIPP/US/G Changes

Members of Parliament finally wish for the adoption of international instruments for the protection of personal

data beyond the geographic sphere of the European Union and call upon the French government to make sure

that the future, European framework better protects French citizens.

Similar review work is now ongoing at the Senate, the other house of the French Parliament, where a resolution

will be drafted and discussed on March 6.

Pascale Gelly, CIPP/E, of the French law firm Cabinet Gelly, can be reached at [email protected].

Caroline Doulcet, of the French law firm Cabinet Gelly, can be reached at [email protected].

Page 37: —stakeholders ó—will be involved industry,iapp.org/media/pdf/publications/advisor_03_12_print.pdf · By Mehmet Munur, CIPP/US, Sarah Branam and Matt Mrkobrad, CIPP/US/G Changes

FRANCE—Unlawful e-marketing campaign

By Pascale Gelly, CIPP/E, and Caroline Doulcet

The real-estate sector has once again caught the attention of the CNIL for questionable privacy practices. A

company had been sending text messages to numerous private property owners, without their consent, in order

to offer its real-estate analysis services. Several owners filed a complaint with the CNIL after having been unable

to unsubscribe several times.

After an onsite investigation, the CNIL found that the property owners’ contact details had been purchased from

providers of online real estate services who had merely captured advertisements. These were not opt-in lists of

contact details.

The CNIL pointed out that the company concerned, as a data controller, had to verify whether the contacts details

lists were opt-in. The real estate company could not hide behind the seller of the lists.

Moreover, the privacy notice provided to individuals was not compliant. The company unsuccessfully argued that

the elements of information legally required were too long to be included in text messages. The CNIL could refer

to other companies that managed to make the notice fit in one or two messages.

The CNIL noted also that the only means provided to data subjects to unsubscribe were not free of charge—need

to send a text message or to call a number—and not effective. Indeed, the requests for unsubscribing were not

taken into account by the company.

The CNIL sentenced the company to a pecuniary sanction of €20,000 and published its decision on the Internet.

Pascale Gelly, CIPP/E, of the French law firm Cabinet Gelly, can be reached at [email protected].

Caroline Doulcet, of the French law firm Cabinet Gelly, can be reached at [email protected].

Page 38: —stakeholders ó—will be involved industry,iapp.org/media/pdf/publications/advisor_03_12_print.pdf · By Mehmet Munur, CIPP/US, Sarah Branam and Matt Mrkobrad, CIPP/US/G Changes

GERMANY—Model consent and release from professional secrecy for the insurance

industry

By Flemming Moos

The Duesseldorfer Kreis, an informal association of the German data protection supervisory

authorities, and the German Insurance Association (GDV) have published an official model

consent and release from professional secrecy declaration for insurance companies. Such a

declaration is required whenever personal health data relating to an insured person or an

applicant shall be collected from third parties like hospitals and physicians, which is normally done for purposes of

risk assessment or verification of liability. The Duesseldorfer Kreis requests all insurance companies to replace

their currently used templates by declarations based on the model declaration.

A consent and release from professional secrecy is necessary because, under German law, the collection,

processing and use of personal health data relating to an insured person is subject to the conditions enshrined in

Sec. 213 Insurance Contract Act (Versicherungsvertragsgesetz), which requires opt-in consent.

The model declaration by the Duesseldorfer Kreis and the GDV provides for a certain extent of legal certainty in

this respect—especially as the German data protection supervisory authorities have agreed to them. This is

positive because, in practice, many of the consent wordings used by insurance companies in the past have not

been sufficient. Therefore, it is recommended to replace the currently used declarations. Yet, the model

declaration—even though it is eight pages long—still needs to be adapted to the particular case. Adaptions are

necessary in order to correctly reflect the data processing steps that actually take place at the individual insurance

company using the declaration; e.g., data collections from certain sources, data transfers to other group

companies and service providers, etc. Also, it must be indicated which health data are collected, processed and

used for which purpose. If the model declaration is not adapted carefully, considering all requirements under Sec.

213 Insurance Contract Act and the additional data protection law provisions, the consent and release from

professional secrecy declaration might be invalid and the processing of the health data would be unlawful.

Flemming Moos is a partner at Norton Rose in Germany and a certified specialist for information technology law. He

chairs the IAPP KnowledgeNet in Hamburg and can be reached at [email protected].

Page 39: —stakeholders ó—will be involved industry,iapp.org/media/pdf/publications/advisor_03_12_print.pdf · By Mehmet Munur, CIPP/US, Sarah Branam and Matt Mrkobrad, CIPP/US/G Changes

ITALY—Simplification interim law decree impact on DPS

By Rocco Panetta

On January 27, the interim law decree on urgent measures on simplification and development,

known as the “Simplification package,” was adopted by the Italian government. It provides for

further amendments toward the Legislative Decree of June 30, 2003, n. 196—the Personal Data

Protection Code. This is the third change in the data protection legislation passed in the last 12

months in Italy. The Simplification package, due to its nature of interim rule, should be confirmed by the

Parliament within 60 days; otherwise, it will expire with no further effect. At the time of this writing, it is also still

pending the publication of the Simplification package on the Italian Official Gazette. As a consequence, the

interim rule is formally not yet in force, and we have been provided with a provisional draft only.

However, if confirmed by the Parliament, the Simplification package would introduce into the system significant

changes with respect to one important data protection obligation and requirement: the security policy document

(DPS). In fact, according to the draft version of Section 47 of the Simplification package, “Paragraph 1, Letter G

and even Paragraph 1-bis of Section 34 are deleted,” and “within the technical specifications concerning minimum

security measures, referred to in Annex B, Paragraphs from 19 to 19.8 and 26 are cancelled.” In turn, Section 34 of

the code in force set forth that the personal data processing carried out throughout electronic means is only

allowed if all required minimum security measures are adopted, including the “keeping an up-to-date security

policy document.”

According to the Simplification package’s amendments, the data controller will no longer have to comply with the

duty of keeping and updating the DPS, since such a document would not represent any more a minimum security

measure that the data controller is required to comply with and adopt “in order to ensure a minimum level of

personal data protection.”

What such amendments imply

Under a practical point of view, the immediate effect of such a change—we recall again that the law decree will

come into force only after its publication on the Italian Official Gazette and for an interim period of 60 days until

the confirmation of the Parliament by means of the approval of an ad-hoc law—would imply the cancellation of

the mandatory obligation to draw up, keep and update the DPS on an annual basis every March 31.

Does such a change represent a substantial simplification in terms of security measures?

Under a general point of view, yes. Unfortunately, the DPS is just one of the various material activities required,

and its deletion does not imply any decreasing of the level and number of all mandatory requirements, obligations

and suggestions provided for by the code with respect to personal data processing and relevant security measures

to be adopted. In other words, each of us is aware about the function of the DPS; it has been and it is still a sort of

summon or picture of privacy policy adopted within and by a company.

Page 40: —stakeholders ó—will be involved industry,iapp.org/media/pdf/publications/advisor_03_12_print.pdf · By Mehmet Munur, CIPP/US, Sarah Branam and Matt Mrkobrad, CIPP/US/G Changes

Once prepared and rolled out, the DPS is a helpful tool aimed at providing to data protection officers, data

processors and persons in charge of data processing useful guidance that lists and collects all mandatory policies

and measures to be adopted.

In case of confirmation of the Simplification package, the DPS will be no longer required, but the relevant policies,

measures, requirement and obligations underneath and duly described in and attached to it will still be. More in

detail, we want to point out that, even though the DPS and related reference section of Annex B of the code

would be definitively cancelled, the further legal requirements that the data controller must comply with are still

all mandatory for all companies.

Can the company set definitively aside the DPS and its content?

The answer is yes in principle but no in concrete. In fact, our suggestion is not to definitely set aside a useful and

virtuous document like the DPS, especially for those companies that have invested time and money in it for years.

For sure, we will take benefits from the abolition of the annual updating, so that in the future we will be free from

being required to update the DPS by March 31—and running to meet the deadline. But, we want to recall once

again the purpose of such a document.

The DPS describes the organizational and management structure of the company as well as providing a clear

description of all kinds of personal data processed—common, sensitive or judicial data, and describes accurately

all kind of security measures—organization, physical, logical and informatics, which must be adopted by each

company. Therefore, even though the DPS could no longer represent a mandatory requirement for the data

controllers in the near future, its drafting and updating represent always the optimal vehicle in order to

summarize and consider all the existing security measures within each company, which otherwise would be

illogically allocated within the different functions of the company, and in absence of the DPS, their reviewing and

natural/necessary updating would become time- and cost-consuming activities.

At the same time, it is also the optimal vehicle in order to allow the specialized corps of the financial and fiscal

police—which are in charge of checking compliance with the provisions of Data Protection Code—to carry out

easily the due inspections with a strong reduction of risks of noncompliance for the company.

In conclusion, our opinion is that Section 47, Letters B and C of the Simplification package, if confirmed, would

likely generate a visible dyscrasia and confusion in the system rather than simplifying the personal data protection

requirements. In other words, the Italian legislature could have simplified the DPS’s drafting rather than excluding

it from the minimum security measures.

Our suggestion is, however, to always save and keep best practices in use within the company in order to avoid

the risk of severe sanctions—administrative and criminal—and/or request of compensations for data breaches and

relevant noncompliance with legislation in force at a national and EU level.

Rocco Panetta is an Italian lawyer and partner of Panetta & Associati Studio Legale in Rome. He is the former head

of legal at the Italian Data Protection Authority and a member of the IAPP Europe Advisory Board.

Page 41: —stakeholders ó—will be involved industry,iapp.org/media/pdf/publications/advisor_03_12_print.pdf · By Mehmet Munur, CIPP/US, Sarah Branam and Matt Mrkobrad, CIPP/US/G Changes

POLAND—Reform of Polish data protection law

By Joanna Tomaszewska

The amended provisions of the Polish Data Protection Act of 29th August 1997 entered into

force on 1 January. These reforms were necessitated by the enactment of new rules governing

the exchange of information between the law enforcement authorities of EU member states.

Transfers of personal data to third countries

The most significant changes to the act concern the provisions governing the transfer of personal data to third

countries; i.e., outside European Economic Area. Pursuant to the newly worded provisions, transfers of personal

data to a third country may only occur if the country of destination ensures an adequate level of personal data

protection in its territory. The amendment removes previous references to guarantees of the protection of

personal data at least the same as that in force in the territory of Poland. The criterion of an “adequate level of

protection” has been clearly defined—implementing the wording of Article 25 Paragraph 2 of the directive

95/46/EC.

Accordingly, the act clearly states that the adequacy of the level of personal data protection should be evaluated

taking into account all the circumstances concerning a data transfer operation, in particular the nature of the

data, the purpose and duration of the proposed data processing operations, the country of origin and the country

of final destination of the data as well as the legal provisions being in force in a given third country and the

security measures and professional rules applicable in that country. Despite this amendment, it seems that these

changes will not have a considerable impact on the practice of the Polish Data Protection Authority (DPA) when

evaluating an application seeking permission to transfer personal data to a third country which does not ensure an

adequate level of protection—a so-called DPA permit. This is due to the fact that the legislative provisions

governing DPA permits still refer to the transfer of personal data to a third country which fails to ensure at least

the same level of personal data protection as that in force in the territory of Poland.

Exception from the duty to register with the DPA

Other changes to the act include the introduction of an exception from the duty to register a data filing system

with the DPA where such data is processed by relevant organs on the basis of provisions governing the exchange

of information between the law enforcement authorities of EU member states.

Personal data of persons conducting economic activity

1 January also witnessed the repeal of earlier legal provisions of the Law on Economic Activity stating that

personal data; i.e., personal data of entrepreneurs being physical persons, contained in the business activity

register—now replaced by the central record on economic activity—fell outside the scope of protection of the act.

Therefore, currently, information identifying persons conducting economic activity, if in factual circumstances,

they constitute personal data within the meaning of the act, are subject to the provisions of the act.

Page 42: —stakeholders ó—will be involved industry,iapp.org/media/pdf/publications/advisor_03_12_print.pdf · By Mehmet Munur, CIPP/US, Sarah Branam and Matt Mrkobrad, CIPP/US/G Changes

Consequently, controllers of personal data of persons conducting economic activity will need to ensure that such

processing is done in compliance with the act. These would appear to be the most important practical changes

having entered into force in 2012.

Joanna Tomaszewska is an associate in the Intellectual Property, New Technologies and Protection of Information

Department of Spaczyński, Szczepaniak&Wspólnicy, Warsaw Office. She has experience in data protection and

privacy law, information technology law, media and advertising and intellectual property matters. She can be

reached at [email protected].

Page 43: —stakeholders ó—will be involved industry,iapp.org/media/pdf/publications/advisor_03_12_print.pdf · By Mehmet Munur, CIPP/US, Sarah Branam and Matt Mrkobrad, CIPP/US/G Changes

Government agencies given tools for assessing third-party websites

By Jedidiah Bracy, CIPP/US

With the increased use of mobile devices, parts of both the healthcare and marketing sectors are

offering ways to increase user awareness of the privacy and security of personal information by

offering best practice guidance and recommendations for user-friendly language in policy

statements.

As part of its Privacy and Security Mobile Device project, the National Coordinator for Health Information

Technology’s Office of the Chief Privacy Officer, together with the U.S. Department of Health and Human

Services’ (HHS) Office for Civil Rights, is developing “good practices” to increase health information security on

mobile devices.

The project builds on the existing HIPAA Security Rule-Remote Use Guidance. “There have been a number of

security incidents,” states the security rule, “related to the use of laptops, other portable and/or mobile devices

and external hardware that store, contain or are used to access Electronic Protected Health Information under the

responsibility of a HIPAA-covered entity.”

In addition to providing privacy and security “good practices,” the project aims to communicate its findings to

healthcare professionals in streamlined and readable text.

HHS officials are also looking for input and will be hosting a roundtable on the subject in the coming months.

Similarly, after receiving industry input during a public comment period that ended last November, the Mobile

Marketing Association (MMA), a mobile industry trade association, has released finalized guidelines that provide

developers with a framework to promote more secure user experiences.

The MMA states that more than 58 percent of U.S. mobile device users are concerned about unauthorized access

to their personal information.

MMA Global CEO Greg Stuart said, “Mobile app developers asked for clear, transparent policy language that

consumers can quickly and fully understand.” The release, he says, is “the first in a series of privacy policy

guidelines that the MMA is creating with input from industry leaders” to give “the app development community

the meaningful support they need.”

The MMA Mobile Application Privacy Policy guidelines offer streamlined recommendations revolving around

“core privacy principles” and easy-to-read language for consumers. Among their suggestions, the guidelines offer

ways to educate consumers on how their data is collected and used as well as ways to appropriately secure the

collected data.

In a statement, MMA Privacy & Advocacy Committee Co-Chair Alan Chapell, CIPP/US, said, “Our guidelines offer

developers the foundation from which to craft a document that reflects the privacy practices of each of their apps

and helps them stay in compliance with applicable law and industry standards,” adding, “We urge app developers

to consult with their legal counsel when adapting these guidelines for their purposes.”

Page 44: —stakeholders ó—will be involved industry,iapp.org/media/pdf/publications/advisor_03_12_print.pdf · By Mehmet Munur, CIPP/US, Sarah Branam and Matt Mrkobrad, CIPP/US/G Changes

Grant helps professors create data-masking technology to ease tension between research

needs and privacy laws

By Angelique Carson, CIPP/US

Medical researchers often contend with a competing requirement when it comes to progress:

protecting privacy. Under the Health Insurance Portability and Accountability Act (HIPAA),

healthcare providers, health insurers and healthcare clearinghouses often have to obtain

additional documentation before disclosing health information to outside parties, making

researchers’ data collection practices cumbersome.

Thanks to a National Institutes of Health (NIH) grant and two researchers at the University of Massachusetts

Lowell’s Manning School of Business, some of that difficulty may soon be resolved.

The $700,000 NIH grant will help Profs. Xiaobai Li and Lutvai Motiwalla and two assistants create technology that

will resolve the tensions between adequate healthcare privacy provisions and research needs.

Called data-masking technology, it will allow researchers to access more meaningful data without having to

comply with rigid consent requirements under HIPAA. Healthcare providers must remove 18 points of sensitive

data—such as names, Social Security numbers and dates of birth before releasing information, unless an

established legal agreement precludes such requirements. But with data-masking technology, only five or six data

points must be removed.

“So it improves data quality, because more variables are released,” said Motiwalla. “It’s a win-win for both the

people sharing (the data) as well as people who are using the data for analysis, because the people releasing it

don’t have to be afraid of data compliance, and identity cannot be revealed.”

While data that’s been de-identified according to HIPAA’s standards can lawfully be shared without specific

authorizations, such data often proves less useful in conducting meaningful, comprehensive research, the

professors said.

“To analyze the data, you need to provide information,” said Li. “But at the same time, you want to protect the

individuals—like patients or even doctors. That’s the kind of tension that we try to address.”

The entire process starts with the data controller.

“So if a researcher wants data from a hospital, the hospital will run through the query and remove data from the

database,” Motiwalla explains. “Then, our software would be applied to the data, which would then mask it, and

the masked data would be released to the third party.”

Data masking is different from encryption, however, and addresses a problem thus far unresolved.

“If you encrypt the data, you cannot do any statistical analysis,” said Li.

Data masking can work in a couple of different ways. One is called statistical protobation. By either applying

statistical methods to the data or, for example, adding the same number to each data subject’s age, the data is no

Page 45: —stakeholders ó—will be involved industry,iapp.org/media/pdf/publications/advisor_03_12_print.pdf · By Mehmet Munur, CIPP/US, Sarah Branam and Matt Mrkobrad, CIPP/US/G Changes

longer a true value, but statistical. Alternatively, data swapping can be used, in which one person’s age, for

example, would be swapped with another in the database.

“So what is released of the data, in terms of age or date of birth, will not be a true value but in terms of a statistical

property data, it’s still the same,” Motiwalla said. Data can also be generalized so that instead of a street address,

a zip code with the first four digits truncated is used.

Motiwalla and Li are now in the process of collecting patient data available under HIPAA laws in order to identify

vulnerabilities. They hope to collect data from organizations in sectors other than healthcare as well, as data

masking could potentially be employed for myriad applications in a number of sectors, including for non-research

purposes, such as simply to keep databases safe from hackers or data loss.

Motiwalla and Li encourage organizations interested in getting involved to contact them.

In the end, say Motiwalla and Li, their purpose is to find a way for research to thrive while still maintaining

appropriate privacy protections for individuals.

“With data masking, that balance is possible,” said Li.

Page 46: —stakeholders ó—will be involved industry,iapp.org/media/pdf/publications/advisor_03_12_print.pdf · By Mehmet Munur, CIPP/US, Sarah Branam and Matt Mrkobrad, CIPP/US/G Changes

Marketing and healthcare sectors developing mobile device guidance

By Jedidiah Bracy, CIPP/US

With the increased use of mobile devices, parts of both the healthcare and marketing sectors are

offering ways to increase user awareness of the privacy and security of personal information by

offering best practice guidance and recommendations for user-friendly language in policy

statements.

As part of its Privacy and Security Mobile Device project, the National Coordinator for Health Information

Technology’s Office of the Chief Privacy Officer, together with the U.S. Department of Health and Human

Services’ (HHS) Office for Civil Rights, is developing “good practices” to increase health information security on

mobile devices.

The project builds on the existing HIPAA Security Rule-Remote Use Guidance. “There have been a number of

security incidents,” states the Security Rule, “related to the use of laptops, other portable and/or mobile devices

and external hardware that store, contain or are used to access Electronic Protected Health Information under the

responsibility of a HIPAA-covered entity.”

In addition to providing privacy and security “good practices,” the project aims to communicate its findings to

healthcare professionals in streamlined and readable text.

HHS officials are also looking for input and will be hosting a roundtable on the subject in the coming months.

Similarly, after receiving industry input during a public comment period that ended last November, the Mobile

Marketing Association (MMA), a mobile industry trade association, has released finalized guidelines that provide

developers with a framework to promote more secure user experiences.

The MMA states that more than 58 percent of U.S. mobile device users are concerned about unauthorized access

to their personal information.

MMA Global CEO Greg Stuart said, “Mobile app developers asked for clear, transparent policy language that

consumers can quickly and fully understand.” The release, he says, is “the first in a series of privacy policy

guidelines that the MMA is creating with input from industry leaders” to give “the app development community

the meaningful support they need.”

The MMA Mobile Application Privacy Policy guidelines offer streamlined recommendations revolving around

“core privacy principles” and easy-to-read language for consumers. Among their suggestions, the guidelines offer

ways to educate consumers on how their data is collected and used as well as ways to appropriately secure the

collected data.

In a statement, MMA Privacy & Advocacy Committee Co-Chair Alan Chapell, CIPP/US, said, “Our guidelines offer

developers the foundation from which to craft a document that reflects the privacy practices of each of their apps

and helps them stay in compliance with applicable law and industry standards,” adding, “We urge app developers

to consult with their legal counsel when adapting these guidelines for their purposes.”

Page 47: —stakeholders ó—will be involved industry,iapp.org/media/pdf/publications/advisor_03_12_print.pdf · By Mehmet Munur, CIPP/US, Sarah Branam and Matt Mrkobrad, CIPP/US/G Changes