inf529: security and privacy in informatics -...
TRANSCRIPT
INF529: Security and Privacy
In Informatics
Civil Law and Privacy
Prof. Clifford Neuman
Lecture 1029 March 2019OHE 100C
Course Outline
• What data is out there and how is it used
• Technical means of protection
• Identification, Authentication, Audit
• The right of or expectation of privacy
• Government and Policing access to data – February15th
• Mid-term, Then more on Government, Politics, and Privacy
• Social Networks and the social contract – March 1st
• Big data – Privacy Considerations – March 8th
• Criminal law, National Security, and Privacy – March 22nd
• Civil law and privacy – March 29th
• International law and conflict across jurisdictions – April 5th
• The Internet of Things – April 12th
• Technology – April 19th
• The future – What can we do – April 26th
This Week- Civil Law and Privacy
• Arjun Raman – CCPA and GDPR
• Sevanti Nag – Measurement of Pivacy in Social Media
• Ahmed Qureshi - Monetization of PII
April 5th – International Aspectsof Security and Privacy
• Mindy Huang
• Abdulla Alshabanan
• Anupama Sakhalkar
• Brianna Tu – Internet of Things
April 12 - Internet of Things
• Lance Aaron - Smart Assistants
• Yulie Felice - Amazon Alexa Security
• Sophia Choi – RFID, USN, M2M
• Jairo Hernandez – Security&Privacy of NFC
• Ann Bailleul – Privacy implication for IoT
April 19th Medical IoT and Technology
Security, Privacy and Safety of Medical Devices and
technology.
• Fumiko Uehara
• Joseph Mehltretter
• Abdullah Altokhais
Facial Recognition and related technologies
• Louis Uuh – Facial Recognition
Security and Privacy in Messaging Technologies
• Aaron Howland
April 26th – The Future of Privacy
Guest lecture on differential Privacy– Prof. Aleksandra Korolova
Technology, Training, Legislation
– Charlene Chen • Right to be Forgotten and the future of privacy
– Kate Glazko
What is Civil Law
• Civil law is concerned with private relations
between parties rather than criminal complaints
by a government against an individual.– This is in contrast to criminal law.
– Includes contract law.
– Includes tort law.
• If a tort (wrong) is committed we may be able to
settle or litigate over actual, punitive, or
stipulated damages, for “specific performance”,
or injunctive relief.
Civil Law and Privacy
• Contracts and privacy and security– Privacy policy statement
• Discovery and Privacy
• Laws protecting privacy of consumers– HIPAA
– FERPA (Buckley Amendment)
– Fair Credit Reporting Act
– Others
– Regulations by FTC (and at one point FCC)
– Data Breach Notification Laws
Contracts and Privacy
We enter into contracts all the timeSigning contracts for services or good
Consenting to terms of use on websites
Installing software (EULAs)
Such agreements set the terms of our activityWe can give away some rights to privacy
They may spell out what our “expectations” are
They can limit the damages we can collect
They can determine how and where to litigate
Certain terms can still be found unenforceable for a
variety of reasons.
Enforcement of Contracts
Probably easier against the writer of such agreements, if
acceptance was “implied”.– But usually the terms with respect to privacy tend to disclaim
expectations of privacy, so no damages to demonstrate, and
other damages usually limited by the terms of the
agreements.
– Litigation can be initiated by injured parties, class actions, or
by government agencies in some cases (e.g. FTC).
– Terms of such agreements can’t allow either party to “break
the law” or violate other regulations, but they can change how
certain breaches are to be treated (e.g. opt-in)
– Deceptive trade practices…(can provide alternative remedy)
Discovery
When bringing suit (litigating) civil matters, all parties
have the right to compel disclosure of facts that may
benefit their case.– The process of forcing disclosure of such information is
called Discovery.
– If you are a party to the suit then you may be required to
produce “discoverable” information.• A good reason not to keep some things to begin with.
• A good reason to have a data retention/destruction policy– It is illegal to destroy the data after you have reason to believe that
it will become subject to discovery.
• Third party doctrine applies– Data about you may be obtained from third parties
– You may have an opportunity to object to such disclosure, but not
always.
Legislation/Regulations
• Laws protecting privacy of consumers– HIPAA
– FERPA (Buckley Amendment)
– Fair Credit Reporting Act
– Others
– Regulations by FTC (and at one point FCC)
– Data Breach Notification Laws
Health Insurance Portability andAccountability Act
• Health Insurance Portability & Accountability Act
of 1996 (45 C.F.R. parts 160 & 164).
Provides a framework for:
• Nationwide protection of patient confidentiality
• Security of electronic systems
• Standards for electronic transmission of health
information.
HIPAA Privacy Rule
• The HIPAA Privacy Rule establishes national
standards to protect individuals' medical
records and other personal health information
and applies to health plans, health care
clearinghouses, and those health care
providers that conduct certain health care
transactions electronically.– Defines Protected Health Information (PHI)
– Provides outline for protecting PHI
HIPAA Security Rule
• The HIPAA Security Rule establishes national
standards to protect individuals’ electronic
personal health information that is created,
received, used, or maintained by a covered
entity. The Security Rule requires appropriate
administrative, physical and technical
safeguards to ensure the confidentiality,
integrity, and security of electronic protected
health information.– Relationship to NIST Framework
Protected Health Information
Protected Health Information (PHI) is individually identifiable health information that is:
– Created or received by health care provider, health plan, employer, or clearinghouse that:
• Relates to the past, present, or future physical or mental health or condition of an individual;
• Relates to the provision of health care to an individual• Or payment for provision of health care
Includes information in health record such as:– Encounter/visit documentation
– Lab results
– Appointment dates/times
– Invoices
– Radiology films and reports
– History and physicals (H&Ps)
– Patient Identifiers•
Patient Identifiers (examples)
• Names• Medical Record Numbers• Social Security Numbers• Account Numbers• License/Certification numbers• Vehicle Identifiers/Serial
numbers/License plate numbers
• Internet protocol addresses• Health plan numbers• Full face photographic images
and any comparable images
• Web universal resource
locaters (URLs)
• Any dates related to any
individual (date of birth)
• Telephone numbers
• Fax numbers
• Email addresses
• Biometric identifiers including
finger and voice prints
• Any other unique identifying
number, characteristic or code
Minimum Necessary Disclosure
– To use or disclose/release only the minimum necessary to accomplish intended purposes of the use, disclosure, or request.
– Requests from employees at Organization:• Identify each workforce member who needs to access
PHI.• Limit the PHI provided on a “need-to-know” basis.
– Requests from individuals not employed at Organization:• Limit the PHI provided to what is needed to accomplish
the purpose for which the request was made.
Costs of HIPAA Breaches
HHS announces first HIPAA breach settlement involving less than 500 patients
• Hospice of North Idaho settles HIPAA security case for $50,000 - The Hospice of
North Idaho (HONI) has agreed to pay the U.S. Department of Health and
Human Services’ (HHS) $50,000 to settle potential violations of the Health
Insurance Portability and Accountability Act of 1996 (HIPAA) Security Rule. This
is the first settlement involving a breach of unsecured electronic protected health
information (ePHI) affecting fewer than 500 individuals.
• The HHS Office for Civil Rights (OCR) began its investigation after HONI
reported to HHS that an unencrypted laptop computer containing the electronic
protected health information (ePHI) of 441 patients had been stolen in June
2010. Laptops containing ePHI are regularly used by the organization as part of
their field work. Over the course of the investigation, OCR discovered that HONI
had not conducted a risk analysis to safeguard ePHI. Further, HONI did not have
in place policies or procedures to address mobile device security as required by
the HIPAA Security Rule. Since the June 2010 theft, HONI has taken extensive
additional steps to improve their HIPAA Privacy and Security compliance
program.
FERPA
• The Family Educational Rights and Privacy
Act (FERPA) (20 U.S.C. § 1232g; 34 CFR
Part 99) is a Federal law that protects the
privacy of student education records. The law
applies to all schools that receive funds under
an applicable program of the U.S. Department
of Education. The law gives parents or
"eligible students" (those who are over 18
years old) certain rights with respect to a
student's educational records.
FERPA
• he Family Educational Rights and Privacy Act
(FERPA), a Federal law, requires that
Schools, with certain exceptions, obtain your
written consent prior to the disclosure of
personally identifiable information from your
education records.
FERPA – Directory Information
• Directory information is information
contained in a student's education record that
would not generally be considered harmful or
an invasion of privacy if disclosed. FERPA
requires each institution to define its directory
items, and such information may be quite
broad. Students can request non-disclosure
of such directory information.
• USC’s FERPA Notices
• Personally Identifiable Information (PII)• SSN, Drivers License, Credit/Debit Card,
Passport Number, Banking Records, DOB, Parents Information
• Protected Health Information (PHI)• Medical Record, Health status, provision and
payment of health care• FERPA controlled information• Financial Information
Universities have lots of Data
Slide by Punith Shetty
Inf529 Spring 2016
What does USC follow?• Education records: Under the Family Educational Rights and Privacy Act of 1974,
or “FERPA,” USC may not disclose records relating to a student without the student’s written permission. “Directory” information is an exception.
• Health information: Under the Health Insurance Portability and Accountability
Act of 1996, or “HIPAA,” USC may not use or release identifiable health information without the patient’s written authorization. USC must notify patients and the federal government if there is a breach of patient information.
• Personal information: Under California law, USC must protect personal
information including name and any of the following data: – Social Security number –Drivers license number – Credit card and pin number – Medical information
• Customer information: Under a federal law known as the Gramm-Leach Bliley
Act, USC must protect personally identifiable financial information that it collects about an individual in connection with providing a financial product or service such as financial aid or faculty housing loans.
• Research records: These records may be protected by copyright, trademark, trade
secret, patent, or other intellectual property laws.
Slide by Punith Shetty
Inf529 Spring 2016
Security Measures• Password & Log-in Monitoring
• Use a combination of upper case, lower case, numbers, and special characters
• Use misspelled words• Numeric substitutions• Configuring your system to log off automatically is a relatively simple task• DUO, Two Factor Authentication
• Malicious software and security reminders• Viruses, Worms, Spyware, Trojan Horse, Social Engineering, Phishing
• Physical security• Door Locks, paper shredder, Safes. Laptop security, Mobile security
• Encryption (laptop, USB, external hard drive)
Slide by Punith Shetty
Inf529 Spring 2016
Fair Credit Reporting Act
• The Fair Credit Reporting Act, 15 U.S.C. §
1681 (“FCRA”) is U.S. Federal Government
legislation enacted to promote the accuracy,
fairness, and privacy of consumer information
contained in the files of consumer reporting
agencies. The FCRA regulates the collection,
dissemination, and use of consumer
information, including consumer credit
information.
Fair Credit Reporting Act
• Defines purposes for which such reports may
be requested.
• Imposes requirements on the reporting
agencies, users of reports, and furnishers of
information.– Including Notice requirements.
– Permissible purposes.
– Ability to correct information.
Federal Trade Commission
• The FTC has been the chief federal agency on privacy policy and
enforcement since the1970s, when it began enforcing one of the first
federal privacy laws – the Fair Credit Reporting Act. Since then, rapid
changes in technology have raised new privacy challenges, but the
FTC’s overall approach has been consistent: The agency uses law
enforcement, policy initiatives, and consumer and business education to
protect consumers’ personal information and ensure that they have the
confidence to take advantage of the many benefits of the ever-changing
marketplace.– FTC's Privacy Report: Balancing Privacy and Innovation
– The Do Not Track Option: Giving Consumers a Choice
– Making Sure Companies Keep Their Privacy Promises to Consumers
– Protecting Consumers’ Financial Privacy
– The Children’s Online Privacy Protection Act (COPPA)
FCC Broadband Privacy (rescinded)
• In 2016 the FCC released rules to protect
consumer broadband privacy. As discussed,
congress rescinded these rules, leaving the
issue to be covered by state law, privacy
policies and contract provisions, and the FTC.
Data Breach Notification Laws
• “Forty-seven states, the District of Columbia, Guam, Puerto Rico and the Virgin
Islands have enacted legislation requiring private or governmental entities to
notify individuals of security breaches of information involving personally
identifiable information.” (according to national conference of state legislatures)
• Security breach laws typically apply to particular classes of business and define
personally identifiable information such as name combined with SSN, drivers
license or state ID, account numbers. They also define what constitutes a breach
(e.g., unauthorized acquisition of data); requirements for notice; and exemptions
based on whether the information was encrypted, if disclosure would impede law
enforcement investigations, etc.
• Federal law is by sector (e.g. FERPA, HIPAA, etc). See comparison of various
laws.
California Consumer Privacy Act (CCPA) and related modular emerging technologyINF529 - SECURITY AND PRIVACY IN INFORMATICS – SPRING 2019
PRESENTED BY: ARJUN G. RAMAN
CONTACT: [email protected]
Objectives & Assumptions
Presentation Assumptions:
❖Use class notes / lectures as a common basis of knowledge
❖Leverage trustworthy public information sources
❖Conduct project interview(s) as applicable to tap into subject matter expertise within industry
❖Audience is familiar with information privacy history and masters students
❖20 mins of presentation / 10 mins Q&A
The main objectives of presentation are:
❖Provide background context of information privacy and leading to California Consumer Privacy Act (CCPA) overview
❖Compare CCPA vs. GDPR and highlight challenges organizations will face
❖Highlight three relevant success technology areas and alternatives that are being built / implemented to meet privacy
❖Considerations for the future of privacy legislation in USA (time permitting)
Information Privacy LandscapeHISTORICAL VIEW SINCE 19 TH CENTURY
Information Privacy Path – View
High
LowTime
Information PrivacyTreats / Triggers from Technology
High
Low
Level of InformationPrivacy
Level of TechnologyIntegration and Regulations
Legislation Landscape – 19th Century to 1960
Time Period Events / Policy Threats / Triggers
19th Century 4th and 5th AmendmentsPrivacy of the Body - Union Pacific Railway Co. v. BotsfordWarren and Brandeis’s article “The Right to Privacy.”Roberson v. Rochester Folding Box Co
Census and Government RecordsThe MailTelegraph Communications
1900 - 1960 Warren and Brandeis’s Privacy Torts• Early Recognition• William Prosser and the Restatement
• Intrusion upon Seclusion• Public Disclosure of Private Facts• False Light• Appropriation
The Emergence of the Breach of Confidentiality TortThe Fourth Amendment: Olmstead v. UnitedStatesFederal Communications Act Section 605The FBI and Increasing Domestic Surveillance Freedom of Association and the McCarthy Era
Social Security System – 1935Government record systems growTelephone and Wiretapping
Source: GW School of Law – History of Information Privacy
Legislation Landscape – 1960s & 1970s
Time Period Events / Policy Threats / Triggers
1960s – 1970s New Limits on Government Surveillance• Fourth Amendment Resurgence: Katz v. United States• Title III of the Omnibus Crime and Control Act of 1968The Constitutional Right to Privacy• Decisional Privacy: Griswold v. Connecticut• Information Privacy: Whalen v. RoeFreedom of Information Act of 1966 & Fair Information Practices (FIPS)Katz vs. United States – 1967Omnibus Crime Control and Safe Streets Act of 1968Privacy Act of 1974 & Family Educational Rights and Privacy Act of 1974Foreign Intelligence Surveillance Act of 1978Financial Privacy• Fair Credit Reporting Act of 1970• Bank Secrecy Act of 1970• United States v. Miller• Right to Financial Privacy Act of 1978
Rise of the ComputerInformation privacy at personal device levelWiretapping advancementThe Retreat from BoydThe Narrowing of the Fourth Amendment
Source: GW School of Law – History of Information Privacy
Legislation Landscape – 1980s & 1990s
Time Period Events / Policy Threats / Triggers
1980s Privacy Protection Act of 1980Cable Communications Policy Act of 1984Computer Matching and Privacy Protection Act of1988Employee Polygraph Protection Act of 1988Video Privacy Protection Act of 1988Electronic Communications Privacy Act of 1986OECD Guidelines and International Privacy
Receding Fourth Amendment ProtectionThe Growth of Federal Privacy Statutory Protection
1990s Telephone Consumer Protection Act of 1991[Driver’s Privacy Protection Act of 1994Health Insurance Portability and AccountabilityAct of 1996Children’s Online Privacy Protection Act of 1998The Gramm-Leach-Bliley Act of 1999The FTC and Privacy PoliciesThe EU Data Protection Directive
Public Internet & Computer DatabasesFederal Statutory ProtectionRise of “Search” and the Information AgeEmail as a service (e.g., AOL, Netscape)Google and Monetization of Data
Source: GW School of Law – History of Information Privacy
Legislation Landscape – 2000s
Time Period Events / Policy Threats
2000 - 2010 September 11 triggered -• The USA PATRIOT Act of 2001• The FISA “Wall”• The Homeland Security Act of 2002• The Intelligence Reform and Terrorism Prevention Act of
2004• The Real ID Act of 2005• NSA Warrantless SurveillanceConsumer Privacy• The Fair and Accurate Credit Transactions Act of 2003• The National Do-Not-Call Registry• The CAN-SPAM Act of 2003• Remsburg v. Docusearch• Privacy Policies and Contract Law
The dramatic rise for national securityRise of data breachesSocial Media TechnologyGlobal integration of moneyRise of IoTExponential increase of data monetizationCloud and cloud computingRise of the smart phoneRise of ”Big Data”
Source: GW School of Law – History of Information Privacy
Legislation Landscape – 2010 Forward
Time Period Events / Policy Threats
2010 – Forward (Prior to CCPA)
FBI vs. AppleUnited States v. Graham – 2012United States v. Jones -2012Cyber Intelligence Sharing and Protection Act (CISPA) – Rogers and RuppersbergReform of the National Security Agency's bulk metadata collection program under Section 215 of the USA Patriot Act.Revisiting Fair Information Practices (FIPs)Consumer Privacy Bill of Rights in 2012White House released the Big Data Privacy Report.resident's Council of Advisors on Science and Technology ("PCAST") Report on Big DataTC Commissioner Julie Brill has announced an initiative titled "Reclaim Your Name”Cambridge AnalyticaA Review of the Data Broker Industry: Collection, Use, and Sale
of Consumer Data for Marketing PurposesCarpenter vs. United States - 2018
Single-sign onMobile application ecosystem maturityRise of AI / ML for processing for Big DataExponential jump in device oriented environmentLonger reviews being performed on privacyExposure of Data brokerage practicesData breach growth including OPM breachSocial Physics and Personalization over the Web
Source: GW School of Law – History of Information Privacy
California Consumer Privacy Act (CCPA)
CCPA – 5 Key Rights“Therefore, it is the intent of the Legislature to further Californians’ right to privacy by giving consumers an effective way to control their personal information, by ensuring the following rights:
(1) The right of Californians to know what personal information is being collected about them.
(2) The right of Californians to know whether their personal information is sold or disclosed and to whom.
(3) The right of Californians to say no to the sale of personal information.
(4) The right of Californians to access their personal information.
(5) The right of Californians to equal service and price, even if they exercise their privacy rights.”
Source: CCPA Text
CCPA Overview
Primary intent is the expansion of the rights of CA residents, related to their personal information
❖Right to know and right to access - consumers may request disclosure of the specific data elements of personal data a business collects about them
❖Data portability – personal information must be provided to a consumer in a readily transferable electronic format
❖Deletion - individuals may request to have their personal information deleted with some exceptions
❖Consumers can request disclosures about the collection, sharing, and sale of personal information – an expansion of California’s existing 2003 “Shine the Light” law
❖Does not apply to protected health information (PHI) collected by covered entities or business associates under HIPAA
Source: TokenEx Webinar
CCPA Overview & Industry Disruption
Opt-out - consumers can object to the sale of their personal information
❖Must wait 12 months before asking consumer to opt back in.
Opt-in - minors under the age of 16 or their guardian must affirmatively authorize sale
Non-discrimination and financial incentives:
❖Can’t deny goods or services to consumers who exercise their privacy rights
❖However, businesses may offer financial incentives to consumers as compensation for their personal information
Transparency - online privacy policies and other web-based notices must disclose the categories of data collected, the sources, recipients, and all data collected, sold, or disclosed within the previous 12 months
❖A link explicitly titled "Do Not Sell My Personal Information" must be on the business’s website homepage
Industry Disruption:
Ad-Brokers and data brokers in general will face the most impact to their business models with this act.
Google does $5B a quarter in ad revenue.
While the primary intent is an expansion of rights, this act very
much focuses on aiming to reining in efforts around the sale of personal
data
Impacts 12 months prior data history of users
Equal service and priceSource: TokenEx Webinar
CCPA vs. GDPRHOW LEGISLATIONS COMPARE AND CONTRAST
CCPA vs. GDPR – Deep Dive
Protected?
CCPA Consumers, defined as natural persons who are California residents and their personal information
GDPR Natural persons (data subjects) who are in the EU and the processing of their personal data
Scope?
CCPA “For profit” businesses collecting personal information of CA consumers who also meet one of the following: annual gross revenue > $25M; collects the information of > 50k CA consumers, households, or devices; or derives at least 50% of their revenue for doing so
GDPR Controllers and processors established in the EU or who offer goods or services to, ormonitor the behavior of EU data subjects
Processing of Personal Information? – Opt out vs. Opt In
CCPA Businesses may process personal information, but need “opt out” consent to sell it
GDPR Organizations must have one of six lawful reasons to process personal information. If thereason for processing is the individual’s consent, it must be “opt in”
Source: Webinar Resource and External Research
CCPA vs. GDPR – Deep Dive
Personal Information?
CCPA “…information that identifies, relates to, describes, is capable of being associated with, or could reasonably be linked, directly or indirectly, with a particular consumer or household.” The categories of personal information within the law are expansive…
GDPR “... information relating to an identified or identifiable natural person…”Article 9 describes special categories of personal data that are prohibited from processing unless certain requirements are met
Technical Controls and Compliance
CCPA § 150 – Business’ “…duty to implement and maintain reasonable security procedures and practices appropriate to the nature of the information to protect the personal information…”
GDPR • Article 25 – Data protection by design and by default: “…the controller shall...implement appropriate technical and organisational measures, such as pseudonymisation, which are designed to implement data-protection principles, such as data minimization…“• Article 32 – Security of processing: “…the controller and the processor shall implement appropriate technical and organisational measures to ensure a level of security appropriate to the risk, including inter alia as appropriate: the pseudonymisation and encryption of personal data…”
Source: Webinar Resource and External Research
CCPA vs. GDPR – Deep Dive
Private Right to Action?
CCPA Yes, but limited to “unauthorized access and exfiltration, theft, or disclosure…” of personal information. The business has 30 days to “cure” the violation. The CA Attorney General must be notified an action has been filed and may block the action
GDPR Yes – “Any person who has suffered material or non-material damage… receive compensation… for the damage suffered.”
Fines
CCPA CA Attorney General may impose a penalty of up to $7,500 per violation.Civil action damages of $100 - $750 per consumer per incident or actual damages,which is greater
GDPR Administrative fines of:€10M or 2% of annual worldwide turnover€20M or 4% of annual worldwide turnover for willful negligence
Source: Webinar Resource and External Research
CCPA vs. GDPR – Individual Rights Summary
Individual Rights CCPA GDPR
Right of access to personal data 45 days - § 100, 110, 130 30 days - Articles 12, 15
Right to data portability Implicitly provided under theright to access - § 100
Article 20
Right to rectification No Article 16
Right to erasure Yes - § 105 Article 17
Right to object to processing Limited (§ 120) Article 21, 22
Right to object sale of personal information Yes - § 120 Not fully called out (Article 21)
Right to request categories of personal information from a business that sells or discloses the information for business purposes
Yes - § 115 Not fully called out (Article 15)
Right to equal service and price Yes Not specifically stated
Source: Webinar Resource and External Research including project interviews
ChallengesMAJOR CCPA CHALLENGES FOR COMPANIES AND ORGANIZATIONS
Key Challenge 1: Right to Know
The right to know if their personal information is sold and where it was consumed is extremely difficult.
❖Data driven market economy (DDME) - Given how vast the marketplace is and where you data may be sold and the multiple avenues
Simple transactions are easier. Larger data types are harder.
❖Eyeballs, proxy panels, etc. But what about various levels of PII and the varying levels of PII?
❖Multi-device? IoT devices? Data manageability becomes critical from a user perspective
❖Question - How does one actually do this correctly and repeatedly
Retro-active transactions – dependent thousands of times per day◦ Think ad-tech stack environment – how to retro-actively determine this correctly
◦ Value in lobbying
Upfront conclusion –
“This does not work with the current internet architecture very well” – M. Hogan, Datacoup
Key Challenge 2: Traceable Consent
❖ Consent is relatively easy when the corporate structure and services delivered to a customer are
straightforward and from an architecture perspective, are under a single traceable account.
❖ However, this becomes much more difficult with modern day multi-national companies (MNC).
Multi-National Corporation (MNC): Pharmaceutical Company Client Challenge Example
“Lets take a pharma company where our system is deployed. They could 16 brands each having a
variety of supporting systems – patient support, marketing, e-commerce, etc. Patient support
systems require to be HIPAA compliant. Marketing and e-commerce systems that needs to be CCPA.
HIPPA is only good for 5 years. Pharma companies are finding that each of their 16 brands were all
different systems and when users give them consent in HIPPA or CCPA or both - implied consent.
They would have differing views of consent by user across the systems without also knowing the
integrity of that consent. What pharmaceutical company needs are solutions that can harmonize consent across their business lines to a single view of truth.” – John Chaisson, Principal – Business Development at Health Verity
Key Challenge 3: Equal Service & Price
What is the rule?
❖Right to receive equal service and price, even if they exercise their privacy rights.
Why is it a challenge?
❖In Pre-CCPA era, data typically subsidized delivery on an extreme scale. This resulted in a variety of robust open source free applications and adjoining ecosystems
❖By default on platforms under CCPA, users share data which can then have varying levels. However, its difficult to deliver the same subsidy if users opt out.
❖A technical macro technical challenge is lingering with the rise of 5G based services. If organizations are fully on 5G, how would they deliver services at lower speeds / levels?
❖Whether you share or not, users are expected to receive same service and be offered more financial incentives to share more
❖The interpretation however is extremely vague and implementation guidance has been very limited. In addition, enforcing this rule would be extremely difficult (e.g., potentially more litigation upfront than enforcement)
Multi-National Corporation (MNC):
Insurance Client Challenge Example
“Lets take for example an insurance
company that needs to perform credit and
insurance underwriting to actually
determine price and services (upon
execution of the policy). Much of this
entire industry is largely based upon user
data and continuous use to access the
service. If people suddenly won’t
contribute their data, should I be paying the
same price as someone else who is
incredibly healthy? This will be a larger
debate on individualism vs. collectivism” –Matt Hogan, CEO, Datacoup
Key Challenge 4: Data Deletion Observations - Most large corporations and providers from a system perspective are dispersed across various businesses lines and architectures are not unified
For example take a internet provider – Phone, Internet, Television all run on different divisions with varying levels of data collection. When you buy subscription to all services, the delivery is still dispersed across business lines and those technical architectures
❖A user has an account in each stack of those systems
To be 100% compliant with the rule is impossible. Instead, companies are striving for an 80/20 role
❖In addition, that data, may have been shared with Third-Parties. Companies do not take responsibility for those third parties
Technology LandscapeMODULAR SOLUTIONS FOR COMPANIES AND ORGANIZATIONS
CCPA vs. GDPR – Individual Rights Summary
Letter Identifier
Individual Rights CCPA GDPR
A Right of access to personal data 45 days - § 100, 110, 130 30 days - Articles 12, 15
B Right to data portability Implicitly provided under theright to access - § 100
Article 20
C Right to rectification No Article 16
D Right to erasure Yes - § 105 Article 17
E Right to object to processing Limited (§ 120) Article 21, 22
F Right to object sale of personal information Yes - § 120 Not fully called out (Article 21)
G Right to request categories of personal information from a business that sells or discloses the information for business purposes
Yes - § 115 Not fully called out (Article 15)
H Right to equal service and price Yes Not specifically stated
1
3
1
Emerging Technology Landscape for Regulation: Three Highlighted Success Technologies
2
2
3
Consent / Data Subject Rights Management• Example Company – Health Verity
Big Data Encryption• Notional Example – MongoDB
Tokenization• Data Marketplace Example- Datacoup• Notional Example - TokenEx
Blockchain Data Integrity Solutions –Note: For this evaluation, these types of solutions were treated or discovered to be supplementary technology to the technologies that have had significant success above
Source: Forbes article on Top 10 Emerging Trends
CONSENT / DATA SUBJECT RIGHTS MANAGEMENT – Example: Health Verity
21 3
Future Example Sectors:• Cable / Phone Companies• Debt Collection in Mortgages
Background & Challenge: Pharmaceutical Company (see prior example)
How Health Verity solution works: Health Verity (HV) solution allows pharma to harmonize consent across a single view of truth as we manage outreach.❖ Any system that records a change of system, sends that delta to HV cloud and profiles consent across
legacy systems via data mapping, updates all unique systems. On any update, HV cloud updates harmonize systems with updated consent. This form a validation schema, legitimacy – if one of their systems fires off a delta we check that delta. Everything is hashed and then compared via check-sum
❖ For CCPA, HV provides a dashboard for in-bound tickets. Companies have an option of selling consented data in a marketplace but this is marked. Cannot trace data if it exchanges further hands than once.
❖ HV establishes recording and auditing log that is tied to an immutable based log that is in Hyperledger (blockchain). Time stamps are placed in and out and when a change in consent, it is harmonized via cloud and then to an ledger which serves as a compliance ledger.
Company: Health Verity – www.healthverity.com
Potential Delivery
1) System analysis
2) Meta data with proposed recommendations
3) AWS instance of our Hyperledger HV solution
a. What is stored - any delta
based data. Original data
is stored in client legacy system
b. Network pointers on blockchain
c. This consent is currently
truth and placed immutably
d. Identification is managed in other systems.
4) API based calls with updates
5) CCPA harmonization reports6) Compliance corrective actions
Source: Interview with John Chaisson, Principal – Business Development at Health Verity
CCPA Targeted AreasConsent Related (E, F)
BIG DATA ENCRYPTION21 3
Background & Challenge: With legacy systems, companies have problems with visualizing data on the back-end, data governance (tied to document validation), compliance validation including period of of time data is stored, access control of data in a serverless environment, etc. Much of this problem becomes worse with rise of ”big data” including “IoT” and need to protect data at rest as well while delivering compliance functionality is critical. Companies therefore are turning to hosted serverless environments or infrastructure-as-a-service solutions. Under GDPR (and likely CCPA) encryption at rest is a key method to meet GDPR compliance.
Company: MongoDB – www.mongodb.comTechnology Profiled – MongoDB Atlas – hosted serverless solution for enterprises
Big Data Encryption features -• End to end data encryption – Explore here• Data in motion, TLS encryption• Data at rest in persistent storage and backups• Document DB structure• Encrypt and tie directly to users easily at a key level
CCPA Targeted AreasBig Data Encryption
(A, B, D), (E, F indirectly)
Potential Delivery• Analyze meta data schema for PII
and other types of data collected• Conduct mapping to structure a
.JSON file per user• Zone set up to manage
compliance across various regions (CCPA vs. GDPR)
• Establish a PKI asymmetric key structure for users and devices
• Write .JSON encrypted file for light weight data
• Migration to MongoAtlas• Monitor and control via Mongo
Compass• Trace encrypted data to locations• Provide connection for AI / ML
activities to data lake solutions
Source: MongoDB Compliance
ENCRYPTION VS TOKENIZATION
Source: Tokenization vs encryption vs masking, Ulf Mattsson, TokenEx Head of Innovation, earlier Chief Technology Officer of Security Solutions at Atlantic BT
ENCRYPTION VS TOKENIZATION21 3
Source: Tokenization vs encryption vs masking, Ulf Mattsson, TokenEx Head of Innovation, earlier Chief Technology Officer of Security Solutions at Atlantic BT
Emerging Data Marketplace Tech: Datacoup
What Datacoup built within a centralized architecture solution:
◦ Meets most CCPA rules with exception to access and value of price. Datacoupmission is to monetize data but deliver that value back to the consumer
Where are we going?
◦ Alleviate issues of CCPA through decentralization
◦ If we are able to build out decentralized architectures,
❖ PKI on user device – genesis of access points.
❖ Data storage looks very different in the frontier of the internet
❖ Monetization looks different in the future – “Personal Data Token” and using a blockchain based solution
Implementation – Org View
User / Org Goal - You are trying to purchase data or purchase access to data
❖ Determine access points and determine aggregated data product (ADP) structures
❖ ADP – trend analysis, pattern analysis, etc.
Implementation – User View
❖ “I want to participate in this Datacoup” – providing key signature
❖ Datacoup becomes a key manager and fiduciary – aggregating and participating and providing remuneration
Datacoup Dashboard -http://datacoup.com/docs#how-it-works
Source: Interview with Matt Hogan, CEO of Datacoup
TOKENIZATION21 3
Potential DeliveryTokenization of PII with Provider (e.g., TokenEx)
Map personal data within organizationIdentify direct and indirect identifiersRisk assessment of data typesTokenization of identifiersInteraction with a vault (if necessary),PII replaced with tokenShare pseudonymisationdata as applicable with shared providers and for analytics purposes
CCPA Targeted AreasTokenization (Technical Need)
Some Future Thoughts on Information PrivacyTHEORIES AND IDEAS FOR THE FUTURE
Theories / View on National Privacy Policy Shape
Custodian / Consortium Theory: A consortium driven model where trusted authoring and managing sources facilitate on behalf of users governance and stewardship
Self Sovereign Identity Theory: Placing power back into consumer either as a user controlled or digital attestation model
Adjacent Industries to Consider
❖Financial Services Custodians / Consortiums
❖Treating Personal Data like Investments or Assets (tokenization)
Restructuring theory:
❖Three Internets View – American Internet, Authoritarian Internet, and EU internet – and the services in each
Strategic Thoughts:
Theories / Views - Implementation
Implementation View: A critical focus on modular technologies that are cost effective with limited disruption to existing architectures in the short run. In the long run, proposed data sharing consortiums / custodian model are technically feasible but requires agreement first among established and new trusted custodians. In the frontier of technology rests the aspirational goal of a fully decentralized world and internet.
Long Run – MIT Consortium ModelShort Run Modular Technologies Frontier – Ex. Enigma Protocol
MIT Open Algorithms (OPAL) Consortium Model – Owner Centric Access Management of IoT Data
Source List – Part 1
Center, Electronic Privacy Information. “EPIC - Big Data and the Future of Privacy.” Electronic Privacy Information Center, epic.org/privacy/big-data/.
“California Passes Landmark Consumer Privacy Act-What It Means for Businesses.” Akin Gump Strauss Hauer & Feld LLP, www.akingump.com/en/news-insights/california-passes-landmark-consumer-privacy-act-what-it-means.html.
Chaisson, John, Personal communication with Principal - Business Development at Health Verity, February 2019
Hogan, Matt, Personal communication with CEO of Datacoup, March 2019
Kassel, Matthew. “As 5G Technology Expands, So Do Concerns Over Privacy.” The Wall Street Journal, Dow Jones & Company, 27 Feb. 2019, www.wsj.com/articles/as-5g-technology-expands-so-do-concerns-over-privacy-11551236460.
Kleinrock, Leonard. “Fifty Years of the Internet.” TechCrunch, TechCrunch, 18 Mar. 2019, techcrunch.com/2019/03/18/fifty-years-of-the-internet/.
MongoDB, MongoDB Compliance, “Impact to your data management landscape” - https://www.mongodb.com/collateral/gdpr-impact-to-your-data-management-landscape
Mattsson, Ulf. “Tokenization vs Encryption vs Masking.” LinkedIn SlideShare, 16 Feb. 2018, www.slideshare.net/ulfmattsson/tokenization-vs-encryption-vs-masking
Source List – Part 2
Newton, Casey. “Europe Is Splitting the Internet into Three.” The Verge, The Verge, 27 Mar. 2019, www.theverge.com/2019/3/27/18283541/european-union-copyright-directive-internet-article-13.
Pentland, Alex, et al. New Solutions for Cybersecurity. MIT Connection Science and Engineering, 2018, Chapters 13 – 15
Press, G. (2019). Top 10 Hot Data Security And Privacy Technologies. [online] Forbes.com. Available at: https://www.forbes.com/sites/gilpress/2017/10/17/top-10-hot-data-security-and-privacy-technologies/#5216002a6b3f [Accessed 01 Mar. 2019].
Solove, Daniel J. “A Brief History of Information Privacy LawGeorge Washington University Law School, 2006, https://scholarship.law.gwu.edu/cgi/viewcontent.cgi?article=2076&context=faculty_publications
“Text.” Bill Text - AB-375 Privacy: Personal Information: Businesses., leginfo.legislature.ca.gov/faces/billTextClient.xhtml?bill_id=201720180AB375&search_keywords=Consumer%2BPrivacy%2BAct%2Bof%2B2018.
“Tokenization vs Encryption: Things You Need to Know to Choose.” ELEKS, 23 Aug. 2018, eleks.com/blog/tokenization-vs-encryption-things-you-need-to-know/.
“Tokenizing_PII.” TokenEx, tokenex.com/tokenex-platform/data/pii/tokenizing_pii/.
Webinar - Becoming Cyber Aware: An Issue of National Security- A Fireside Chat with Gen. Petraeus & Dr. Neuman – March 18, 2019
Webinar – InfoSec - California Consumer Privacy Act: Are You Prepared for 2020? https://infosecinstitute.wistia.com/medias/xytbxo3tfl, January 25, 2019
Webinar – TokenEX – Understanding CCPA - https://tokenex.com/understanding-compliance-california-consumer-privacy-act/ccpa-cover-image/
Thank You – Q&AINF529 - SECURITY AND PRIVACY IN INFORMATICS – SPRING 2019
PRESENTED BY: ARJUN G. RAMAN
CONTACT: [email protected]
MEASUREMENT OF PRIVACY IN SOCIAL MEDIA
SEVANTI NAG
INF529 SECURITY
AND PRIVACY
MARCH 29, 2019
Outline
■ Statistics
■ Attacks on Social Media
■ Motivation of Attacks
■ Social Network Privacy Measurement
Statistics
According to the global
platform Statista.com the total
number of social media users
worldwide will be around 2.77
billion in 2019.
The statistic presents data on
social media platforms used by
marketers worldwide as of January
2018.
Facebook is still the preferred
social networking site, despite the
social media scare.
Attacks on Social Media
■ Identity theft
■ Spam attack
■ Malware attacks
■ Sybil attacks
■ Social phishing
■ Impersonation
■ Hijacking
■ Fake requests
■ Image retrieval and analysis
Motivation of Attacks
■ Revenge/Emotions
■ Financial gains
■ Entertainment
■ Expertise for job
■ Hacktivism
■ Cyber espionage
■ Cyber warfare
Social Network Privacy Measurement
■ Privacy Quotient
■ Privacy Index
■ SONET Model
■ PrivAware
■ Privometer
Privacy Quotient
Privacy quotient (PQ) is a metric to measure the privacy of a user’s profile. It can be
measured on the sensitivity of an information and the visibility of an information.
Sensitivity - The property of an information which makes it private. As sensitivity
increases, privacy risks involved in sharing the item also increases. Hiding of such kind
of information makes the user more private.
Visibility - The property of information that captures the popularity of an item in the
network.
PQ = Sensitivity x Visibility
Privacy Index
Privacy Index (PIDX) is a metric to measure user’s privacy exposure. High PIDX value
indicates high exposure of privacy.
Each user has certain characteristics that describe its features known as attribute like
name, address, social security number (SSN), phone number, etc. Each attribute has a
different impact on privacy like, SSN has higher impact on privacy than phone number.
This impact is referred as Attribute Privacy Impact Factor. Privacy impact factor is a
numerical value .
Among all the attributes of a user, there is a subset of attributes, if known, privacy might be
disclosed.
PIDX =Sum of the privacy impact factors of the known attributes
Sum of the privacy impact factor of all attributes
SONET Model
SONET model can be utilized to stimulate privacy changes in an social network and
assess privacy affect in case user accounts are compromised.
Every user can be described by user profile, privacy settings, and a friend list.
■ A profile comprises of personal information of a user.
■ Privacy settings describe how users need to convey their own information.
■ A friend list includes a group of individuals who are connected together.
■ The friend list can be further classified as different groups, for example, friends,
friends of friends, or public, etc.
■ Privacy settings can thus be characterized according to the groups.
PrivAware
PrivAware is a tool to detect and report unintended information loss in social networks.
PrivAware calculates privacy score based on the total number of attributes visible to the
third party applications to the total number of attributes of a user.
Privometer
Privometer is used to measure the amount of sensitive information leakage in a user’s
profile. The leakage is indicated by a numerical value. The model of Privometer also
considers substantially more information that a potentially malicious application
installed in the user’s friend realm can access.
References
■ https://www.statista.com
■ Zhang, Zhiyong, and Brij B. Gupta. "Social media security and trustworthiness:
overview and new direction." Future Generation Computer Systems 86 (2018): 914-
925.
■ Ananthula, Swathi, et al. "Measuring privacy in online social networks." International
Journal of Security, Privacy and Trust Management 4.2 (2015): 1-9.
■ R. K. Nepali and Y. Wang, "SONET: A SOcial NETwork Model for Privacy Monitoring
and Ranking," 2013 IEEE 33rd International Conference on Distributed Computing
Systems Workshops, Philadelphia, PA, 2013, pp. 162-166.
■ Wang, Yong & Nepali, Raj & Nikolai, Jason. (2014). Social network privacy
measurement and simulation. 802-806. 10.1109/ICCNC.2014.6785440.
THANK YOU
The Monetization of (the lack of) privacy
By: Ahmed Qureshi
Quick Overview
- Data and Trust
- Apple vs Google
- Do people value data?
- Google Adsense
- Banking Industry
- Health Insurance Industry
Data is the foundation of a digital society
- Customer data allows companies to personalize the customer
experience
- However legislation has been passed/is in the process of being
passed that limits companies access to data and requires more
transparency into the data usage
- It all comes down to trust
How much do you trust?
- Apple
- Amazon
Apple vs Google
- iOS anonymizes data sent back to their servers, data is meant
to help them better support their users
- Android phones send a lot of data back to Google, including
location, which is used to power ad targeting
Apple vs Google
- A study was done back in 2013 that showed people making
over 75,000 a year are more likely to own an iPhone than an
android device
- In 2016 the National Bureau of Economic Research found that
owning an iphone or ipad gives a 69% chance of correctly
classifying the person as a high income earner
Apple vs Google
- Paying for privacy
- Getting *paid* for your
data
Spending per transaction
What this means for advertisers
- Depending on your target
demographics you can focus solely
on advertising for specific mobile
devices
- Volume versus value
- 2 billion+ android devices
- 1 billion+ iOS devices
Is apple really privacy focused?
- Apple doesn’t routinely take punitive action against privacy
violators on its platforms, as we saw in the Facebook research
scandal
- Facebook maintains that everything it was doing was totally ok
and it was not doing anything wrong, users were simply being
paid for their data
Apple’s monetization of privacy
- While apple does not sell the data it gets from your phone, it
definitely sells others access to your data
- In 2018 Google paid Apple 9 Billion dollars to be the primary
search engine in safari
- In 2019 that amount is supposedly 12 Billion Dollars
- For reference in Q4 2018 Google’s ad sales were 32.6 Billion
Dollars
Apple’s monetization of privacy
- Google has a majority of their services such as drive, gmail, and
maps, available on the app store
- Google receives tons of data on iphone users usage
- Anything with an in-app purchase (such as more space on
drive) gives apple a 30% cut.
Privacy for inside apps
- Lots of free apps on the app store are really just for data
aggregation
- An experiment was done in an attempt to block ip addresses
owned by google and it found that apps such as lyft and uber
were not usable as they leveraged google map’s api
Customers vote with their data
- Customers want a tangible return for
their data
- 2 out of 3 consumers are willing to
share their personal information in
exchange for some perceived value
What could apple do?
- More rigorous policies against data collection in free/paid apps
that better align with the privacy aspect apple is trying to sell
- Force companies to use apple equivalent APIs where they exist
Consumers on Privacy
- As we’ve seen, consumers are more than willing to give up their
privacy for ease of use or for monetary compensation
- While apple charges more for the privacy stance, at the end of
the day the user still has a significant amount of data being
collected on them by other services
Google AdSense
Google AdSense
- Auction used to select ads that appear on webpages and
determine how much money the website makes from ads are
based on the maximum price advertisers are willing to pay
- The auction ranks advertisers based on the price and the
quality score to ensure that ads provide a positive user
experience
Google AdSense
- Google filters all the the available ads to a pool that are eligible
to be shown on a website for the available placements
- They only consider ads that are relevant to the content or users
of the website
- Ads are also considered that have been specifically picked by
advertisers that think their ads match the sites users
Definitions
- Click through rate (CTR): Number of clicks advertisers receive
on their ads relative to the number of views
- 20 clicks/1000 views = 2% CTR
- Cost per Click (CPC): Actual cost of each click from the
marketing campaign
- Quality Score (QS): estimates relevance of ads to the person
who sees ads
4 types of targeting
- Contextual Targeting
- Website context
- Placement Targeting
- Choose specific ad placements or subsections of websites
- Personalized Targeting
- Based on user data
- Run of Network Targeting
- Target all site in Ad-Sense network
Personalized Ad Factors
- Types of websites visited
- Mobile apps on phone
- Websites/apps visited that belong to businesses that leverage
google for advertising
- Activity on other devices
- Previous interactions with ads
- Google account activity/information
Example with 3 advertisers for 2 spots
- Bob pays: $1.01
- Alice pays: $2.34
- Alice pays the same rate as Bob for ⅓ of her clicks
- She pays the higher amount for the remainder
Factors that affect the bidding process
- Seasonal Campaigns: Advertisers won’t advertise the same way
throughout the year and thus will spend more or less money at
different times
- Ad Targeting: Advertisers can choose their ads to be only
context or placement targeted
- Geotargeting: Advertisers can target based on region and time
of day
- Conversion Optimizer: Bids are automatically adjusted to
maximize clicks or conversions
Conversion Optimizer/Target CPA (cost-per-acquisition) Bidding
- For getting conversions (sales, signups) Target Cpa bidding will
automatically get more conversions for your budget.
- It can help get more sales while paying less for clicks
- Uses conversion tracking data to avoid unprofitable clicks and
thus get more conversions for a lower cost
- Can pay for only conversions specifically
Turning off add personalization
Banking Industry
- Consumers can’t opt out of all information sharing
- Can share information to market bank’s own financial
services
- Can share information to market bank’s own financial
services joint with another bank
- Can share information to process/service transactions
- Can share information to protect against
fraud/unauthorized transactions
- Can share information to respond to judicial processes
- Can share information to comply with any law
Banking Industry
- By running every transaction on a credit card the bank has a
record of what you have bought
- Buying clothes from a thrift store or used tires could indicate a
lack of creditworthiness due to higher risk of default
- Banks could raise interest rates and reduce credit limits if they
want to reduce their risk
Health Insurance Industry
- Attempting to get as much information on people as possible
- Education
- Net Worth
- Amazon Purchases
- Etc
- Low Income - Health Risk
- Errors in data/models leads to discrimination
Health Insurance Industry
- Boost profits by signing up healthy people and attempt to
avoid sick people
- Enrollment office third floor with no elevator
- Affordable Care Act prohibits insurance from denying coverage
on pre-existing conditions
- Short term health plans allow the denial of coverage to the
sick
Health Insurance Industry
- People’s health is being inferred from a larger group of people
with similar traits/data
- Inherent bias towards the poor/unhealthy
- There is no accountability for poor models as there is no
standard health insurance pricing
Summary
- Consumers want value for their data
- Companies see huge profit in mining, processing, and
modeling our data for their benefit
- Google AdSense will leverage its incredibly large dataset on
individuals interacting with their services to optimize revenue
- In both banking and in the health insurance industry you can
be discriminated against for being deemed “poor”
Extra: Paper on Privacy Risks with Facebook’s PII-based Targeting
- The paper shows how an adversary can infer a users full phone
number knowing just their email address
- Also, it can de-anonymize all the visitors to a website by
inferring their phone numbers
- These attacks could be conducted without interacting with the
victim and could not be detected by the victim
References
- https://www.accenture.com/us-en/insight-new-slice-pii-side-digital-trust
- https://www.axios.com/data-privacy-apple-google-cost-18bed43e-5a0f-4568-
b8d0-fbc7978ddde8.html
- https://www.pewinternet.org/2013/06/05/smartphone-ownership-2013/
- https://www.businessinsider.com/apple-iphone-or-ipad-is-the-top-way-of-
knowing-if-youre-rich-or-not-2018-
7?utm_content=bufferb8c8a&utm_medium=social&utm_source=facebook.com&
utm_campaign=buffer-ti
- https://moz.com/blog/apple-vs-android-aov
- https://www.statista.com/statistics/612937/smartphone-average-selling-price-
iphone-and-android/
- https://www.theverge.com/2017/5/17/15654454/android-reaches-2-billion-
monthly-active-users
- https://www.propublica.org/article/health-insurers-are-vacuuming-up-details-
about-you-and-it-could-raise-your-rates
- https://www.creditcards.com/credit-card-news/credit-card-purchase-privacy-
1282.php
References
- https://www.theatlantic.com/technology/archive/2019/01/apples-hypocritical-
defense-data-privacy/581680/
- https://www.ftc.gov/system/files/documents/public_events/1223263/p155407priv
acyconmislove_1.pdf
- https://support.google.com/adsense/answer/6242051?hl=en
- https://www.howtogeek.com/285835/how-to-opt-out-of-personalized-ads-from-
google/
- https://www.wordstream.com/click-through-rate
- https://www.fdic.gov/regulations/examinations/financialprivacy/handbook/index.
html
Current Events - Google
122
Thousands of Reddit users are trying to delete Google from their lives, but they're finding it impossible because Google is
everywhere - Business Insider 3/23/19
The article talks about how a large group of users on reddit are trying to break away from google services but it is difficult
as a majority of services are engrained in everyday usage/lack of alternatives. Services such as spotify, pokemon go,
and uber utilize the company's servers/api's, and thus are rendered broken if you block access to google servers. The
amount of data that is being collected harms privacy and as a simple example the google translate app doesn't work on
android without access to your contacts. No matter how hard someone tries with the amount of reach that google has it
is impossible to stay out of its grasp. -Ahmed Qureshi
https://www.cpomagazine.com/data-privacy/the-new-privacy-threat-airplane-cameras-and-google-nest-microphones/
Passengers aboard a Singapore Airlines flight discovered cameras in the INE systems, which raised concerns regarding
the privacy of travelers. The airline claims that the cameras were merely installed by the manufacturer, but were not
actually activated, and that there are no plans for activation in the future. A similar situation occurred with the Google
Nest device, in which a hidden microphone was discovered. The company claimed to have installed it for future use, but
that they had forgotten to mention it in the product spec. -- Ann Bailleul
Android users' security and privacy at risk from shadowy ecosystem of pre-installed software, study warns - TechCrunch 3/25/19
According to a recent study, researchers have found security and privacy concerns in pre-loaded software (e.g.
Facebook app) on Android devices. There is a lack of transparency of what the software is doing and unless you are an
expert Android user, you are completely unaware of personal information tracking and are unable to delete this software.
These could potentially be creating backdoors as information is spread to third party software companies without user
consent or awareness. - Brianna Tu
Current Events - Facebook
123
Why the Debate Over Privacy Can't Rely on Tech Giants Electronic Frontier Foundation 3/15/2019
Even though tech giants such as Google and Facebook were targeted by users and privacy advocates for privacy
violations and for being negligent, no steps seem to be taken by those companies in terms of privacy enhancements.
There have been many cases such as the Cambridge Analytica case along with many continuous violations of privacy by
Facebook that seem to be intentional. Users had hope when congress questions Facebook's CEO and thought that this
might change something; however, with respect to all efforts, users should never rely on tech giants when it comes to
privacy - Faris Almathami
Former Facebook exec: 'Zuckerberg is sitting on more data about what people want to do online than anyone else in the world’
– CNBC 3/27/19 Christina Farr | Salvador Rodriguez
Alex Stamos, who left Facebook in 2018, spoke on stage at Washington Post's technology and policy conference. He
had an explanation of why his former boss, Mark Zuckerberg's, decisions can seem insane at the time, but make sense
with the benefit of hindsight. He cited the acquisitions of private messaging WhatsApp in 2014 for $19 billion, and photo-
sharing service Instagram in 2012 for $1 billion, as examples of bets "that people think are insane but turn out to be
prophetic because he knows the direction the world is going," Stamos said. Both deals have turned out to be highly
valuable for the company. Instagram now boasts more than 1 billion active users per month and is popular among the
younger audiences who are tuning out the core app. WhatsApp has more than 1.5 billion users and will probably form
the basis of the company's new focus on private messaging, which Zuckerberg announced earlier this month. -- Gene
Zakrzewski
Current Events - Apple
124
Apple wants to be the only tech company you trust - The Verge
Apple is releasing a new credit card and is making privacy a huge focus. It is partnered with Goldman Sachs and clearly
stated that Goldman Sachs will never sell purchase data to third parties. The card is linked with Apple Pay along with
various other Apple applications and all have the same privacy pitch of not selling data to third parties. - Chloe Choe
Apple embracing Privacy as a Selling Point - GeekWire 3/26/2019
Apple had an event on Monday where they revealed multiple services, including a new news subscription service, a
credit card, and more. However, heavy emphasis was made by Tim Cook and each of the presenters about how these
new services were designed “with privacy in mind”. Article explores how companies such as Apple are now utilizing
privacy as a selling point. - Lance Aaron See
Here's The Real Reason Apple Claims To Care About Your Privacy - 03/26/2019 Forbes
An article that discusses why Apple has been emphasizing a lot on privacy since the Cambridge Analytica scandal. Apple
in its recent launch of a couple of products clearly stated that privacy is at the heart of its products and included a
statement in one of the iPhone Ads saying: "what happens on your iPhone, stays on your iPhone". For Apple, this is only
a move to fill a gap in the market that was created after the US increasing concerns about privacy and the
announcement of GDPR. - Abdulla Alshabanah
The most original thing about Apple’s credit card isn’t its app, fees, or laser-etched titanium-Quartz-March 25th 2019
Apple has revealed its partnership with Goldman Sachs to release a credit card within the Apple Wallet app. Less than
50 million people in the US currently use Apple Pay, and the company is seeking to expedite its adoption by selling its
most crucial feature: privacy. It will not track shopping locations, how much was paid, or what was purchased, and it
won't sell data to third parties, creating a unique selling point amidst the tech giants. -Jacqueline Dobbas
Current Events - Europe
125
EU Council Adopts Protocol for Responding to Major Cyberattacks - Europol 03/18/2019
The Council of the European Union has adopted an EU Law Enforcement Emergency Response Protocol to help EU
member countries better respond to large scale cyberattacks, like NotPetya and WannaCry. The protocol is a tool to
support the EU law enforcement authorities in providing immediate response to major cross-border cyber-attacks
through rapid assessment, the secure and timely sharing of critical information and effective coordination of the
international aspects of their investigations. - Sevanti Nag
EU Parliament Approves Controversial Copyright Law – 3.27.19 infosecurity
This article explains that the EU Parliament just approved a copyright law. The article points to 3 glaring issues.
-Article 13 requires sites to filter uploaded content to make sure it doesn’t contain copyright infringement.-Article 11
requires search engines to pay to feature news on their sites. -The law is to be interpreted by member states potentially
leading to inconsistences across the EU. The biggest concern is that this law (in particular Article 13) can lead us
towards a path of internet censorship. -Jairo Hernandez
Current Events – US Government
126
FEMA Leaked the Data of 2.3 Million Disaster Survivors - Wired
FEMA publicly acknowledged a Homeland Security Department Office of the Inspector General report that the
emergency response agency wrongly shared personal data from 2.3 million disaster survivors with a temporary-housing-
related contractor. In doing so, the agency violated the Privacy Act of 1974 and Department of Homeland Security policy,
and exposed survivors to identity theft. There was no 'hack' here. but the data FEMA should have sent to the contractor
to verify survivors’ eligibility for lodging includes full names, dates of birth, eligibility start and end date, a FEMA
registration number, and the last four digits of survivors’ Social Security numbers. But the report also found that FEMA
additionally shared 20 unnecessary data fields with the contractor, including six that contain particularly sensitive
information, like survivors’ full home addresses, bank name, electronic funds transfer number, and bank transit number. -
- Kavya Sethuraman
US disaster agency exposed private data of 2.3M hurricane and wildfire survivors - The Guardian 03/22/2019
The US disaster relief agency unnecessarily released sensitive identifiable data, including banking information, of 2.3m
disaster survivors to an outside contractor. - Nitya Harve
Border agency warns of privacy risks in web initiative - KCRA 3/27/19A Privacy Impact Assessment obtained by Hearst Television National Investigative Unit reveals that the U.S. Customs
and Border Protection (CBP) agency is requesting to expand its sources of intel to include social media — whether the
information is factual or not. This document addresses a CBP Social Media Situational Awareness initiative where the
CBP may collect any publicly made information on social media like an individual's name, username, phone number,
email address, etc. but warns they may not be able to protect a person's privacy. The assessment also states that CBP
personnel can mask their identity when viewing social media data for OPSEC purposes but has received backlash from
Facebook based on their requirement to use real identities on the platform. There is also no opt-out option for the
initiative as well as no defined process for challenging any false information gathered by the CBP. -- Aaron Howland
Current Events – Government
127
District of Columbia Introduces Legislation on Data Privacy - March 26, 2019
Security Breach Protection Amendment Act of 2019 was introduced in the District of Columbia. The new legislation would
expand legal protections to cover additional types of personal information, require companies that deal with personal
information to implement safeguards, include additional reporting requirements for companies that suffer a data breach,
and require companies that expose consumers' social security numbers to offer two years of free identity theft protection
– Mindy Huang.
FTC Demands Broadband Providers Reveal Data Handling Practices ThreatPost 3/27/19
The FTC are requesting information on how 7 broadband companies are using/collecting user data and what they
disclose on their privacy policies. These 7 companies include, AT&T, AT&T Mobility, Comcast Cable/Xfinity, Google Fiber,
T-Mobile, and Verizon Communications and Cellco Partnership. - Charlene Chen
FTC announces inquiry into the privacy practices of broadband providers 03/26/2019 TheVerge
FTC has asked internet service providers like AT&T, Verizon, T-mobile, Xfinity, Google Fiber to hand over non public
information describing how they handle consumer data. This includes what kind of data is collected, why it is collected,
whether this data is shared with 3rd parties, de-identified and the procedures allowing consumers to make changes to
and delete their personal information. --Anupama
Current Events
128
Genetic testing firms share your DNA data more than you think - Axios
Genetic testing companies that trace customers' ancestry are amassing huge databases of DNA information, and some
are sharing access with law enforcement, drug makers and app developers. Why it matters: At-home DNA testing kits
are soaring in popularity, but many consumers who take the tests to learn more about their family trees may not realize
how that data is being shared for other purposes. – Haleh Salimi
Family Tree DNA offers to trade privacy to catch criminals – engadget 3/28/19
A company that test DNA at home called Family Tree DNA is asking their customers to share their genetic data with law
enforcement such as FBI to help solve crimes. The case of the 1979 murder in San Diego could be traced back to the
distant relative of the killer who had DNA in the database GEDmatch to identify the source of blood at the crime scene.
Family Tree DNA has regularly allowed the FBI to search its database to solve crimes, which privacy critics and
bioethicist argue violate civil liberties. -- Sophia Choi
5G is speedy, but does it also raise the stakes on privacy, security, potential abuse? | USA Today | March 27, 2019
5G technology is a whole set of interrelated technologies delivered to consumers all at once. These systems need to be
tested for privacy risks prior to deployment - A cyber attacker could intrude at a very personal level with consumers that
have not been seen before. The real risk comes not from the network itself but potentially from applications that would
not have been possible prior to 5G especially with those tied to IoT devices. - Arjun G. Raman
Current Events
129
HTTPS Isn't Always as Secure as it Seems Wired
An analysis of 10,000 HTTPS sites showed about 550 of them are vulnerable to TLS vulnerabilities. However, each of
those 550 sites still appears with the green lock indicating it is "safe". While exploitation of the vulnerabilities may be
difficult, and not lead to much; it is important to understand because the impression of safety given by general usage of
HTTPS can lead to larger issues. This is especially apparent given the desire to have massive web inter-connectivity. --
Joseph Mehltretter
Hackers Hijacked ASUS Software Updates to Install Backdoors on Thousands of Computers - Motherboard 03/25/19
Researchers at Kaspersky Lab reported that around half a million Windows machine sold by ASUS were getting a
malicious backdoor through the company's live update servers. The update was signed with legitimate ASUS digital
certificates and was pushed to customers for at least five months before it was discovered. The researchers said that
this attack is a supply-chain attack, and that the trust based on known vendor name or digital signature can't prevent
such attacks. Abdullah Altokhais
US Says Chinese Ownership of Grindr is a National Security Risk - Independent 3/27/19
The Committee on Foreign Investment in the United States (CFIUS) was concerned about Kunlun, one of Chine's largest
mobile gaming companies, owning the dating app Grindr. When Kunlun acquired Grindr, they did not submit the
acquisition for CFIUS review, which might be part of the reason why this deal was blocked. Kunlun is now preparing an
auction process to sell Grindr. - Charlene Chen
Current Events
130
Toronto’s 'Smart' Neighborhood Sparks Debate - US News 03/15/2019
Sidewalk Labs promised to build Quayside, a smart neighborhood "from the internet up" – incorporating data-collecting
sensors into the neighborhood's infrastructure to gather information, say, on travel patterns to coordinate traffic lights.
But the proposal met opposition from locals who demanded to know who would own the information that is collected,
whether it would be private and who would benefit from its use. - Nitya Harve
Aluminium firm cyber-attack cost at least £25.6m – BBC
A cyber attack on a Norwegian company has cost them almost $300m dollars after the company was hit with malware
last week. The company posted signs on the doors to employees telling them not to “connect any devices to the Hydro
network and to disconnect any devices from the Hydro network.” -- Louis Uuh
Employers Beware: Judge Greenlights Employee’s Privacy Lawsuit Over Dropbox Access - natlawreview.com - 3/28/19
West District of Pennsylvania judge partially denied a public employer's motion to dismiss the case. The plaintiff was
forced to resign from her position after nude images were found in her personal DropBox account that was also used for
work. She did not access the images from her work computer but she did keep her drop box password stored in an excel
spreadsheet co-located with other work passwords on her work computer. -- Dewaine Reddish
Current Events
131
Can you stop your parents sharing photos of you online? - BBC 3/28/2019
Sharenting, which is the act of parents sharing news and pictures of their children online, can make a lot of children feel
uncomfortable. One child mentions that we live in a society where all of our pictures need to be flattering, so parents
posting embarrassing pictures can make the child feels a little bit betrayed and that a study conducted by a professor of
media studies at University of Tartu in Estonia found that there are discrepancies in what children and parents consider
as "nice" photos. Sharenting poses risks because contributing more photos online means that tech companies know
more about the child without the child's contribution in the data collection and it can also cause "digital kidnapping" where
strangers take the photos of the child and use them for fraudulent or sexual purposes. -Yulie Felice
The Landlord wants Facial Recognition in its Rent stabilized Building. Why? - NYtimes 03/26/2019
This story is about a Rent stabilized apartment in Brooklyn.The landlords wanted to replace the key-fob system with
facial recognition systems for providing access to the apartment. The fact that the Atlantic complex already has 24-hour
security in its lobbies as well as a clearly functioning camera system has only caused tenants to further question the
necessity of facial recognition technology. The initiative is particularly dubious given the population of the
buildings.Ultimately, a state housing agency will decide whether Nelson Management can install the software or not.
-Deepti
Putting data privacy in the hands of users – Science Daily – 2/20/19 MIT
Researchers have developed Riverbed, a platform that ensures web and mobile apps using distributed computing in
data centers adhere to users' preferences on how their data are shared and stored in the cloud. In Riverbed, a user's
web browser or smartphone app does not communicate with the cloud directly. Instead, a Riverbed proxy runs on a
user's device to mediate communication. When the service tries to upload user data to a remote service, the proxy tags
the data with a set of permissible uses for their data, called a "policy.“ -- Gene Zakrzewski