On Monday, the FTC hosted a public workshop on the topic of big data and discrimination entitled, “Big Data: A Tool for Inclusion or Exclusion?” The first panel, which explored today’s big-data landscape, featured the following speakers from government, industry, and academia: Kristin Amerling, Chief Investigative Counsel and Director of Oversight at the U.S. Senate Commerce Committee; danah boyd, Principal Researcher at Microsoft Research and Research Assistant Professor at New York University; Mallory Duncan, Senior Vice President and General Counsel of the National Retail Federation; Gene Gsell, Senior Vice President for U.S. Retail & CPG at SAS; David Robinson, Principal at Robinson + Yu; and Joseph Turow, Professor at the University of Pennsylvania Annenberg School for Communication.
By Randall Friedland
According to a GAO report published September 16th, Healthcare.gov, the health insurance exchange rolled out last October, still has significant privacy weaknesses. Specifically, the report outlined that despite the Centers for Medicare & Medicaid Services’ (CMS) efforts to increase the security and privacy of data that it processes, maintains, and shares with both federal and commercial partners in an effort to support Healthcare.gov, “weaknesses remain both in the processes used for managing information security and privacy as well as the technical implementation of IT security controls.”
Yesterday, the Article 29 Working Party group of European privacy regulators released a short press release describing the results of its most recent plenary meeting, in which the right to be forgotten was discussed.
The “right to be forgotten” refers to a “new” right that the Court of Justice of the European Union (CJEU) read into the Data Protection Directive (95/46/EC) in the May 2014 case, Google Spain v AEPD and Mario Costeja González (C-131/12). At its heart, the right to be forgotten (RTBF) enables European Union residents to request that search engines to take down certain types of search results based on searches of the requestor’s individual name. For example, the right enables requests to take down “irrelevant” or out of date search results.
Making good on its warnings that mobile apps will be an enforcement priority under the revised Children’s Online Privacy Protection Act (“COPPA”) Rule, the FTC has announced two settlements with mobile app developers:
- TinyCo., the developer of several child-directed mobile apps, will pay $300,000 to settle charges that it violated COPPA by collecting children’s email addresses through its mobile app without sufficient notice and parental consent.
- Yelp, the developer of a general-audience mobile app for user-generated reviews of restaurants and other businesses, will pay $450,000 to settle charges that a technical glitch allowed children under 13 to register without parental notice and consent.
While the settlements are getting a lot of attention in the press, the complaints are perhaps most interesting in that they continue the general trend of FTC enforcement actions addressing clear-cut cases of a company collecting children’s personal information (such as e-mail addresses) without providing parents notice or obtaining parental consent. The FTC’s settlements to date have not included allegations related to online behavioral advertising, social plugins, or similar issues involving the integration of third-party services.
The National Institute of Standards and Technology publishes security risk management standards and guidance that apply to public entities but have been influential throughout the private sector. Now, NIST is looking to provide similar guidance on privacy risk management, holding its Second Privacy Engineering Workshop earlier this week to consider draft privacy engineering definitions and concepts.
NIST has said that its work is “focused on providing guidance to developers and designers of information systems that handle personal information,” with the expectation that such guidance “may be used to decrease risks related to privacy harms, and to make purposeful decisions about resource allocation and the effective implementation of controls.” According to the IAPP’s Privacy Advisor, this week’s workshop focused on defining terms, including “privacy engineering” and “problematic data actions,” and a theme that emerged was the difficulty in creating a “black-and-white standards framework” for privacy.
NIST’s security standards focus on the objectives of Confidentiality, Integrity and Availability, and NIST has proposed that its privacy engineering standards similarly build on design objectives, proposing the following three:
- Predictability or enabling reliable assumptions about the rationale for collecting personal information and the data actions to be taken with personal information.
- Manageability or providing the capability for authorized modification of personal information, including alteration, deletion, or selective disclosure of personal information.
- Confidentiality or preserving authorized restrictions on information access and disclosure. (NIST has said it would use the same definition as Confidentiality is afforded in NIST Special Publication 800-53 Revision 4).
The public comment period for the NIST Privacy Engineering Objectives and Risk Model Discussion Draft has been extended until October 10.
In May 2014, the Global Privacy Enforcement Network (“GPEN”) performed its second Global Privacy Sweep, in which 26 privacy enforcement authorities from 19 countries downloaded 1,211 mobile apps and assessed their privacy practices. On September 10, 2014, the Office of the Privacy Commissioner of Canada (“OPC”) published the results of the Sweep (the “OPC Report”). The main findings can be summarized as follows:
- While most apps provided some privacy information, only 15% clearly explained the app’s privacy practices.
- Nearly 60% of the apps tested raised privacy concerns before the app was downloaded—meaning that there was not enough information available prior to download for potential users to adequately assess or review the app’s privacy policies.
- 43% of the apps reviewed did not tailor privacy communications to small screens such as those present on smartphones and tablets.
- 31% of the apps requested access to more information than necessary, based on GPEN’s understanding of the app’s functionality. Of the types of data requested, location was the most popular, followed by device IDs.
“Data is everywhere. The amount of data on the global level is growing by 50 percent annually. 90 [percent] of the world’s data has been generated within the past two years alone,” explains the International Working Group on Data Protection in Telecommunications in their Opinion of May 6, 2014, titled, “Working Paper on Big Data and Privacy: Privacy principles under pressure in the age of Big Data analytics“. The Working Group, founded in 1983, has adopted numerous recommendations and since the beginning of the 90s focused on the protection on privacy on the Internet. Its members include representatives from data protection authorities and other bodies of national public administrations, international organizations and scientists from all over the world.
It’s shaping up to be a big data weekend, for those of us who try to find some interesting weekend reading away from the crush of the day-to-day schedule. If you’re thinking about Monday’s FTC workshop on the impact of big-data analytics on vulnerable communities, a bit of weekend reading about the intersection between technology and justice might be just what you need:
- The brilliant Peter Swire, of the Georgia Institute of Technology and the Future of Privacy Forum, has just released a white paper in advance of his participation in the workshop entitled Lessons from Fair Lending Law for Fair Marketing and Big Data. Peter’s analysis, as always, is right on target — as we think about fairness in the big-data sphere, what better place to start than the fair-lending laws that have tried to meet this challenge for decades?
- If you’re looking for a more global approach, you might like Commissioner Julie Brill’s paper from a talk she gave yesterday in Vienna: Privacy in the Age of Omniscience. In addition to quoting the always interesting Dave Eggers (“all that happens will be known”), the short paper covers a surprising amount of territory — including rebranding the “right to be forgotten” as a right of relevancy or obscurity. Commissioner Brill, too, will be at the FTC on Monday to discuss big data.
- If you’re shopping for some new wearable technology this weekend, you might be thinking about the Internet of Things. (Well, it could happen.) Adam Thierer of George Mason University has just released a comprehensive framework for thinking about how to deal with privacy on interconnected devices called The Internet of Things & Wearable Technology: Addressing Privacy & Security Concerns Without Derailing Innovation. No one-size-fits all solutions here, to Adam’s credit — common sense and a recognition that a variety of options ought to be considered.
If there are interesting and compelling pieces you’re reading or watching about privacy these days, we’d love to know about them. Please tell us on Facebook or Twitter, and we’ll share them through the blog.
Fast fashion retailer Forever 21 Retail Inc. faces a putative class action lawsuit alleging that the retailer violated California law by requesting and recording shoppers’ credit card numbers and personal identification information at the point-of-sale.
Forever 21 shopper Tamar Estanboulian filed the lawsuit on September 7 in U.S. District Court for the Central District of California. Estanboulian alleges that Forever 21 has a policy requiring its cashiers to request and record credit card numbers and personal identification information from customers using credit cards at the point-of-sale in Forever 21’s retail stores in violation of the Song-Beverly Credit Card Act of 1971, California Civil Code § 1747.08. The complaint further alleges that the retailer pairs the obtained personal identification information with the shopper’s name obtained from the credit card used to make the purchase to get additional personal information.
According to the complaint, Estanboulian purchased merchandise with a credit card at a Forever 21 store in Los Angeles, CA this summer. The cashier asked Estanboulian for her email address without informing her of the consequences of not providing the information. Estanboulian alleges that she provided her email address because she believed that it was required to complete the transaction and receive a receipt. She also claims that she witnessed cashiers asking other shoppers for their email addresses. Shortly after completing her purchase and leaving the store, Estanboulian received a promotional email from Forever 21.
On September 10, 2014, President-elect of the European Commission, Jean-Claude Juncker, nominated his team of Commissioners. However, there is still a lack of clarity around responsibility for the data protection portfolio, including the General Data Protection Regulation (“GDPR”). It now appears that the portfolio will be coordinated among at least three Commissioners.