On Tuesday, September 30th, California Governor Jerry Brown signed into law 8 bills his office says were designed to “strengthen privacy [ ] protections.”
Among the bills is AB 2306, which prevents the attempt to capture an image or sound recording in an offensive manner through the use of any technological device. Among other things, AB 2306 builds on existing California law against privacy invasion to rein in aggressive paparazzi tactics, such as the use of drones to collect images and videos of celebrities in a way that violates their privacy rights.
AB 2306 was authored by Assemblyman Ed Chau (D-Monterey Park), who was concerned that “[a]s technology continues to advance and new robotic-like devices become more affordable for the general public, the possibility of an individual’s privacy being invaded substantially increases.” Critics of AB 2306 argued that laws that provide the necessary privacy protections already exist, such as those dealing with trespassing and harassment. In addition, the photojournalism industry has expressed opposition to AB 2306 because of the bill’s potential application to newsgathering and attendant First Amendment concerns.
Also on Tuesday, Gov. Brown signed into law an expansion of California’s “revenge porn” law. The initial law made it illegal to post online nude or sexually explicit pictures of an individual without their consent. The law provides for a jail term of up to six months, if convicted. Tuesday’s expansion allows victims themselves to seek a restraining order to remove the posts, and to seek damages in civil court. In addition, while the original law only covered images taken by someone else, the new law now covers “selfies.”
A full list of the bills is available here. And, the full text of the bills is available here.
The second annual study on data breach preparedness was released by the Ponemon Institute on September 24, and the study indicates that the number of companies that have had a data breach is on the rise.
Ponemon Institute conducts independent research on privacy, data protection, and information security policy. For the September 2014 study, Is Your Company Ready for a Big Data Breach?, Ponemon Institute surveyed 567 U.S. executives from organizations ranging in size from less than 500 to more than 75,000 employees about how prepared they think their companies are to respond to a data breach.
It appears that for an overwhelming number of the study’s participants, the answer to “Is your company ready for a big data breach?” is, unfortunately, “No.”
Meena Harris, a member of Covington’s Global Privacy and Data Security Practice Group, spoke with LXBN TV about the National Labor Relations Board’s recent ruling that two employees of a sports bar and restaurant were unlawfully discharged for their participation in a Facebook discussion criticizing their employer. You can view the interview here.
The Article 29 Data Protection Working Party (“Working Party”), the independent European advisory body on data protection and privacy, comprised of representatives of the data protection authorities of each of the EU member states, the European Data Protection Supervisor (the “EDPS”) and the European Commission, has identified a number of significant data protection challenges related to the Internet of Things. Its recent Opinion 08/2014 on the Recent Developments on the Internet of Things (the “Opinion”), adopted on September 16, 2014 provides guidance on how the EU legal framework should be applied in this context. The Opinion complements earlier guidance on apps on smart devices (see InsidePrivacy, EU Data Protection Working Party Sets Out App Privacy Recommendations, March 15, 2013).
After a particularly long work week, curling up with a law-review article can seem a little daunting for weekend reading. So for this weekend, I’ve been saving up some really promising magazine articles — short, concise, entertaining, and full of terrific information about privacy. Here are a few ideas that might make for bite-size reading on a nice autumn afternoon:
This week, the Government Accountability Office (“GAO”) released a report recommending eleven actions the Consumer Financial Protection Bureau (“CFPB”) should take to enhance the privacy and security of its ongoing data collections. The report also provides a detailed look at the increasingly large volume of information that CFPB collects, and how the agency’s data collection practices compare to those of other regulators.
To carry out its role in overseeing financial institutions and issuing reports on consumer financial issues, the CFPB began large-scale collection of financial data in January 2012. The agency uses that data to inform rulemaking, create statutorily-required studies, determine where to allocate supervisory resources, and understand the markets they oversee, according to the GAO report. Between January 2012 and July 2014, CFPB undertook 12 large-scale data collection efforts, spanning products including mortgages, student loans, credit cards.
The large-scale ongoing collections include:
- Automobile Sales Records on 700,000 vehicles, obtained on a monthly basis to monitor car sales volume and financing.
- Consumer credit report information on 10.7 million consumers, co-signers, and co-borrowers, obtained monthly to analyze changes in consumer behavior relating to debt.
- Credit card information on 25 to 75 million accounts, obtained monthly to identify risks in the credit card market.
- Mortgage information on 29 million active loans and 173 million total loans, obtained monthly monitor emerging trends in the mortgage market.
- Private-label mortgage information on 4 million active loans and 21.9 million total loans, obtained monthly to monitor emerging trends in the mortgage market
A recent statement from the Article 29 Working Party, the independent European advisory body on data protection and privacy, comprised of representatives of the national data protection authorities of the EU Member States, the European Data Protection Supervisor and the European Commission, finds that the EU data protection principles, outlined in the EU Data Protection Directive 95/46/EC, are still valid and appropriate for the development and use of big data analysis.
The statement responded to recent calls by stakeholders that certain data protection principles under EU law should be “substantially reviewed” to enable promising developments in big data operations. The Article 29 Working Party Statement, adopted on September 16, 2014, acknowledged that challenges presented by big data might require “innovative thinking” on how to address key data protection principles; but, the protection of personal data remains fundamentally engrained in building trust between companies and consumers.
This summer, the International Standards Organization (ISO) adopted a new voluntary standard governing the processing of personal data in the cloud — ISO 27018. Although this recent development has gone mostly unnoticed by the technology and media press to date, the new cloud standard provides a useful privacy compliance framework for cloud services providers that addresses key processor (and some controller) obligations under EU data protection laws.
ISO 27018 builds on existing information security standards, such as ISO 27001 and ISO 27002, which set out general information security principles (e.g., securing offices and facilities, media handling, human resources security, etc.). By contrast, ISO 27018 is tailored to cloud services specifically and is the first privacy-specific international standard for the cloud. ISO 27018 seeks to address such issues as keeping customer information confidential and secure and preventing personal information from being processed for secondary purposes (e.g., advertising or data analytics) without the customer’s approval. ISO 27018 also responds directly to EU regulators’ calls for the introduction of an auditable compliance framework for cloud processors to increase trust in the online environment (see the European Commission’s 2012 Cloud Strategy here).
On Monday, the FTC hosted a public workshop on the topic of big data and discrimination entitled, “Big Data: A Tool for Inclusion or Exclusion?” The first panel, which explored today’s big-data landscape, featured the following speakers from government, industry, and academia: Kristin Amerling, Chief Investigative Counsel and Director of Oversight at the U.S. Senate Commerce Committee; danah boyd, Principal Researcher at Microsoft Research and Research Assistant Professor at New York University; Mallory Duncan, Senior Vice President and General Counsel of the National Retail Federation; Gene Gsell, Senior Vice President for U.S. Retail & CPG at SAS; David Robinson, Principal at Robinson + Yu; and Joseph Turow, Professor at the University of Pennsylvania Annenberg School for Communication.
By Randall Friedland
According to a GAO report published September 16th, Healthcare.gov, the health insurance exchange rolled out last October, still has significant privacy weaknesses. Specifically, the report outlined that despite the Centers for Medicare & Medicaid Services’ (CMS) efforts to increase the security and privacy of data that it processes, maintains, and shares with both federal and commercial partners in an effort to support Healthcare.gov, “weaknesses remain both in the processes used for managing information security and privacy as well as the technical implementation of IT security controls.”