By Monika Kuschewsky and Sebastian Martin
The German Federal Ministry of the Interior recently published its revamped proposal for an “IT Security” Law. A similar proposal had already been adopted by the previous German Government in March last year (see InsidePrivacy, German Government Proposes Cybersecurity Law, March 22, 2014). However, that proposal ultimately failed to clear the legislative process in time before the federal elections in autumn 2013. The new proposal is based on the 2013 proposal, but certain changes have been made to address some of the concerns that had been voiced as regards the previous proposal.
The proposed IT Security Law pursues a similar objective as the proposed EU Directive on Network and Information Security (“NIS”) (see InsidePrivacy, European Parliament Votes to Ensure that the Proposed Network and Information Security Directive Focuses on Protecting Critical Infrastructure, March 15, 2014). In particular the rules on security requirements and security incident notifications in both proposals are generally aligned. Nonetheless, the reporting obligations under the proposed IT Security Law go further than those in the NIS Directive and the German proposal contains additional obligations in particular for telecommunications providers and providers of commercial information society services (Telemedien). The German Government made it clear that the proposed IT Security Law will serve as a guideline for its position in the Council (which represents the EU member states’ governments) and pending negotiation of the NIS Directive at EU level.
A federal court opinion released this week is a reminder that Michigan’s Video Rental Privacy Act (VRPA) may apply to far more than just videos.
The Michigan VRPA restricts the disclosure of customers’ personal information by companies “engaged in the business of selling at retail, renting, or lending books or other written materials, sound recordings, or video recordings.”
Plaintiff Deborah Kinder filed a putative class action lawsuit against Meredith Corp., alleging that the magazine publisher violated the VRPA by disclosing her personal information, along with information about the magazines to which she subscribed, without her consent.
Last Friday, the National Labor Relations Board (“NLRB”) ruled that two employees of a sports bar and restaurant were unlawfully discharged for their participation in a Facebook discussion criticizing their employer. In the Facebook discussion that prompted the firings, a former employee complained in a status update that she owed more taxes than expected because of withholding mistakes by the employer. The employee commented on the status, “I owe too. Such an asshole,” and was discharged. A second employee, who “liked” the former employee’s status, was discharged as well.
Section 7 of the National Labor Relations Act provides, in relevant part, “Employees shall have the right to self-organization, to form, join, or assist labor organizations, to bargain collectively through representatives of their own choosing, and to engage in other concerted activities for the purpose of collective bargaining or other mutual aid or protection . . . .” At issue in this case was not whether the employees’ Facebook activity was “concerted” or whether the employees had a statutorily protected right to engage in a Facebook discussion about the employer’s tax-withholding practices. Rather, the case centered on whether, as a result of their actions on Facebook, the two employees adopted the allegedly defamatory and disparaging statements contained in the former employee’s Facebook status and therefore lost the protection of the Act. Continue Reading
Last week, the Online Interest-Based Advertising Accountability Program released a compliance warning to clarify that its Self-Regulatory Principles for Online Behavioral Advertising (OBA Principles) apply―not just to traditional HTTP cookies―but to other types of tracking technologies that enable the tracking of consumers across different platforms and devices.
The compliance warning admonished companies developing and implementing new cross-platform and cross-device tracking technologies for OBA to provide effective enhanced notice and an OBA opt-out mechanism. As the compliance warning explained, “[a]s new ‘cookie-less’ technologies increasingly replace the more familiar ‘cookies’ in the delivery of personalized advertising across multiple screens, consumers must continue to receive real-time ‘enhanced’ notice and an easy-to-use and effective opt-out mechanism.”
The compliance warning also warned web publishers to be aware of the types of technologies employed by the third-parties that collect data for OBA on their websites.
“As with collection via HTTP cookies, when website publishers permit third parties to collect data for OBA using alternative identification technologies, they bear responsibility to provide enhanced notice on every page where that collection takes place and a disclosure of OBA practices that includes a compliant opt-out link that will work effectively with these cookie-less technologies.”
The compliance warning also notes that the Federal Trade Commission and Network Advertising Initiative have taken recent enforcement actions in connection with the use of alternative tracking technologies. It noted the FTC’s 2011 settlement with ScanScout, Inc. relating to the use of flash cookies and recent NAI and FTC actions involving Epic Marketplace, Inc.’ in connection with its use of a “history-sniffing” script to collect data on users’ web browsing habits.
The Federal Trade Commission (“FTC”) has approved final orders settling charges against Fandango and Credit Karma that the companies misrepresented the security of their mobile apps and failed to protect the transmission of consumers’ sensitive personal information. The FTC specifically alleged that, although the companies made security promises to consumers that their information was adequately stored and transmitted, both failed to reasonably secure mobile apps, leaving personal data such as credit-card information and Social Security numbers at risk for interception by third parties. In particular, among other claims, the FTC charged the companies with disabling Secure Sockets Layer (“SSL”) encryption, a default security process intended to protect consumers’ information by verifying the security of app communications and ensuring that an attacker cannot access any data sent or received.
The FTC alleged that these vulnerabilities easily could have been tested and prevented, however, each company failed to perform basic security reviews, including establishing an auditing process to oversee and examine security practices and vulnerability reports. The settlements therefore require that Fandango and Credit Karma establish comprehensive security programs that address any risks during the design and development stages of their apps. Fandango and Credit Karma also must agree to independent security evaluations every other year for the next 20 years.
More information on Fandango and Credit Karma’s respective settlements with the FTC can be found here and here.
On Tuesday, August 12, 2014, the Northern District of California’s Judge Lucy Koh issued an order granting in part and denying in part Yahoo’s motion to dismiss claims that it violated federal and California anti-wiretapping laws.
The putative class action, In re Yahoo Mail Litig., alleges that Yahoo’s practice of intercepting, scanning, analyzing, collecting, and storing information contained in emails between Yahoo Mail and non-Yahoo Mail users violated the Federal Wiretap Act, the Stored Communications Act, California’s Invasion of Privacy Act, and the California Constitution. For these violations, the putative class of non-Yahoo Mail users seeks injunctive relief, declaratory relief, statutory damages, and disgorgement of Yahoo’s revenues related to the alleged practices.
By Eric Carlson and Scott Livingston
On Friday, August 8, 2014, a Chinese court convicted British fraud investigator Peter Humphrey and his wife, Yu Yingzeng, a naturalized US citizen, of illegally obtaining personal information. Mr. Humphrey was sentenced to two and a half years in prison and fined RMB 200,000 (about US $32,000); Ms. Yu was sentenced to two years in prison and fined RMB 150,000 (US $24,000). For more information on the original arrests and Mr. Humphrey’s subsequent confession on state-owned TV, please see our earlier blog post here.
The husband and wife team ran a China-based consulting firm, ChinaWhys Co., that specialized in providing risk advisory services to multinational companies doing business in China. Under China’s Criminal Law, companies and individuals are subject to criminal penalties for illegally selling or obtaining the personal information of others where such violation is “serious.” Prosecutors alleged that the couple violated the law’s prohibition on illegal obtainment by collecting 256 personal information records, including hukou (city residential permit) information, family information, and travel and phone records. According to prosecutors, ChinaWhys purchased this information for RMB 800 to RMB 2000 (about US $130 to $325) per record and used it in background investigation reports prepared for ChinaWhys’ clients.
Last Friday, the FTC announced an agenda for its upcoming workshop, “Big Data: A Tool for Inclusion or Exclusion?” which will take place on Monday, Sept. 15, starting at 8:00 a.m. As we’ve previously reported, the workshop will build on recent efforts by the FTC and other government agencies to understand how new technologies affect the economy, government, and society, and the implications on individual privacy. In particular, while there has been much recognition for the value of big data in revolutionizing consumer services and generally enabling “non‐obvious, unexpectedly powerful uses” of information, there has been parallel focus on the extent to which practices and outcomes facilitated by big-data analytics could have discriminatory effects on protected communities.
The workshop will explore the use of big data and its impact on consumers, including low-income and underserved consumers, and will host the following panel discussions:
- Assessing the Current Environment. Examine current uses of big data in various contexts and how these uses impact consumers.
- What’s on the Horizon with Big Data? Explore potential uses of big data and possible benefits and harms for particular populations of consumers.
- Surveying the Legal Landscape. Review anti-discrimination and consumer-protection laws and discuss how they may apply to the use of big data, and whether there may be gaps in the law.
- Mapping the Path Forward. Consider best practices for the use of big data to protect consumers.
The FTC hopes that the workshop will build on the dialogue raised in its Spring Privacy Seminar Series held from February through May, which addressed mobile-device tracking, data brokers and predictive scoring, and consumer generated and controlled health data. The workshop will convene academic experts, business representatives, industry leaders, and consumer advocates, and will be open to the general public. In advance of the workshop, the FTC has invited the public to file comments, reports, and original research on the proposed topics. The deadline to submit pre-workshop comments is August 15. Following the workshop on September 15, the comment period will remain open until October 15.
The workshop comes on the heels of the White House’s anticipated report on big data released in May, which outlined the administration’s priorities in protecting privacy and data security in an era of big data. With an entire section dedicated to “Big Data and Discrimination,” the report warned that big data “could enable new forms of discrimination and predatory practices.” Chiefly focusing on the use of information, the report showed concern about using data to discriminate against vulnerable groups. Specifically, the report stated that “the ability to segment the population and to stratify consumer experiences so seamlessly as to be almost undetectable demands greater review, especially when it comes to the practice of differential pricing and other potentially discriminatory practices.” Continue Reading
By Jacqueline Clover and Monika Kuschewsky
The Court of Justice of the European Union (‘CJEU’) has ruled that an analysis produced by an administrative agency to inform and support the agency’s formal decisions (‘legal analysis’) is not of itself “personal data” as defined under Directive 95/46/EC (the ‘EU Data Protection Directive’). This is the case even where the legal analysis contains information that is clearly “personal data”, such as an individual’s name, date of birth, nationality and gender. The ruling of 17 July 2014 in Joined Cases C-141/12 and C-372/12 YS v. Minister voor Immigratie, Integratie en Asiel, and Minister voor Immigratie, Integratie en Asiel v. M, S, is available here.
It is an important decision for two reasons. First, it clarifies the boundaries of what constitutes “personal data” under EU law. And, second, it clarifies that a data subject’s right of access under the EU Data Protection Directive does not necessarily require access to the actual records containing personal data. In some cases, a full summary of the personal data in an intelligible form suffices.
Today, the Federal Trade Commission (“FTC”) issued a staff report examining the consumer-protection implications of popular shopping apps. These services are intended to ease and enhance the shopping experience by allowing consumers to, for example, compare prices in-store across retailers, collect and redeem deals, or pay for purchases while shopping in brick-and-mortar stores. The FTC was specifically interested in learning about what information these services make available to consumers before software is downloaded onto the mobile device, such as how the apps manage payment-related disputes like unauthorized transactions and billing errors, or the apps’ processes for handling consumers’ personal and purchase data. The report, which surveyed a total of 121 different shopping apps across the Google Play and Apple App Stores, concluded that apps frequently failed to provide vital pre-download information. For instance, of the 45 in-store purchase apps studied, which enable consumers to use their phones to pay for goods purchased in physical stores, few provided information explaining consumer liability or the process for handling payment-related disputes. Regarding the collection and use of consumer data, the privacy policies for most apps used vague and confusing language giving expansive authority to companies to collect, use, and share personal data. On the basis of these findings, the report offered the following recommendations to app developers and consumers.
- Disclose consumers’ rights and liability limits for unauthorized, fraudulent, or erroneous transactions. Before committing to use one of these services, consumers should be able to know what their potential liability is for unauthorized transactions, what protections are available based on method of payment, and whether procedures are available for resolving disputes.
- Clearly describe how consumer data is collected, used, and shared. Detailed explanations in plain language help consumers evaluate and compare the data practices of different services in order to make informed decisions about the apps they choose to install.
- Ensure that strong data-security promises mean strong data-security practices. Especially in light of the technological advances for smartphones that offer increased security, consumer should receive enhanced protections for data collected. Any commitments made regarding data security must be honored.
- Look for apps’ dispute-resolution procedures and liability limits, and consider payment methods used to make purchases. Because federal law limits consumer liability for unauthorized transactions made with credit or debit cards, but does not have limits for prepaid cards or accounts with a pre-loaded balance, consumers may not have recourse for unauthorized charges made with prepaid funds.
- Seek information before downloading apps about how data is collected, used, and shared. Consumers should make informed decisions about installing apps and evaluate an app’s data practices before using a particular service.