The last two weeks have brought two important decisions in the ongoing litigation over behavioral advertising firm NebuAd’s alleged use of a device to intercept data from ISP networks. Several ISPs allegedly permitted NebuAd to install an “appliance” on their networks in order to collect and analyze subscriber data for ad targeting purposes. In lawsuits that began to be filed in 2008, plaintiffs have alleged that NebuAd--and the ISPs with which it allegedly partnered-- violated Title I of the Electronic Communications Privacy Act (i.e., the Wiretap Act) as well as other federal and state laws. Plaintiffs have sued the ISPs in separate suits around the country. Two of these suits--against ISPs Embarq and WideOpen West (“WOW”)--yielded decisions in favor of the ISPs last week.
On Thursday, the Federal Trade Commission (“FTC”) hosted a workshop to explore the practices and privacy implications of comprehensive data collection. The event gathered consumer protection groups, academics, privacy professionals, and business and industry representatives to examine the current state of comprehensive data collection, its risks and potential benefits, and what the future holds for consumers and their choices.
In her opening remarks, FTC Commissioner Julie Brill indicated the agency was open to revising its consumer privacy framework if comprehensive data collection warranted heightened restrictions or enhanced consent to protect and inform users: “We know that comprehensive data collection allows for greater personalization and other benefits, but there may be other contexts in which it does not lead to desirable results.”
The workshop was one of five main action items adopted by the FTC as part of its March 2012 report, Protecting Consumer Privacy In an Era of Rapid Change. In the report, the commission told companies that consent was not required for the collection and use of information that was consistent with a particular transaction or the company's relationship with the consumer. But the agency said it needed more information to determine how this principle applied to technologies that could capture large amounts of consumer information, such as deep packet inspection (DPI).
On Thursday, November 15, 2012, Judge Robert S. Lasnick of the Western District of Washington dismissed Del Vecchio v. Amazon, stating that the parties had reached a settlement, the details of which were not disclosed. The suit had alleged (among other things) that Amazon used Flash cookies to backup and “respawn” browser cookies that plaintiffs had deleted, and thereby “circumvented” plaintiffs’ browser privacy controls. The complaint (which was amended several times) included claims under the federal Computer Fraud and Abuse Act and the Washington Consumer Protection Act, as well as several common law claims.
Prior to the settlement, Amazon had filed three separate motions to dismiss, and succeeded twice in getting major claims tossed out. Amazon filed its initial motion to dismiss in May 2011, but eventually withdrew it after the court granted a request by plaintiffs to amend their complaint for the first time. The court later granted Amazon’s motion to dismiss the first amended complaint in its entirety, citing plaintiffs’ failure to “establish any plausible harm.” In June 2012, the key claims in the second amended complaint also were dismissed based on similar reasoning.
The fact that this settlement was limited to the individual plaintiff suggests that Amazon’s strategy of vigorously defending the litigation appears to have brought it more success than some defendants in other “Flash cookie” lawsuits (such as QuantCast and Clearspring) who agreed to more sizeable class settlements early in their litigations.
A U.S. district court has approved the Federal Trade Commission's $22.5 million settlement with Google. The FTC had charged that Google misrepresented to users of Apple's Safari browser that it would not place tracking cookies or serve targeted ads to those users, violating an earlier privacy settlement between the company and the FTC.
The settlement is the largest FTC penalty ever for violation of a Commission order.
A Web analytics company recently settled FTC charges that it deceptively collected consumers’ personal information.
The Federal Trade Commission has announced that it will host a public workshop on December 6 to discuss the privacy issues raised by the collection of data about consumers’ online activities by so-called large platform providers. According to the scheduling notice, the FTC seeks to explore the potential privacy issues raised by the collection of data by Internet Service Providers, operating systems, browsers, and social media. Specifically, the FTC seeks to examine such questions as:
- What methods are used to collect data about consumers’ activities across the Internet?
- What are the benefits of comprehensive data collection and what are the possible privacy challenges?
- Which entities are capable of comprehensive data collection, and which of them are doing so?
- How aware are consumers of comprehensive data collection, and what are their attitudes toward it?
- If companies implement comprehensive data collection, how can they effectively inform consumers about the collection and provide meaningful choice to consumers?
- What privacy risks are created by serving as a host for third-party applications?
- Are there sufficient choices among online products and services to give consumers meaningful options should they wish to avoid products or services that use comprehensive data collection?
- What legal protections currently exist in this area?
- What legal protections should be provided?
The scheduled workshop is one of five main action items adopted by the FTC as part of its March 2012 report, Protecting Consumer Privacy In an Era of Rapid Change. The other action items pertain to the implementation of Do Not Track, mobile privacy issues, data broker privacy issues, and promoting enforceable self-regulatory codes.
On Monday, the Online Interest-Based Advertising Accountability Program, which monitors compliance with the Self-Regulatory Principles for Online Behavioral Advertising, issued a decision finding that the auto company Kia had failed to adhere to the Principles. The Accountability Program also issued decisions stating that Kia’s ad agency--and the ad network the agency had used to place Kia’s ads--also had failed to comply with the Principles.
The decisions state that representatives of the Accountability Program visited Kia’s website and later were shown Kia ads on non-affiliated sites, suggesting that the ads had been targeted based on the representatives having visited kia.com. (In other words, the representatives appear to have been “retargeted.”) The targeted ads, however, were not accompanied by “enhanced notice,” which the decisions describe as “a clear, meaningful, and prominent link” (such as the widely used AdChoices Icon) that “directs the consumer to information about the [advertiser’s or other third party’s] OBA data collection and use practices and an opportunity to exercise choice” with respect to those practices. This, the decisions assert, violated the Principles’ Transparency requirement.
Although not the first decision issued by the Accountability Program, the Kia decision may be the most significant. It is the first decision issued against a major advertiser; previous cases had focused on ad networks, DSPs, and data management companies. The case also marked the first time the Accountability Program has taken action against multiple companies involved in the same ad campaign. The Program’s decision to do so underscores the Digital Advertising Alliance’s position that the Self-Regulatory Principlesapply to actors across the advertising ecosystem. Finally, it is noteworthy that Kia apparently had not represented (on its website or elsewhere) that it would comply with the Principles. That the Accountability Program nonetheless chose to take action against the company shows that any company that engages in OBA (as that term is defined in the Principles) may be monitored for compliance with the Principles.
FTC Approves $22.5 Million Consent Decree to Settle Charges that Google Bypassed Safari Users' Privacy Settings
Today the Federal Trade Commission has announced its approval of a consent decree to settle charges that Google misrepresented to users of Apple's Safari browser that it would not place tracking “cookies” or serve targeted ads to those users, violating an earlier privacy settlement between the company and the FTC. The decree requires Google to pay a civil penalty of $22.5 million and to disable all of the tracking cookies it had said it would not place on Safari users' computers. The FTC states that this settlement "is the largest FTC penalty ever for violation of a Commission order."
Google's practices related to Safari first became subject to public scrutiny after a February 2012 Wall Street Journal article alleged that Google and other advertising companies installed special code onto users' computers that tricked Safari into allowing the companies to track users' web-browsing habits. Safari is designed to block tracking by default. In its complaint, the FTC charged that for several months in 2011 and 2012, Google placed cookies on the computers of Safari users who visited sites within Google's DoubleClick advertising network, despite previous language on Google's Help Center website telling those users they would automatically be opted out of such tracking. Under the settlement, through February 15, 2014 Google must maintain systems configured to instruct Safari browsers to expire any DoubleClick cookie placed by Google on or before February 15, 2012.
Yesterday, Microsoft announced that users of Windows 8 and Internet Explorer 10 will have a “first run” option to disable the default “Do Not Track” privacy setting. A first run option occurs during the software set-up process. If users take no action, the DNT setting will be enabled by default.
Shortly after the Federal Trade Commission first began calling for the creation of DNT mechanisms in a December 2010 preliminary staff report, Protecting Consumer Privacy in an Era of Rapid Change, Microsoft and other browser providers have announced a number of different DNT solutions. The FTC’s March 2012 report on consumer privacy praised the efforts of browser vendors and other industry groups to develop DNT mechanisms. The FTC has said that it will continue to work with industry groups to complete implementation of a DNT system that is universal, easy to use, persistent, enforceable, and that allows consumers to opt out of the collection of behavioral data for all purposes (other than expected contextual uses).
The House Judiciary Subcommittee on Intellectual Property, Competition, and the Internet recently held a hearing entitled “New Technologies and Innovations in the Mobile and Online Space, and the Implications for Public Policy." Much of the discussion focused on the relative merits of self-regulation versus the enactment of comprehensive federal privacy legislation. (Separately, the Senate Commerce Committee has announced that it will hold a hearing on the adequacy of self-regulation in protecting consumer privacy on June 28.)
In his opening remarks, Rep. Melvin Watt (D-NC) discussed the need for “baseline progressive legislation that will provide certainty to both consumers and companies, and promote a healthy online economy.” Rep. Watt appeared to support the White House framework of enacting comprehensive federal privacy legislation that would be complemented by industry codes of conduct. Emphasizing the importance of legislation, Watt surmised that, “without a baseline set of principles with the force of law, privacy policies may be used by larger players in an anti-competitive manner to drive smaller players from the market.”
Yesterday, the FTC held a public workshop titled “In Short: Advertising & Privacy Disclosures in a Digital World.” The workshop explored whether and how the FTC should revise its 2000 guidance concerning advertising and privacy disclosures in the new era of online and mobile technology.
This post will highlight the morning workshop sessions on usability research, cross-platform advertising disclosures, and social media advertising disclosures. A second post will recap the afternoon’s discussions on mobile advertising and privacy disclosures.
The FTC has decided not to pursue an enforcement action against Clearwater Aquarium for alleged violations of the Children's Online Privacy Protection ("COPPA") Rule.
After reviewing the website, the FTC concluded "that the information collection practices that had triggered CARU's inquiry had been remedied." The FTC declined to take any further action, instead referring the matter back to CARU.
CARU, a division of the Council of Better Business Bureaus, is a self-regulatory body that monitors websites for compliance with COPPA. Although CARU's self-regulatory program is completely voluntary, CARU may refer cases to the FTC if companies refuse to respond to inquiry letters. The FTC reviews CARU's case referrals to determine whether enforcement action is appropriate. Although the FTC has initiated enforcement actions in response to CARU referrals in the past, the Clearwater Aquarium case is a reminder that the FTC may decide no further action is necessary.
The Digital Advertising Alliance’s Self-Regulatory Program for Online Behavioral Advertising continues to gather steam. Last month, after the Program garnered favorable mention in the FTC’s final privacy report, a representative of the Interactive Advertising Bureau (one of the DAA’s participating organizations) announced that the Program’s Advertising Option Icon is now being served in more than one trillion online ads per month.
An announcement yesterday by the IAB suggests another milestone for the Program may be on the horizon: expansion into online streaming video. The IAB revealed that its new suite of technical specifications and protocols for the serving of in-stream ads will enable the Icon to be served in or around such ads, allowing entities that collect behavioral data from video viewers to meet any obligations they may have under the DAA’s transparency and consumer control principles.
Over the last few weeks, a number of cosponsors have been added to the Do Not Track Kids Act of 2011 (H.R. 1895), bringing the total number of cosponsors to 29. The bill was introduced by Rep. Markey and Rep. Barton on May 13, 2011. Earlier this month, the two members also hosted a Congressional briefing to discuss how to protect children and teens online.
As we blogged about here, the bill would expand the Children’s Online Privacy Protection Act ("COPPA"). In addition, the bill would introduce new privacy protections for minors under the age of 18, including a prohibition on the use of personal information for targeted marketing to minors and a requirement that operators of websites and online services provide "eraser buttons" that enable the deletion of personal information shared publicly by minors.
We will continue to monitor this legislation as these two senior, bipartisan members of the Committee press for a mark-up of their bill.
Last week, Judge Ware of the Northern District of California denied a motion to amend his November 2011 dismissal, with prejudice, in In re Facebook Privacy Litigation, a case in which plaintiffs had argued that Facebook improperly transmitted users’ personal information, including User ID numbers or usernames, to third party advertisers.
In his most recent Order, Judge Ware reaffirmed his prior holding that plaintiffs had not stated a claim under the Stored Communications Act (“SCA”) based on an exception to the statute that allows a service provider to divulge the contents of a communication to, or with the lawful consent of, “an addressee or intended recipient” of the communication.
The Network Advertising Initiative ("NAI"), a coalition of more than 80 online advertising companies committed to self-regulation, released a report this week finding that there is a high degree of compliance with the NAI's Self-Regulatory Code of Conduct, which governs the use of consumer data for purposes of online behavioral advertising. In particular, the report concludes that NAI's member companies are complying with the Code's restrictions on using sensitive data for purposes of online behavioral advertising and prohibitions on the use of data for secondary purposes, including to make insurance or employment decisions. In addition, member companies are not specifically targeting children under the age of 13.
FTC Report Calls For More Notice Involving Mobile Apps Directed To Kids, Warns Enforcement Could Come Over Next Six Months
The FTC staff released a report today calling for participants in the mobile app ecosystem -- including app developers, app stores, and third parties who collect data through mobile apps -- to provide better privacy notices to parents about mobile apps directed to children, and warning that over the next six months, staff will be conducting additional reviews "to determine whether there are COPPA violations and whether enforcement is appropriate."
The report is based on the staff's survey of apps offered in the Android Market and the Apple App store. Staff focused on "the types of apps offered to children; the age range of the intended audience; the disclosures provided to users about the apps’ data collection and sharing practices; the availability of interactive features, such as connecting with social media; and the app store ratings and parental controls offered for these systems."
Notably, the report stated that the FTC expects the whole app ecosystem to "play an active role in providing key information to parents who download apps." Specifically, the report outlined the following:
- App developers should provide parents information about (1) what information an app collects, (2) how the information will be used, and (3) with whom the information will be shared, using short disclosures or icons that are easy to find and understand on the small screen of a mobile device. App developers also should alert parents if the app connects with social media, or allows targeted advertising to occur through the app.
- Third parties that collect information through apps should disclose their privacy practices, whether through a link on the app promotion page or another easily accessible method.
- App stores should provide a more consistent way for developers to display information regarding their app’s data collection practices and interactive features. The FTC stated, for example, that app stores could provide a designated space for developers to disclose this information and standardized icons to signal specific features, such as connections with social media services. In addition, the FTC emphasized that app stores should be enforcing developer agreements that require developers to disclose the information their apps collect.
The report expressed a preference for disclosures that are provided prior to the parent's purchase of the app, noting that "[i]nformation provided to parents after downloading an app is, in staff’s view, less useful in the parent’s decision-making since, by then, the child may already be using the app and the parent already could have been charged a fee."
In addition, the report focused on disclosures involving in-app purchases, interactive features, and targeted advertising. The report states that the FTC is considering whether additional protections are needed with respect to in-app purchase capabilities in apps for children. It emphasized that "confusing and hard-to-find disclosures do not give parents the control that they need in this area." Staff believe that the presence of social features within an app is highly relevant to parents selecting apps for their children, and that such functionality should be disclosed prior to download. And the report states that "parents need clear, easy-to-read, and consistent disclosures regarding the advertising that their children may view on apps, especially when that advertising is personalized based on the child’s in-app activities.”
As we have blogged about here and here, the FTC currently is reviewing its rules implementing the Children’s Online Privacy Protection Act, which governs the online collection, use, and disclosure of personal information from children under the age of 13.
The United States District Court for the Western District of Seattle recently dismissed an online privacy case involving the alleged improper use of browser and Flash cookies in Del Vecchio v. Amazon. Finding that the plaintiff “simply not plead adequate facts to establish any plausible harm,” this opinion follows closely on the heels of several other recent decisions that dismissed cases because of an ability to demonstrate adequate injury or harm or to allege sufficient injury-in-fact to satisfy Article III standing, including In re Facebook Privacy Litigation, In re Zynga Privacy Litigation and Low v. LinkedIn (in which Covington represents LinkedIn).
In reaching this finding, the Amazon court rejected plaintiffs’ two categories of alleged injury; namely, (1) that Amazon’s alleged misappropriation of plaintiffs’ economic and property interests led to “economic harms,” including “lack of proper value-for-value exchanges, undisclosed opportunity costs devaluation of personal information [and] loss of the economic value of the information as an asset”; and (2) that Amazon’s alleged transfer of cookies caused damage by diminishing the performance and value of plaintiffs’ computer resources. Plaintiffs were granted leave to file an amended complaint.
Earlier today, the House of Representatives approved an amendment to the Video Privacy Protection Act (VPPA) (H.R. 2471) that would clarify certain ambiguities in the 1988 law in light of technological changes in the marketplace. In his remarks on the House floor, Rep. Bob Goodlatte (R-VA) – the primary author of H.R. 2471– explained that the amendment will facilitate the sharing of video usage information on social media networks.
During a debate on the legislation, Rep. Melvin Watt (D-NC) opposed the bill as he did in the committee markup, expressing concern about the adequacy of one-time consent to the sharing of information on dynamic social media sites. He emphasized the sensitivity of video usage information and expressed concerns about whether Congress has given sufficient thought to the impact of H.R. 2471 on state video privacy laws. Rep. Watt also questioned the propriety of Congress acting in light of a number of pending private law suits under the VPPA. Rep. John Conyers, Jr. (D-MI) lent his support to H.R. 2471, but stated that he would have preferred the bill require consumers to renew their consent periodically.
Under the VPPA, which was passed long before the Internet was widely available, “video tape service providers” generally are not permitted to share a consumer’s video usage information without “the informed, written consent of the consumer given at the time the disclosure is sought.” If enacted into law, H.R. 2471 would clarify this limitation in the context of online distribution in the following ways:
The Ninth Circuit reversed the district court’s approval of a class action settlement last Monday in Nachshin v. AOL, remanding the two-year old case back to the district court for a new round of settlement negotiation and approval. No. 10-55129 (9th Cir. Nov. 21, 2011). The class action was brought in 2009, alleging that the Internet company violated the Electronic Communications Privacy Act (ECPA) when it inserted footers containing promotional messages into e-mails sent by its users. The complaint also alleged unjust enrichment, breach of contract, and violations of state law.
The problem with the settlement was not that the class representatives failed to adequately represent class members, as in the Second Circuit’s recent decision in the latest iteration of the Tasini v. New York Times case, or that the interests of the members of the proposed class (all 66 million of them) were too factually and legally different to proceed in a class action, as in the Ninth Circuit’s recent decision in Ellis v. Costco Wholesale Corp. Instead, the Ninth Circuit reversed the settlement on the less common ground that it provided for distributions from the settlement fund to charities that were unrelated to the claims underlying the lawsuit.
- Web-standards group releases draft "Do-Not-Track" mechanism
- FTC Settles Flash Cookie and COPPA Claims
- Self-Regulatory Council Releases Enforcement Decisions
- DAA Releases "Self-Regulatory Principles for Multi-Site Data"
- Bono Mack Holds Hearing About Consumer Privacy Expectations
- House Subcommittee Discusses COPPA Updates, Teen Privacy
- Article 29 Working Party Meets the European Advertising Industry over Self-Regulatory Code
- Preliminary Results Reported From Stanford "Tracking the Trackers" Study
- Supreme Court Reaffirms Application of First Amendment to Children
- FTC Launches Online Advertising Review
- California Privacy Claims Survive Motion to Dismiss In NebuAd Lawsuit
- California DNT Hearing Scheduled For May 3
- FTC Reaches Settlement with Online Advertiser Chitika on Opt-Outs
- UK Information Commissioner Issues (Vague) Warning on Cookies
- Growing Diversity in Advertising Opt Outs
- Privacy Lawsuit Against Cable One Dismissed
- Roundtable, Commissioner Brill Discuss Preliminary FTC Staff Report
- Ringleader Agrees to Settle Privacy Suits
- Banks Explore Advertising On Customer Bank Statements
- Come Clean on Paid-For Tweets, says UK Authority
- New Canadian Law Regulates Spam
- The FTC Seeks To Recover Millions Of Dollars In Unauthorized Charges
- Court Holds Subscribers Consented to "Deep Packet Inspection"
- FTC's Chief Technologist Explains "Do Not Track"
- Commerce Privacy Report Comments Due January 28
- European Parliament Says Targeted Online Advertising Threatens Privacy
- Quantcast, Clearspring Agree to Settle "Flash Cookies" Suits