FTC Issues Guidance on Best Practices for Facial Recognition Technology
This week, the FTC released a staff report urging companies to adopt best practices for commercial uses of facial recognition technology. The report, entitled Facing Facts: Best Practices for Common Uses of Facial Recognition Technologies, follows a workshop held last December and more than 80 public comments addressing issues raised at the workshop. Facing Facts largely discusses how the core privacy principles from the Commission’s March 2012 privacy report — privacy by design, simplified choice, and transparency — should inform the use of facial recognition technologies, such as digital signs that can assess the age and gender of consumers standing before them as well as online photo tagging tools.
In this post, we provide an overview of the staff report’s guidance on how each of the principles should be applied by companies that employ facial recognition in their products and services.
Privacy by Design
The principle of privacy by design holds that companies should consider privacy throughout their organizations, at every stage of product development, and during the entire lifecycle of the data that they collect (from initial acquisition to disposal). In the facial recognition context, this broad principle means three things:
- Provide reasonable security not only for biometric data — i.e., the unique mathematical data extracted from an image to capture the individual identity— but also for the images themselves. The report endorses the use of encryption for stored biometric data, stating that encryption would diminish the likelihood that the data could be used by a wrongdoer who acquires the data without authorization. The report also suggests that companies that make identified photos available for public viewing use mechanisms to prevent the “unauthorized scraping” of these photos. Staff’s concern with scraping is that the scraped photo could be used as a reference against which an unidentified photo could be compared, potentially resulting in the identification of the person in the unidentified photo.
- Establish and maintain appropriate retention and disposal practices. The report’s guidance on retention and disposal largely echoes guidance the FTC provided in its March 2012 privacy report: consumer data should be retained only for as long as necessary to carry out the purpose for which it was collected. So, for example, when an eyeglass company allows consumers to upload their images to the company’s website in order to allow consumers to virtually “try on” different pairs of glasses, the company should only retain those images for as long as is necessary to provide that service. Although the report suggests it would be permissible to allow the consumer to save his or her image for use across different sessions, the report also states that the consumer should be given the opportunity to have the image deleted. Similarly, in the photo tagging context, where companies store biometric data derived from previously uploaded photos in order to suggest tags for new photos, the report says that those companies should delete that biometric data if the user decides to turn off the photo-tagging feature. Finally, and perhaps most notably, the FTC staff stated that “in all cases, the company should . . . inform consumers of (1) the length of time images are stored, (2) who will have access to the stored images, and (3) consumers’ rights regarding deletion of the stored images.”
- Consider the sensitivity of information when using facial recognition technologies. The March 2012 report suggested that companies consider the sensitivity of the data they maintain in developing their data management practices, noting, for example that sensitive data (e.g., geolocation data) generally should be retained for less time than other data. It appears that the facial recognition report’s discussion of the relative sensitivity of facial recognition data focuses less on the kind of facial recognition data that company collects (e.g., data that can be used to uniquely identify an individual vs. data that can be used to assess age or gender) than it does on the place where facial recognition data is collected. For example, the report suggests that digital signage companies be “vigilant regarding the location of signs” that employ facial recognition technology and should not place them in “bathrooms, locker rooms, health care facilities, and areas where children may congregate.” Also, the report highlights as a key concern the prospect that facial recognition technology could be used to identify otherwise anonymous people in public places.
The FTC’s March 2012 report advised that companies should provide choice before collecting and using consumer data for practices that are not “consistent with the context of the transaction or the company’s relationship with the consumer.” Although the report suggested that the ability to opt out of a particular practice generally was sufficient, it also discussed several practices that required the “affirmative express consent” of the consumer. The Facing Facts report invokes the same general standard for when choice is required and also notes certain practices for which affirmative express consent should be obtained.
- For certain practices, however, affirmative express consent should be obtained. As a general matter, the staff report states that companies must obtain affirmative express (i.e., opt-in) consent before using previously collected images or biometric data in a materially different way than was promised when the images or data were collected. This principle will be familiar from previous FTC pronouncements about companies’ obligations in the event of a material change to their data practices. The report also gives guidance with respect to specific practices that require an opt-in. These include situations in which a digital sign operator individually identifies consumers through its signs and situations in which a social networking service identifies users of its service to other users who are not their “friends.”
Throughout Facing Facts, the staff emphasizes the importance not only of providing clear notice to consumers who may interact with a technology using facial recognition, but also of general consumer education about these technologies — and about sharing photos online. The report notes that education about the implications of sharing photos “has particular relevance for teens, who often impulsively post photos and other content without an understanding that the photos could be used for unintended, secondary purposes.”
* * *
As we’ve recently noted, biometric privacy--including the privacy implications of facial recognition--have very much been under the microscope in the U.S. and abroad. With the publication of this report, we expect this scrutiny to increase. We will continue to monitor developments.