Earlier this week, U.S. Federal Trade Commission (FTC) Chairwoman Edith Ramirez gave the keynote address at a technology conference, in which she focused on the privacy challenges of so-called “big data.” Her remarks provide some guidance about what the FTC considers “best practices” in terms of deploying big data analytics without raising privacy concerns.
- Data minimization and sound retention limits. The Chairwoman urged companies to “[a]void the indiscriminate collection of personal information” and suggested that it is not appropriate for companies to, “[k]eep data on the off-chance that it might prove useful.” She also suggested that retention limits are appropriate, noting that “old data is of little value.”
- De-identification. She noted that stripping out unique identifiers to render data anonymous can be an effective risk-mitigation technique. She cited the FTC’s 2012 Privacy Report as describing “an approach to de-identification that seeks to balance the benefits of de-identification with the risks that anonymous data will be re-identified.”
- Choice. She called on companies to “focus on consumer choice at the time of collection.” She noted that when consumers decide to share personal data with a business, that consent “is generally limited to the transaction at hand.” “Rarely, if ever, are consumers given a say about the aggregation of their personal data or secondary uses that are not even contemplated when their data is first collected.” Chairwoman Ramirez did not expand on what she believes that companies should do to provide consumers more of a “say” with respect to the aggregation and secondary uses of their data.
- Use limitations. She noted that use restrictions “have their place too,” although such use restrictions do not replace the need for companies to implement limitations on collection and provide users choices where appropriate.
- Profiling. The Chairwoman expressed concern about uses of personal data that could also be harmful to [consumers’] interests.” She stated that this problem can be seen “most acutely” with data brokers — companies that collect and aggregate consumer information from a wide array of sources to create detailed profiles of individuals. She noted that the FTC is investigating and will produce a report on the practices of data brokers later this year.
- Data Determinism. She also stated particular concern with the possibility that big data analytics might unfairly identify certain consumers as “poor credit or insurance risks, unsuitable candidates for employment or admission to schools or other institutions, or unlikely to carry out certain functions,” based solely on “unwarranted inferences and correlations.” She further stated that, “[a]t the very least, companies must ensure that by using big data algorithms they are not accidently classifying people based on categories that society has decided — by law or ethics — not to use, such as race, ethnic background, gender, and sexual orientation.”
- Data security. The Chairwoman praised the Commission’s record of bringing enforcement actions against companies that fail to employ reasonable security safeguards. She also encouraged Congress to provide the Commission civil penalty authority against companies that fail to meet this standard to provide “stronger incentives to push firms to safeguard big data.”
The Chairwoman also noted that the Commission has scheduled a November workshop on the Internet of Things, suggesting that the big data ecosystem may become even more complex as “more parts of our daily lives . . . generate data.” The Chairwoman said that the FTC’s existing jurisdiction provides it several tools to help ensure that consumer privacy is respected, although she also called on Congress to enact baseline privacy legislation informed by the core principles of privacy by design, simplified choice, and greater transparency.