In 2020, Illinois residents whose photos were included in the Diversity in Faces dataset brought a series of lawsuits against multiple technology companies, including IBM, Facefirst, Microsoft, Amazon, and Google alleging violations of Illinois’ Biometric Information Privacy Act.[1] In the years since, the cases against IBM and FaceFirst were dismissed at the agreement of both parties, while the cases against Microsoft, Amazon, and most recently, Google were dismissed at summary judgment.

These cases are unique in the landscape of BIPA litigation because in all instances, defendants are not alleged to have had direct contact with the plaintiffs. Instead, plaintiffs alleged that defendants used a dataset of photos created by IBM (the Diversity in Faces, or DiF, dataset) which allegedly included images publicly made available by photo-sharing website Flickr. The DiF dataset allegedly implemented facial coding schemes to measure various aspects of the facial features of individuals pictured, and was made available to researchers with the goal of mitigating dataset bias. The nature of these allegations sets these cases apart from cases like Monroy v. Shutterfly, Inc., 2017 WL 4099846 or In re Facebook Biometric Info. Priv. Litig., 326 F.R.D. 535 in which plaintiffs alleged that defendants had collected biometric data from them. Here, there was no allegation that plaintiffs used a product created by defendants, gave data to defendants, or interacted with defendants in any way. Thus, these cases demonstrate the importance of considering BIPA when developing biometric technologies or performing research, even if direct interaction with Illinois residents is limited.

Extraterritoriality

It is well-established that BIPA does not apply extraterritorially to conduct outside of Illinois. The DiF cases considered whether BIPA’s territorial limits barred plaintiffs’ claims. The courts uniformly declined to grant defendants’ motions to dismiss on such grounds but did eventually grant motions for summary judgment. Both the Amazon and Microsoft courts acknowledged at the motion to dismiss stage that plaintiffs did not upload any data to defendant companies, that they did not directly use defendants’ products, and that plaintiffs did not allege that defendants had obtained the DiF dataset from Illinois. However, the courts allowed discovery in order to assess not only typical factors such as plaintiff’s residency and the location of harm, but also “[i]nternet-specific factors, such as where the site or information was accessed, or where the corporation operates the online practice.”

Ultimately, all courts to rule on the question found that BIPA did not apply as the events in question did not occur primarily and substantially in Illinois. To support this finding, the Amazon and Microsoft courts noted that entities other than defendants were responsible for collecting and generating facial scans from the photographs. Additionally, the Amazon court found that there was no evidence that employees had downloaded, reviewed, or evaluated the DiF dataset in Illinois. Similarly, the Google court stated that plaintiffs had not alleged any “direct interaction” that would give rise to the alleged BIPA violations. The Microsoft court went further by stating that even if Microsoft’s systems “‘chunked,’ encrypted, and stored the DiF Dataset on a server in Illinois,” any connection between Microsoft’s conduct and Illinois would still have been too attenuated for BIPA to apply.

Unjust Enrichment

Plaintiffs also brought unjust enrichment claims, alleging that defendants unlawfully acquired plaintiffs’ biometric information and profited from its dissemination. On summary judgment, the Microsoft and Amazon courts found that, because employees did not use the facial annotations in the dataset and did not use the dataset to train or improve their facial recognition technologies, there was no unjust enrichment. It is worth noting that these decisions relied on highly fact-specific analyses citing multiple relevant depositions.

In conclusion, a key observation emerging from this line of cases is that those that did not settle were dismissed at summary judgment once discovery showed that defendants’ actions were not connected to Illinois and that they did not use the DiF dataset to improve their own technologies. Though this trend may slow the rate at which new BIPA litigation is filed against companies that use biometric data to improve their technologies, companies can still consider mitigating risk and improving their chances of prevailing on motions to dismiss by closely examining the source of any biometric data and evaluating whether consumer consent was obtained.


[1] Vance v. Int’l Bus. Machines Corp., 2020 WL 5530134; Vance v. Facefirst, Inc., 2021 WL 5044010; Vance v. Amazon.com, Inc., 2022 WL 12306231; Vance v. Google LLC, 2024 WL 1141007; Vance v. Microsoft Corp., 2022 WL 9983979.

Print:
Email this postTweet this postLike this postShare this post on LinkedIn
Photo of Libbie Canter Libbie Canter

Libbie Canter represents a wide variety of multinational companies on privacy, cyber security, and technology transaction issues, including helping clients with their most complex privacy challenges and the development of governance frameworks and processes to comply with global privacy laws. She routinely supports…

Libbie Canter represents a wide variety of multinational companies on privacy, cyber security, and technology transaction issues, including helping clients with their most complex privacy challenges and the development of governance frameworks and processes to comply with global privacy laws. She routinely supports clients on their efforts to launch new products and services involving emerging technologies, and she has assisted dozens of clients with their efforts to prepare for and comply with federal and state privacy laws, including the California Consumer Privacy Act and California Privacy Rights Act.

Libbie represents clients across industries, but she also has deep expertise in advising clients in highly-regulated sectors, including financial services and digital health companies. She counsels these companies — and their technology and advertising partners — on how to address legacy regulatory issues and the cutting edge issues that have emerged with industry innovations and data collaborations.

As part of her practice, she also regularly represents clients in strategic transactions involving personal data and cybersecurity risk. She advises companies from all sectors on compliance with laws governing the handling of health-related data. Libbie is recognized as an Up and Coming lawyer in Chambers USA, Privacy & Data Security: Healthcare. Chambers USA notes, Libbie is “incredibly sharp and really thorough. She can do the nitty-gritty, in-the-weeds legal work incredibly well but she also can think of a bigger-picture business context and help to think through practical solutions.”

Photo of Lindsey Tonsager Lindsey Tonsager

Lindsey Tonsager co-chairs the firm’s global Data Privacy and Cybersecurity practice. She advises clients in their strategic and proactive engagement with the Federal Trade Commission, the U.S. Congress, the California Privacy Protection Agency, and state attorneys general on proposed changes to data protection…

Lindsey Tonsager co-chairs the firm’s global Data Privacy and Cybersecurity practice. She advises clients in their strategic and proactive engagement with the Federal Trade Commission, the U.S. Congress, the California Privacy Protection Agency, and state attorneys general on proposed changes to data protection laws, and regularly represents clients in responding to investigations and enforcement actions involving their privacy and information security practices.

Lindsey’s practice focuses on helping clients launch new products and services that implicate the laws governing the use of artificial intelligence, data processing for connected devices, biometrics, online advertising, endorsements and testimonials in advertising and social media, the collection of personal information from children and students online, e-mail marketing, disclosures of video viewing information, and new technologies.

Lindsey also assesses privacy and data security risks in complex corporate transactions where personal data is a critical asset or data processing risks are otherwise material. In light of a dynamic regulatory environment where new state, federal, and international data protection laws are always on the horizon and enforcement priorities are shifting, she focuses on designing risk-based, global privacy programs for clients that can keep pace with evolving legal requirements and efficiently leverage the clients’ existing privacy policies and practices. She conducts data protection assessments to benchmark against legal requirements and industry trends and proposes practical risk mitigation measures.

Photo of Priya Leeds Priya Leeds

Priya Sundaresan Leeds is an associate in the firm’s San Francisco office. She is a member of the Privacy and Cybersecurity Practice Group. She also maintains an active pro bono practice with a focus on gun control and criminal justice.