Biometric Information

On Thursday, the Illinois Supreme Court unanimously ruled in McDonald v. Symphony Bronzeville Park LLC that the exclusivity provisions of the state’s workers’ compensation statute do not preclude liquidated damages claims under the Biometric Information Privacy Act.  The decision narrows the defenses available to employers facing employment-related BIPA claims.

Illinois’s Workers’ Compensation Act generally provides the exclusive means by which an employee can recover against an employer for a work-related injury and requires such claims to be adjudicated before the Illinois Workers’ Compensation Commission, subject to several exceptions.  One of those exceptions is for injuries that are not compensable under the Workers’ Compensation Act.  At issue in McDonald was whether an alleged employment-based BIPA violation—here, the alleged use of a fingerprint-based timekeeping system without the required disclosures or consent—was the type of injury covered by the Workers’ Compensation Act.
Continue Reading Illinois Supreme Court Rules Workers’ Compensation Act Does Not Bar BIPA Liquidated Damages Claims

On May 5, 2020, the Seventh Circuit held that violations of the section 15(b) disclosure and informed consent provisions of the Illinois Biometric Information Privacy Act, 740 ILCS 14/1 et seq. (“BIPA”) constitute “an invasion of personal rights that is both concrete and particularized” for the purposes of establishing Article III standing to sue in federal courts.  However, the Seventh Circuit also held that the alleged harms associated with violations of section 15(a) of BIPA were insufficient to establish Article III standing.  Section 15(a) mandates public disclosure of a retention schedule and guidelines for permanent destruction of collected biometric information.

Covington has previously discussed developments in BIPA litigation, which has proliferated in recent years with the advancement of relevant technologies.  The increase in BIPA litigation has been accompanied by a rise in disputes over the nature of the harm required to sustain an action, both in state and federal courts.  Although this issue was seemingly resolved at the state-level by the Illinois Supreme Court’s 2019 Rosenbach decision, federal courts have continued to grapple with the issue for the purposes of Article III standing.
Continue Reading Seventh Circuit Rules on Article III Standing Issues in Illinois BIPA Lawsuit, Allowing Case to Proceed in Federal Court

On February 12, 2020, Senator Kirsten Gillibrand (D-NY) announced a plan to create a new Data Protection Agency through her proposed legislation, the Data Protection Act of 2020 (S.3300).

Under the proposal, the new agency would replace the Federal Trade Commission (FTC) as the “privacy cop on the beat.”  As such, the FTC’s current authority in the privacy space—including its ability to draft guidelines, conduct studies, and issue implementing regulations for certain federal privacy laws, would be transferred to the new agency.

As opposed to the Online Privacy Act, a bill introduced by Representatives Anna Eshoo (D-CA-18) and Zoe Lofgren (D-CA-19) that also would create a new privacy agency, Sen. Gillibrand’s bill would not create a new omnibus federal privacy law.  Instead, it is focused on the creation of the Data Protection Agency and its rulemaking authority.  However, various aspects of the new agency’s authority provide valuable insights into what privacy regulation at the federal level might look like under the bill.
Continue Reading Sen. Kirsten Gillibrand Proposes New Digital Privacy Agency

On March 29, 2019, the ICO opened the beta phase of the “regulatory sandbox” scheme (the “Sandbox”), which is a new service designed to support organizations that are developing innovative and beneficial projects that use personal data.  The application process for participating in the Sandbox is now open, and applications must be submitted to the ICO by noon on Friday May 24, 2019. The ICO has published on its website a Guide to the Sandbox, which explains the scheme in detail.

The purpose of the Sandbox is to support organizations that are developing innovative products and services using personal data and develop a shared understanding of what compliance looks like in particular innovative areas.  Organizations participating in the Sandbox are likely to benefit from having the opportunity to liaise directly with the regulator on innovative projects with complex data protection issues.  The Sandbox will also be an opportunity for market leaders in innovative technologies to influence the ICO’s approach to certain use cases with challenging aspects of data protection compliance or where there is uncertainty about what compliance looks like.

The beta phase of the Sandbox is planned to run from July 2019 to September 2020.  Around 10 organizations from private, public and third sectors will be selected to participate.  In the beta phase, the ICO is focusing on data processing that falls within the remit of UK data protection law.  
Continue Reading ICO opens beta phase of privacy “regulatory sandbox”

A class-action lawsuit filed last month alleges that Wal-Mart’s video recording technology at its self-service checkout kiosks collects “personal identification information” in violation of the California Song-Beverly Act Credit Card Act of 1971 (“Song-Beverly Act”).  The Song-Beverly Act, like analogous statutes in several other states, generally prohibits businesses from recording
Continue Reading Lawsuit Alleges That Self-Checkout Videos Violate the Song-Beverly Act

On May 16, 2017, Governor Jay Inslee signed into law H.B. 1493—Washington’s first statute governing how individuals and non-government entities collect, use, and retain “biometric identifiers,” as defined in the statute.  The law prohibits any “person” from “enroll[ing] a biometric identifier in a database for a commercial purpose, without first providing notice, obtaining consent, or providing a mechanism to prevent the subsequent use of a biometric identifier for a commercial purpose.”  It also places restrictions on the sale, lease, and other disclosure of enrolled biometric identifiers.  With the new law, Washington has become only the third state after Illinois and Texas to enact legislation that regulates business activities related to biometric information.  Although the three laws seek to provide similar consumer protections around the collection, use, and retention of biometric data, the Washington law defines the content and activity it regulates in different terms, and, similar to Texas, but unlike Illinois, the Washington law does not provide a private right of action.

The Washington statute, as compared to existing biometrics laws, is notable for its definition of “biometric identifier.”   In the law, a “biometric identifier” is “data generated by automatic measurements of an individual’s biological characteristics,” including “fingerprints, voiceprints, eye retinas, irises, or other unique biological patterns or characteristics that is used to identify a specific individual.”  Washington’s definition of “biometric identifier” may be broader than that in the Texas statute, but Washington’s definition does not specifically provide for a “scan of hand or face geometry,” as is the case in the Illinois statute.  Washington’s definition of “biometric identifiers” specifically excludes “physical or digital photograph, video or audio recording or data generated therefrom” (in addition to certain health-related data), suggesting the statute will have limited application in the context of facial recognition technology.
Continue Reading Washington Becomes the Third State with a Biometric Law

This week, the FTC released a staff report urging companies to adopt best practices for commercial uses of facial recognition technology.  The report, entitled Facing Facts: Best Practices for Common Uses of Facial Recognition Technologies, follows a workshop held last December and more than 80 public comments addressing issues raised at the workshop.  Facing Facts largely discusses how the core privacy principles from the Commission’s March 2012 privacy report — privacy by design, simplified choice, and transparency — should inform the use of facial recognition technologies, such as digital signs that can assess the age and gender of consumers standing before them as well as online photo tagging tools. 

In this post, we provide an overview of the staff report’s guidance on how each of the principles should be applied by companies that employ facial recognition in their products and services.Continue Reading FTC Issues Guidance on Best Practices for Facial Recognition Technology

The Electronic Frontier Foundation and the Immigration Policy Center last week released an interesting report on law enforcement’s increasing efforts to gather biometric data, and associated risks of data inaccuracy, racial profiling, erroneous deportations, security breaches, and privacy invasions.  The report calls for greater accountability in the biometrics context, including collection and retention limitations; clear rules for collection, use, and sharing; robust security; notice requirements; and independent oversight. 

In recent months, a number of policymakers have raised concerns about both public and private collection of biometric data.  For example,Continue Reading Biometric Data Under the Privacy Microscope

An amendment to a discussion tabled in the House of Lords relating to the Protection of Freedoms Bill 2010 – 2011 has called for the creation of a dedicated Privacy Commissioner.

The proposed establishment of a single Privacy Commissioner seeks to correct the existing proliferation of UK commissioners with strictly circumscribed powers and create an organization that is sufficiently flexible to navigate through the ever-changing technology and privacy policy landscapes.

If the Bill receives Royal Assent and becomes law, the new Commissioner will supersede the current UK Information Commissioner and reflect a more holistic approach of protecting individual privacy in all of its aspects rather than regulating personal data alone.Continue Reading House of Lords Calls for a Privacy Commissioner