On January 6, 2016, the Federal Trade Commission issued its staff report on big data, Big Data: A Tool for Inclusion or Exclusion? Understanding the Issues, following up on the FTC’s workshop on big data in September 2014 and seminar on alternative scoring products in March 2014. The report provides an overview of the characteristics and lifecycle of big data, summarizes the benefits and risks of big data, and outlines considerations for companies using big data, including potentially relevant laws such as the Fair Credit Reporting Act, equal opportunity laws, and laws prohibiting unfair and deceptive acts or practices.
The report serves as a helpful resource for a company in evaluating the potential uses, benefits, risks, and compliance requirements for big data. While many companies that already use big data will be familiar with the laws analyzed in the report and how to comply with them, new or emerging companies or companies that do not regularly work with consumer protection or financial services laws should read the report with care to develop an understanding of the legal framework applicable to the use of big data.
Characteristics and Lifecycle
The report does not attempt to provide a comprehensive definition of “big data,” instead describing it as a “confluence of factors” including “the nearly ubiquitous collection of consumer data from a variety of sources, the plummeting cost of data storage, and powerful new capabilities to analyze data to draw connections and make inferences and predictions.” The report also restates the “three Vs” often used to characterize big data:
- Volume – big data represents a vast quantity of data that can be gathered and analyzed.
- Velocity – big data represents data that can be accumulated, analyzed, and used quickly.
- Variety – big data represents a breadth of data that can be analyzed effectively.
The report describes four phases comprising the life cycle of big data: collection (both online and offline), compilation and consolidation, data mining and analytics, and use. Compilation and consolidation of big data may be conducted by data brokers, the subject of another similar FTC report, Data Brokers: A Call for Transparency and Accountability (Sept. 2014), which is referenced throughout the big data report. This direct link between big data and data brokers potentially indicates that the FTC is concerned that the lack of transparency and regulation of data brokers may affect the use of big data.
Benefits and Risks
The report describes numerous opportunities and benefits for the effective use of big data:
- Increased educational attainment for individual students. For example, big data can be used to identify students for advanced classes and to identify students who are at risk of dropping out of school.
- Non-traditional underwriting methods. Big data can be used to score populations who are not scorable using traditional credit data. Use of non-traditional data elements in underwriting, such as educational history, professional licenses, and personal property ownership, may expand access to credit to those borrowers without traditional credit profiles.
- Healthcare tailored to individual patients’ characteristics. Organizations have used big data to predict life expectancy, genetic predisposition to disease, likelihood of hospital readmission, and likelihood of adherence to a treatment plan to tailor treatment to an individual’s characteristics.
- Specialized healthcare to underserved communities. Big data has been used to provide specialized oncology diagnosis and treatment services in rural and low-income areas where there is a shortage of specialty healthcare providers.
- Increased equal access to employment. Hiring practices have been enhanced through the use of big data to reduce interview biases and promote a diversified workforce.
The report also describes several risks of companies’ use of big data:
- Erroneous denial of opportunities. Big data can be used to make decisions about a consumer based on the actions of others (e.g., lowering a consumer’s credit limit based on delinquency information of comparable consumers), and these actions may not actually be predictive of the consumer.
- Creation and reinforcement of existing disparities. Big data may result in a low-income consumer not receiving advertisements for particular products even though the consumer is eligible for them.
- Exposure of sensitive information. Critics fear that big data may be used to predict sensitive characteristics, including ethnic origin, religion, and political affiliation.
- Targeting of vulnerable consumers. Big data may be used by companies to identify and make misleading offers or scams to the most vulnerable people, including senior citizens.
- New justifications for exclusion. Because big data can be used to identify and target opportunities to particular populations, critics fear that big data will be used by companies to justify their exclusion of other populations.
- Higher-priced goods and services for low-income communities. Big data may facilitate differential pricing by online retailers based on geography and location of bricks and mortar competitors, potentially leading to consumers in poorer neighborhoods paying more for online products because there are fewer bricks and mortar competitors.
- Weakening of consumer choice. Big data can be used to infer information about a consumer who decides to opt out of sharing his or her information, lessening the effectiveness of the consumer’s choice.
These lists highlight the report’s primary point regarding big data that it can be used to facilitate inclusion or exclusion.
Considerations for Companies Using Big Data
The report stresses that the reality of the current marketplace is that companies will use big data. In doing so, companies should take into consideration potentially applicable laws and policy considerations based on field research aimed at identifying potential biases and inaccuracies in big data use.
Potentially Applicable Laws
Fair Credit Reporting Act – Companies that compile big data, including social media information, may be consumer reporting agencies subject to the Fair Credit Reporting Act. The report cites the FTC’s enforcement actions against Spokeo in 2012 and Instant Checkmate in 2014 for the proposition that online data brokers that collect information, create personal profiles, and market these profiles for FCRA-covered purposes will be treated as consumer reporting agencies. Simply posting a disclaimer on the data broker’s website stating that profiles may not be used for FCRA-covered purposes is insufficient to avoid liability under the FCRA.
The report also discusses a growing trend in which companies use predictive analytics products for eligibility determinations. These products compare a known characteristic about a consumer (often a non-traditional data element such as zip code, social media usage, and shopping history) to other consumers with the same characteristic to predict whether the consumer will meet his or her credit obligations, based on the other consumers’ records of meeting their own obligations. An example would be a creditor that requires an applicant to provide zip code, social media information, and shopping history information on an application; strips the applicant’s identifying information from the application; and sends the application to a firm to be analyzed against people in the same zip code with similar social media and shopping history information. If the firm provides a score or recommendation to be used by the company in making a credit decision for the applicant, the FTC would consider the score or recommendation to be a consumer report subject to the FCRA’s requirements and protections.
Interestingly, a footnote in this section of the report casts doubt on an interpretation in the FTC’s seminal guide on the FCRA, 40 Years of Experience with the Fair Credit Reporting Act (July 2011), which states that “information that does not identify a specific consumer does not constitute a consumer report even if the communications is used in part to determine eligibility.” The FTC no longer believes this provision is accurate, taking the position that a report “crafted for eligibility purposes with reference to a particular consumer or set of particular consumers (e.g., those that have applied for credit)” is “a consumer report even if the identifying information of the consumer has been stripped.”
Equal Opportunity Laws – The report urges companies to assess their use of big data analytics to make sure it complies with discrimination prohibitions in the Equal Credit Opportunity Act, Fair Housing Act, Americans with Disabilities Act, Age Discrimination in Employment Act, and other equal opportunity laws. Even if big data analytics suggest that members of a protected class are more likely to quit their jobs or that members of a protected class are less likely to repay loans, companies may not take action or fail to take action with respect to a member of a protected class based on these analytics.
This concern has particular significance in light of the recognized theory of discrimination referred to as “disparate impact,” in which facially neutral policies have a disproportionate adverse effect on a protected class. Big law analytics may inform a facially neutral policy regarding credit decisioning, but if the policy has a disproportionate adverse effect on a protected class, the policy nevertheless may violate equal opportunity laws if the policy is not justified by a legitimate business necessity or a less discriminatory alternative exists.
The report helpfully references discussions during the workshop regarding the extent to which advertising implicates equal opportunity laws. For example, a company’s advertisement to a particular community for a credit offer that is open to all will not by itself violate ECOA, absent disparate treatment or disparate impact. The report continues to urge caution in this area, particularly in credit transactions for which ECOA and implementing regulations contain relevant prohibitions on certain types of solicitations.
Federal Trade Commission Act – Section 5 of the Federal Trade Commission Act prohibits unfair or deceptive acts or practices and applies to most companies acting in commerce. (In addition, although not noted in the report, the Dodd-Frank Act prohibits unfair, deceptive, and abusive acts or practices with respect to consumer financial services companies and their service providers, as enforced by the Consumer Financial Protection Bureau.) This authority was used by the FTC to take enforcement action against a credit card marketing company that failed to disclose that its consumers’ credit lines would be reduced if they used funds for cash advances or for certain types of transactions (e.g., bars and nightclubs, marriage counseling), based on big data analytics. The report provides several other examples of unfair or deceptive acts or practices relating to big data.
Policy Considerations Raised by Big Data Field Research
Researchers have warned that there is a potential for incorporating errors and biases at every stage in the big data lifecycle and that there is latent uncertainty with respect to the way models will predict outcomes affecting low- and moderate-income individuals, protected classes, and other groups. Research in this area has generated four questions to be analyzed by companies in order to maximize benefits and minimize harms from the use of big data:
- How representative is the data set? Data sets that are missing information about particular populations may tend to disadvantage an individual member of the population when the data sets are used for modeling or predictive analytics with respect to a member.
- Does the model account for biases? Hidden biases at the collection and analytics stage may lead to disparate impact by reproducing existing patterns of discrimination, applying the prejudice of prior decision makers, or reflecting widespread biases persisting in society.
- How accurate are predictions based on big data? Even though big data may leverage non-traditional data and analytics, the report reminds companies to give sufficient attention to traditional applied statistics issues in order to make sure any trends or correlations identified in big data are statistically meaningful.
- Does use of big data raise ethical or fairness concerns? The report encourages companies to perform their own assessment of the factors that are part of an analytics model and balance the predictive value with fairness considerations.
The FTC report concludes by acknowledging that big data will continue to grow in importance and that it is currently improving the lives of underserved communities in areas such as education, health, local and state services, and employment. The FTC will be vigilant in monitoring big data practices’ compliance with the FCRA, equal opportunity laws, and FTC Act and bring enforcement actions where appropriate. The report directs government, academics, consumer advocates, and industry to work together because “big data analytics can have big consequences.”