"consumer protection"

On December 10th, the Federal Trade Commission (FTC) published a Statement of Regulatory Priorities that announced the agency’s intent to initiate rulemakings on issues such as privacy, security, algorithmic decision-making, and unfair methods of competition.
Continue Reading FTC Announces Regulatory Priorities for Both Privacy and Competition

On December 4, 2018, the Federal Trade Commission (“FTC”) announced that it is accepting public comments regarding its Identity Theft Detection Rules, 16 C.F.R. Part 681 (the “Rules”), as part of a systematic review of the Commission’s regulations and guidelines. The review of the Rules is particularly noteworthy because identity theft is among the top consumer complaints to the FTC, and has been an enforcement priority for the FTC’s Bureau of Consumer Protection.
Continue Reading FTC Solicits Public Comment on Identity Theft Detection Rules

Last week, the European Data Protection Supervisor (the “EDPS”), in collaboration with European consumer organisation BEUC, hosted a joint conference on Big Data: individual rights and smart enforcement in Brussels (for the conference agenda, see here).  The conference brought together leading regulators and experts in the areas of competition, data protection and consumer

Regulators and courts in the EU are increasingly vigilant in relation to privacy practices and policies of large online companies.  In recent years and months, the pressure increases not only through privacy-specific regulations and enforcement, but also through the application of consumer legislation.  As the below examples from France and Germany show, some courts or regulators assess privacy practices and policies against the rules on unfair or abusive trade practices — in some countries, the legislator is even proposing new laws to that end.  This is a worrying trend, as it could trigger the application of an additional set of rules to privacy policies, and implies that EU consumer protection authorities may acquire competence in relation to online privacy policies, in addition to the EU data protection regulators.

Continue Reading European Consumer Legislation and Online Privacy Policies: Opening Pandora’s Box?

Yesterday the White House released a report discussing how companies are using big data to charge different prices to different customers, a practice known as price discrimination or differential pricing.  The report describes the benefits of big data for sellers and buyers alike, and concludes that many concerns raised by big data and differential pricing can be addressed by existing antidiscrimination and consumer protection laws.

Big Data and Personalized Pricing 

“Big data” refers to the ability to gather large volumes of data, often from multiple sources, and use it to produce new kinds of observations, measurements, and predictions about individual consumers.  Thus, big data has made it easier for sellers to target different populations with customized marketing and pricing plans.

The White House report identifies two trends driving the increased application of big data to marketing and consumer analytics.  The first trend is the widespread adoption of new information technology platforms, most importantly the Internet and the smartphone.  These platforms give businesses access to a wide variety of applications like search engines, maps, blogs, and music or video streaming services.  In turn, these applications create new ways for businesses to interact with consumers, which produce new sources and types of data, including (1) a user’s location via mapping software; (2) their browser and search history; (3) the songs and videos they have streamed; (4) their retail purchase history; and (5) the contents of their online reviews and blog posts.  Sellers can use these new types of information to make educated guesses about consumer characteristics like location, gender, and income.  The second trend is the growth of the ad-supported business model, and the creation of a secondary market in consumer information.  The ability to place ads that are targeted to a specific audience based on their personal characteristics makes information about consumers’ characteristics particularly valuable to businesses.  This, in turn, has fostered a growing industry of data brokers and information intermediaries who buy and sell customer lists and other data used by marketers to assemble digital profiles of individual consumers.
Continue Reading White House Issues Report on Big Data and Differential Pricing

The International Association of Privacy Professionals hosted its annual Privacy Academy, at which one panel, “Data Brokers Demystified,” specifically focused on regulation of the data-broker industry.  The panelists included Janis Kestenbaum from the Federal Trade Commission, Jennifer Glasgow from Acxiom, and Pam Dixon from the World Privacy Forum.  Emilio Cividanes from Venable also participated.

Major Conclusions of the FTC Report (Janis Kestenbaum)

  • Data brokers operate with a fundamental lack of transparency.  They engage in extensive collection of information about nearly every US consumer, profiles of which are composed of billions of data elements.
  • Much data collection occurs without consumer awareness and uses a wide variety of online and offline sources, such as social networks, blogs, individual purchases and transactions with retailers, state and federal governments, events requiring registration, and magazine subscriptions.
  • The practice of “onboarding”–where offline data is onboarded onto an online cookie and is used to market to consumers online–is increasingly common.
  • Some data collected is sensitive, but even non-sensitive data is sometimes used to make “sensitive inferences” about (for example) health status, income, education, ethnicity, religion, and political ideology.  Consumers are often segmented into “clusters” based on these inferred characteristics.
  • For regulators, some of these clusters are concerning.  For example, one cluster is entitled “Urban Scramble” and contains high concentrations of low-income ethnic minorities.
  • Congress should create a centralized portal where consumers can go online and access individual data brokers’ websites to opt out and access and correct their information.  For consumer-facing entities, like retailers, consumers must be given some kind of choice before data is sold to a data broker, and when that data is sensitive, the choice should be in the form of an opt in.
    Continue Reading IAPP Privacy Academy: “Data Brokers Demystified”

Last week, the governor of Connecticut signed into law a new requirement that extends compliance with the state’s existing Do-Not-Call registry to promotional text messages (SMS).  Specifically, the law amends the definition of a “telephonic sales call” to include a “text or media message sent by or on behalf of a telephone solicitor,” thereby prohibiting

This morning, the FTC announced that it would host a public workshop in September entitled “Big Data: A Tool for Inclusion or Exclusion?” in order to examine the increasing use of big-data analytics and its potential impact on low-income, diverse, and underserved American consumers.  The FTC noted that while predictive-analytic techniques produce tremendous benefits by enabling innovation in medicine, education, and transportation, and in improving product offerings, manufacturing processes, and tailored ads, there is concern that insights from big data also “may be used to categorize consumers in ways that may affect them unfairly, or even unlawfully.” 

The FTC gave examples of such practices that could limit certain populations of consumers’ access to higher quality products or services, including:  (1) rewarding frequent customers with better service or shorter wait times; (2) offering a discounted mortgage rate to a consumer who has a checking, savings, credit card, and retirement account with a competitor; (3) providing offers for “gold level” credit cards to high-income consumers while offering low-income consumers subprime credit cards; (4) circumventing the requirements of the Fair Credit Reporting Act by assessing credit risk through unregulated “aggregate scoring models,” which are based on aggregate credit or demographic profiles of groups of consumers who shop at certain stores, rather than the credit characteristics of individual consumers.  Although these types of uses of big data may be thought to bring about convenience, efficiency, and economic opportunity for some, consumer advocates have urged businesses and regulators to ensure that such techniques be implemented to respect the values of equality and opportunity for all.Continue Reading FTC to Examine Impact of “Big Data” on Low-Income and Underserved Communities

“The evolution of big data has exposed gaps in EU competition, consumer protection and data protection policies”, said Peter Hustinx, the European Data Protection Supervisor (EDPS), when presenting the EDP’s preliminary opinion on the interplay between these three policy areas. The Opinion titled “Privacy and Competitiveness in the Age of Big Data”, issued on 26 March 2014, (the Opinion) aims at stimulating a debate between experts and practitioners. The EDPS’ preliminary opinions are not legally binding but intended to inform and facilitate discussion.Continue Reading The New EDPS Opinion “Privacy and Competitiveness in the Age of Big Data”