Privacy Impact Assessments

On January 30, 2024, the U.S. Office of Management and Budget (OMB) published a request for information (RFI) soliciting public input on how agencies can be more effective in their use of privacy impact assessments (PIAs) to mitigate privacy risks, including those “exacerbated by artificial intelligence (AI).”  The RFI notes that federal agencies may develop or procure AI-enabled systems from the private sector that are developed or tested using personal identifiable information (PII), or systems that process or use PII in their operation.  Among other things, the RFI seeks comment on the risks “specific to the training, evaluation, or use of AI and AI-enabled systems” that agencies should consider in conducting PIAs of those systems. Continue Reading OMB Publishes Request for Information on Agency Privacy Impact Assessments

The UK Information Commissioner’s Office (“ICO”), which enforces data protection legislation in the UK, has ruled that the NHS Royal Free Foundation Trust (“Royal Free”), which manages a London hospital, failed to comply with the UK Data Protection Act 1998 in providing 1.6 million patient records to Google DeepMind (“DeepMind”), requiring the Royal Free to sign an undertaking committing to changes to ensure it is acting in line with the UK Data Protection Act.

On September 30,  2015, the Royal Free entered into an agreement with Google UK Limited (an affiliate of DeepMind) under which DeepMind would process approximately 1.6 million partial patient records, containing identifiable information on persons who had presented for treatment in the previous five years together with data from the Royal Free’s existing electronic records system.  On November 18, 2015, DeepMind began processing patient records for clinical safety testing of a newly-developed platform to monitor and detect acute kidney injury, formalized into a mobile app called ‘Streams’.
Continue Reading ICO Rules UK Hospital-DeepMind Trial Failed to Comply with UK Data Protection Law

On June 16, 2016, the French data protection authority (“CNIL”) launched a public consultation on the General Data Protection Regulation (“GDPR).   The consultation focuses on four priority themes set out in the Article 29 Working Party’s 2016 Action plan:

  • the data protection officer;
  • the right to data portability;
  • data protection impact assessments; and
  • certification.

Continue Reading The CNIL and EDPS Launch Public Consultations

The International Association of Privacy Professionals hosted its annual Privacy Academy, at which one panel, “Data Brokers Demystified,” specifically focused on regulation of the data-broker industry.  The panelists included Janis Kestenbaum from the Federal Trade Commission, Jennifer Glasgow from Acxiom, and Pam Dixon from the World Privacy Forum.  Emilio Cividanes from Venable also participated.

Major Conclusions of the FTC Report (Janis Kestenbaum)

  • Data brokers operate with a fundamental lack of transparency.  They engage in extensive collection of information about nearly every US consumer, profiles of which are composed of billions of data elements.
  • Much data collection occurs without consumer awareness and uses a wide variety of online and offline sources, such as social networks, blogs, individual purchases and transactions with retailers, state and federal governments, events requiring registration, and magazine subscriptions.
  • The practice of “onboarding”–where offline data is onboarded onto an online cookie and is used to market to consumers online–is increasingly common.
  • Some data collected is sensitive, but even non-sensitive data is sometimes used to make “sensitive inferences” about (for example) health status, income, education, ethnicity, religion, and political ideology.  Consumers are often segmented into “clusters” based on these inferred characteristics.
  • For regulators, some of these clusters are concerning.  For example, one cluster is entitled “Urban Scramble” and contains high concentrations of low-income ethnic minorities.
  • Congress should create a centralized portal where consumers can go online and access individual data brokers’ websites to opt out and access and correct their information.  For consumer-facing entities, like retailers, consumers must be given some kind of choice before data is sold to a data broker, and when that data is sensitive, the choice should be in the form of an opt in.
    Continue Reading IAPP Privacy Academy: “Data Brokers Demystified”

The Article 29 Data Protection Working Party (“Working Party”), the independent European advisory body on data protection and privacy, comprised of representatives of the data protection authorities of each of the EU member states, the European Data Protection Supervisor (the “EDPS”) and the European Commission, has identified a number of significant data protection challenges related to the Internet of Things. Its recent Opinion 08/2014 on the Recent Developments on the Internet of Things (the “Opinion”), adopted on September 16, 2014 provides guidance on how the EU legal framework should be applied in this context. The Opinion complements earlier guidance on apps on smart devices (see InsidePrivacy, EU Data Protection Working Party Sets Out App Privacy Recommendations, March 15, 2013).
Continue Reading Internet of Things Poses a Number of Significant Data Protection Challenges, Say EU Watchdogs

Data is everywhere. The amount of data on the global level is growing by 50 percent annually. 90 [percent] of the world’s data has been generated within the past two years alone,” explains the International Working Group on Data Protection in Telecommunications in their Opinion of May 6, 2014, titled, “Working Paper on Big Data and Privacy: Privacy principles under pressure in the age of Big Data analytics“. The Working Group, founded in 1983, has adopted numerous recommendations and since the beginning of the 90s focused on the protection on privacy on the Internet. Its members include representatives from data protection authorities and other bodies of national public administrations, international organizations and scientists from all over the world.


Continue Reading Big Data Analysis is Possible Without Infringing Key Privacy Principles, Says International Working Group

The Office of Information and Regulatory Affairs (OIRA) recently released a model Privacy Impact Assessment (PIA) that federal agencies must use before they employ third-party websites and applications to communicate with the public.  The new rules issued by OIRA, an arm of the White House’s Office of Management and Budget (OMB), build on rules the agency issued in June 2010.Continue Reading OIRA Releases Privacy Impact Assessment for Agency Use of Third-Party Websites

It is no surprise that the 97 comments filed in response to the Department of Commerce’s Green Paper on “Commercial Data Privacy and Innovation in the Internet Economy: A Dynamic Policy Framework” take a range of positions on issues such as the need for federal privacy legislation, the relevance of

Continue Reading Department of Commerce Proposed Privacy Framework: Context Matters