Photo of Joshua Gray

Joshua Gray

Joshua Gray is a commercial, privacy and technology lawyer focusing on digital health, technology and data-driven transactions and regulation.

While the EU Directive on Unfair Terms in Consumer Contracts prohibits certain clauses in standard (i.e., unilaterally imposed) contracts between businesses and consumers, some recently enacted EU laws restrict the use of certain clauses in standard contracts between businesses (“B2B”).  The Data Act is the latest example of such a law, as it prohibits certain “unfair contractual terms” (“Unfair Clauses”) in standard contracts between businesses relating to the access and use of data.  As such, it has a potentially very wide scope.  Businesses entering into such a contract should therefore ensure that they do not include any clause that could be considered “unfair” because such a clause would not be binding on the other party to the contract. This blog post focuses specifically on the Data Act’s provision on Unfair Clauses.  For more information on the Data Act, see our previous blog post.Continue Reading EU Data Act Regulates Business-to-Business Contracts Relating to Access and Use of Data

On 25 May 2018, the EU General Data Protection Regulation (GDPR) came into effect. The GDPR establishes some of the most robust privacy requirements globally and is likely to be a model followed by other jurisdictions. Airlines are uniquely affected by the GDPR with passenger data being at the heart of their business and international operations. As new technologies allow airlines to pursue new and innovative uses of customer data, it is imperative that airlines continue to conduct their operations with GDPR compliance in mind, particularly given the financial and other reputational issues that can arise for a failure to meet the GDPR’s strict requirements.

Below are 5 key issues for airlines to consider in relation to the GDPR post-implementation.
Continue Reading GDPR: Top 5 Post-Implementation Issues for Airlines

Designing data-driven products and services in compliance with privacy requirements can be a challenging process.  Technological innovation enables novel uses of personal data, and companies designing new data-driven products must navigate new, untested, and sometimes unclear requirements of privacy laws, including the General Data Protection Regulation (GDPR).  These challenges are often particularly acute for companies providing products and services leveraging artificial intelligence technologies, or operating with sensitive personal data, such as digital health products and services.

Recognising some of the above challenges, the Information Commissioner’s Office (ICO) has commenced a consultation on establishing a “regulatory sandbox”.  The first stage is a survey to gather market views on how such a regulatory sandbox may work (Survey).  Interested organisations have until 12 October to reply.

The key feature of the regulatory sandbox is to allow companies to test ideas, services and business models without risk of enforcement and in a manner that facilitates greater engagement between industry and the ICO as new products and services are being developed.

The regulatory sandbox model has been deployed in other areas, particularly in the financial services sector (see here), including by the Financial Conduct Authority in the UK (see here).

Potential benefits of the regulatory sandbox include reducing regulatory uncertainty, enabling more products to be brought to market, and reducing the time of doing so, while ensuring appropriate protections are in place (see the FCA’s report on its regulatory sandbox here for the impact it has had on the financial services sector, including lessons learned).

The ICO indicated earlier this year that it intends to launch the regulatory sandbox in 2019 and will focus on AI applications (see here).

Further details on the scope of the Survey are summarised below.Continue Reading ICO consults on privacy “regulatory sandbox”

On 13 September, the Information Commissioner’s Office (ICO) published draft guidance on GDPR contracts and liabilities on contracts between controllers and processors under the GDPR (the “Guidance”).  The ICO is consulting on the Guidance until 10 October.  We summarize the key aspects of the Guidance below.
Continue Reading GDPR Contracts and Liabilities Between Controllers and Processors

The UK Information Commissioner’s Office (“ICO”), which enforces data protection legislation in the UK, has ruled that the NHS Royal Free Foundation Trust (“Royal Free”), which manages a London hospital, failed to comply with the UK Data Protection Act 1998 in providing 1.6 million patient records to Google DeepMind (“DeepMind”), requiring the Royal Free to sign an undertaking committing to changes to ensure it is acting in line with the UK Data Protection Act.

On September 30,  2015, the Royal Free entered into an agreement with Google UK Limited (an affiliate of DeepMind) under which DeepMind would process approximately 1.6 million partial patient records, containing identifiable information on persons who had presented for treatment in the previous five years together with data from the Royal Free’s existing electronic records system.  On November 18, 2015, DeepMind began processing patient records for clinical safety testing of a newly-developed platform to monitor and detect acute kidney injury, formalized into a mobile app called ‘Streams’.
Continue Reading ICO Rules UK Hospital-DeepMind Trial Failed to Comply with UK Data Protection Law