Apps

On January 27, 2020, the French Supervisory Authority (“CNIL”) issued a guidance for developers of websites and applications which sets out the main principles of the General Data Protection Regulation (“GDPR”), expounds on their application in the online environment, and gives practical tips to help developers respect users’ privacy when deploying websites and apps.

The guidance consists of 17 recommendations, each covering a key principle supported by additional advice and examples.  Below, we list all 17 of these recommendations and provide a brief summary of the CNIL’s advice related to each.Continue Reading French Supervisory Authority Publishes Guidance for Website and App Developers

Germany recently enacted a law that enables state health insurance schemes to reimburse costs related to the use of digital health applications (“health apps”), but the law requires the Federal Ministry of Health to first develop the reimbursement process for such apps.  Accordingly, on January 15, 2020, the German government published a draft regulation setting

Researchers at Carnegie Mellon University have designed a website that doles out grades to Android apps based on their privacy practices. The website, privacygrade.org, assigns grades based on a model that measures the gap between people’s expectations of an app’s behavior and how the app actually behaves. The grades range from A+, representing no privacy concerns, to D, representing many concerns.

To determine its grades, the Carnegie Mellon model relies on both static analysis and crowdsourcing. In the static analysis component, Carnegie Mellon’s software analyzes what data an app uses, why it uses such data, and how that data is used. For example, the software assessed whether an app used location data, whether that location data was used to provide location features (such as a map app), or whether that location data was used to provide the user with targeted advertising (or for other purposes). In the crowdsourcing component, Carnegie Melon solicited user privacy expectations for certain apps. For example, researchers asked whether users were comfortable with or expected a certain app to collect geolocation information. Where an app collected certain information and users were surprised by that collection, the surprise was represented in the model as a penalty to the app’s overall privacy grade.
Continue Reading Carnegie Mellon Grades Privacy of Android Apps

In May 2014, the Global Privacy Enforcement Network (“GPEN”) performed its second Global Privacy Sweep, in which 26 privacy enforcement authorities from 19 countries downloaded 1,211 mobile apps and assessed their privacy practices. On September 10, 2014, the Office of the Privacy Commissioner of Canada (“OPC”) published the results of the Sweep (the “OPC Report”). The main findings can be summarized as follows:

  • While most apps provided some privacy information, only 15% clearly explained the app’s privacy practices.
  • 30% of the apps tested provided no privacy communications to users—such as a link to or information about the app’s privacy policy—other than communications requesting access to information (referred to as “permissions”).
  • Nearly 60% of the apps tested raised privacy concerns before the app was downloaded—meaning that there was not enough information available prior to download for potential users to adequately assess or review the app’s privacy policies.
  • 43% of the apps reviewed did not tailor privacy communications to small screens such as those present on smartphones and tablets.
  • 31% of the apps requested access to more information than necessary, based on GPEN’s understanding of the app’s functionality. Of the types of data requested, location was the most popular, followed by device IDs.

Continue Reading Global App Review Finds 85% of Apps Have Privacy Shortcomings

The Federal Trade Commission (“FTC”) announced on Thursday, September 4 that Google has agreed to settle charges and refund no less than $19 million to consumers whose children were allegedly deceived into making mobile purchases through the Android app store.

Google offers thousands of apps for free or a specific dollar amount through its Google Play Store, which is preloaded on Android mobile devices.  In many children’s game apps, after installation, children may purchase virtual items within an app — “in-app charges.”Continue Reading Google to Refund Consumers at Least $19 Million to Settle FTC Complaint It Unlawfully Billed Parents for Children’s Unauthorized In-App Charges

By: Nora Diamond

The Federal Trade Commission (“FTC”) brought suit last week against Amazon.com for allegedly collecting unauthorized in-app charges in connection with children’s apps. The FTC alleges that, by failing to require the account holder to enter a password before allowing a charge, Amazon unfairly billed parents for millions of dollars in unauthorized purchases.

This week, the Senate Judiciary Subcommittee on Privacy, Technology and the Law held a hearing to discuss the Location Privacy Protection Act of 2014, a bill reintroduced in March by Senator Al Franken (D-MN).  Most concerned with the potential for misuse and abuse of location data for purposes of stalking and perpetrating domestic violence, Senator Franken, who chairs the Subcommittee on Privacy, made clear at the hearing his view that, “Stalking apps must be shut down.”  Franken clarified, however, that his bill is not only intended to protect victims of stalking, but provides basic privacy safeguards for sensitive location information pertaining to all consumers.  Most critically, Senator Franken suggested that because location data lacks sufficient legislative protection, some of the most popular apps used widely by average consumers have been found to disclose users’ precise location to third parties without obtaining user permission.  Further, he noted that in light of stalking apps that are deceptively labeled as something else, such as “parental monitoring,” it is necessary to create a law with basic rules for any service that collects location information.

The witnesses representing law enforcement, federal agencies, and consumer-advocacy and anti-domestic violence groups gave testimony sharing Senator Franken’s concerns, and also suggested that industry self-regulation in this area so far has not been consistent or transparent.  Jessica Rich, Director of the Federal Trade Commission’s Bureau of Consumer Protection, for example, noted that broadly speaking, while many industry groups and individual companies purport to adopt the opt-in model as a best practice, enforcement has shown that the standard is in fact not complied with on a regular basis. 

In response, witnesses representing industry largely rejected the notion that legislation like Senator Franken’s is needed at this time.  Expressing particular worry that laws and regulations are inflexible and can quickly become outdated in the face of rapidly evolving technologies, Lou Mastria, Executive Director of the Digital Advertising Association (“DAA”), testified that innovation is better served by self-regulation, which can adapt to new business models because it is more “nimble” than government regulation, as subcommittee ranking member Senator Jeff Flake (R-AZ) phrased it.  Mr. Mastria pointed to the DAA’s Self-Regulatory Principles as an effective framework for self-regulation.  Sally Greenberg, Executive Director of the National Consumers League, however, contested the usefulness of DAA’s code, calling it weak, “full of holes,” and “late to the game,” especially in the face of her view that there is “monumental evidence that self-regulation is not working.”Continue Reading Senate Subcommittee Examines “Stalking Apps” Bill

Today, the Federal Trade Commission announced settlements with two mobile app makers that allegedly failed to provide reasonable security for the personal information collected in connection with their apps.  In complaints against Credit Karma, Inc. and Fandango LLC, the FTC alleged that both companies’ apps failed to validate SSL certificates, a security shortcoming that could have allowed an attacker to connect to the app—and collect unencrypted sensitive information—by presenting an invalid certificate.  (This type of attack is sometimes called a “man-in-the-middle attack.”)  Both respondents agreed to 20-year consent orders requiring, among other things, that they establish comprehensive information security programs. 

These cases are important for a number of reasons:  they reinforce past FTC guidance on the importance of performing security reviews and testing, overseeing service providers, and providing channels whereby security researchers can report vulnerabilities.  But what might be most notable is that in neither case does the FTC specifically allege that the respondent’s practices were “unfair” within the meaning of the Section 5 of the FTC Act.  Instead, both cases appear predicated upon the FTC’s authority to take actions against companies engaged in “deceptive” practices.Continue Reading FTC Announces Settlements with Two Mobile App Providers

The Federal Trade Commission (“FTC”) recently announced a settlement with Apple, Inc. over allegations that the company billed parents and other account holders for children’s in-app activities without obtaining the account holders’ express and informed consent. The FTC’s complaint alleged that Apple’s failure to obtain express and informed consent prior to each in-app purchase constituted an unfair act or practice in violation of Section 5 of the FTC Act.

The FTC’s allegations stemmed from an App Store feature, disclosed in Apple’s Terms and Conditions, that allowed in-app purchases for up to fifteen minutes without requiring password re-entry after the user completed a password-requiring transaction. The FTC complaint alleged that this feature allowed children who were given possession of mobile devices after an initial password entry to incur charges for up to fifteen minutes without parental or accountholder knowledge.Continue Reading FTC Announces $32.5M Settlement with Apple, Inc., May Be Seen as Expanding its “Unfairness” Authority