Yesterday, California Attorney General Kamala Harris continued her efforts to promote privacy best practices in the mobile app ecosystem by issuing a number of recommendations in her report, “Privacy on the Go.” The report encourages app developers, platform providers, ad networks, OS developers, and even mobile carriers to incorporate privacy by design into their products and services and provides detailed suggestions on how to do so. Importantly, the report notes that its recommendations in many cases go beyond what’s currently required by law; they are, for the most part, best practices.
As the report explains, “[t]he basic approach . . . is to minimize surprises to users from unexpected privacy practices.” A practice is “unexpected” when it’s not “related to an app’s basic functionality” or when it involves “sensitive information.” Minimizing surprises means limiting the collection and retention of data that is unrelated to the app’s core functionality; giving users “enhanced notice” (i.e., notice beyond what is provided in the developer’s general privacy policy) of unexpected practices; and giving users control over those practices. (These concepts, if not the precise terminology, will be familiar to those who have read the FTC’s March 2012 report, which recommended that companies provide consumers with robust notice and meaningful choices for practices that were “inconsistent with the context” of a particular transaction or with the company’s relationship with the consumer.)
The report goes onto make a number of specific recommendations that build on these basic propositions. After the jump, we discuss a few that struck us as particularly noteworthy.
Transparency and Choice
- An app’s privacy policy should be available before the app is downloaded. The report notes that the best way to accomplish this is to make the policy available from the app platform (i.e., on the promotion page). FTC staff also urged developers to take this step in the recent report, “Mobile Apps for Kids: Disclosures Still Not Making the Grade.” A more novel recommendation in this area was for ad networks, which were urged to provide links to their privacy policies to app developers so that the developers can make the policies available to users “before they download and/or activate the app.” This practice seems less likely to be seen as consistent with industry practice or expectations.
- Make the app’s “general” privacy policy “readily accessible from within the app.” The report makes clear that a privacy policy is “readily accessible” if its linked from the controls/settings page. The report also recommends hosting the privacy policy in the browser, in order to facilitate updates in case the developer’s practices change.
- Include key privacy disclosures in the general privacy statement. The report lists several disclosures that should be made in the privacy policy. Several of these reflect familiar requirements in the California Online Privacy Protection Act (“CalOPPA”), but others are less familiar. For example, the report recommends disclosing the “uses and retention period for each type or category of personally identifiable data collected” as well as “[w]hether your app, or a third party, collects payment information for in-app purchases.” The privacy policy should also describe—and provide links to the privacy policies from—third parties with whom personally identifiable data may be shared.
- Provide “enhanced measures” if the app collects “sensitive information” or “personally identifiable data” that are “not needed for basic functionality. The report defines “personally identifiable data” and “sensitive information” more broadly than these terms are usually defined. “Personally identifiable data” is “any data linked to a person or persistently linked to a mobile device,” while “sensitive information” is “personally identifiable data about which users are likely to be concerned,” including “precise geo-location data; financial and medical information; passwords; stored information such as contacts, photos and videos; and children’s information.” Where the app collects this kind of information for purposes other than basic functionality, the report recommends either (1) providing a “special notice,” (i.e., an alert that appears at the time the data is collected) or (2) a combination of “short privacy statement” (i.e., a statement that highlights the “unexpected practices”) and privacy controls that enable the person to make choices about those unexpected practices.
Security and Accountability
- Use encryption for personally identifiable data in transit—and in storage. Encrypting certain types of PII in transit has become a common practice thanks to encryption requirements in Massachusetts and Nevada laws, while encryption of stored data, however, is significantly less common. Given the breadth of the term “personally identifiable data,” many companies may have difficulty complying with this recommendation as it applies to both transmission and storage. The recommendation that ad networks use encryption for the transmission of permanent unique device identifiers seems particularly unlikely to be adopted.
- Put someone in charge of the general privacy policy. The report recommends making someone in the organization responsible for reviewing the privacy policy when practices change; maintaining archived versions of the policy; and acting as a point of contact for privacy questions and comments. Of all the report’s recommendations, this one may be the most important: having a person commit a least some of his or her time to thinking about privacy issues can improve a company’s practices dramatically. The privacy profession has exploded over the past decade, and this endorsement from General Harris signals the value that such professionals have to offer.
The report has already drawn criticism from ad industry groups, which have faulted the report for proposing “unworkable” solutions that could create confusion in the industry.