Researchers at Carnegie Mellon University have designed a website that doles out grades to Android apps based on their privacy practices. The website,, assigns grades based on a model that measures the gap between people’s expectations of an app’s behavior and how the app actually behaves. The grades range from A+, representing no privacy concerns, to D, representing many concerns.

To determine its grades, the Carnegie Mellon model relies on both static analysis and crowdsourcing. In the static analysis component, Carnegie Mellon’s software analyzes what data an app uses, why it uses such data, and how that data is used. For example, the software assessed whether an app used location data, whether that location data was used to provide location features (such as a map app), or whether that location data was used to provide the user with targeted advertising (or for other purposes). In the crowdsourcing component, Carnegie Melon solicited user privacy expectations for certain apps. For example, researchers asked whether users were comfortable with or expected a certain app to collect geolocation information. Where an app collected certain information and users were surprised by that collection, the surprise was represented in the model as a penalty to the app’s overall privacy grade.

The Carnegie Mellon research team analyzed over 1 million apps in all. Among those receiving top grades were popular apps such as Instagram, Facebook, and Virgin Mobile. For those apps, the model identified few or zero permissions that users found concerning with respect to the expectation of app behavior and reality of app behavior.

In all, nearly 1,000 apps received the worst possible rating of D. Many of those receiving the worst grades were children’s games, like My Talking Tom and Fruit Ninja. For instance, the researchers discovered that if you connected a device with the My Talking Tom app to a computer, the app could delete or modify files on the users’ USB storage, contrary to users’ expectations.

Carnegie Mellon Associate Professor Jason Hong reported to CNN that many apps are intrusive not because the “developers are [ ]evil,” but because app makers piece together portions of computer code to deliver data to advertisers without actually reviewing the code.

More information on Carnegie Mellons’ research is available in their International Conference on Ubiquitous Computing 2012 and Symposium on Usable Privacy and Security 2014 research papers.