Although Google has tightened app policies on the Play Store over the years, the company thinks it can do more to protect users. One way to do that is to crack down on the amount of malware and bad apps on its marketplace, so it's begun reviewing apps before they become available to download. The new policy, which is similar to Apple's approach on the App Store, has been in effect for a couple of months and uses a mix of algorithms and human intervention to weed out rogue apps.
But that's not all it's doing. Google has also launched a new rating system for Android apps that will spell out which apps and games are appropriate for certain age groups. It's teamed up with a number of independent bodies including the ESRB, PEGI, USK, ClassInd and the Australian Classification Board, the same groups that classify video games like GTA V before they go on sale.
The idea is to help developers better target users and educate parents about the apps and games they are being asked to download. The ratings will cover the usual topics: sexual content, violent content, drugs, alcohol and gambling. According to Google, if users aren't in one of the supported countries, it will deliver an "age-based, generic rating" that has been awarded after developers have completed a content rating questionnaire. If developers don't comply, downloads could be blocked in certain regions.
It's a far cry from the early days of Android, when developers could submit whatever they like. The Play Store is now a $7 billion-plus business, so Google needs to better protect users as hardware gets more powerful and apps normally reserved for desktops and consoles come to mobile devices.