Google wants to make it easier for other companies to safeguard their user's privacy. In a blog post, the tech giant announced today that it is releasing an open-source version of its differential privacy library. Differential privacy -- still a relatively new form of data science -- has been used by tech companies to mine large amounts of user data without violating individual privacy. In practice, the tool injects random noise into your personal data so it's impossible to tell individual users apart.
But such tools are difficult to build from scratch, as Google acknowledges. "Today, we're rolling out the open-source version of the differential privacy library that helps power some of Google's core products. To make the library easy for developers to use, we're focusing on features that can be particularly difficult to execute from scratch, like automatically calculating bounds on user contributions. It is now freely available to any organization or developer that wants to use it," wrote Google product manager Miguel Guevara.
Google has already announced several open-source privacy projects this year, including Private Join and Compute, which helps companies that work together encrypt their data. The company also released TensorFlowPrivacy, a tool to help developers create machine learning models with privacy protections.
Google already uses differential privacy in many of its endeavors, like gauging how popular a specific restaurant's dish is on Google Maps or improving Google Fi. Apple uses the tool to obfuscate everything from your browsing history to your HealthKit data. Uber released a differential privacy tool in 2017 to limit how much data it can extract from its drivers. The idea is that by injecting random noise into everyone's personal data, companies can mine the information without tracking or other privacy violations.
Developers can access the open-source differential library on GitHub. Included among the tools in the library is a stochastic tester, which will help developers check their work and prevent mistakes.