Select Page

New privacy-enhancing technology for everyone

Google Developers
Published: December 22, 2022

Posted by Miguel Guevara, Product Manager, Privacy and Data Protection Office

As more aspects of our daily lives continue to become digitized, the technology industry’s responsibility to keep personal data safe grows. That’s why, for the past decade, we’ve invested in researching and developing a variety of privacy-enhancing technologies (PETs) like Federated Learning, Differential Privacy and Fully Homomorphic Encryption that have the power to analyze large data sets in a provably private way, underpinned by the highest technical standards.

PETs ensure Google can provide great experiences like helping you find a restaurant’s most popular dishes or getting better recommendations as you type a message, all while keeping our users’ information anonymous and protected. But adoption of these technologies for the broader industry has been historically challenging due to many barriers to entry, including requiring large computational resources, being complex to manage and expensive to implement.

That’s why three years ago we set out to help create a safer ecosystem for every internet user by democratizing access to these privacy technologies for all, and have made many of our PETs freely available via open source projects in the years since.

Today we’re sharing several recent PET developments from our engineering team that will provide the broader developer community (researchers, governments, nonprofits, businesses and more) new ways to deploy and enhance privacy features in their own work.

New Machine Learning tool efficiently blurs objects

PETs encompass a wide range of applications that can help developers and researchers analyze data without revealing personal info. They can also be used to help protect the identity and security of people in photos and videos online through object blurring – but doing so in an efficient way that doesn’t disrupt the user experience can be challenging due to the computational intensity of the algorithms.

Today, we are happy to announce an open-source version of an internal project, Magritte, which uses Machine Learning (ML) advances to detect objects using low computational resources, and applies a blur to those objects automatically, as soon as they appear on screen. The tool can blur arbitrary objects, like license plates, and more.

This code is especially useful for video journalists who want to provide increased privacy assurances. By using this open-source code, videographers can save time in blurring objects from a video, while knowing that the underlying ML algorithm can perform detection across a video with high-accuracy.

Improving Fully Homomorphic Encryption transpiler performance

Last year, we publicly released our Fully Homomorphic Encryption (FHE) Transpiler, a promising technology that allows developers to perform computations on encrypted data without being able to access personally identifiable information.

As part of our work to expand FHE use cases, we recently introduced new circuit optimizations for our transpiler that result in lower computational cost and time, two of the biggest challenges. Our estimations indicate that circuit size has decreased by 50% which improves overall speed, and we believe these optimizations will help in industries such as financial services, healthcare and government, where a robust security guarantee around the processing of sensitive data is of highest importance.

Additionally, Duality Technologies recently announced the first production application of our transpiler, demonstrating the applicability of FHE to general purpose problems. Duality also integrated a new backend to the transpiler which allows developers to choose more cryptographic systems for their applications.

How privacy innovations can help solve global challenges

Just a decade ago, PETs were largely seen as an academic exercise, with many ideas that were still untested. With our dedicated investment and work from engineering teams, we’re now applying these novel data processing techniques across many of our products. In fact, PETs are a key part of our Protected Computing effort at Google, which is a growing toolkit of technologies that transforms how, when and where data is processed to technically ensure its privacy and safety.

But Google is not alone on this journey. Organizations and governments around the world are exploring PET use to help tackle societal challenges, as evidenced by the U.S. and UK governments hosting a contest this year to develop PET solutions that would address financial crime and public health emergencies.

That’s why we continue investing in democratizing access to the PETs we’ve developed, knowing the power they have in helping developers and researchers securely process and protect user data. As we’ve said before, we believe that every internet user in the world deserves world-class privacy, and we’ll continue partnering with organizations to further that goal. We’re excited for new testing and feedback on our open source PETs and look forward to releasing more updates in 2023 and beyond.

Source: developers.googleblog.com