Google has announced that two of its latest privacy-enhancing technologies (PETs), including one that blurs objects in a video, will be provided to anyone for free via open source. The new tools are part of Google’s Protected Computing initiative designed to transform “how, when and where data is processed to technically ensure its privacy and safety,” the company said.
The first is an internal project called Magritte, now out on Github, which uses machine learning to detect objects and apply a blur as soon as they appear on screen. It can disguise arbitrary objects like license plates, tattoos and more. “This code is especially useful for video journalists who want to provide increased privacy assurances,” Google wrote in the blog. “By using this open-source code, videographers can save time in blurring objects from a video, while knowing that the underlying ML algorithm can perform detection across a video with high-accuracy.”
The other with the unwieldy name “Fully Homomorphic Encryption (FHE) Transpiler, allows developers to perform computations on encrypted data without being able to access personally identifiable information. Google says it can help industries like financial services, healthcare and government, “where a robust security guarantee around the processing of sensitive data is of highest importance.”
Google notes that PETs are starting to enter the mainstream after being mostly an academic exercise. The White House recently touted the technology, saying “it will allow researchers, physicians, and others permitted access to gain insights from sensitive data without ever having access to the data itself.” Google noted that both the US and UK governments are held a contest this year to develop PET solutions around financial crime and public health emergencies.
All products recommended by Engadget are selected by our editorial team, independent of our parent company. Some of our stories include affiliate links. If you buy something through one of these links, we may earn an affiliate commission. All prices are correct at the time of publishing.