WhatsApp head says Apple's child safety update is a 'surveillance system'

It's not the first time the two companies have clashed on privacy.

WhatsApp logo on the App Store displayed on a phone screen and WhatsApp logo in the background are seen in this illustration photo taken in Poland on January 14, 2021. Signal and Telegram messenger apps gained popularity due to the new WhatsApp's privacy policy. (Photo illustration by Jakub Porzycki/NurPhoto via Getty Images) (NurPhoto via Getty Images)

One day after Apple confirmed plans for new software that will allow it to detect images of child abuse on users’ iCloud photos, Facebook’s head of WhatsApp says he is “concerned” by the plans.

In a thread on Twitter, Will Cathcart called it an “Apple built and operated surveillance system that could very easily be used to scan private content for anything they or a government decides it wants to control.” He also raised questions about how such a system may be exploited in China or other countries, or abused by spyware companies.

A spokesperson for Apple disputed Cathcart's characterization of the software, noting that users can choose to disable iCloud Photos. Apple has also said that the system is only trained on a database of “known” images provided by the National Center for Missing and Exploited Children (NCMEC) and other organizations, and that it wouldn’t be possible to make it work in a regionally-specific way since it’s baked into iOS.

It’s not surprising that Facebook would take issue with Apple’s plans. Apple has spent years bashing Facebook over its record on privacy, even as the social network has embraced end-to-end encryption. More recently, the companies have clashed over privacy updates that have hindered Facebook’s ability to track its users, an update the company has said will hurt its advertising revenue.

This article contains affiliate links; if you click such a link and make a purchase, we may earn a commission.