WhatsApp will not scan photos for child abuse like Apple
WhatsApp will not adopt Apple’s method of scanning users’ private photos for child abuse footage. The director of the WhatsApp division within Facebook has spoken out against Apple’s method of scanning photos.
WhatsApp Director Will Cathcart calls the system a ‘setback to privacy’. “This is the wrong approach and a setback to the privacy of people around the world. (…) We have had computers for decades and there has never been a mandate to scan the private files on every desktop, laptop and phone looking for content that is illegal. That’s not how software works in the free world.” Cathcart is especially afraid that countries where iPhones are on the shelves will eventually be able to determine what the system will scan for.
Apple unveiled the method last week as part of several measures to curb the distribution of child abuse images. Apple is getting a database of hashes of child abuse photos from child protection organizations, such as the National Center for Missing and Exploited Children.
Then that database goes encrypted to all Macs, iPads and iPhones, where the devices locally compare the hashes with hashes of local photos that will go to backup service iCloud Photos. When the number of matches exceeds a threshold, the device sends a notification to an Apple server. The feature will work in the US during the course of this year.
The WhatsApp director is not the first critic of the system. Civil rights organizations Electronic Frontier Foundation and Center for Democracy & Technology, among others, have already spoken out against Apple’s system.