Apple promises to only use photo scan to detect child abuse

Spread the love

Apple pledges to use the photo scanning feature on user devices announced last week only for detecting known child abuse material. If governments want to track other things, Apple will refuse.

In the faq posted Monday About the new techniques, Apple writes that it has for years refused requests from governments that compromise user privacy: “Let’s be clear: this technology is limited to detecting child abuse images in iCloud and we will not accept requests from governments. willing to expand it.”

The fear that Apple will at some point expand the feature to things other than child abuse images has been one of the biggest criticisms from civil rights groups like the Electronic Frontier Foundation, among others. The Verge points out that Apple has been more likely to comply with government requests in the past. For example, Facetime is not available in various countries because it is end-to-end encrypted and governments do not allow it. The company has also allowed iCloud data from Chinese users to be stored in a data center that the Chinese government manages.

Apple unveiled the method last week as part of several measures to combat the proliferation of child abuse imagery. Apple obtains a database of hashes of child abuse photos from organizations dedicated to protecting children, such as the National Center for Missing and Exploited Children.

That database then goes encrypted to all Macs, iPads and iPhones, where the devices locally compare the hashes with hashes of local photos that will go to backup service iCloud Photos. If the number of matches exceeds a threshold, the device sends a notification to an Apple server. The feature will be operational in the US sometime this year.

iPhone 12 Mini

You might also like
Exit mobile version