Apple pulls the plug on iCloud scanner that is supposed to detect child abuse

Spread the love

Apple is pulling the plug on the photo scanner that was supposed to look for images of child abuse among iCloud users. The company will continue to work on security features that will alert children if they come into contact with such footage.

Apple states opposite Wired that it has consulted extensively with experts and that the company believes it can protect children without having to search through user data. The plan to roll out the iCloud photo scanner, which looks for images of child abuse, will therefore not be implemented for the time being.

The American tech company says it wants to focus more on other functions that should help combat child abuse, such as: the Communications Safety function. This feature alerts children as soon as they are about to send or receive nude photos via the Messages app. Information sources are also shown to teach children how to deal with such situations. Apple says it wants to eventually expand this feature to videos and also open up the technology for use in third-party apps. The company did not provide an exact timeframe for the new capabilities of the Communication Safety function. Communications security has been available since iOS 15.2, iPad OS 15.2, and macOS 12.1. The feature is opt-in and Apple does not gain access to user images.

Apple announced in the summer of 2021 that it would add functionality to iOS and iPadOS to check photos that American users put on iCloud for child abuse. This would be done with a mechanism that would protect the privacy of innocent customers and give a very low risk of false positives. The plans were heavily criticized by security researchers, privacy activists and legal experts, among others, who pointed out the long-term risks to privacy and security. Shortly after announcing the plans, Apple decided to postpone the introduction of the feature to a date to be determined later.

Communication Safety

You might also like