Apple employees criticized photo scan for detecting child abuse

Spread the love

Apple’s photo-scanning functionality to detect child abuse material has come under fierce criticism from within, according to a Reuters report. There has been a lot of discussion among employees about the technology since the announcement.

Sources within the company who wished to remain anonymous told Reuters that “more than 800 messages were exchanged in Apple’s internal Slack regarding the plan announced last week.” Many employees are said to have expressed concerns about how the functionality could be used by authoritarian regimes to censor and arrest opponents.

Apple announced the new functionality for iOS and iPadOS early this month. This concerns the so-called csam detection with which Apple wants to detect material that points to child abuse. The photo scan will compare hashes of local photos on iPhones and iPads with hashes from a database of child abuse photos. Apple itself has promised to use the technology only for detecting child abuse, but opponents are not reassured.

According to sources with whom Reuters spoke, the internal discussion is more heated than last times Apple introduced new security measures. In addition to concerns about what will happen if the technology gets into the wrong hands, some employees are also concerned that this choice will destroy Apple’s reputation for protecting privacy.

Several privacy organizations have already expressed their concerns about the scanning and wrote a letter in which they call on Apple not to go ahead with the plans for the photo scan. Apple declined to comment to Reuters.

You might also like
Exit mobile version