“Apple employees criticized photo scan for detecting child abuse”

Spread the love

Apple’s photo-scanning functionality for detecting child abuse material is drawing fierce criticism from its own ranks, according to a Reuters report. Among employees, there would be a lot of discussion about the technology since the announcement.

Sources within the company who wished to remain anonymous have to Reuters let it be known that “more than 800 messages have been exchanged in Apple’s internal Slack about the plan announced last week.” Many employees reportedly expressed concerns about how the functionality could be used by authoritarian regimes to censor and arrest opponents.

Apple announced the new functionality for iOS and iPadOS early this month. It concerns the so-called csam detection with which Apple wants to detect material that indicates child abuse. The photo scan will compare hashes of local photos on iPhones and iPads with hashes from a database of child abuse photos. Apple itself has promised to use the technology only to detect child abuse, but opponents are not convinced.

According to the sources Reuters spoke to, the internal discussion is more intense than previous times Apple introduced new security measures. In addition to concerns about what will happen if the technology falls into the wrong hands, some employees are also concerned that Apple’s reputation for protecting privacy will be lost as a result of this choice.

Several privacy organizations have already expressed their concerns about the scanning and wrote a letter calling on Apple not to go through with the plans for the photo scan. Apple declined to comment to Reuters.

You might also like