3x this is what Apple is doing to prevent child abuse

Spread the love

Apple will soon take a series of measures to prevent child abuse. The software they use for this will first appear in the United States and will be expanded to other countries in the future. What exactly is Apple going to do?

1. Scan photos for sensitive material

Apple will take various measures from this fall to prevent the spread of illegal images. For example, new software compares photos stored in iCloud with photos of child abuse from the database of the US National Center for Missing and Exploited Children. This would have a margin of error of one per trillion.

Do the smart software detect multiple suspicious photos in iCloud? Then Apple can decide to view the photos of this user: only then will human researchers be involved. And if something does indeed appear to be wrong, the authorities can be informed, they will then deal with it further.

2. Send or receive sensitive messages? Apple warns

It’s not the only measure Apple is taking to protect children. In the Messages app (iMessage) on the iPhone, iPad and Mac comes a function called ‘Communication Safety’. Apple wants to prevent children from receiving or sending sensitive images. Artificial intelligence is able to detect sexually explicit content in Messages. Before opening or sending such a photo, children will be warned that this is not okay, and the photo will be blurred. If the children still send or view the content, the parents will receive a notification.

3. Siri intervenes

Finally, there is one more adjustment in Siri and its search function. As soon as objectionable material is searched, virtual assistant Siri can intervene. For example, references are made to sources where people can get help.

Software for other purposes

Apple, therefore, wants to scan for images of child abuse and says that privacy is of paramount importance. And while Apple undoubtedly has the best intentions, not everyone is happy with the developments. Because what happens if you take cute pictures of your children in a bath, for example?

Security experts also warn that the software can in principle also be used to intercept other prohibited images. For example, the software can also be used to detect extremist content – ​​such as terrorist acts. The question then is: who decides what prohibited content is? The fear is that Apple will be pressured by governments that do not take human rights very seriously.

These features will come in updates to iOS 15, iPadOS 15, watchOS 8, and macOS Monterey later this year.

What do you think? Can Apple check photos for child abuse images?

You might also like
Exit mobile version