EC wants to be able to oblige platforms to scan for photos of child abuse
If it is up to the European Commission, online platforms will be obliged to take more measures against grooming and child abuse images. If they don’t do enough, organizations could be required to scan for grooming or images, under the proposal.
below the new bill from the European Commission hosting providers and ‘communication service providers’ such as social media and chat services are obliged to identify the risks children run on their platforms. This specifically concerns sharing images of child abuse and approaching children online, or grooming. Platforms are thereby obliged to propose measures that can reduce these risks.
Under the bill, member states will be required to establish authorities that review these platform risk assessments. If an authority finds that there are still too many risks associated with a platform, it can ask a judge ‘or an independent, national authority’ to set up a detection order.
Those detection orders are subpoenas that oblige a platform to scan for known or new images of child abuse, or for grooming. These orders are limited in time, content type and service, states the EC.
The proposal calls for the creation of a new EU Center for Child Sexual Abuse to scan for known images of child abuse. This EU Center must provide platforms with, among other things, ‘reliable information from identified material’. After getting a child abuse detection order, platforms have to scan with the information from the EU Center. “Platforms should use techniques that least infringe privacy, consistent with the latest in the industry,” the European Commission writes.
Encryption
Platforms using encryption will also be covered by the proposal, emphasizes the European Commission. “Encryption is an important tool for protecting cybersecurity and the confidentiality of communications. At the same time, its function as a protected channel can be exploited by criminals.” If encrypted services were not included in the proposal, the consequences for children would be severe, the Commission writes.
On the other hand, the Commission says it understands the importance of encryption, especially for children. That is why the Commission says it is supporting research together with companies, organizations and scientists into technical solutions that enable the scanning of child abuse images in encrypted services, ‘with full respect for fundamental rights’.
Automatically scan texts
If a platform under a detection order is obliged to scan for grooming, this platform will have to ‘automatically scan texts’, writes the Commission in the bill. “Often automatic scanning is the only way to detect grooming. The technology used does not ‘understand’ the content of the messages, but looks for known, pre-identified patterns that indicate potential grooming.” In addition to scanning algorithms, human oversight and assessment are also needed, says the Commission.
The European Commission recognizes that the automated scan to grooming is ‘more drastic’ than scanning for child abuse images. Therefore, stricter conditions should apply to grooming detection orders than to child abuse detection orders. Also, these grooming detection orders should take less time than the child abuse detection orders.
According to the Commission, there are two reasons why there should be European rules on grooming and child abuse images. First, the current voluntary system would not be effective. In recent years, 95 percent of all reports came from Facebook, while there are no indications that other platforms have fewer child abuse images, according to the Commission.
The second reason is that some Member States would now take their own measures against grooming and child abuse images, forcing companies in different countries to take different rules into account. According to the Commission, this is an undesirable situation in a single European market. That is why the Commission is now proposing the rules. The European Parliament and Member States still need to agree to the regulation before it enters into force.