Tech companies must remove violent terrorist content faster

Spread the love

Bennie G. Thompson, a US Congresswoman from the Democratic Party, in his capacity as chairman of the Committee on Homeland Security, has urged four major tech companies to remove violent terrorist content from their platforms more quickly.

Thompson has written a letter to the top executives at Facebook, YouTube, Twitter and Microsoft complaining about the speed with which these companies removed the violent video of the perpetrator of the recent attacks in Christchurch. The terrorist attacked two mosques in the New Zealand city on Friday, killing 50 people. Thompson believes the companies need to get better at quickly removing this type of content and expects to receive a briefing from these companies on March 27.

The committee chair said the terrorist “exploited the platforms of these tech companies to spread a horrific video of mass violence around the world.” He emphasizes the importance of prioritizing the removal of such sensitive, violent content. In doing so, Thompson points to the risk of others being inspired by such videos, which he says could lead to the “next act of violence.”

The Congressman says the companies mentioned are reporting data about deleted terrorist content. They do this on the website of a joint working group to prevent the distribution of this type of content. Thompson argues that Facebook claims that 99 percent of ISIS or Al Qaeda-related content is removed prematurely, but that this kind of transparency is lacking when it comes to other violent extremists, such as those from the far right.

Thompson points to reports that Facebook had to be alerted by the New Zealand police to the presence of the Christchurch terrorist’s video, rather than that the platform’s algorithms had picked the video in question. He also references a report that YouTube was unable to contain the amount of video reposts uploaded in the 24 hours following the attack. Thompson acknowledges that the influx of this video content has been limited since then, but he speaks of “systematic errors” that allowed the distribution of this terrorist content on YouTube. According to him, those errors have still not been resolved.

Facebook says the original live video of the terrorist who haunted Christchurch was viewed a total of no more than 4,000 times before being removed from its platform. The actual live broadcast has only been viewed two hundred times, according to Facebook. The company also states that in the 24 hours after the attack, it deleted about 1.5 million videos of the attack worldwide. About 1.2 million videos of that are said to have been blocked during upload.

You might also like