Facebook releases tools for detecting harmful video and photo material

Spread the love

Facebook has released the source code of the tools it uses to detect child pornography and violent videos and photos, among other things. By posting the source code, other organizations can also start using the tools.

These are two algorithms for detecting malicious videos and photos, called PDQ and TMK+PDQF. According to Facebook, the amount of malicious video and photo content on the internet, such as child pornography or terrorist content, is increasing sharply, so that the tools that Facebook now makes open source can contribute to limiting the spread. The source code is on GitHub, along with an explanation of how it works.

With the software it is possible to scan a photo or video after which it is compared with known material. In the event of a match, a signal is given, and the content can be removed, for example. Hashes of the matched videos and photos can also be shared, so that other companies and organizations are also aware. Both of these tools were created by Facebook’s own artificial intelligence team.

According to Facebook, the tracking tools are suitable for processing photo and video material in real time and can handle large quantities. This makes the software suitable for use on social media, for example, where users frequently post videos and photos and where a lot of material has to be scanned.

You might also like