Twitter bans users who replace faces with ai in porn videos

Spread the love

Twitter has announced that it is handing out bans to accounts where users share so-called deepfakes. These are porn videos in which a head is exchanged with that of, for example, a celebrity using machine learning.

In an email to the website Motherboard, a Twitter spokesperson said any account on which the user posts “intimate media” created or distributed without the consent of the people depicted will be banned. That would have already happened with the Twitter account called ‘@mydeepfakes’, which was suspended within hours of posting videos with face swapped faces.

Twitter has on this front policy which states that sharing intimate photos or videos without the permission of the persons depicted is not allowed and is a violation of his or her privacy. The deepfake creations are not specifically mentioned here, but according to Motherboard fall within this category. Meanwhile, Discord, Gfycat and Pornhub have announced that they do not tolerate fake porn videos created by artificial intelligence. Motherboard has also asked what Reddit is doing about this, but the website has still not responded 12 days after a request for comment from Motherboard.

The deepfakes subreddit now has 91,000 followers. Founded in mid-December last year, this subbreddit is where Reddit users share their creations. Two weeks ago, a Reddit user posted a tool on the internet called FakeApp. The creator of the tool said he wants to make FakeApp as user-friendly as possible in the future, so that users only need to select two videos and press a button to swap the faces. The tool uses a neural network and trains it using faces detected in videos to determine facial expressions, lighting and position.

The tool is also sometimes used for purposes other than modifying a porn video, according to Motherboard. For example, a user combined a video of Hitler with images of Argentine President Mauricio Macri. In the past, scientists have used artificial intelligence for similar purposes, such as generating realistic mouth movements from audio. Last year, the Defense Cyber ​​Command also expressed its concerns about the manipulation of live video.

You might also like
Exit mobile version