Facebook Bans Deepfakes Ahead Of 2020 Election

But the ban won’t cover all altered videos—despite the misinformation they might cause.

Facebook is banning deepfake videos ahead of the 2020 presidential election—but the ban doesn’t prohibit all doctored videos.

Having grown in popularity over the past few years, deepfakes generally take a source video or image and superimposes content into it using machine learning techniques, in a way that makes it look real. Though the videos can be used innocently enough to make comedic clips, deepfake videos have also been posted to promote fake news with hoax imagery, and have even been used in celebrity pornographic videos and revenge porn.

To try and curb the spread of fake news and videos on their platform ahead of the 2020 presidential election, Facebook announced on Monday that it is banning videos that are “edited or synthesized” by technologies like AI in a way that average users would not easily spot.

“While these videos are still rare on the internet, they present a significant challenge for our industry and society as their use increases,” Facebook Global Policy Management Vice President Monika Bickert said in a blog post.

However, the policy will not apply to parody or satire videos. It also won’t apply to videos edited with lesser forms of manipulation, or “shallowfakes,” which include the May 2019 video of House Speaker Nancy Pelosi that was altered to make her look like she was unwell and slurring her words.

Joe Biden’s 2020 spokesman, Bill Russo, criticized the ban on Tuesday, saying it does not go far enough to fix fake news from spreading.

“Facebook’s policy does not get to the core issue of how their platform is being used to spread disinformation,” he said, “but rather how professionally that disinformation is created.”