Facebook Suspends Donald Trump From Its Platform For At Least Two Years
Trump was suspended in January following the deadly U.S. Capitol insurrection.
Facebook announced Friday that it will ban former President Donald Trump from its platform until January 2023. The platform is also set to alter its content moderation policy for politicians who violate its rules after mounting pressure from advisers and years of criticism. The social media giant has faced backlash in recent years for its lack of moderation — particularly before and during the Trump presidency, which culminated with a deadly insurrection and his suspension from the platform.
In a blog post, Nick Clegg, Facebook’s vice president of global affairs, wrote that Trump’s “actions constituted a severe violation of our rules,” and warranted “the highest penalty available.” The penalty, a two-year suspension, means that Trump’s accounts will remain suspended until January 7, 2023, at which point Facebook says it will evaluate whether his accounts may be reinstated; if it finds a reinstatement poses a “serious risk to public safety,” it will extend the suspension.
The Verge first reported Thursday that the Oversight Board, which is a group of experts that helps oversee some of Facebook’s tough content decisions, called on the company to make changes to how it moderates posts by people in power on the platform. The board made recommendations for how Facebook should handle “serious risks of harm posed by political leaders” following its decision to uphold Trump’s suspension from the site. In May, the Board upheld the company’s decision to suspend Trump but warned that an “indefinite suspension” was “not appropriate.”It gave Facebook six months to decide how long Trump’s suspension should last.
Facebook declined to comment to multiple outlets on the reported content moderation changes.
Trump was suspended from Facebook, Twitter, YouTube, and Snapchat following the January 6 U.S. Capitol insurrection and has not been reinstated on any of those platforms. The former president launched a blog to communicate with his base of followers but shut it down this week after less than a month.
The Oversight Board said Facebook needs to be more transparent with its decision making for political figures.
“While the same rules should apply to all users, context matters when assessing the probability and imminence of harm,” the board wrote. “When posts by influential users pose a high probability of imminent harm, Facebook should act quickly to enforce its rules.”
The board also said Facebook should revise its “newsworthiness” allowance given to influential figures who violate the site’s policies, and to give an explanation when it makes exemptions. According to The Washington Post, Facebook started using the newsworthiness exemption in 2015 when Trump was campaigning and posted a video suggesting the U.S. should ban entry of all Muslims.
In 2019, Clegg confirmed the company exempts politicians from a third-party fact checking system, adding, “From now on we will treat speech from politicians as newsworthy content that should, as a general rule, be seen and heard.”
Despite Facebook’s recent attempts to reduce misinformation and hate speech on its site, the delay in substantial moderation has already had severe consequences; many Trump loyalists plotted the Capitol attack on Facebook, according to multiple reports.
According to The Verge, as part of the board’s recommendations, Facebook also plans to make users aware of a previously secret strike system it implemented when monitoring content. Instead, the social network will reportedly notify users when they’ve received a strike.
Facebook has a history of picking and choosing which influential figures or pages can remain on the site. According to a BuzzFeed News report in August 2020, Facebook fired a senior engineer who collected data that proved the site was helping conservative politicians post false or misleading content without consequence, despite the right wing’s baseless theory that it faces censorship from Big Tech. In late 2020, Facebook banned users perpetuating the conspiracy theorist group QAnon, who often used the platform to spread baseless claims that Republican voters and lawmakers have promoted.