Facebook parent company Meta is reviewing whether to allow former President Donald Trump, who was suspended following the Jan. 6 Capitol riots in 2021, back on the platform.
Meta President of Global Affairs Nick Clegg said the company would review its decision to suspend Trump for encouraging the Capitol riots through Facebook posts. Clegg spoke at an event for Semafor, an upcoming media startup, in Washington D.C. on Thursday, according to Deadline.
“When you make a decision that affects the public realm, you need to act with great caution. You shouldn’t throw your weight about,” Clegg told Semafor editor Steve Clemons, according to Medium.
Trump received a temporary, “indefinite” suspension in January 2021 immediately following the riots at the Capitol in Washington D.C. A more permanent ban arrived in June, prohibiting the former president from actively using his account for two years.
“Given the gravity of the circumstances that led to Mr. Trump’s suspension, we believe his actions constituted a severe violation of our rules which merit the highest penalty available under the new enforcement protocols,” Clegg said in a June 2021 blog post.
The suspension was set to lift January 7, 2023 — but Clegg said that is dependent on a few factors.
Clegg said Facebook will “look to experts to assess whether the risk to public safety has receded” by the time Trump’s suspension is set to be lifted. The company will also evaluate several “external factors,” Clegg said, “including instances of violence, restrictions on peaceful assembly and other markers of civil unrest,” when making its decision.
Facebook is not the only platform that restricted Trump following Jan. 6., with Instagram and Twitter imposing their own bans. Facebook’s ban extended through Instagram, on which the former president had amassed 56 million followers.
Twitter permanently suspended Trump’s account Jan. 8, 2021 citing “the risk of further incitement of violence.” Trump has said he “probably” would not return to Twitter were he allowed to do so, but hasn’t specifically addressed Facebook or Instagram.
“If we think there is is content on our platform which will lead to real-world harm — physical harm — then we feel we have a clear responsibility to act against it,” Clegg said at the Semafor event.