Background - News & Events (Landing) 2016

Search NewsRoom

Advanced Search >

All Alerts & Newsletters

Will the Upholding of Facebook’s Trump Ban Have First Amendment Implications?

May 6, 2021

On May 5, 2021, Facebook’s Oversight Board upheld Facebook’s January 2021 ban of former president Trump from Facebook and Instagram for encouraging violence following the January 6, 2021 U.S. Capitol siege, based on two posts Trump made on Instagram and Facebook that afternoon and evening calling the rioters “great patriots” and “very special.” However, the Board left open the possibility for Trump to return to Facebook, claiming that his indefinite suspension was “standardless” and giving Facebook six months to modify its penalties against Trump proportionately based on its existing platform rules. Facebook had requested the input of the 20-member Board as to whether its decision was correct and sought recommendations on future suspensions of political leader users.

In its review, the Board agreed with Facebook that Trump’s two posts violated Facebook’s Community Standards and Instagram’s Community Guidelines by violating Facebook’s rules against praise or support of those engaged in violence. Additionally, by continuing with his unsupported narrative of electoral fraud and calling people to action during a context of “immediate risk of harm,” when there were already protestors at the Capitol, Trump’s posts created a “serious risk of violence.” However, the Board found that, rather than imposing an indefinite suspension, Facebook must “apply and justify a defined penalty” based on its rules for severe violations and the risk of future harm. A minority of the Board believed Facebook should go further, ensuring users who later seek reinstatement of their account actively recognize their wrongdoing and commit to following the rules in the future.

In response to Facebook’s request for recommendations for political leader users’ accounts, the Board emphasized that newsworthiness considerations should not be a priority when there is urgent action needed to prevent significant harm and that there should not necessarily be a distinction between the treatment of political leaders and other influential users. Additionally, among other recommendations, the Board stated that Facebook should publicly explain its rules when it imposes account-level sanctions against influential users and reconsider the continuing need for the suspension before it expires based on risk of lawless action. High government officials’ accounts should be suspended for a limited time if they have repeatedly created posts that pose a risk of harm based on, for example, international human rights norms. The Board also found that Facebook should escalate political speech from influential users to specialized staff who are familiar with the speech’s context but insulated from undue influence and from political and economic interference.

The Board’s decision will undeniably spur continued debate over a private company’s ability to remove user content and, importantly, who gets to decide whether to remove. While the Oversight Board is intended to be independent of Facebook, it is still funded by Facebook. For example, the ability of Facebook to ban a political leader’s speech raises interesting First Amendment questions, especially in the context a private actor regulating speech of political leaders. At the same time, the Board’s decision implicates Section 230 of the Communications Decency Act, which, as we’ve discussed before here, here, and here, is a hot button issue for Congress and Trump.

Now, some are characterizing the decision as deflecting responsibility back to Facebook, and others wish the Board had imposed clearer standards, at a minimum imposing the Board’s minority view that Facebook actively should seek to prevent the repetition of the adverse human rights effects. Others criticized the decision as an example of Facebook’s enormous power over public expression, creating more calls for antitrust regulation against the company. In any case, the true impact of the Board’s decision will not be known until Facebook makes its decision in six months and potentially creates new standards for content moderation based on the Board’s recommendations. And, that decision may reverberate through the entire Internet ecosystem.

It will be crucial to watch how Facebook responds to the Board’s decision and what standards emerge from it in the next few months.

For more information, please contact the professional(s) listed below, or your regular Crowell & Moring contact.

Suzanne Trivette
Associate – New York
Phone: +1.212.895.4312