View from
The Center

The Facebook Oversight Board: How to Make It Actually Work

(AP Photo/Jeff Chiu, File)

Accordingly, we believe that Facebook could streamline its appeals process by following the guidelines suggested above.”

The Facebook Oversight Board is busy. This board, which officially began operating in October of last year, has 20 part-time committee members. Members include well-known academics, activists, and attorneys from numerous countries. (One board member, Helle Thorning-Schmidt, is even a former head of state.) The board reportedly has already received 300,000 appeals from users regarding Facebook content enforcement actions. As of April 14th, however, it has rendered only 8 decisions. Similarly, the Supreme Court of the United States hears approximately 76 cases per year out of more than 7,000 requests. However, the Supreme Court sits as the ultimate appeals court after cases are adjudicated in 94 Federal District Courts (where approximately 400,000 cases are heard annually) and 13 Federal Courts of Appeal (where approximately 45,000 appeals cases are heard annually). There are no similar, lower-level Facebook boards to hear appeals prior to one being escalated to this Oversight Board. 

Facebook recently announced this board would now also consider user appeals to have other users’ content removed, if Facebook did not itself already remove such content. This policy change will dramatically increase the number of appeals sent to the board and further reduce its capacity to issue rulings. As such, the board is in danger of becoming further clogged and also being used as a tool to expand censorship, particularly if influential users can simply appeal to have content from less popular users taken down. Faced with such a massive (and increasing) number of user appeals, as well as no intermediate boards to help manage the caseload, the Oversight Board will not be able to offer an effective appeals mechanism to prevent censorship without changes to how it operates. 

As such, we at First and Fourteenth Institute (FAFI) would like to propose five policy changes to help the Facebook Oversight Board be more effective. 

  1. Transparency: Both the speech content rules and related enforcement actions need to be published in a manner that is clear to all users—and with adequate examples and explanation. Enforcement actions, including shadow-banning or limits on how content can be shared, need to be communicated to the user in question. Appeal rulings should also be transparently published.
  2. Equal Standards for All: Speech content rules and enforcement actions need to be applied to all users, not just some. If a user can find examples of similar speech content being treated differently, the platform should respond by treating this similar content the same way the platform is treating the original user in question’s content.
  3. Commensurate Enforcement Actions: Enforcement actions should be reasonable and commensurate with the violations of the speech rules. Policies for first-time offenders (or for content that warrants a warning rather than enforcement actions) should be part of the published rules. 
  4. Speedy and Fair Due Process to Adjudicate Disputes: Users need immediate access to a written explanation of which specific speech rules were violated by their content, as well as a simple method to dispute the enforcement action taken by the platform. The dispute mechanism needs to have the resources and capability to respond quickly (within days) to the user’s dispute. The platform needs to provide an adequate explanation for the decision that is tied specifically to the speech content rules violated and the enforcement actions taken.
  5. Independent Appeals Process: If a user is not happy with the adjudication results of a dispute, Facebook needs to provide access to a mechanism to appeal the decision to an authority independent of the company itself. The Oversight Board is a fine start; however, this appeals mechanism needs to be multi-layered, with the resources and capability to resolve the dispute quickly and fairly. One part-time board processing 300,000 appeals and offering eight decisions in six months is clearly not an answer by itself. (It may be reasonable for some appeals to require a fee to avoid excessive appeals; fees can be refunded if the user is ultimately successful in having an enforcement action reversed.)

Facebook and other platforms can even more dramatically reduce the controversy and scale of user censorship disputes by focusing enforcement actions on speech that is criminal or dangerous, rather than trying to fact-check which information is true or false. This distinction needs to be spelled out in the published content speech rules. Facebook, YouTube, and others use a combination of algorithms and third-parties in an effort to fact-check content. However, due to significant bias in newspapers, television news channels, non-expert employees, and so-called fact-checkers, relying on these parties inevitably leads to accuracy problems. Fact-checking efforts are simply not workable at the scale of billions of messages and millions of new topics each day. As a result, these efforts will overwhelm the mechanisms for adjudication regarding dangerous and criminal speech.

While Facebook is off to a good start with the creation of the Oversight Board, the result so far is clearly not scalable and is currently unworkable. This creates the appearance of the board’s existence being merely a public relations effort. Accordingly, we believe that Facebook could streamline its appeals process by following the guidelines suggested above. Even dramatically expanding its resources will not allow the Oversight Board to be effective without these changes. Facebook, Twitter, and the other social media platforms may want to consider the creation of a single industry-wide, multi-layered appeals entity that would work across all platforms. (Companies would have to agree to abide by its rulings.) Pursuing policies such as these would allow platforms like Facebook to avoid engaging in censorship, while also providing reasonable responses to content that is truly dangerous or illegal.

Mike Matthys is a co-founder at First and Fourteenth Institute (FAFI), which he formed with John Quinn and Brian Jackson. He has spent his career in the technology industry and currently lives in the San Francisco Bay Area.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.