The Times Real Estate


.

  • Written by Siri Terjesen, Phil Smith Professor of Entrepreneurship & Associate Dean, Research & External Relations, Florida Atlantic University

Facebook’s quasi-independent Oversight Board on May 5, 2021, upheld the company’s suspension of former President Donald Trump[1] from the platform and Instagram. The decision came four months after Facebook CEO Mark Zuckerberg banned Trump “indefinitely” for his role[2] in inciting the Jan. 6 riot at the U.S. Capitol. The board chastised Facebook for failing to either set an end date for the suspension or permanently ban Trump and gave the social media company six months to resolve the matter.

What is this Oversight Board that made one of the most politically perilous decisions Facebook has ever faced? Why did the company create it, and is it a good idea? We asked Siri Terjesen[3], an expert on corporate governance, to answer these and several other questions.

1. What is the Facebook Oversight Board?

The Oversight Board was set up to give users an independent third party to whom they can appeal Facebook moderation decisions, as well as to help set the policies that govern these decisions. The idea was first proposed[4] by Zuckerberg in 2018 after a discussion with Harvard Law Professor Noah Feldman, and the board began work in October 2020, funded by a US$130 million trust provided by Facebook to cover the initial six years of operating expenses.

According to the board[5], it “was created to help Facebook answer some of the most difficult questions around freedom of expression online: what to take down, what to leave up, and why.” The Oversight Board has final decision-making authority, even above the board of directors, and its decisions are binding on Facebook.

The Oversight Board has 20 members[6] from around the world and a diverse variety of disciplines and backgrounds, such as journalism, human rights and law, as well as different political perspectives. It even includes a former prime minister. The goal is to eventually expand the board to 40 members in total.

Why Facebook created its own ‘supreme court’ for judging content – 6 questions answered In a statement, Trump called the Oversight Board decision a ‘total disgrace.’ AP Photo/Jacquelyn Martin[7]

2. What other decisions has it made?

So far, the board has reviewed 10 Facebook decisions[8], including the one involving Trump. The decisions involved a variety of types of content, such as posts that were removed because they were deemed racist[9], indecent[10] or intended to incite violence[11]. It overturned Facebook’s ruling in six of the cases and upheld it in three of them. In the 10th case, the user deleted the post that Facebook had removed, which ended the board’s review.

In cases where the board overruled Facebook, the posts that had been removed were reinstated. And the board sometimes urged the company to clarify or revise its guidelines.

Given that Facebook is expected to take 20 to 30 billion enforcement actions[12] in 2021 alone, it’s unlikely the Oversight Board will be able to handle more than a handful of the most high-profile cases, like that of Trump. It’s one of the reasons the Oversight Board is dubbed “Facebook’s Supreme Court[13].”

3. Is it a model other social media companies are likely to follow?

As a platform company, Facebook is unique.

It’s a social media giant that must monitor a global operation that generates over $86 billion in revenue[14], employs 58,600 people[15] and serves more than 2.8 billion active monthly users[16] – more than a third of the world’s population – as well as millions of advertisers. Very few companies operate in a space that involves user content moderation, and none at this scale. Other platform companies have considerably less content, and usually only in one language, whereas Facebook is available in 100 languages[17].

Given Facebook’s shareholder-elected corporate board of directors includes just 10 people, each of whom has their own demanding day job, it is not surprising to me that Zuckerberg decided to set up an outside panel to develop decisions about speech and online safety.

It’s unlikely, however, that other companies will ever have a similar type of board. The Oversight Board has been extremely resource intensive. It took over two years to establish[18] through a series of 22 roundtable meetings with participants in 88 countries, six in-depth workshops, 250 one-on-one discussions and 1,200 submissions – not to mention its high cost of $130 million, which is meant to last six years.

4. Was it a good idea, from a corporate governance standpoint?

A growing body of research questions whether directors on corporate boards can fulfill their oversight responsibilities[19] on their own, due to the sheer amount of information that must be obtained, processed and shared.

While I think we will see more corporate boards outsource some decisions and processes to external panels – as a small board cannot be expected to have the requisite knowledge and skills on all topics – few corporations are likely to follow Facebook’s lead and grant an outside body the power to make unilateral decisions.

Since only the board of directors is beholden to a company’s shareholders, board directors ultimately need to take the final responsibility for corporate decisions.

Why Facebook created its own ‘supreme court’ for judging content – 6 questions answered Zuckerberg may still face political blowback because of the Oversight Board’s decision. AP Photo/Andrew Harnik[20]

5. Does the Oversight Board shield Facebook from political or legal fallout?

While it’s likely that some at Facebook hoped shifting its thorniest decisions would insulate the company, executives and corporate board members from political or legal problems, as the Trump decision shows, it won’t actually do that.

Certainly the decision to utilize an outside oversight body might be interpreted as political, as all 10 Facebook board directors live and work[21] predominantly in the United States and might be hesitant to vote to make decisions like restricting the freedom of expression of a former president who still commands support among many Americans[22] – and won 47% of the popular vote[23] in the last election.

But whether Facebook makes the decision itself or outsources to an independent board, Facebook will still face the consequences if the decision to uphold the Trump ban alienates Americans or people around the world who feel it is an attack on their freedom of expression.

People may leave Facebook for other platforms such as Parler[24], Gab[25] and Signal[26], as many have already done[27] since the initial Trump ban in January – and knowing an outside body made the decision won’t stop them.

And a poor “political” decision could drive away some advertisers[28] and make it harder to hire and retain employees, regardless of who made it.

6. How are other social media companies handling these issues differently?

Twitter CEO Jack Dorsey made an internal decision to permanently suspended Trump[29] from his company’s platform on Jan. 8, 2021. While Dorsey acknowledged that the decision set a “dangerous precedent[30],” Twitter, like other social media companies, doesn’t have an appeals process for that kind of decision.

Some newer companies, such as MeWe[31] and Rumble[32], offer more lax content moderation in order to allow greater freedom of expression for users.

Gab[33] describes itself as “A social network that champions free speech, individual liberty and the free flow of information online. All are welcome.” Parler’s content guidelines[34] are even more basic and keeps content moderation to an “absolute minimum. We prefer to leave decisions about what is seen and who is heard to each individual.”

Gab[35] and Parler[36] are presently banned from the app stores of both Apple and Google due to a lack of content moderation.

[You’re smart and curious about the world. So are The Conversation’s authors and editors. You can read us daily by subscribing to our newsletter[37].]

References

  1. ^ upheld the company’s suspension of former President Donald Trump (www.oversightboard.com)
  2. ^ banned Trump “indefinitely” for his role (www.washingtonpost.com)
  3. ^ Siri Terjesen (scholar.google.com)
  4. ^ idea was first proposed (www.washingtonpost.com)
  5. ^ According to the board (oversightboard.com)
  6. ^ 20 members (www.oversightboard.com)
  7. ^ AP Photo/Jacquelyn Martin (newsroom.ap.org)
  8. ^ the board has reviewed 10 Facebook decisions (oversightboard.com)
  9. ^ racist (oversightboard.com)
  10. ^ indecent (oversightboard.com)
  11. ^ intended to incite violence (oversightboard.com)
  12. ^ 20 to 30 billion enforcement actions (reason.com)
  13. ^ Facebook’s Supreme Court (www.newyorker.com)
  14. ^ generates over $86 billion in revenue (investor.fb.com)
  15. ^ employs 58,600 people (www.statista.com)
  16. ^ more than 2.8 billion active monthly users (www.oberlo.com)
  17. ^ 100 languages (investor.fb.com)
  18. ^ took over two years to establish (oversightboard.com)
  19. ^ questions whether directors on corporate boards can fulfill their oversight responsibilities (www.doi.org)
  20. ^ AP Photo/Andrew Harnik (newsroom.ap.org)
  21. ^ live and work (investor.fb.com)
  22. ^ still commands support among many Americans (www.usnews.com)
  23. ^ won 47% of the popular vote (www.cfr.org)
  24. ^ Parler (techcrunch.com)
  25. ^ Gab (reason.com)
  26. ^ Signal (www.theverge.com)
  27. ^ many have already done (fortune.com)
  28. ^ could drive away some advertisers (www.wsj.com)
  29. ^ permanently suspended Trump (www.nbcnews.com)
  30. ^ set a “dangerous precedent (www.foxnews.com)
  31. ^ MeWe (www.usatoday.com)
  32. ^ Rumble (www.foxbusiness.com)
  33. ^ Gab (gab.com)
  34. ^ Parler’s content guidelines (legal.parler.com)
  35. ^ Gab (www.businessinsider.com)
  36. ^ Parler (qz.com)
  37. ^ You can read us daily by subscribing to our newsletter (theconversation.com)

Authors: Siri Terjesen, Phil Smith Professor of Entrepreneurship & Associate Dean, Research & External Relations, Florida Atlantic University

Read more https://theconversation.com/why-facebook-created-its-own-supreme-court-for-judging-content-6-questions-answered-160349

Metropolitan republishes selected articles from The Conversation USA with permission

Visit The Conversation to see more