The Times Real Estate


.

  • Written by Lynn Greenky, Professor Emeritus of Communication and Rhetorical Studies, Syracuse University
Supreme Court to consider giving First Amendment protections to social media posts

The First Amendment does not protect messages posted on social media platforms.

The companies that own the platforms can – and do – remove, promote or limit the distribution of any posts according to corporate policies[1]. But all that might soon change.

The Supreme Court has agreed to hear five cases[2] during this current term, which ends in June 2024, that collectively give the court the opportunity to reexamine the nature of content moderation – the rules governing discussions on social media platforms such as Facebook and X, formerly known as Twitter – and the constitutional limitations on the government to affect speech on the platforms.

Content moderation, whether done manually by company employees or automatically by a platform’s software and algorithms, affects what viewers can see on a digital media page. Messages that are promoted garner greater viewership and greater interaction; those that are deprioritized or removed will obviously receive less attention. Content moderation policies reflect decisions by digital platforms about the relative value of posted messages.

As an attorney, professor[3] and author of a book about the boundaries of the First Amendment[4], I believe that the constitutional challenges presented by these cases will give the court the occasion to advise government, corporations and users of interactive technologies what their rights and responsibilities are as communications technologies continue to evolve.

Public forums

In late October 2023, the Supreme Court heard oral arguments on two related cases in which both sets of plaintiffs argued that elected officials who use their social media accounts either exclusively or partially to promote their politics and policies cannot constitutionally block constituents[5] from posting comments on the officials’ pages.

In one of those cases, O’Connor-Radcliff v. Garnier[6], two school board members from the Poway Unified School District in California blocked a set of parents – who frequently posted repetitive and critical comments on the board members’ Facebook and Twitter accounts – from viewing the board members’ accounts.

In the other case heard in October, Lindke v. Freed[7], the city manager of Port Huron, Michigan, apparently angered by critical comments about a posted picture, blocked a constituent from viewing or posting on the manager’s Facebook page.

Courts have long held that public spaces, like parks and sidewalks, are public forums, which must remain open to free and robust conversation and debate[8], subject only to neutral rules unrelated to the content of the speech expressed[9]. The silenced constituents in the current cases insisted that in a world where a lot of public discussion is conducted in interactive social media, digital spaces used by government representatives for communicating with their constituents[10] are also public forums and should be subject to the same First Amendment rules as their physical counterparts.

If the Supreme Court rules that public forums can be both physical and virtual, government officials will not be able to arbitrarily block users from viewing and responding to their content or remove constituent comments with which they disagree. On the other hand, if the Supreme Court rejects the plaintiffs’ argument, the only recourse for frustrated constituents will be to create competing social media spaces where they can criticize and argue at will.

Content moderation as editorial choices

Two other cases – NetChoice LLC v. Paxton[11] and Moody v. NetChoice LLC[12] – also relate to the question of how the government should regulate online discussions. Florida[13] and Texas[14] have both passed laws that modify the internal policies and algorithms of large social media platforms by regulating how the platforms can promote, demote or remove posts.

NetChoice, a tech industry trade group representing a wide range of social media platforms[15] and online businesses, including Meta, Amazon, Airbnb and TikTok, contends that the platforms are not public forums. The group says that the Florida and Texas legislation unconstitutionally restricts the social media companies’ First Amendment right to make their own editorial choices[16] about what appears on their sites.

In addition, NetChoice alleges that by limiting Facebook’s or X’s ability to rank, repress or even remove speech – whether manually or with algorithms – the Texas and Florida laws amount to government requirements that the platforms host speech they didn’t want to[17], which is also unconstitutional.

NetChoice is asking the Supreme Court to rule the laws unconstitutional so that the platforms remain free to make their own independent choices regarding when, how and whether posts will remain available for view and comment.

A man in a military uniform stands at a lectern looking out at a group of people sitting in chairs.
In 2021, U.S. Surgeon General Vivek Murthy declared misinformation on social media, especially about COVID-19 and vaccines, to be a public health threat. Chip Somodevilla/Getty Images[18]

Censorship

In an effort to reduce harmful speech that proliferates across the internet – speech that supports criminal and terrorist activity as well as misinformation and disinformation – the federal government has engaged in wide-ranging discussions with internet companies about their content moderation policies[19].

To that end, the Biden administration has regularly advised – some say strong-armed[20] – social media platforms to deprioritize or remove posts the government had flagged as misleading, false or harmful. Some of the posts related to misinformation[21] about COVID-19 vaccines or promoted human trafficking. On several occasions, the officials would suggest that platform companies ban a user who posted the material from making further posts. Sometimes, the corporate representatives themselves would ask the government what to do with a particular post.

While the public might be generally aware that content moderation policies exist, people are not always aware of how those policies affect the information to which they are exposed. Specifically, audiences have no way to measure how content moderation policies affect the marketplace of ideas or influence debate and discussion about public issues.

In Missouri v. Biden[22], the plaintiffs argue that government efforts to persuade social media platforms to publish or remove posts were so relentless and invasive that the moderation policies no longer reflected the companies’ own editorial choices. Rather, they argue, the policies were in reality government directives that effectively silenced – and unconstitutionally censored[23] – speakers with whom the government disagreed.

The court’s decision in this case could have wide-ranging effects on the manner and methods of government efforts to influence the information that guides the public’s debates and decisions.

References

  1. ^ according to corporate policies (www.freedomforum.org)
  2. ^ hear five cases (www.nytimes.com)
  3. ^ professor (lynngreenky.com)
  4. ^ boundaries of the First Amendment (press.uchicago.edu)
  5. ^ cannot constitutionally block constituents (www.nytimes.com)
  6. ^ O’Connor-Radcliff v. Garnier (www.oyez.org)
  7. ^ Lindke v. Freed (www.oyez.org)
  8. ^ remain open to free and robust conversation and debate (www.oyez.org)
  9. ^ unrelated to the content of the speech expressed (firstamendment.mtsu.edu)
  10. ^ communicating with their constituents (www.nytimes.com)
  11. ^ NetChoice LLC v. Paxton (www.oyez.org)
  12. ^ Moody v. NetChoice LLC (www.oyez.org)
  13. ^ Florida (perma.cc)
  14. ^ Texas (perma.cc)
  15. ^ wide range of social media platforms (netchoice.org)
  16. ^ editorial choices (www.oyez.org)
  17. ^ platforms host speech they didn’t want to (www.oyez.org)
  18. ^ Chip Somodevilla/Getty Images (www.gettyimages.com)
  19. ^ content moderation policies (www.nytimes.com)
  20. ^ some say strong-armed (www.nytimes.com)
  21. ^ related to misinformation (www.nytimes.com)
  22. ^ Missouri v. Biden (www.scotusblog.com)
  23. ^ and unconstitutionally censored (www.oyez.org)

Authors: Lynn Greenky, Professor Emeritus of Communication and Rhetorical Studies, Syracuse University

Read more https://theconversation.com/supreme-court-to-consider-giving-first-amendment-protections-to-social-media-posts-217760

Metropolitan republishes selected articles from The Conversation USA with permission

Visit The Conversation to see more