The Times Real Estate


.

  • Written by Ethan Zuckerman, Associate Professor of Public Policy, Communication, and Information, UMass Amherst
Let the community work it out: Throwback to early internet days could fix social media's crisis of legitimacy

In the 2018 documentary “The Cleaners[1],” a young man in Manila, Philippines, explains his work as a content moderator: “We see the pictures on the screen. You then go through the pictures and delete those that don’t meet the guidelines. The daily quota of pictures is 25,000.” As he speaks, his mouse clicks, deleting offending images while allowing others to remain online.

The man in Manila is one of thousands of content moderators hired as contractors by social media platforms – 10,000 at Google alone[2]. Content moderation on an industrial scale like this is part of the everyday experience for users of social media. Occasionally a post someone makes is removed, or a post someone thinks is offensive is allowed to go viral.

Similarly, platforms add and remove features without input from the people who are most affected by those decisions. Whether you are outraged or unperturbed, most people don’t think much about the history of a system in which people in conference rooms in Silicon Valley and Manila determine your experiences online.

But why should a few companies – or a few billionaire owners – have the power to decide everything about online spaces that billions of people use? This unaccountable model of governance has led stakeholders of all stripes to criticize platforms’ decisions as arbitrary[3], corrupt[4] or irresponsible[5]. In the early, pre-web days of the social internet, decisions about the spaces people gathered in online were often made by members of the community. Our examination of the early history of online governance[6] suggests that social media platforms could return – at least in part – to models of community governance in order to address their crisis of legitimacy.

The documentary ‘The Cleaners’ shows some of the hidden costs of Big Tech’s customer service approach to content moderation.

Online governance – a history

In many early online spaces, governance was handled by community members, not by professionals. One early online space, LambdaMOO[7], invited users to build their own governance system, which devolved power from the hands of those who technically controlled the space – administrators known as “wizards” – to members of the community. This was accomplished via a formal petitioning process and a set of appointed mediators[8] who resolved conflicts between users.

Other spaces had more informal processes for incorporating community input. For example, on bulletin board systems, users voted with their wallets[9], removing critical financial support if they disagreed with the decisions made by the system’s administrators. Other spaces, like text-based Usenet newsgroups, gave users substantial power to shape their experiences. The newsgroups left obvious spam in place, but gave users tools to block it if they chose to. Usenet’s administrators argued that it was fairer to allow each user to make decisions that reflected their individual preferences[10] rather than taking a one-size-fits-all approach.

The graphical web expanded use of the internet from a few million users to hundreds of millions within a decade[11] from 1995 to 2005. During this rapid expansion, community governance was replaced with governance models inspired by customer service, which focused on scale and cost.

This switch from community governance to customer service made sense to the fast-growing companies that made up the late 1990s internet boom. Promising their investors that they could grow rapidly and make changes quickly, companies looked for approaches to the complex work of governing online spaces that centralized power and increased efficiency[12].

While this customer service model of governance allowed early user-generated content sites like Craigslist and GeoCities to grow rapidly[13], it set the stage for the crisis of legitimacy facing social media platforms today. Contemporary battles over social media are rooted in the sense that the people and processes governing online spaces are unaccountable to the communities that gather in them.

Paths to community control

Implementing community governance in today’s platforms could take a number of different forms, some of which are already being experimented with.

Advisory boards like Meta’s Oversight Board[14] are one way to involve outside stakeholders in platform governance, providing independent — albeit limited — review of platform decisions. X (formerly Twitter) is taking a more democratic approach with its Community Notes[15] initiative, which allows users to contextualize information on the platform by crowdsourcing notes and ratings.

Some may question whether community governance can be implemented successfully in platforms that serve billions of users. In response, we point to Wikipedia. It is entirely community-governed and has created an open encyclopedia that’s become the foremost information resource in many languages. Wikipedia is surprisingly resilient to vandalism and abuse, with robust procedures that ensure a resource used by billions remains accessible, accurate and reasonably civil.

On a smaller scale, total self-governance – echoing early online spaces – could be key for communities that serve specific subsets of users. For example, Archive of Our Own[16] was created after fan-fiction authors – people who write original stories using characters and worlds from published books, television shows and movies – found existing platforms unwelcoming. For example, many fan-fiction authors were kicked off social media platforms[17] due to overzealous copyright enforcement or concerns about sexual content.

Fed up with platforms that didn’t understand their work or their culture, a group of authors designed and built their own platform specifically to meet the needs of their community. AO3, as it is colloquially known, serves millions of people a month, includes tools specific to the needs of fan-fiction authors, and is governed by the same people it serves.

text above and below a photo of two people in lab coats standing in a hallway
X, formerly Twitter, allows people to use Community Notes to append relevant information to posts that contain inaccuracies. Screen capture by The Conversation U.S., CC BY-ND[18][19]

Hybrid models, like on Reddit, mix centralized and self-governance[20]. Reddit hosts a collection of interest-based communities called subreddits that have their own rules, norms and teams of moderators. Underlying a subreddit’s governance structure is a set of rules, processes and features that apply to everyone. Not every subreddit is a sterling example of a healthy online community, but more are than are not.

There are also technical approaches to community governance. One approach would enable users to choose the algorithms that curate their social media feeds. Imagine that instead of only being able to use Facebook’s algorithm, you could choose from a suite of algorithms provided by third parties – for example, from The New York Times or Fox News.

More radically decentralized platforms like Mastodon devolve control to a network of servers that are similar in structure to email. This makes it easier to choose an experience that matches your preferences. You can choose which Mastodon server to use, and can switch easily – just like you can choose whether to use Gmail or Outlook for email – and can change your mind, all while maintaining access to the wider email network.

Additionally, advancements in generative AI – which shows early promise in producing computer code[21] – could make it easier for people, even those without a technical background, to build custom online spaces when they find existing spaces unsuitable. This would relieve pressure on online spaces to be everything for everyone and support a sense of agency in the digital public sphere.

There are also more indirect ways to support community governance. Increasing transparency – for example, by providing access to data about the impact of platforms’ decisions – can help researchers, policymakers and the public hold online platforms accountable. Further, encouraging ethical professional norms among engineers and product designers can make online spaces more respectful of the communities they serve.

Going forward by going back

Between now and the end of 2024, national elections are scheduled in many countries, including Argentina, Australia, India, Indonesia, Mexico, South Africa, Taiwan, the U.K. and the U.S. This is all but certain to lead to conflicts over online spaces.

We believe it is time to consider not just how online spaces can be governed efficiently and in service to corporate bottom lines, but how they can be governed fairly and legitimately. Giving communities more control over the spaces they participate in is a proven way to do just that.

References

  1. ^ The Cleaners (gebrueder-beetz.de)
  2. ^ 10,000 at Google alone (www.npr.org)
  3. ^ arbitrary (www.brennancenter.org)
  4. ^ corrupt (nymag.com)
  5. ^ irresponsible (www.oxfordstrategyreview.com)
  6. ^ examination of the early history of online governance (doi.org)
  7. ^ LambdaMOO (thenewstack.io)
  8. ^ formal petitioning process and a set of appointed mediators (doi.org)
  9. ^ voted with their wallets (yalebooks.yale.edu)
  10. ^ to make decisions that reflected their individual preferences (fishbowl.pastiche.org)
  11. ^ a few million users to hundreds of millions within a decade (www.internetworldstats.com)
  12. ^ that centralized power and increased efficiency (doi.org)
  13. ^ to grow rapidly (datasociety.net)
  14. ^ Oversight Board (about.meta.com)
  15. ^ Community Notes (help.twitter.com)
  16. ^ Archive of Our Own (archiveofourown.org)
  17. ^ kicked off social media platforms (www.theverge.com)
  18. ^ Screen capture by The Conversation U.S. (twitter.com)
  19. ^ CC BY-ND (creativecommons.org)
  20. ^ mix centralized and self-governance (www.redditinc.com)
  21. ^ early promise in producing computer code (doi.org)

Authors: Ethan Zuckerman, Associate Professor of Public Policy, Communication, and Information, UMass Amherst

Read more https://theconversation.com/let-the-community-work-it-out-throwback-to-early-internet-days-could-fix-social-medias-crisis-of-legitimacy-213209

Metropolitan republishes selected articles from The Conversation USA with permission

Visit The Conversation to see more