The Times Real Estate


.

  • Written by Amanda Lotz, Fellow, Peabody Media Center; Professor of Media Studies, University of Michigan

For decades, U.S. media companies have limited the content they’ve offered based on what’s good for business. The decisions by Apple, Spotify, Facebook and YouTube[1] to remove content from commentator Alex Jones and his InfoWars platform[2] follow this same pattern.

My research on media industries[3] makes clear that government rules and regulations do little to limit what television shows, films, music albums, video games and social media content are available to the public. Business concerns about profitability are much stronger restrictions. Movies are given ratings based on their content not by government officials but by the Motion Picture Association of America[4], an industry group. Television companies, for their part, often have departments handling what are called “standards and practices[5]” – reviewing content and suggesting or demanding changes to avoid offending audiences or advertisers.

The self-policing by movie studios and TV networks is very similar to YouTube’s and Facebook’s actions: Distributing extremely controversial content is bad for business. Offended viewers will turn away from the program and may choose to boycott the network or service[6] – reducing the size of audiences that can be sold to advertisers. Some alarmed viewers may even urge boycotts of the advertisers whose messages air during controversial programming[7].

Over the decades, television networks have internalized feedback from advertisers and unintended controversies to try to steer clear of negative attention. Social media companies[8] are just beginning[9] to understand these forces are at work[10] in their own industries as well.

Self-regulation to avoid government intrusion

The practices of media industries to police themselves arose over many years, as companies tried to appease public concern without triggering formal government supervision. This pleased all sides: Elected and appointed officials avoided having to do much of anything that might look like squashing free speech, companies avoided formal restrictions that might be quite severe, and concerned citizens had their objections heard and acted upon.

When concerns about the amount of sex and violence on broadcast television developed in the 1970s, the networks agreed – with strong encouragement from the federal government – to establish a “Family Hour[11]” during the first hour of prime-time programming that was monitored by the National Association of Broadcasters. Music labels agreed to place “Parental Advisory” labels on albums with explicit lyrics[12]. Inspired by moviemakers, video game developers adopted ratings based on evaluations by an industry group, the Entertainment Software Ratings Board[13].

There is, though, a key difference between those industries and the situation of YouTube and Facebook. Movie studios, record labels and TV companies are responsible for making their content as well as distributing it – and are legally liable for any problems that might arise.

Online media companies, though, typically don’t create most of what appears on their platforms, and are expressly protected from legal responsibility[14] for the content of the messages others post. But hosting information publicly viewed as hateful can damage a business, even if it doesn’t run afoul of government rules.

Challenges of social media content regulation

Social media companies have achieved their ubiquity[15] and high profits[16] because they do not have to pay for creating the content that attracts attention to their services. They reap the financial rewards of a technological advantage in which billions of users can create, share and look at different messages and pieces of content every day.

They are just beginning to understand the downside to that technological advantage, which is that the public – even if not the law – considers them at least somewhat responsible for what is said on their sites. And it’s extremely difficult to sort through[17], classify and police all those billions of posts – much less to figure out how to automate some of those tasks[18].

Profit, not free speech, governs media companies' decisions on controversy Alex Jones, banned from many social media platforms. Michael Zimmermann, CC BY-ND[19][20]

So far, social media sites have avoided limiting content except in the most extreme cases, because it is difficult to draw lines of acceptability that don’t produce more controversy themselves. Their decision likely included weighing the effects of the objections that would erupt if they did ban Jones against what might happen to their brands if they didn’t[21].

In the past, self-regulation often allowed media companies to evade governmental action. It is unclear whether these latest moves by social media companies are the start of lasting self-regulation or a one-off effort to quell current concern. Either way, their decisions are all about what is good for business.

Their response to outcry may be craven, but it might suggest these companies are recognizing the cultural power of their products. Ultimately, social media companies – like other media companies – are showing that they will respond to pressure from their audiences and the marketplace. In the absence of regulation, consumers will encourage companies to change policies by opting out of social media that enable cesspools of trolling and hate.

Users who want changes made should take note of how audiences have pressured other media industries to make changes in the past. Consumers who want greater privacy controls, environments free of hate speech, and different kinds of algorithms could demand them by leaving flawed services or boycotting the advertisers that support them. As demand for alternatives becomes clearer, services will change or a competitor will arise.

References

  1. ^ Apple, Spotify, Facebook and YouTube (www.vox.com)
  2. ^ remove content from commentator Alex Jones and his InfoWars platform (theconversation.com)
  3. ^ research on media industries (global.oup.com)
  4. ^ Motion Picture Association of America (www.mpaa.org)
  5. ^ standards and practices (www.museum.tv)
  6. ^ boycott the network or service (global.oup.com)
  7. ^ whose messages air during controversial programming (www.nytimes.com)
  8. ^ Social media companies (slate.com)
  9. ^ are just beginning (arstechnica.com)
  10. ^ these forces are at work (apnews.com)
  11. ^ Family Hour (www.museum.tv)
  12. ^ albums with explicit lyrics (www.npr.org)
  13. ^ Entertainment Software Ratings Board (www.esrb.org)
  14. ^ expressly protected from legal responsibility (theconversation.com)
  15. ^ ubiquity (techcrunch.com)
  16. ^ high profits (www.thestreet.com)
  17. ^ extremely difficult to sort through (www.cnbc.com)
  18. ^ automate some of those tasks (theconversation.com)
  19. ^ Michael Zimmermann (commons.wikimedia.org)
  20. ^ CC BY-ND (creativecommons.org)
  21. ^ if they didn’t (apnews.com)

Authors: Amanda Lotz, Fellow, Peabody Media Center; Professor of Media Studies, University of Michigan

Read more http://theconversation.com/profit-not-free-speech-governs-media-companies-decisions-on-controversy-101292

Metropolitan republishes selected articles from The Conversation USA with permission

Visit The Conversation to see more