The Times Real Estate


.

  • Written by Joseph B. Walther, Professor of Communication; Director, Center for Information Technology and Society, University of California, Santa Barbara

A number of prominent figures have called for some sort of regulation of Facebook – including one of the company’s co-founders[1] and a venture capitalist[2] who was one of Facebook’s early backers.

Much of the criticism[3] of Facebook relates to how the company’s algorithms target users[4] with advertising, and the “echo chambers[5]” that show users ideologically slanted content.

Despite the public criticism, the company has posted record profits[6]. And billions of people[7] – including more than two-thirds of American adults[8] – continue to use the unregulated version of Facebook that exists now.

I have been studying the social dynamics of the internet[9] for 30 years, and I suspect what’s behind these apparent contradictions is something psychological. People know about Facebook’s problems, but each person assumes he or she is largely immune – even while imagining that everyone else is very susceptible to influence. That paradox helps explain why people keep using the site – which still boasts more than 2 billion monthly average users[10]. And ironically, it also helps explain what’s behind pressure to regulate the social media giant.

It’s not me, it’s them

The psychological tendency at work here is called “the third person effect[11],” the belief that media don’t fool me, and maybe don’t fool you, but all those other people are sitting ducks for media effects.

Ironically, this dynamic can encourage people to support restrictions on media consumption – by others. If someone uses, say, a social media site and feels immune to its negative influences, it triggers another psychological phenomenon called the “influence of presumed influence[12].” When that happens, a person worries that everyone else falls victim, and supports efforts to protect others, even if they think they themselves don’t need the protection.

This could be why there are lots of Facebook users who complain about Facebook’s danger to others, but continue using it nevertheless[13].

Even the Facebook-funding venture capitalist Roger McNamee, who wrote a book about how bad Facebook has become[14], may have fallen prey to this psychological irony. As the Washington Post reports, “despite … his disgust[15] with the worst crimes of social media platforms … McNamee not only still owns Facebook shares … he also still counts himself among the behemoth’s more than 2 billion users. After all, McNamee acknowledges with a shrug and a smile, ‘I’ve got a book to promote.’”

Not everyone can be above average

McNamee may think he’s immune to the echo chambers and other online influences that, he warns, affect the average Facebook user. What if average Facebook users think they’re not the average Facebook user, and therefore also believe that they are immune to Facebook’s pernicious influences?

I explored this possibility in a survey of 515 adults in the U.S. who used Facebook at least once the previous week. Participants were recruited by Qualtrics, a company that administered my survey questions. Respondents resided in all 50 states. Their average age was 39, and they reported an average of just under 10 hours per week on Facebook, which they estimated to be similar to most other Facebook users[16].

The survey asked the respondents three groups of questions. One group was about how strongly they believe that Facebook affects them on a number of important social and political topics, including building a wall on the U.S.-Mexico border, expanding or repealing the Affordable Care Act, whether President Trump is doing a good job and other major national issues[17].

The second group of questions asked how much each respondent believes Facebook affects others’ perceptions of those same issues – how much social media affects their idea of “the average person.”

The third group of questions asked how strongly each respondent supported regulating Facebook, through a variety of possible strategies that include rulings from the Federal Trade Commission or the Federal Communications Commission, breaking up Facebook using anti-trust laws, requiring Facebook to reveal its algorithms and other steps.

Eager to protect others

Respondents believed that Facebook affects other people’s perceptions much more strongly than it affects their own. The more they thought that others were more vulnerable than they were, the more they wanted to rein Facebook in.

Facebook doesn't fool me – but I worry about how it affects you A man misled by online information surrenders to police in Washington, D.C., after firing a rifle in a pizzeria. Sathi Soma via AP[18]

People who thought they were far less affected than others, and who wanted to regulate Facebook, also believed more strongly that the source of the problem with Facebook lies in the power of echo chambers to repeat, amplify and reinforce a user’s beliefs. That was true even though they would be affected by the regulations as well.

Echo chambers do exist, and they do affect people’s perceptions – even leading one person to shoot up a pizza parlor[19] alleged to be a front for child prostitution. But research has called into question[20] the idea that echo chambers are extremely influential over most people’s views.

In my view, it’s more important to help people understand that they are just as much at risk from Facebook as everyone else, whatever the level of risk may actually be. Society may bear some responsibility, but so do individual Facebook users. Otherwise they’ll ignore recommendations about their own media consumption, while supporting calls for sweeping regulations that may be too broad and potentially misdirected. Ultimately, people need to save themselves more, and worry a little less about saving everyone else.

References

  1. ^ one of the company’s co-founders (www.nytimes.com)
  2. ^ venture capitalist (www.penguinrandomhouse.com)
  3. ^ Much of the criticism (www.nytimes.com)
  4. ^ algorithms target users (theconversation.com)
  5. ^ echo chambers (theconversation.com)
  6. ^ posted record profits (newsroom.fb.com)
  7. ^ billions of people (newsroom.fb.com)
  8. ^ more than two-thirds of American adults (www.pewresearch.org)
  9. ^ studying the social dynamics of the internet (scholar.google.com)
  10. ^ more than 2 billion monthly average users (newsroom.fb.com)
  11. ^ the third person effect (doi.org)
  12. ^ influence of presumed influence (doi.org)
  13. ^ continue using it nevertheless (www.pri.org)
  14. ^ how bad Facebook has become (www.penguinrandomhouse.com)
  15. ^ despite … his disgust (www.washingtonpost.com)
  16. ^ most other Facebook users (www.vox.com)
  17. ^ major national issues (www.pewresearch.org)
  18. ^ Sathi Soma via AP (www.apimages.com)
  19. ^ shoot up a pizza parlor (www.cbsnews.com)
  20. ^ called into question (doi.org)

Authors: Joseph B. Walther, Professor of Communication; Director, Center for Information Technology and Society, University of California, Santa Barbara

Read more http://theconversation.com/facebook-doesnt-fool-me-but-i-worry-about-how-it-affects-you-117296

Metropolitan republishes selected articles from The Conversation USA with permission

Visit The Conversation to see more