Business Success


.

  • Written by Anjana Susarla, Omura-Saxena Professor of Responsible AI, Michigan State University

With less than half the United States population fully vaccinated for COVID-19 and as the delta variant sweeps the nation, the U.S. surgeon general issued an advisory that called misinformation an urgent threat to public health[1]. The advisory said efforts by social media companies to combat misinformation are “too little, too late and still don’t go far enough.” The advisory came more than a year after the World Health Organization warned of a COVID-related “infodemic.”[2]

There’s good reason to be concerned. A study in the U.K. and the U.S. found that exposure to online misinformation about COVID-19 vaccines reduced the number of people who said they would get vaccinated[3] and increased the number of people who said they would not.

As a researcher who studies social media[4], I can recommend ways social media companies, in collaboration with researchers, can develop effective interventions against misinformation[5] and help build trust and acceptance of vaccines. The government could intervene, but a bill to curb medical misinformation on social media[6] filed in July is revealing some of the challenges – it’s drawing scorn[7] for leaving to a political appointee decisions about what constitutes misinformation.

The threat

A serious threat in online settings is that fake news spreads faster[8] than verified and validated news from credible sources. Articles connecting vaccines and death have been among the content people engage with most[9].

Algorithms on social media platforms are primed for engagement[10]. Recommendation engines in these platforms create a rabbit-hole effect[11] by pushing users who click on anti-vaccine messages toward more anti-vaccine content. Individuals and groups that spread medical misinformation are well organized to exploit the weaknesses[12] of the engagement-driven ecosystems on social media platforms.

Social media is being manipulated on an industrial scale[13], including a Russian campaign pushing disinformation[14] about COVID-19 vaccines. Researchers have found that people who rely on Facebook as their primary source of news about the coronavirus are less likely to be vaccinated[15] than people who get their coronavirus news from any other source.

While social media companies have actively tagged and removed misinformation about COVID-19 generally, stories about vaccine side effects are more insidious because conspiracy theorists may not be trafficking in false information as much as engaging in selectively distorting risks from vaccination. These efforts are part of a well-developed disinformation ecosystem[16] on social media platforms that extends to offline anti-vaccine activism[17].

A woman in a white lab coat sitting in a kitchen points a laptop screen to the viewer Anti-vaccine activists use sophisticated misinformation techniques, including hijacking social media accounts of health care workers like this physician and presenting misinformation via the stolen identities. AP Photo/Sue Ogrocki[18]

Misinformation on social media may also fuel vaccine inequities. There are significant racial disparities[19] among COVID-19 vaccine recipients so far. For example, though vaccine-related misinformation is not the only source of these differences, health-related misinformation is rife on Spanish-language Facebook[20].

Here are two key steps social media companies can take to reduce vaccine-related misinformation.

Block known sources of vaccine misinformation

There have been popular anti-vaccine hashtags such as #vaccineskill. Though it was blocked on Instagram two years ago, it was allowed on Facebook until July 2021[21]. Aside from vaccines, misinformation on multiple aspects of COVID-19 prevention and treatment abounds, including misinformation about the health benefits of wearing a mask[22].

Twitter recently suspended U.S. Rep. Marjorie Taylor Greene for a couple of days, citing a post of COVID misinformation. But social media companies could do a lot more to block disinformation spreaders. Reports suggest that most of the vaccine disinformation on Facebook and Twitter comes from a dozen users who are still active on social media referred to as the disinformation dozen. The list is topped by businessman and physician Joseph Mercola and prominent anti-vaccine activist Robert F. Kennedy Jr.[23][24][25]

Evidence suggests that infodemic superspreaders engage in coordinated sharing of content[26], which increases their effectiveness in spreading disinformation and, correspondingly, makes it all the more important to block them. Social media platforms need to more aggressively flag harmful content[27] and remove people known to traffic in vaccine-related disinformation.

Disclose more about medical misinformation

Facebook claims that it has taken down 18 million pieces of coronavirus misinformation[28]. However, the company doesn’t share data about misinformation[29] on its platforms. Researchers and policymakers don’t know how much vaccine-related misinformation is on the platforms and how many people are seeing and sharing[30] misinformation.

Another challenge is distinguishing between different types of engagement. My own research studying medical information on YouTube found different levels of engagement[31], people simply viewing information that’s relevant to their interests and people commenting on and providing feedback about the information. The issue is how vaccine-related misinformation fits into people’s preexisting beliefs and to what extent their skepticism of vaccines is accentuated by what they are exposed to online.

Social media companies can also partner with health organizations, medical journals and researchers to more thoroughly and credibly identify medical misinformation[32].

[Over 100,000 readers rely on The Conversation’s newsletter to understand the world. Sign up today[33].]

Researchers who are working to understand how misinformation spreads rely on social media companies to conduct research about users’ behavior on their platforms. For instance, what researchers do know about anti-vaccine disinformation[34] on Facebook comes from Facebook’s CrowdTangle[35] data analysis tool for public information on the platforms.

How to use CrowdTangle to see coronavirus information on Facebook and Instagram.

Researchers need more information from the companies, including ways to spot bot activity[36]. Facebook could follow its own example from when it provided data to researchers seeking to uncover Russian fake news campaigns[37] targeted at African American voters.

Data about about social media will help researchers answer key questions about medical misinformation, and the answers in turn could lead to better ways of countering the misinformation.

References

  1. ^ an urgent threat to public health (www.hhs.gov)
  2. ^ “infodemic.” (www.un.org)
  3. ^ reduced the number of people who said they would get vaccinated (doi.org)
  4. ^ researcher who studies social media (scholar.google.com)
  5. ^ develop effective interventions against misinformation (dx.doi.org)
  6. ^ a bill to curb medical misinformation on social media (www.klobuchar.senate.gov)
  7. ^ drawing scorn (techpolicy.press)
  8. ^ fake news spreads faster (dx.doi.org)
  9. ^ among the content people engage with most (www.npr.org)
  10. ^ primed for engagement (theconversation.com)
  11. ^ create a rabbit-hole effect (doi.org)
  12. ^ are well organized to exploit the weaknesses (techpolicy.press)
  13. ^ manipulated on an industrial scale (www.ox.ac.uk)
  14. ^ a Russian campaign pushing disinformation (www.wsj.com)
  15. ^ less likely to be vaccinated (www.washingtonpost.com)
  16. ^ well-developed disinformation ecosystem (www.nature.com)
  17. ^ extends to offline anti-vaccine activism (www.washingtonpost.com)
  18. ^ AP Photo/Sue Ogrocki (newsroom.ap.org)
  19. ^ significant racial disparities (www.kff.org)
  20. ^ rife on Spanish-language Facebook (www.theguardian.com)
  21. ^ it was allowed on Facebook until July 2021 (www.cnn.com)
  22. ^ misinformation about the health benefits of wearing a mask (dx.doi.org)
  23. ^ post of COVID misinformation (www.nytimes.com)
  24. ^ comes from a dozen users who are still active on social media (www.counterhate.com)
  25. ^ businessman and physician Joseph Mercola and prominent anti-vaccine activist Robert F. Kennedy Jr. (www.businessinsider.com)
  26. ^ coordinated sharing of content (doi.org)
  27. ^ flag harmful content (techpolicy.press)
  28. ^ taken down 18 million pieces of coronavirus misinformation (www.bloomberg.com)
  29. ^ doesn’t share data about misinformation (arstechnica.com)
  30. ^ vaccine-related misinformation is on the platforms and how many people are seeing and sharing (slate.com)
  31. ^ found different levels of engagement (dx.doi.org)
  32. ^ thoroughly and credibly identify medical misinformation (dx.doi.org)
  33. ^ Sign up today (theconversation.com)
  34. ^ what researchers do know about anti-vaccine disinformation (monitoring.bbc.co.uk)
  35. ^ CrowdTangle (www.crowdtangle.com)
  36. ^ spot bot activity (www.bostonglobe.com)
  37. ^ seeking to uncover Russian fake news campaigns (www.bbc.com)

Authors: Anjana Susarla, Omura-Saxena Professor of Responsible AI, Michigan State University

Read more https://theconversation.com/big-tech-has-a-vaccine-misinformation-problem-heres-what-a-social-media-expert-recommends-164987

Metropolitan republishes selected articles from The Conversation USA with permission

Visit The Conversation to see more

Business Marketing