.

  • Written by Heather Woods, Assistant Professor of Rhetoric and Technology, Kansas State University
How mainstream media helps weaponize far-right conspiracy theories

Once an anti-Semitic rumor moved from fringe to the mainstream, it took less than two weeks for violence to erupt. The false allegation that liberal philanthropist George Soros was funding or supporting a caravan of Honduran refugees heading to the U.S. spread wildly from a single tweet posted on Oct. 14.

Along with[1] far-right memes[2], that allegation helped motivate both an alleged[3] mail-bomber[4] and a mass[5] shooter[6] at a Pittsburgh synagogue. The way these messages traveled across the internet in this short time span is just one example of how extremist messages and memes circulate with incredible speed across mainstream social media platforms.

From our vantage point as researchers of visual[7] and digital communication[8], memes – short, often image-based forms of communication – are powerful engines of persuasion, even though they can appear innocuous or even humorous. Perhaps the best known examples are LOLCats[9] memes, pairing funny pictures of cats with customizable phrases or sentences. Memes can disseminate information quickly because they invite people to share or remix content with little effort required, making widespread dispersal more likely.

Memes need not be humorous or factual to be functional. All they need to do is attract attention online, which often translates into mainstream media coverage. That makes memes potent tools for distributing disinformation. Moreover, the online and mainstream platforms that amplify memes’ circulation can weaponize false claims and encourage conspiracy theorists – sometimes toward violence.

Memes move conspiracies

Understanding how these messages embolden anti-Semitism and other forms of terrorism involves grappling with how white supremacists use digital media. As we detail in our forthcoming book “Make America Meme Again[10],” messages and memes weaponized in far-right networks are deft political tools[11] that move swiftly across social and traditional media. Because memes are stealthy political messages that usually offer rebellious or irreverent humor, they can be easily retweeted, shared or even pasted to the side of a van.

Before the dawn of today’s social media network, right-wing extremists were more difficult to find, often gathering in local communities and later discreetly in online forums unknown to the vast majority of internet users. Paranoid, rabid discourses of this ilk still boil around those darker corners of the internet. Today, memes help right-wing extremists communicate with one another and with mainstream audiences.

Soros has been demonized by right-wing activists for years, if not decades[12]. Long before the Pittsburgh attack and the mail bombings, conspiracy theories about him were common on all sorts of right-wing discussion areas – including on Infowars, 4chan, Reddit and Gab. Starting in March 2018, the terms “caravan,” “immigrants” and “Soros”[13] were frequently posted together on Twitter and Facebook. Memes depicting Soros as an evil fascist[14] facilitating an invasion were commonplace.

The alleged mail bomber covered his van with “images and slogans[15] often found on fringe right-wing social media accounts.” But the suspect didn’t find them on radical sites where white supremacists hide. Instead, based on his social media activity[16], he likely was radicalized in the same place most people look at cute photos of friends’ kids and check up on Aunt Beatrice – Facebook.

From fringe to network

Social media platforms have tried to push hate speech and uninformed conspiracy theories[17] off their sites, but that’s a difficult task both technologically and ethically[18]. Often, conspiracy promoters find ways to get their ideas into well-trafficked social media[19], where algorithms promote posts[20] that garner lots of responses – whether appreciative or outraged.

Despite repeated[21] fact[22] checking[23], the conspiracy grew. Bots and other automated accounts drove roughly 60 percent[24] of online talk about the caravan – but people were part of it too, often sharing posts without doing any sort of verification[25]. Ultimately, these messages and memes may have inspired terrorism.

By October, discussions of the “caravan of immigrants” had grown beyond social media. Within a week of that Oct. 14 tweet alleging Soros was funding a group of refugees seeking asylum[26], far-right commentator Alex Jones broadcast the conspiracy on Infowars, to his audience of over 1 million daily visitors[27].

The conspiracy grew from there, with the video or related images popping up on nearly every platform. Eventually the conspiracy reached hundreds of thousands of potential viewers[28] – including the men who would allegedly become the mail bomber and the synagogue[29] shooter[30].

The two men may never have known of each other or the other’s plans. But their actions intertwined with a viciously networked conspiracy theory.

Connecting to mainstream media

Once there is enough social media attention on a topic or claim, it may be covered in more traditional news outlets[31]. That can spread the idea even more widely, and lend credence to inaccuracies and lies. Politicians may also notice online discussion and join in, as U.S. Sen. Ted Cruz[32] and a clerk for Texas’ Harris County[33] did with the purported Soros connection to the migrant caravan.

Conspiratorial ideas often become an echo chamber[34], in which each post draws more attention than the last, generating stronger outrage and escalating the conspiracy. The average user who looks at a conspiratorial meme may not believe its message, but many users may. Even people who don’t believe it initially might come to assume it’s true[35] after seeing an idea several times from different sources. Still others might spread the conspiracy just for amusement in the distress of others.

Demonize, divide, conquer

Memes, tweets and other forms of propaganda are designed to rile up constituents. Scaring voters with purported invasions was one way to infuriate voters as they headed to vote in the midterm elections.

President Donald Trump has historically[36] spread[37] far-right[38] conspiracy[39] theories[40] with little regard[41] for the truth[42]. Just before the election – after the mail bomb attempts and the tragedy in Pittsburgh – Trump himself explicitly repeated[43] the conspiracy about Soros.

When anti-Semitic, racist and xenophobic ideas spread through social media networks, they can infect a host of mainstream information sources – and make fear and violence more likely[44]. That broadens the picture of a dangerous world from which people need protection. Fear appeals of this sort can influence voting, and even push people to take matters into their own violent hands. Until social media platforms or federal agencies find ways to diminish extremism, the proliferation of far-right memes, videos and texts will continue to imperil the citizenry.

References

  1. ^ Along with (www.nytimes.com)
  2. ^ far-right memes (slate.com)
  3. ^ alleged (www.washingtonpost.com)
  4. ^ mail-bomber (www.cjr.org)
  5. ^ mass (www.vox.com)
  6. ^ shooter (www.vox.com)
  7. ^ researchers of visual (www.baylor.edu)
  8. ^ digital communication (scholar.google.com)
  9. ^ LOLCats (knowyourmeme.com)
  10. ^ Make America Meme Again (www.peterlang.com)
  11. ^ deft political tools (www.nytimes.com)
  12. ^ right-wing activists for years, if not decades (www.theatlantic.com)
  13. ^ “caravan,” “immigrants” and “Soros” (medium.com)
  14. ^ depicting Soros as an evil fascist (www.bbc.com)
  15. ^ images and slogans (www.nytimes.com)
  16. ^ based on his social media activity (www.dailydot.com)
  17. ^ tried to push hate speech and uninformed conspiracy theories (www.nytimes.com)
  18. ^ difficult task both technologically and ethically (theconversation.com)
  19. ^ well-trafficked social media (www.dailydot.com)
  20. ^ algorithms promote posts (theconversation.com)
  21. ^ repeated (www.usatoday.com)
  22. ^ fact (www.pbs.org)
  23. ^ checking (www.factcheck.org)
  24. ^ automated accounts drove roughly 60 percent (www.vanityfair.com)
  25. ^ sharing posts without doing any sort of verification (www.washingtonpost.com)
  26. ^ a group of refugees seeking asylum (www.usatoday.com)
  27. ^ 1 million daily visitors (www.statesman.com)
  28. ^ hundreds of thousands of potential viewers (www.usatoday.com)
  29. ^ synagogue (www.wired.com)
  30. ^ shooter (archive.is)
  31. ^ covered in more traditional news outlets (theconversation.com)
  32. ^ Ted Cruz (twitter.com)
  33. ^ Harris County (www.texasobserver.org)
  34. ^ echo chamber (medium.com)
  35. ^ come to assume it’s true (theconversation.com)
  36. ^ has historically (www.aljazeera.com)
  37. ^ spread (www.businessinsider.com)
  38. ^ far-right (www.cnn.com)
  39. ^ conspiracy (time.com)
  40. ^ theories (www.cnn.com)
  41. ^ little regard (www.politifact.com)
  42. ^ the truth (www.newsweek.com)
  43. ^ explicitly repeated (www.washingtonpost.com)
  44. ^ fear and violence more likely (theconversation.com)

Authors: Heather Woods, Assistant Professor of Rhetoric and Technology, Kansas State University

Read more http://theconversation.com/how-mainstream-media-helps-weaponize-far-right-conspiracy-theories-106223

Metropolitan republishes selected articles from The Conversation USA with permission

Visit The Conversation to see more