The Times Real Estate


.

  • Written by Emilio Ferrara, Associate Professor of Computer Science; USC Viterbi School of Engineering; Associate Professor of Communication, USC Annenberg School for Communication and Journalism
On Twitter, bots spread conspiracy theories and QAnon talking points

Americans who seek political insight and information on Twitter should know how much of what they are seeing is the result of automated propaganda campaigns.

Nearly four years after my collaborators and I revealed how automated Twitter accounts[1] were distorting online election discussions[2] in 2016, the situation appears to be no better. That’s despite the efforts of policymakers, technology companies and even the public to root out disinformation campaigns on social media.

In our latest study, we collected 240 million election-related tweets[3] mentioning presidential candidates and election-related keywords, posted between June 20 and Sept. 9, 2020. We looked for activity from automated (or bot) accounts, and the spread of distorted or conspiracy theory narratives.

We learned that on Twitter, many conspiracy theories, including QAnon, may not be quite as popular among real people as media reports indicate. But automation can significantly increase the distribution of these ideas, inflating their power by reaching unsuspecting users who may be drawn in not by posts from their fellow humans, but from bots programmed to spread the word.

Bots amplify conspiracy theories

Typically, bots are created by people or groups who want to amplify certain ideas or points of view. We found that bots are roughly equally active in online discussions of both right-wing and left-wing perspectives, making up about 5% of the Twitter accounts active in those threads.

Bots appear to thrive in political groups discussing conspiracy theories, making up nearly 13% of the accounts tweeting or retweeting posts with conspiracy theory-related hashtags and keywords.

Then we looked more closely at three major categories of conspiracies. One was a category of alleged scandals described using the suffix “-gate,” such as “Pizzagate” and “Obamagate.” The second was COVID-19-related political conspiracies, such as biased claims that the virus was deliberately spread by China or that it could be spread via products imported from China. The third was the QAnon movement, which has been called a “collective delusion[4]” and a “virtual cult[5].”

These three categories overlap: Accounts tweeting about material in one of them were likely to also tweet about material in at least one of the others.

The link to right-wing media

We found that the accounts that are prone to share conspiratorial narratives are significantly more likely than nonconspirator accounts to tweet links to, or retweet posts from, right-leaning media such as One America News Network, Infowars and Breitbart.

[Deep knowledge, daily. Sign up for The Conversation’s newsletter[6].]

Bots play an important role as well: More than 20% of the accounts sharing content from those hyperpartisan platforms are bots. And most of those accounts also distribute conspiracy-related content.

Twitter has recently tried to limit[7] the spread of QAnon[8] and other conspiracy theories on its site. But that may not be enough to stem the tide. To contribute to the global effort against social media manipulation, we have publicly released the dataset[9] used in our work to assist future studies[10].

References

  1. ^ how automated Twitter accounts (theconversation.com)
  2. ^ distorting online election discussions (firstmonday.org)
  3. ^ collected 240 million election-related tweets (firstmonday.org)
  4. ^ collective delusion (www.buzzfeednews.com)
  5. ^ virtual cult (www.cnn.com)
  6. ^ Sign up for The Conversation’s newsletter (theconversation.com)
  7. ^ recently tried to limit (www.washingtonpost.com)
  8. ^ spread of QAnon (mediaschool.indiana.edu)
  9. ^ publicly released the dataset (github.com)
  10. ^ assist future studies (arxiv.org)

Authors: Emilio Ferrara, Associate Professor of Computer Science; USC Viterbi School of Engineering; Associate Professor of Communication, USC Annenberg School for Communication and Journalism

Read more https://theconversation.com/on-twitter-bots-spread-conspiracy-theories-and-qanon-talking-points-149039

Metropolitan republishes selected articles from The Conversation USA with permission

Visit The Conversation to see more