Disinformation is rampant on social media – a social psychologist explains the tactics used against you
- Written by H. Colleen Sinclair, Associate Research Professor of Social Psychology, Louisiana State University
Information warfare abounds, and everyone online has been drafted[1] whether they know it or not.
Disinformation is deliberately generated misleading content disseminated for selfish or malicious purposes. Unlike misinformation, which may be shared unwittingly or with good intentions, disinformation aims to foment distrust, destabilize institutions, discredit good intentions[2], defame opponents and delegitimize sources of knowledge such as science and journalism.
Many governments engage in disinformation campaigns. For instance, the Russian government has used images of celebrities[3] to attract attention to anti-Ukraine propaganda. Meta, parent company of Facebook and Instagram, warned on Nov. 30, 2023, that China has stepped up its disinformation operations[4].
Disinformation is nothing new[5], and information warfare has been practiced by many countries, including the U.S.[6] But the internet gives disinformation campaigns unprecedented reach. Foreign governments[7], internet trolls[8], domestic and international extremists[9], opportunistic profiteers[10] and even paid disinformation agencies[11] exploit the internet to spread questionable content. Periods of civil unrest[12], natural disasters[13], health[14] crises and wars[15] trigger anxiety[16] and the hunt for information, which disinformation agents take advantage of.
Certainly it’s worth watching for the warning signs for misinformation[17] and dangerous speech[18], but there are additional tactics disinformation agents employ.
It’s just a joke
Hahaganda[19] is a tactic[20] in which disinformation agents use memes, political comedy from state-run outlets, or speeches to make light of serious matters, attack others, minimize violence[21] or dehumanize[22], and deflect blame.
This approach provides an easy defense: If challenged, the disinformation agents can say, “Can’t you take a joke?” often followed by accusations of being too politically correct.
Shhh … tell everyone
Rumor-milling is a tactic in which the disinformation agents claim to have exclusive access to secrets[23] they allege are being purposefully concealed. They indicate that you will “only hear this here” and will imply that others are unwilling to share the alleged truth – for example, “The media won’t report this” or “The government doesn’t want you to know” and “I shouldn’t be telling you this … .”
But they do not insist that the information be kept secret, and will instead include encouragement to share it – for example, “Make this go viral” or “Most people won’t have the courage to share this.” It’s important to question how an author or speaker could have come by such “secret” information and what their motive is to prompt you to share it.
People are saying
Often disinformation has no real evidence, so instead disinformation agents will find or make up people[24] to support their assertions. This impersonation can take multiple forms. Disinformation agents will use anecdotes as evidence, especially sympathetic stories from vulnerable groups such as women or children.
Similarly, they may disseminate “concerned citizens’[25]” perspectives. These layperson experts present their social identity as providing the authority to speak on a matter; “As a mother …,” “As a veteran …,” “As a police officer ….” Convert communicators[26], or people who allegedly change from the “wrong” position to the “right” one, can be especially persuasive, such as the woman who got an abortion but regretted it. These people often don’t actually exist or may be coerced[27] or paid.
If ordinary people don’t suffice, fake experts[28] may be used. Some are fabricated, and you can watch out for “inauthentic user[29]” behavior, for example, by checking X – formerly Twitter – accounts using the Botometer[30]. But fake experts can come in different varieties.
- A faux expert is someone used for their title but doesn’t have actual relevant expertise.
- A pseudoexpert is someone who claims relevant expertise but has no actual training[31].
- A junk expert is a sellout. They may have had expertise once but now say whatever is profitable. You can often find these people have supported other dubious claims – for example, that smoking doesn’t cause cancer – or work for institutes that regularly produce questionable “scholarship.”[32][33][34]
- An echo expert is when disinformation sources cite each other to provide credence for their claims. China and Russia routinely cite one another’s newspapers.[35]
- A stolen expert is someone who exists, but they weren’t actually contacted and their research is misinterpreted. Likewise, disinformation agents also steal credibility from known news sources, such as by typosquatting[36], the practice of setting up a domain name that closely resembles a legitimate organization’s.
You can check whether accounts, anecdotal or scientific, have been verified by other reliable sources[37]. Google the name. Check expertise status, source validity and interpretation of research. Remember, one story[38] or interpretation is not necessarily representative.
It’s all a conspiracy
Conspiratorial narratives involve some malevolent force – for example, “the deep state,” – engaged in covert actions[39] with the aim to cause harm to society. That certain conspiracies such as MK-Ultra[40] and Watergate have been confirmed is often offered as evidence for the validity of new unfounded conspiracies.
Nonetheless, disinformation agents find that constructing a conspiracy is an effective means to remind people of past reasons to distrust governments, scientists or other trustworthy sources[41].
But extraordinary claims require extraordinary evidence. Remember, the conspiracies that were ultimately unveiled had evidence – often from sources like investigative journalists, scientists and government investigations. Be particularly wary of conspiracies that try to delegitimize knowledge-producing institutions[42] like universities, research labs, government agencies and news outlets by claiming that they are in on a cover-up.
Good vs. evil
Disinformation often serves the dual purpose of making the originator look good and their opponents look bad. Disinformation takes this further by painting issues as a battle between good and evil, using accusations of evilness to legitimize violence[43]. Russia is particularly fond of accusing others of being secret Nazis[44], pedophiles[45] or Satanists[46]. Meanwhile, they often depict their soldiers as helping children and the elderly.
Be especially wary of accusations of atrocities[47] like genocide, especially under the attention-grabbing “breaking news” headline. Accusations[48] abound. Verify the facts and how the information was obtained.
Are you with us or against us?
A false dichotomy narrative sets up the reader to believe that they have one of two mutually exclusive options; a good or a bad one, a right or a wrong one, a red pill or a blue pill. You can accept their version of reality or be an idiot or “sheeple.”
There are always more options than those being presented, and issues are rarely so black and white. This is just one of the tactics in brigading[49], where disinformation agents seek to silence dissenting viewpoints by casting them as the wrong choice.
Turning the tables
Whataboutism[50] is a classic Russian disinformation technique they use to deflect attention from their own wrongdoings by alleging the wrongdoings of others. These allegations about the actions of others may be true or false but are nonetheless irrelevant[51] to the matter at hand. The potential past wrongs of one group does not mean you should ignore the current wrongs of another.
Disinformation agents also often cast their group as the wronged party. They only engage in disinformation because their “enemy” engages in disinformation against them; they only attack to defend; and their reaction was appropriate, while that of others was an overreaction[52]. This type of competitive victimhood[53] is particularly pervasive when groups have been embedded in a long-lasting conflict.
In all of these cases, the disinformation agent is aware that they are deflecting, misleading, trolling or outright fabricating. If you don’t believe them, they at least want to make you question what, if anything, you can believe.
You often look into the things you buy rather than taking the advertising at face value before you hand over your money. This should also go for what information you buy into.
References
- ^ abounds, and everyone online has been drafted (www.nytimes.com)
- ^ discredit good intentions (www.jstor.org)
- ^ used images of celebrities (www.wired.com)
- ^ has stepped up its disinformation operations (www.npr.org)
- ^ nothing new (blogs.lse.ac.uk)
- ^ including the U.S. (warontherocks.com)
- ^ Foreign governments (www.nytimes.com)
- ^ internet trolls (misinforeview.hks.harvard.edu)
- ^ extremists (unicri.it)
- ^ opportunistic profiteers (counterhate.com)
- ^ paid disinformation agencies (phys.org)
- ^ civil unrest (www.psychologytoday.com)
- ^ natural disasters (www.nytimes.com)
- ^ health (publichealth.jhu.edu)
- ^ wars (www.nytimes.com)
- ^ trigger anxiety (www.ncbi.nlm.nih.gov)
- ^ misinformation (theconversation.com)
- ^ dangerous speech (theconversation.com)
- ^ Hahaganda (www.cde.ual.es)
- ^ tactic (stratcomcoe.org)
- ^ minimize violence (euvsdisinfo.eu)
- ^ dehumanize (www.psychologytoday.com)
- ^ claim to have exclusive access to secrets (doi.org)
- ^ find or make up people (press.princeton.edu)
- ^ concerned citizens’ (doi.org)
- ^ Convert communicators (doi.org)
- ^ coerced (www.bbc.com)
- ^ fake experts (www.france24.com)
- ^ inauthentic user (datajournalism.com)
- ^ Botometer (botometer.osome.iu.edu)
- ^ no actual training (doi.org)
- ^ smoking doesn’t cause cancer (dx.doi.org)
- ^ institutes (redwoods.libguides.com)
- ^ scholarship (doi.org)
- ^ cite one another’s (www.brookings.edu)
- ^ typosquatting (stratheia.com)
- ^ have been verified by other reliable sources (www.forbes.com)
- ^ one story (www.ncbi.nlm.nih.gov)
- ^ engaged in covert actions (doi.org)
- ^ MK-Ultra (www.smithsonianmag.com)
- ^ distrust governments, scientists or other trustworthy sources (doi.org)
- ^ delegitimize knowledge-producing institutions (doi.org)
- ^ legitimize violence (doi.org)
- ^ secret Nazis (www.nytimes.com)
- ^ pedophiles (www.voanews.com)
- ^ Satanists (www.atlanticcouncil.org)
- ^ accusations of atrocities (doi.org)
- ^ Accusations (www.bbc.com)
- ^ brigading (doi.org)
- ^ Whataboutism (theconversation.com)
- ^ true or false but are nonetheless irrelevant (hedgehogreview.com)
- ^ overreaction (doi.org)
- ^ competitive victimhood (doi.org)
Authors: H. Colleen Sinclair, Associate Research Professor of Social Psychology, Louisiana State University