The Times Real Estate


.

  • Written by K. Hazel Kwon, Associate Professor of Journalism and Digital Audiences, Arizona State University
Disinformation is spreading beyond the realm of spycraft to become a shady industry – lessons from South Korea

Disinformation, the practice of blending real and fake information with the goal of duping a government or influencing public opinion, has its origins in the Soviet Union. But disinformation is no longer the exclusive domain of government intelligence agencies.

Today’s disinformation scene has evolved into a marketplace in which services are contracted, laborers are paid and shameless opinions and fake readers are bought and sold. This industry is emerging around the world. Some of the private-sector players are driven by political motives, some by profit and others by a mix of the two.

Public relations firms have recruited social media influencers in France and Germany[1] to spread falsehoods. Politicians have hired staff to create fake Facebook accounts in Honduras[2]. And Kenyan Twitter influencers[3] are paid 15 times more than many people make in a day for promoting political hashtags. Researchers at the University of Oxford have tracked government-sponsored disinformation activities in 81 countries and private-sector disinformation operations in 48 countries[4].

South Korea has been at the forefront of online disinformation. Western societies began to raise concerns about disinformation in 2016, triggered by disinformation related to the 2016 U.S. presidential election and Brexit. But in South Korea, media reported the first formal disinformation operation in 2008. As a researcher who studies digital audiences[5], I’ve found that South Korea’s 13-year-long disinformation history demonstrates how technology, economics and culture interact to enable the disinformation industry.

Most importantly, South Korea’s experience offers a lesson for the U.S. and other countries. The ultimate power of disinformation is found more in the ideas and memories that a given society is vulnerable to and how prone it is to fueling the rumor mill than it is in the people perpetrating the disinformation or the techniques they use.

From dirty politics to dirty business

The origin of South Korean disinformation can be traced back to the nation’s National Intelligence Service, which is equivalent to the U.S. Central Intelligence Agency. The NIS formed teams in 2010 to interfere in domestic elections[6] by attacking a political candidate it opposed.

The NIS hired more than 70 full-time workers who managed fake, or so-called sock puppet[7], accounts. The agency recruited a group called Team Alpha, which was composed of civilian part-timers who had ideological and financial interests in working for the NIS. By 2012, the scale of the operation had grown to 3,500 part-time workers[8].

Two men, one in a suit jacket in the other a windbreaker jacket, stand shoulder to shoulder in a stairwell, photographers behind them
South Korean President Moon Jae-in (left) campaigning in 2014 for Kim Kyoung-soo (right), who became governor of South Gyeongsang Province in 2018 but was subsequently convicted of opinion rigging. Udenjan/WikiCommons, CC BY[9][10]

Since then the private sector has moved into the disinformation business. For example, a shadowy publishing company led by an influential blogger was involved in a high-profile opinion-rigging scandal[11] between 2016 and 2018. The company’s client was a close political aide of the current president, Moon Jae-in.

In contrast to NIS-driven disinformation campaigns, which use disinformation as a propaganda tool for the government, some of the private-sector players are chameleonlike, changing ideological and topical positions in pursuit of their business interests. These private-sector operations have achieved greater cost effectiveness than government operations by skillfully using bots to amplify fake engagements[12], involving social media entrepreneurs like YouTubers[13] and outsourcing trolling to cheap laborers[14].

Narratives that strike a nerve

In South Korea, Cold War rhetoric has been particularly visible across all types of disinformation operations. The campaigns typically portray the conflict with North Korea and the battle against Communism as being at the center of public discourse in South Korea. In reality, nationwide polls have painted a very different picture. For example, even when North Korea’s nuclear threat was at a peak in 2017, fewer than 10 percent of respondents[15] picked North Korea’s saber-rattling as their priority concern, compared with more than 45 percent who selected economic policy.

Across all types of purveyors and techniques, political disinformation in South Korea has amplified anti-Communist nationalism and denigrated the nation’s dovish diplomacy toward North Korea. My research on South Korean social media rumors[16] in 2013 showed that the disinformation rhetoric continued on social media even after the formal disinformation campaign ended, which indicates how powerful these themes are. Today I and my research team continue to see references to the same themes.

A man standing on a stage while holding a microphone tears a flag Much of the disinformation trafficked in South Korea involves nationalistic anti-Communist narratives similar to this protester’s anti-North Korea message. Photo by Jung Yeon-je/AFP via Getty Images[17]

The dangers of a disinformation industry

The disinformation industry is enabled by the three prongs of today’s digital media industry: an attention economy, algorithm and computational technologies and a participatory culture. In online media, the most important currency is audience attention. Metrics such as the number of page views, likes, shares and comments quantify attention, which is then converted into economic and social capital.

Ideally, these metrics should be a product of networked users’ spontaneous and voluntary participation. Disinformation operations more often than not manufacture these metrics by using bots, hiring influencers, paying for crowdsourcing and developing computational tricks to game a platform’s algorithms.

The expansion of the disinformation industry is troubling because it distorts how public opinion is perceived by researchers, the media and the public itself. Historically, democracies have relied on polls to understand public opinion. Despite their limitations, nationwide polls conducted by credible organizations, such as Gallup[18] and Pew Research[19], follow rigorous methodological standards to represent the distribution of opinions in society in as representative a manner as possible.

Public discourse on social media has emerged as an alternative means of assessing public opinion. Digital audience and web traffic analytic tools are widely available to measure the trends of online discourse. However, people can be misled when purveyors of disinformation manufacturer opinions expressed online and falsely amplify the metrics about the opinions.

Meanwhile, the persistence of anti-Communist nationalist narratives in South Korea shows that disinformation purveyors’ rhetorical choices are not random. To counter the disinformation industry wherever it emerges, governments, media and the public need to understand not just the who and the how, but also the what – a society’s controversial ideologies and collective memories. These are the most valuable currency in the disinformation marketplace.

[The Conversation’s science, health and technology editors pick their favorite stories. Weekly on Wednesdays[20].]

References

  1. ^ France and Germany (www.nytimes.com)
  2. ^ Honduras (www.theguardian.com)
  3. ^ Kenyan Twitter influencers (www.wired.com)
  4. ^ private-sector disinformation operations in 48 countries (demtech.oii.ox.ac.uk)
  5. ^ studies digital audiences (scholar.google.com)
  6. ^ to interfere in domestic elections (www.theguardian.com)
  7. ^ sock puppet (doi.org)
  8. ^ 3,500 part-time workers (www.brookings.edu)
  9. ^ Udenjan/WikiCommons (commons.wikimedia.org)
  10. ^ CC BY (creativecommons.org)
  11. ^ opinion-rigging scandal (www.koreaherald.com)
  12. ^ using bots to amplify fake engagements (ojs.aaai.org)
  13. ^ YouTubers (restofworld.org)
  14. ^ outsourcing trolling to cheap laborers (globalvoices.org)
  15. ^ fewer than 10 percent of respondents (www.nytimes.com)
  16. ^ South Korean social media rumors (doi.org)
  17. ^ Photo by Jung Yeon-je/AFP via Getty Images (www.gettyimages.com)
  18. ^ Gallup (www.gallup.com)
  19. ^ Pew Research (www.pewresearch.org)
  20. ^ Weekly on Wednesdays (theconversation.com)

Authors: K. Hazel Kwon, Associate Professor of Journalism and Digital Audiences, Arizona State University

Read more https://theconversation.com/disinformation-is-spreading-beyond-the-realm-of-spycraft-to-become-a-shady-industry-lessons-from-south-korea-168054

Metropolitan republishes selected articles from The Conversation USA with permission

Visit The Conversation to see more