Metro

  • Written by Anjana Susarla, Omura-Saxena Professor of Responsible AI, Michigan State University
If Big Tech has the will, here are ways research shows self-regulation can work

Governments and observers across the world have repeatedly raised concerns about the monopoly power of Big Tech companies[1] and the role the companies play in disseminating misinformation. In response, Big Tech companies have tried to preempt regulations by regulating themselves[2].

With Facebook’s announcement that its Oversight Board will make a decision[3] about whether former President Donald Trump can regain access to his account after the company suspended it, this and other high-profile moves by technology companies to address misinformation have reignited the debate about what responsible self-regulation by technology companies should look like.

Research shows three key ways social media self-regulation can work: deprioritize engagement, label misinformation and crowdsource accuracy verification.

Deprioritize engagement

Social media platforms are built for constant interaction[4], and the companies design the algorithms that choose which posts people see[5] to keep their users engaged. Studies show falsehoods spread faster than truth on social media[6], often because people find news that triggers emotions to be more engaging[7], which makes it more likely they will read, react to and share such news. This effect gets amplified through algorithmic recommendations. My own work[8] shows that people engage with YouTube videos about diabetes more often when the videos are less informative.

Most Big Tech platforms also operate without the gatekeepers or filters[9] that govern traditional sources of news and information. Their vast troves of fine-grained and detailed demographic data[10] give them the ability to “microtarget” small numbers of users[11]. This, combined with algorithmic amplification of content designed to boost engagement, can have a host of negative consequences for society, including digital voter suppression[12], the targeting of minorities for disinformation[13] and discriminatory ad targeting[14].

Deprioritizing engagement in content recommendations should lessen the “rabbit hole” effect of social media[15], where people look at post after post, video after video. The algorithmic design of Big Tech platforms[16] prioritizes new and microtargeted content, which fosters an almost unchecked proliferation of misinformation. Apple CEO Tim Cook recently summed up the problem[17]: “At a moment of rampant disinformation and conspiracy theories juiced by algorithms, we can no longer turn a blind eye to a theory of technology that says all engagement is good engagement – the longer the better – and all with the goal of collecting as much data as possible.”

Apple CEO Tim Cook criticized social media companies for prioritizing engagement over battling misinformation.

Label misinformation

The technology companies could adopt a content-labeling system to identify whether a news item is verified or not. During the election, Twitter announced a civic integrity policy[18] under which tweets labeled as disputed or misleading would not be recommended by their algorithms[19]. Research shows that labeling works. Studies suggest that applying labels to posts from state-controlled media outlets[20], such as from the Russian media channel RT, could mitigate the effects of misinformation.

In an experiment, researchers hired anonymous temporary workers to label trustworthy posts[21]. The posts were subsequently displayed on Facebook with labels annotated by the crowdsource workers. In that experiment, crowd workers from across the political spectrum were able to distinguish between mainstream sources and hyperpartisan or fake news sources, suggesting that crowds often do a good job of telling the difference between real and fake news.

Experiments also show that individuals with some exposure to news sources[22] can generally distinguish between real and fake news. Other experiments found that providing a reminder about the accuracy of a post[23] increased the likelihood that participants shared accurate posts more than inaccurate posts.

In my own work, I have studied how combinations of human annotators, or content moderators, and artificial intelligence algorithms – what is referred to as human-in-the-loop intelligence – can be used to classify health care-related videos on YouTube[24]. While it is not feasible to have medical professionals watch every single YouTube video on diabetes, it is possible to have a human-in-the-loop method of classification. For example, my colleagues and I recruited subject-matter experts to give feedback to AI algorithms, which results in better assessments of the content of posts and videos.

Tech companies have already employed such approaches. Facebook uses a combination of fact-checkers and similarity-detection algorithms[25] to screen COVID-19-related misinformation. The algorithms detect duplications and close copies[26] of misleading posts.

Community-based enforcement

Twitter recently announced that it is launching a community forum, Birdwatch[27], to combat misinformation. While Twitter hasn’t provided details about how this will be implemented, a crowd-based verification mechanism adding up votes or down votes[28] to trending posts and using newsfeed algorithms to down-rank content[29] from untrustworthy sources could help reduce misinformation.

The basic idea is similar to Wikipedia’s content contribution system[30], where volunteers classify whether trending posts are real or fake. The challenge is preventing people from up-voting interesting and compelling but unverified content, particularly when there are deliberate efforts to manipulate voting[31]. People can game the systems through coordinated action[32], as in the recent GameStop stock-pumping episode[33].

Another problem is how to motivate people to voluntarily participate in a collaborative effort such as crowdsourced fake news detection. Such efforts, however, rely on volunteers annotating the accuracy of news articles[34], akin to Wikipedia, and also require the participation of third-party fact-checking organizations[35] that can be used to detect if a piece of news is misleading.

However, a Wikipedia-style model needs robust mechanisms[36] of community governance[37] to ensure that individual volunteers follow consistent guidelines when they authenticate and fact-check posts. Wikipedia recently updated its community standards specifically to stem the spread of misinformation[38]. Whether the big-tech companies will voluntarily allow their content moderation policies to be reviewed so transparently is another matter.

[Get our best science, health and technology stories. Sign up for The Conversation’s science newsletter[39].]

Big Tech’s responsibilities

Ultimately, social media companies could use a combination of deprioritizing engagement, partnering with news organizations, and AI and crowdsourced misinformation detection. These approaches are unlikely to work in isolation and will need to be designed to work together.

Coordinated actions facilitated by social media can disrupt society, from financial markets[40] to politics[41]. The technology platforms play an extraordinarily large role in shaping public opinion, which means they bear a responsibility to the public[42] to govern themselves effectively.

Calls for government regulation of Big Tech are growing all over the world, including in the U.S., where a recent Gallup poll showed worsening attitudes toward technology companies[43] and greater support for governmental regulation. Germany’s new laws on content moderation[44] push greater responsibility on tech companies for the content shared on their platforms. A slew of regulations in Europe[45] aimed at reducing the liability protections enjoyed by these platforms and proposed regulations in the U.S. aimed at restructuring internet laws[46] will bring greater scrutiny to tech companies’ content moderation policies.

Some form of government regulation is likely in the U.S. Big Tech still has an opportunity to engage in responsible self-regulation – before the companies are compelled to act by lawmakers.

References

  1. ^ raised concerns about the monopoly power of Big Tech companies (www.nytimes.com)
  2. ^ regulating themselves (www.economist.com)
  3. ^ Oversight Board will make a decision (time.com)
  4. ^ constant interaction (dx.doi.org)
  5. ^ the companies design the algorithms that choose which posts people see (www.fastcompany.com)
  6. ^ falsehoods spread faster than truth on social media (dx.doi.org)
  7. ^ news that triggers emotions to be more engaging (doi.org)
  8. ^ My own work (dx.doi.org)
  9. ^ also operate without the gatekeepers or filters (niemanreports.org)
  10. ^ fine-grained and detailed demographic data (theconversation.com)
  11. ^ “microtarget” small numbers of users (doi.org)
  12. ^ digital voter suppression (www.vox.com)
  13. ^ the targeting of minorities for disinformation (www.fastcompany.com)
  14. ^ discriminatory ad targeting (www.shrm.org)
  15. ^ rabbit hole” effect of social media (www.scientificamerican.com)
  16. ^ algorithmic design of Big Tech platforms (doi.org)
  17. ^ Apple CEO Tim Cook recently summed up the problem (www.reuters.com)
  18. ^ civic integrity policy (www.startribune.com)
  19. ^ recommended by their algorithms (www.npr.org)
  20. ^ applying labels to posts from state-controlled media outlets (misinforeview.hks.harvard.edu)
  21. ^ to label trustworthy posts (doi.org)
  22. ^ show that individuals with some exposure to news sources (dx.doi.org)
  23. ^ found that providing a reminder about the accuracy of a post (doi.org)
  24. ^ can be used to classify health care-related videos on YouTube (dx.doi.org)
  25. ^ combination of fact-checkers and similarity-detection algorithms (www.analyticsinsight.net)
  26. ^ detect duplications and close copies (ai.facebook.com)
  27. ^ launching a community forum, Birdwatch (www.nbcnews.com)
  28. ^ up votes or down votes (doi.org)
  29. ^ newsfeed algorithms to down-rank content (doi.org)
  30. ^ Wikipedia’s content contribution system (en.wikipedia.org)
  31. ^ deliberate efforts to manipulate voting (doi.org)
  32. ^ through coordinated action (www.vox.com)
  33. ^ GameStop stock-pumping episode (marker.medium.com)
  34. ^ annotating the accuracy of news articles (www.kdd.org)
  35. ^ third-party fact-checking organizations (www.microsoft.com)
  36. ^ robust mechanisms (doi.org)
  37. ^ community governance (ojs.aaai.org)
  38. ^ to stem the spread of misinformation (www.france24.com)
  39. ^ Sign up for The Conversation’s science newsletter (theconversation.com)
  40. ^ financial markets (www.politico.com)
  41. ^ politics (www.buzzfeednews.com)
  42. ^ responsibility to the public (theconversation.com)
  43. ^ worsening attitudes toward technology companies (news.gallup.com)
  44. ^ new laws on content moderation (www.wsj.com)
  45. ^ regulations in Europe (www.wsj.com)
  46. ^ proposed regulations in the U.S. aimed at restructuring internet laws (www.wsj.com)

Authors: Anjana Susarla, Omura-Saxena Professor of Responsible AI, Michigan State University

Read more https://theconversation.com/if-big-tech-has-the-will-here-are-ways-research-shows-self-regulation-can-work-154248

Metropolitan republishes selected articles from The Conversation USA with permission

Visit The Conversation to see more

Entertainment News

SANDRA BOOKER RELEASES “UNTIL WE MEET AGAIN”

SANDRA BOOKER "Until We Meet Again" RELEASES WORLDWIDE MARCH 30TH There is something beautiful about artists whose insight into the human condition allows them to create works that meet our collective moment at a time we most need their...

News Co Media - avatar News Co Media

How Can Music Itself Survive Without Rock n Roll?

Just because you don’t hear much straightforward rock and roll on the Top 40 charts these days doesn’t mean it’s going anywhere. Its time in the mainstream limelight may not be as popular as it was when I was growing up in the 70s and what we now...

Michael Mesey, American Greed - avatar Michael Mesey, American Greed

Is Rock Music a Dying Breed?

“Rock ‘n’ Roll [is here to stay, it...] can […will] never die” – David Ernest White, Neil Percival Young, etc. “Rock ‘n’ Roll is dead” – Leonard Albert Kravitz, Barrington DeVaughn Hendricks, etc. “And so on and so on and scooby dooby dooby…” – S...

Eli Soiefer/Emodulari - avatar Eli Soiefer/Emodulari

skinsNbones

“There’s Reason” The question mark (?) logo that Brisbane, Australia husband and wife rockers skinsNbones use in all their promotional materials in lieu of band photos reflects a fascinating aesthetic designed to create mystery, provoke and en...

News Co Media - avatar News Co Media

RAP without the F Bomb

Some would say that’s like peanut butter without jelly.  Hey, I am no prude and I shocked my best friend from grade school when I said “great detectives caught the Mo Foes, from my new song Clean Slate.”  She said:  whoa, you dropped the F word.”...

Rebecca L Davis   aka    DawgGoneDavis - avatar Rebecca L Davis aka DawgGoneDavis

My COVID Musical Journey by Kai Alece

It has been said that maybe one in 1 million musicians will make it to stardom. If you are in the top 5%, you are probably writing songs for top artists, scoring for blockbuster films and TV, a sought after session musician or maybe even playing ...

Kai Alece - avatar Kai Alece

Metropolitan Business News

Perfecting Web Design For A Health-Based Website

These days, when it comes to understanding what we put into our bodies, we are more focused than ever. It seems that everywhere we look, there is another health benefit, product, or trend on the...

News Co Media - avatar News Co Media

3 Realistic Reasons Why Physical Offices Are (Almost) Dead

Nowadays, more and more businesses are trying to find alternatives to traditional offices. For many years, brick and mortar offices have been at the heart of a company’s life. But work environme...

News Co Media - avatar News Co Media

Advantages of no-code app development for businesses

A marketer may or may not have knowledge about coding. But even if you don't’ have knowledge in coding, it is easy to build an automation sequence between two apps by making use of a no code app bui...

News Co - avatar News Co

Do Directories Still Help SEO?

Directories were once one of the main staples of SEO and they were definitely in existence before search engines took over. Directories were once the main way that people navigated the world wide we...

News Co - avatar News Co

Shipping Container FAQs

If you are looking to rent or purchase a shipping container, you probably have a few questions. We have selected some of the most common questions and answered them here. We hope that you find this ...

News Co - avatar News Co

Is Duplicate Content an SEO Myth?

Any marketer you speak to about duplicate content is concerned about a “duplicate content penalty” but they are probably not very experienced with SEO. Google has specific guidelines on duplicate co...

News Co - avatar News Co

Writers Wanted


News Co Media

Content & Technology Connecting Global Audiences

More Information - Less Opinion