The Times Real Estate


.

  • Written by Jennifer Grygiel, Assistant Professor of Communications (Social Media) & Magazine, Syracuse University
Livestreamed massacre means it's time to shut down Facebook Live

When word broke that the massacre in New Zealand was livestreamed on Facebook, I immediately thought of Robert Godwin Sr.[1] In 2017, Godwin was murdered in Cleveland, Ohio, and initial reports indicated that the attacker streamed it on Facebook Live, at the time a relatively new feature[2] of the social network. Facebook later clarified that the graphic video was uploaded after the event[3], but the incident called public attention to the risks of livestreaming violence.

In the wake of Godwin’s murder, I recommended that Facebook Live broadcasts be time-delayed[4], at least for Facebook users who had told the company they were under 18. That way, adult users would have an opportunity to flag inappropriate content before children were exposed to it. Facebook Live has broadcast killings[5], as well as other serious crimes such as sexual assault, torture and child abuse[6]. Though the company has hired more than 3,000 additional human content moderators[7], Facebook is not any better at keeping horrifying violence from streaming live online without any filter or warning for users.

In the 24 hours after the New Zealand massacre, 1.5 million videos and images of the killings[8] were uploaded to Facebook’s servers, the company announced. Facebook highlighted the fact that 1.2 million of them “were blocked at upload[9].” However, as a social media researcher and educator[10], I heard that as an admission that 300,000 videos and images of a mass murder passed through its automated systems and were visible on the platform.

The company recently issued some analytic details and noted that fewer than 200 people viewed[11] the livestream of the massacre, and that surprisingly, no users reported it to Facebook until after it ended. These details make painfully clear how dependent Facebook is on users to flag harmful content. They also suggest that people don’t know how to report inappropriate content – or don’t have confidence the company will act on the complaint.

The video that remained after the livestream ended was viewed nearly 4,000 times – which doesn’t include copies of the video[12] uploaded to other sites and to Facebook by other users. It’s unclear how many of the people who saw it were minors; youth as young as 13 are allowed to set up Facebook accounts and could have encountered unfiltered footage of murderous hatred. It’s past time for the company to step up and fulfill the promise its founder and CEO, Mark Zuckerberg, made two years ago, after Godwin’s murder: “We will keep doing all we can to prevent tragedies like this from happening[13].”

Facebook founder and CEO Mark Zuckerberg discusses the murder of Robert Godwin Sr.

A simple time-delay

In the television industry, short time-delays of a few seconds are typical[14] during broadcasts of live events. That time allows a moderator to review the content and confirm that it’s appropriate for a broad audience.

Facebook relies on users as moderators, and some livestreams may not have a large audience like TV, so its delay would need to be longer, perhaps a few minutes. Only then would enough adult users have screened it and had the chance to report its content. Major users, including publishers and corporations, could be permitted to livestream directly after completing a training course. Facebook could even let people request a company moderator[15] for upcoming livestreams.

Facebook has not yet taken this relatively simple step – and the reason is clear. Time-delays took hold in TV only because broadcasting regulators penalized broadcasters[16] for airing inappropriate content during live shows. There is effectively no regulation[17] for social media companies; they change only in pursuit of profits[18] or to minimize public outcry[19].

Whether and how to regulate social media is a political question, but many U.S. politicians have developed deep ties with platforms like Facebook[20]. Some have relied on social media to collect donations, target supporters with advertising and help them get elected[21]. Once in office, they continue to use social media to communicate with supporters[22] in hopes of getting reelected.

Federal agencies also use social media to communicate with the public and influence people’s opinions[23] – even in violation of U.S. law[24]. In my view, Facebook’s role as a tool to gain, keep and spread political power makes politicians far less likely to rein it in.

US regulation isn’t coming soon

Congress has not yet taken any meaningful action to regulate social media companies. Despite strong statements from politicians and even calls for hearings about social media in response to the New Zealand attack[25], U.S. regulators aren’t likely to lead the way.

European Union officials[26] are handling much of the work, especially around privacy[27]. New Zealand’s government has stepped up, too, banning the livestream video[28] of the mosque massacre, meaning anyone who shares it could face up to NZ$10,000 in fines and 14 years in prison[29]. At least two people have already been arrested[30] for sharing it online[31].

Facebook could – and should – act now

Much of the discussion about regulating social media has considered using anti-trust and monopoly laws[32] to force the enormous technology giants like Facebook to break up into smaller separate companies. But if it happens at all, that will be very difficult – breaking up AT&T lasted a decade[33], from the 1974 lawsuit to the 1984 launch of the “Baby Bell” companies.

In the interim, there will be many more dangerous and violent incidents people will try to livestream. Facebook should evaluate its products’ potential for misuse[34] and discontinue them if the effects are harmful to society.

No child should ever see the sort of “raw and visceral content[35]” that has been produced on Facebook Live – including mass murder. I don’t think adult users should be exposed to witnessing such heinous acts either, as studies have shown that viewing graphic violence has health risks[36], such as post-traumatic stress.

That’s why I’m no longer recommending just a livestream delay for adolescent users – it was an appeal to protect children, when more major platform changes are unlikely. But all people deserve better and safe social media. I’m now calling on Mark Zuckerberg to shut down Facebook Live in the interest of public health and safety. In my view, that feature should be restored only if the company can prove to the public – and to regulators – that its design is safer.

Handling livestreaming safely includes having more than enough professional content moderators to handle the workload. Those workers also must have appropriate access to mental health support[37] and safe working environments, so that even Facebook employees and contractors are not unduly scarred by brutal violence posted online.

References

  1. ^ Robert Godwin Sr. (www.cnn.com)
  2. ^ relatively new feature (www.wsj.com)
  3. ^ graphic video was uploaded after the event (apnews.com)
  4. ^ Facebook Live broadcasts be time-delayed (www.cbsnews.com)
  5. ^ Facebook Live has broadcast killings (www.usatoday.com)
  6. ^ sexual assault, torture and child abuse (www.buzzfeednews.com)
  7. ^ 3,000 additional human content moderators (money.cnn.com)
  8. ^ 1.5 million videos and images of the killings (newsroom.fb.com)
  9. ^ were blocked at upload (www.cnbc.com)
  10. ^ social media researcher and educator (news.syr.edu)
  11. ^ fewer than 200 people viewed (arstechnica.com)
  12. ^ doesn’t include copies of the video (techcrunch.com)
  13. ^ We will keep doing all we can to prevent tragedies like this from happening (www.businessinsider.com)
  14. ^ short time-delays of a few seconds are typical (www.cnet.com)
  15. ^ request a company moderator (slate.com)
  16. ^ broadcasting regulators penalized broadcasters (us.cnn.com)
  17. ^ effectively no regulation (doi.org)
  18. ^ only in pursuit of profits (theconversation.com)
  19. ^ minimize public outcry (www.nytimes.com)
  20. ^ ties with platforms like Facebook (www.theverge.com)
  21. ^ help them get elected (motherboard.vice.com)
  22. ^ communicate with supporters (doi.org)
  23. ^ influence people’s opinions (www.nbcnews.com)
  24. ^ in violation of U.S. law (www.nytimes.com)
  25. ^ in response to the New Zealand attack (www.courant.com)
  26. ^ European Union officials (www.theverge.com)
  27. ^ around privacy (theconversation.com)
  28. ^ banning the livestream video (www.newshub.co.nz)
  29. ^ NZ$10,000 in fines and 14 years in prison (www.newshub.co.nz)
  30. ^ arrested (gizmodo.com)
  31. ^ for sharing it online (thehill.com)
  32. ^ using anti-trust and monopoly laws (www.theverge.com)
  33. ^ breaking up AT&T lasted a decade (articles.latimes.com)
  34. ^ evaluate its products’ potential for misuse (newrepublic.com)
  35. ^ raw and visceral content (www.buzzfeednews.com)
  36. ^ studies have shown that viewing graphic violence has health risks (www.philly.com)
  37. ^ appropriate access to mental health support (www.theverge.com)

Authors: Jennifer Grygiel, Assistant Professor of Communications (Social Media) & Magazine, Syracuse University

Read more http://theconversation.com/livestreamed-massacre-means-its-time-to-shut-down-facebook-live-113830

Metropolitan republishes selected articles from The Conversation USA with permission

Visit The Conversation to see more