The Times Real Estate


.

  • Written by Andrew Guthrie Ferguson, Professor of Law, American University

Video of police[1] in riot gear clashing with unarmed protesters in the wake of the killing of George Floyd by Minneapolis police officer Derek Chauvin has filled social media feeds. Meanwhile, police surveillance of protesters[2] has remained largely out of sight.

Local, state and federal law enforcement organizations use an array of surveillance technologies to identify and track protesters, from facial recognition[3] to military-grade drones[4].

Police use of these national security-style surveillance techniques – justified as cost-effective techniques that avoid human bias and error – has grown[5] hand-in-hand with the increased militarization of law enforcement. Extensive research, including my own[6], has shown that these expansive and powerful surveillance capabilities have exacerbated rather than reduced bias[7], overreach and abuse[8] in policing, and they pose a growing threat to civil liberties[9].

Police reform efforts are increasingly looking at law enforcement organizations’ use of surveillance technologies. In the wake of the current unrest, IBM[10], Amazon[11] and Microsoft[12] have put the brakes on police use of the companies’ facial recognition technology. And police reform bills submitted by the Democrats in the U.S. House of Representatives call for regulating police use of facial recognition[13] systems.

A decade of big data policing

We haven’t always lived in a world of police cameras, smart sensors and predictive analytics. Recession and rage fueled the initial rise of big data policing[14] technologies. In 2009, in the face of federal, state and local budget cuts[15] caused by the Great Recession, police departments began looking for ways to do more with less. Technology companies rushed to fill the gaps, offering new forms of data-driven policing[16] as models of efficiency and cost reduction.

Then, in 2014, the police killing of Michael Brown in Ferguson, Missouri, upended already fraying police and community relationships. The killings of Michael Brown, Eric Garner, Philando Castile, Tamir Rice, Walter Scott, Sandra Bland, Freddie Gray and George Floyd all sparked nationwide protests and calls for racial justice[17] and police reform[18]. Policing was driven into crisis mode as community outrage threatened to delegitimize the existing police power structure.

In response to the twin threats of cost pressures and community criticism, police departments further embraced startup technology companies[19] selling big data efficiencies and the hope that something “data-driven” would allow communities to move beyond the all-too-human problems of policing. Predictive analytics and bodycam video capabilities were sold as objective solutions[20] to racial bias. In large measure, the public relations strategy worked, which has allowed law enforcement to embrace predictive policing[21] and increased digital surveillance[22].

High-tech surveillance amplifies police bias and overreach A law enforcement drone flew over demonstrators, Friday, June 5, 2020, in Atlanta. AP Photo/Mike Stewart[23]

Today, in the midst of renewed outrage against structural racism and police brutality, and in the shadow of an even deeper economic recession, law enforcement organizations face the same temptation to adopt a technology-based solution to deep societal problems. Police chiefs are likely to want to turn the page from the current levels of community anger and distrust.

The dangers of high-tech surveillance

Instead of repeating the mistakes of the past 12 years or so, communities have an opportunity to reject the expansion of big data policing. The dangers have only increased, the harms made plain by experience.

Those small startup companies that initially rushed into the policing business have been replaced by big technology companies[24] with deep pockets and big ambitions.

Axon[25] capitalized on the demands for police accountability after the protests in Ferguson and Baltimore to become a multimillion dollar company providing digital services for police-worn body cameras. Amazon has been expanding partnerships[26] with hundreds of police departments through its Ring cameras[27] and Neighbors App[28]. Other companies like BriefCam, Palantir and Shotspotter offer a host of video analytics[29], social network analysis[30] and other sensor technologies[31] with the ability to sell technology cheaply in the short run with the hope for long term market advantage.

The technology itself is more powerful. The algorithmic models created a decade ago pale in comparison to machine learning capabilities today. Video camera streams have been digitized and augmented with analytics[32] and facial recognition[33] capabilities, turning static surveillance into a virtual time machine[34] to find patterns in crowds. Adding to the data trap are smartphones[35], smart homes[36] and smart cars[37], which now allow police to uncover individuals’ digital trails with relative ease.

High-tech surveillance amplifies police bias and overreach Researchers have been working to overcome widespread racial bias in facial recognition. IBM Research/Flickr, CC BY-ND[38][39]

The technology is more interconnected. One of the natural limiting factors of first generation big data policing technology was the fact that it remained siloed. Databases could not communicate with one another. Data could not be easily shared. That limiting factor has shrunk as more aggregated data systems have been developed within government[40] and by private vendors[41].

The promise of objective, unbiased technology didn’t pan out. Race bias in policing[42] was not fixed by turning on a camera. Instead the technology created new problems, including highlighting the lack of accountability[43] for high-profile instances of police violence.

Lessons for reining in police spying

The harms of big data policing have been repeatedly exposed. Programs that attempted to predict individuals’ behaviors in Chicago[44] and Los Angeles[45] have been shut down after devastating audits[46] cataloged their discriminatory impact and practical failure. Place-based predictive systems[47] have been shut down[48] in Los Angeles and other cities that initially had adopted the technology. Scandals involving facial recognition[49], social network analysis technology[50] and large-scale sensor surveillance[51] serve as a warning that technology cannot address the deeper issues of race, power and privacy that lie at the heart of modern-day policing.

The lesson of the first era[52] of big data policing is that issues of race, transparency and constitutional rights must be at the forefront of design, regulation and use. Every mistake[53] can be traced to a failure to see how the surveillance technology fits within the context of modern police power – a context that includes longstanding issues of racism and social control. Every solution points to addressing that power imbalance at the front end[54], through local oversight, community engagement[55] and federal law[56], not after the technology has been adopted.

The debates about defunding[57], demilitarizing[58] and reimagining[59] existing law enforcement practices must include a discussion about police surveillance. There are a decade of missteps to learn from and era-defining privacy and racial justice challenges ahead. How police departments respond to the siren call of big data surveillance[60] will reveal whether they’re on course to repeat the same mistakes.

[Get our best science, health and technology stories. Sign up for The Conversation’s science newsletter[61].]

References

  1. ^ Video of police (time.com)
  2. ^ police surveillance of protesters (www.marketplace.org)
  3. ^ facial recognition (www.google.com)
  4. ^ military-grade drones (www.vox.com)
  5. ^ has grown (theintercept.com)
  6. ^ including my own (nyupress.org)
  7. ^ bias (www.axios.com)
  8. ^ overreach and abuse (www.theatlantic.com)
  9. ^ threat to civil liberties (www.pbs.org)
  10. ^ IBM (www.technologyreview.com)
  11. ^ Amazon (www.wired.com)
  12. ^ Microsoft (www.nbcnews.com)
  13. ^ regulating police use of facial recognition (www.protocol.com)
  14. ^ big data policing (thecrimereport.org)
  15. ^ budget cuts (www.ncdsv.org)
  16. ^ new forms of data-driven policing (www.siliconvalley.com)
  17. ^ racial justice (www.cbsnews.com)
  18. ^ police reform (www.theatlantic.com)
  19. ^ startup technology companies (www.smithsonianmag.com)
  20. ^ objective solutions (www.governing.com)
  21. ^ predictive policing (time.com)
  22. ^ digital surveillance (theintercept.com)
  23. ^ AP Photo/Mike Stewart (www.apimages.com)
  24. ^ big technology companies (www.nytimes.com)
  25. ^ Axon (theappeal.org)
  26. ^ Amazon has been expanding partnerships (www.nbcnews.com)
  27. ^ Ring cameras (www.washingtonpost.com)
  28. ^ Neighbors App (www.vice.com)
  29. ^ video analytics (www.vice.com)
  30. ^ social network analysis (www.wired.com)
  31. ^ sensor technologies (www.govtech.com)
  32. ^ analytics (slate.com)
  33. ^ facial recognition (www.nbcnews.com)
  34. ^ virtual time machine (scholarship.law.upenn.edu)
  35. ^ smartphones (www.nytimes.com)
  36. ^ smart homes (papers.ssrn.com)
  37. ^ smart cars (www.washingtonpost.com)
  38. ^ IBM Research/Flickr (www.flickr.com)
  39. ^ CC BY-ND (creativecommons.org)
  40. ^ within government (www.fbi.gov)
  41. ^ by private vendors (www.thetrace.org)
  42. ^ Race bias in policing (www.propublica.org)
  43. ^ lack of accountability (www.vox.com)
  44. ^ Chicago (www.chicagotribune.com)
  45. ^ Los Angeles (www.courthousenews.com)
  46. ^ audits (www.cnn.com)
  47. ^ Place-based predictive systems (www.theatlantic.com)
  48. ^ shut down (www.buzzfeednews.com)
  49. ^ facial recognition (www.nytimes.com)
  50. ^ social network analysis technology (www.nola.com)
  51. ^ sensor surveillance (www.citylab.com)
  52. ^ lesson of the first era (www.economist.com)
  53. ^ mistake (papers.ssrn.com)
  54. ^ the front end (www.thealiadviser.org)
  55. ^ community engagement (www.measureaustin.org)
  56. ^ federal law (www.vice.com)
  57. ^ defunding (www.thenation.com)
  58. ^ demilitarizing (www.nytimes.com)
  59. ^ reimagining (www.theatlantic.com)
  60. ^ big data surveillance (www.asanet.org)
  61. ^ Sign up for The Conversation’s science newsletter (theconversation.com)

Authors: Andrew Guthrie Ferguson, Professor of Law, American University

Read more https://theconversation.com/high-tech-surveillance-amplifies-police-bias-and-overreach-140225

Metropolitan republishes selected articles from The Conversation USA with permission

Visit The Conversation to see more