Metropolitan Digital

Google


.

  • Written by Anne Toomey McKenna, Affiliated Faculty Member, Institute for Computational and Data Sciences, Penn State

On a Saturday morning, you head to the hardware store. Your neighbors’ Ring cameras film[1] your walk to the car. Your car’s sensors, cameras and microphones record[2] your speed, how you drive, where you’re going, who’s with you, what you say, and biological metrics such as facial expression, weight and heart rate. Your car may also collect text messages and contacts from your connected smartphone.

Meanwhile, your phone continuously senses[3] and records your communications, info about your health, what apps you’re using, and tracks your location[4] via cell towers, GPS satellites and Wi-Fi and Bluetooth.

As you enter the store, its surveillance cameras[5] identify your face and track your movements through the aisles. If you then use Apple or Google Pay to make your purchase, your phone tracks what you bought and how much you paid.

All this data quickly becomes commercially available[6], bought and sold by data brokers. Aggregated and analyzed by artificial intelligence, the data reveals detailed, sensitive information about you that can be used to predict and manipulate your behavior[7], including what you buy, feel, think and do[8].

Companies unilaterally collect data from most of your activities. This “surveillance capitalism[9]” is often unrelated to the services device manufacturers, apps and stores are providing you. For example, Tinder is planning to use AI to scan[10] your entire camera roll. And despite their promises, “opting out” doesn’t actually stop[11] companies’ data collection.

While companies can manipulate you, they cannot put you in jail. But the U.S. government can[12], and it now purchases massive quantities of your information[13] from commercial data brokers. The government is able to purchase Americans’ sensitive data because the information it buys is not subject to the same restrictions[14] as information it collects directly[15].

The federal government is also ramping up its abilities to directly collect data through partnerships with private tech companies. These surveillance tech partnerships are becoming entrenched[16], domestically and abroad, as advances in AI take surveillance to unprecedented levels[17].

As a privacy, electronic surveillance and tech law attorney, author and legal educator[18], I have spent years researching, writing and advising about privacy and legal issues related to surveillance and data use. To understand the issues, it is critical to know how these technologies function, who collects what data about you, how that data can be used against you, and why the laws you might think are protecting your data do not apply or are ignored.

Spherical security camera in the foreground with a store in the background.
Store security cameras can be used to collect demographic and location data that is sold on the commercial market. Sebastian Willnow/picture alliance via Getty Images[19]

Big money for AI-driven tech and more data

Congressional funding is supercharging[20] huge government investments in surveillance tech and data analytics driven by AI, which automates analysis of very large amounts of data. The massive 2025 tax-and-spending law[21] netted the Department of Homeland Security an unprecedented US$165 billion[22] in yearly funding. Immigration and Customs Enforcement, part of DHS, got about $86 billion[23].

Disclosure of documents allegedly hacked from Homeland Security[24] reveal a massive surveillance web[25] that has all Americans in its scope.

DHS is expanding its AI surveillance capabilities[26] with a surge in contracts to private companies. It is reportedly funding companies that provide[27] more AI-automated surveillance in airports; adapters to convert agents’ phones into biometric scanners; and an AI platform that acquires all 911 call center data to build geospatial heat maps to predict incident trends[28]. Predicting incident trends can be a form of predictive policing[29], which uses data to anticipate where, when and how crime may occur.

DHS has also spent millions on AI-driven software used to detect sentiment and emotion[30] in users’ online posts. Have you been complaining about Immigration and Customs Enforcement policies online? If so, social media companies including Google, Reddit, Discord, and Facebook and Instagram owner Meta may have sent identifying data, such as your name, email address, phone number and activity, to DHS in response to hundreds of DHS subpoenas[31] served on the companies.

Meanwhile, the Trump administration’s national policy framework for artificial intelligence[32], released on March 20, 2026, urges Congress to use grants and tax incentives to fund “wider deployment of AI tools across American industry” and to allow industry and academia to use federal datasets to train AI.

Using federal datasets[33] this way raises privacy law[34] concerns because they contain a lifetime of sensitive details[35] about you, including biographical, employment and tax[36] information.

Blurring lines and little oversight

In foreign intelligence work, the funding, development and controlled use of certain AI-driven gathering of data makes sense. The CIA’s new acquisition framework[37] to turbocharge collaboration with the private sector may be legal with proper oversight. But the line between collaborating for lawful national security purposes versus unlawful domestic spying[38] is becoming dangerously blurred or ignored.

For example, the Pentagon has declared a contractor, Anthropic[39], a national security risk[40] because Anthropic insisted that its powerful agentic AI model, Claude, not be used for[41] mass domestic surveillance of Americans or fully autonomous weapons.

On March 18, 2026, FBI Director Kash Patel confirmed to Congress that the FBI is buying Americans’ data from data brokers[42], including location histories, to track American citizens.

As the federal government accelerates the use of and investment in AI-driven spy tech, it is mandating less oversight around AI technology. In addition to the national AI policy framework, which discourages state regulation of AI, the president has issued executive orders to accelerate federal government adoption of AI systems[43], remove state law AI regulation barriers[44] and require that the federal government not procure the use of AI models that attempt to adjust for bias[45]. But using advanced AI systems is risky, given reports of AI agents going rogue[46], exposing sensitive data and becoming a threat[47], even during routine tasks.

Your data

The surveillance capitalism system requires people to unwittingly participate in a manipulative cycle[48] of group- and self-surveillance. Neighborhood doorbell cameras[49], Flock license plate readers[50] and hyperlocal social media sites like Nextdoor create a crowdsourced record of all people’s movements in public spaces[51].

A camera and solar panel attached to a traffic pole
Flock cameras, which take pictures of license plates as cars drive by, are used to collect and sell data to third parties – including the U.S. government. Justin Sullivan via Getty Images[52]

Sensors in phones and wearable devices, such as earbuds and rings, collect ever more sensitive details[53]. These include[54] health data, including your heart rate and heart rate variability, blood oxygen, sweat and stress levels, behavioral patterns, neurological changes and even brain waves[55]. Smartphones can be used to diagnose, assess and treat Parkinson’s disease[56]. Earbuds could be used to monitor brain health[57].

This data is not protected under HIPAA[58], which prohibits health care providers and those working with them from disclosing your health information without your permission, because the law does not consider tech companies to be health care providers nor these wearables to be medical devices.

Legal protections

People have little choice when buying devices, using apps or opening accounts but to agree to lengthy terms that include consent for companies to collect and sell[59] their personal data. This “consent” allows their data to end up in the largely unregulated[60] commercial data market.

The government claims it can lawfully[61] purchase this data from data brokers. But in buying your data in bulk on the commercial market, the government is circumventing the Constitution[62], Supreme Court decisions and federal laws[63] designed to protect your privacy from unwarranted government overreach.

The Fourth Amendment[64] prohibits unreasonable search and seizure by the government. Supreme Court cases require police to get a warrant to search a phone[65] or use cellular[66] or GPS location information to track[67] someone. The Electronic Communications Privacy Act[68]’s Wiretap Act prohibits unauthorized interception of wire, oral and electronic communications.

Despite some efforts, Congress has failed to enact legislation to protect data privacy[69], the use of sensitive data by AI systems[70] or to restore the intent of the Electronic Communications Privacy Act. Courts have allowed the broad electronic privacy protections in the federal Wiretap Act[71] to be eviscerated by companies claiming consent[72].

In my opinion, the way to begin to address these problems is to restore the Wiretap Act and related laws to their intended purposes of protecting Americans’ privacy in communications, and for Congress to follow through on its promises and efforts[73] by passing legislation that secures Americans’ data privacy and protects them from AI harms.

This article is part of a series on data privacy[74] that explores who collects your data, what and how they collect, who sells and buys your data, what they all do with it, and what you can do about it.

References

  1. ^ neighbors’ Ring cameras film (www.uclalawreview.org)
  2. ^ sensors, cameras and microphones record (natlawreview.com)
  3. ^ continuously senses (papers.ssrn.com)
  4. ^ tracks your location (doi.org)
  5. ^ its surveillance cameras (www.homedepot.com)
  6. ^ becomes commercially available (www.intelligence.gov)
  7. ^ predict and manipulate your behavior (doi.org)
  8. ^ what you buy, feel, think and do (doi.org)
  9. ^ surveillance capitalism (www.hbs.edu)
  10. ^ Tinder is planning to use AI to scan (www.404media.co)
  11. ^ opting out” doesn’t actually stop (www.404media.co)
  12. ^ government can (nyupress.org)
  13. ^ purchases massive quantities of your information (www.cato.org)
  14. ^ not subject to the same restrictions (www.washingtonpost.com)
  15. ^ information it collects directly (www.theguardian.com)
  16. ^ are becoming entrenched (www.theguardian.com)
  17. ^ unprecedented levels (dx.doi.org)
  18. ^ attorney, author and legal educator (www.annetoomeymckenna.com)
  19. ^ Sebastian Willnow/picture alliance via Getty Images (www.gettyimages.com)
  20. ^ is supercharging (fedscoop.com)
  21. ^ massive 2025 tax-and-spending law (www.congress.gov)
  22. ^ unprecedented US$165 billion (www.dhs.gov)
  23. ^ $86 billion (www.npr.org)
  24. ^ allegedly hacked from Homeland Security (techcrunch.com)
  25. ^ massive surveillance web (www.npr.org)
  26. ^ expanding its AI surveillance capabilities (fedscoop.com)
  27. ^ funding companies that provide (www.theguardian.com)
  28. ^ predict incident trends (www.theguardian.com)
  29. ^ can be a form of predictive policing (www.brennancenter.org)
  30. ^ used to detect sentiment and emotion (www.404media.co)
  31. ^ DHS subpoenas (www.nytimes.com)
  32. ^ national policy framework for artificial intelligence (www.whitehouse.gov)
  33. ^ federal datasets (catalog.data.gov)
  34. ^ privacy law (www.justice.gov)
  35. ^ lifetime of sensitive details (epic.org)
  36. ^ including biographical, employment and tax (www.nytimes.com)
  37. ^ new acquisition framework (www.cia.gov)
  38. ^ versus unlawful domestic spying (www.americanbar.org)
  39. ^ Anthropic (www.anthropic.com)
  40. ^ national security risk (www.nbcnews.com)
  41. ^ not be used for (www.anthropic.com)
  42. ^ buying Americans’ data from data brokers (techcrunch.com)
  43. ^ accelerate federal government adoption of AI systems (trumpwhitehouse.archives.gov)
  44. ^ remove state law AI regulation barriers (www.whitehouse.gov)
  45. ^ AI models that attempt to adjust for bias (www.whitehouse.gov)
  46. ^ AI agents going rogue (techcrunch.com)
  47. ^ becoming a threat (www.irregular.com)
  48. ^ manipulative cycle (nyupress.org)
  49. ^ doorbell cameras (ring.com)
  50. ^ Flock license plate readers (www.404media.co)
  51. ^ record of all people’s movements in public spaces (www.uclalawreview.org)
  52. ^ Justin Sullivan via Getty Images (www.gettyimages.com)
  53. ^ ever more sensitive details (doi.org)
  54. ^ These include (iapp.org)
  55. ^ brain waves (www.diagnosticsworldnews.com)
  56. ^ diagnose, assess and treat Parkinson’s disease (doi.org)
  57. ^ monitor brain health (eng.unimelb.edu.au)
  58. ^ protected under HIPAA (www.law.cornell.edu)
  59. ^ consent for companies to collect and sell (www.gsulawreview.org)
  60. ^ largely unregulated (epic.org)
  61. ^ government claims it can lawfully (www.theguardian.com)
  62. ^ circumventing the Constitution (www.aclu.org)
  63. ^ decisions and federal laws (www.pennstatelawreview.org)
  64. ^ Fourth Amendment (www.law.cornell.edu)
  65. ^ search a phone (supreme.justia.com)
  66. ^ cellular (www.supremecourt.gov)
  67. ^ GPS location information to track (epic.org)
  68. ^ Electronic Communications Privacy Act (www.law.cornell.edu)
  69. ^ protect data privacy (iapp.org)
  70. ^ use of sensitive data by AI systems (www.theguardian.com)
  71. ^ Wiretap Act (www.law.cornell.edu)
  72. ^ eviscerated by companies claiming consent (kleinmoynihan.com)
  73. ^ promises and efforts (lofgren.house.gov)
  74. ^ series on data privacy (theconversation.com)

Authors: Anne Toomey McKenna, Affiliated Faculty Member, Institute for Computational and Data Sciences, Penn State

Read more https://theconversation.com/us-government-ramps-up-mass-surveillance-with-help-of-ai-tech-data-brokers-and-your-apps-and-devices-277440