Data insecurity leads to economic injustice – and hits the pocketbooks of the poor most
- Written by Michele Gilman, Venable Professor of Law, University of Baltimore
Congress may finally be on the verge of passing a comprehensive federal privacy law after almost a half-century of trying[1]. Even the tech lobby[2] is on board[3] following years of resistance.
The growing bipartisan support[4] for privacy legislation seems to be responding to the public “techlash”[5] against a drumbeat of data breaches and social media misinformation campaigns. It also appears aimed at preventing a patchwork of state laws after California passed its own privacy legislation[6] in 2018.
While the time is right to enact a new law, what you may not realize is that data privacy is actually an important economic justice issue. As a clinical law professor[7] representing low-income people for the last 20 years, I have seen how one’s digital privacy experience varies[8] depending on social class.
And poorer Americans are among those who have the most at risk.
Data targeting
Take data brokers[9], which are companies that sell personal data collected from sources[10] such as public records, internet browsing activity, social media posts, emails, app usage and retail loyalty cards.
This industry is one reason why you are barraged with online ads for a product you may have glanced at only briefly. For most of us, this is simply an annoying fact of life. For low-income people, the harms extend beyond this shared sense of creepiness.
For example, the digital dossiers assembled by data brokers are used to target low-income Americans[11] for predatory products such as payday loans, high-interest mortgages and for-profit educational scams. These brokers segment consumers[12] into highly specific categories, such as “rural and barely making it” and “credit crunched: city families.”
While a slew of lawsuits pushed Facebook[13] to stop allowing its advertisers to target groups based on gender, race, zip code and age, advertisers can continue to discriminate against people simply because they are poor. Poverty is not a protected category[14] under our civil rights laws or the Constitution[15].
Meanwhile, police are using big data to predict criminal activity, particularly in low-income and minority neighborhoods. The problem is this creates a vicious cycle[16] in which communities that are already heavily policed trigger predictive software[17] that urges more aggressive policing.
AP Photo/Seth Perlman[18]Data exclusion
Targeting is not the only problem. Big data can also exclude people[19] living in material poverty from opportunities that would foster their economic stability.
Employers are using applicant tracking systems[20] to predict whether potential employees will perform on the job. Colleges[21] are assessing algorithms to determine which prospective students are likely to stick around for graduation. Landlords[22] are scouring credit reports to predict whether prospective tenants will pay the rent.
And while these can be legitimate objectives, society puts too much faith[23] in the algorithms used to predict human behavior. Computer outputs may have the veneer of objectivity, but human beings impart their own conscious and implicit biases[24] into the software that fuels these predictions. This can reinforce longstanding prejudices[25].
In addition, much of the data fed into algorithms is erroneous[26]. Since these algorithms increasingly include information[27] pulled from social networks, you could be judged on the posts and conduct of your “friends.”
A lack of transparency[28] means that people never learn why they are denied a job, a home or an education. Mechanisms to correct faulty data either do not exist or are so Kafkaesque[29] that people give up in frustration.
Not surprisingly, then, in states that rely on algorithms to assess eligibility for public benefits such as Medicaid, thousands of qualified people have been kicked out of programs[30], imperiling their health and costing lives.
Automated decision-making strips social service delivery of needed nuance.
Reuters/Brian Snyder[31]Data security
Data security is another area of concern for low-income Americans.
Recently, researchers found a database online[32] containing identifying information on 80 million American households. This follows years of data breaches[33] that have put everyone’s data at risk of identity theft.
While always a nightmare, such breaches can be especially devastating[34] for people living on the financial edge. They generally can’t afford the costly and complicated measures needed to clean their credit after someone else steals their identity. Economic losses resulting from a breach can push low-income people over a financial cliff.
Furthermore, identity theft can result[35] in low-income people facing wrongful arrests or utility cut-offs or aggressive debt collection tactics. Unsurprisingly, low-income people report lower confidence[36] in their ability to protect their data.
Data privacy gaps
All these harms are in part because the U.S. still lacks an overarching privacy law.
Although all 50 states now require companies[37] to notify consumers about data breaches, California is the only state to pass a comprehensive privacy law governing how data is collected and used. However, multiple states are considering similar legislation[38].
There are a few federal sectoral laws[39] that protect certain pieces of Americans’ financial and health information. But mostly a notice and consent regime[40] puts the onus on individuals to safeguard their own online privacy.
Do you actually read those lengthy notices[41] that flash before you when you log on to a new website? Companies count on the fact[42] that you probably do not.
For its part, the overburdened[43] Federal Trade Commission has tried to push companies to improve their data security. But its resources and enforcement power are limited[44] under current law.
Lessons from Europe
Lawmakers working on a federal privacy law should look to Europe for inspiration.
About a year ago, the European Union began implementing the General Data Protection Regulation, which gives its citizens a bevy of rights to control their data[45]. In particular, it also includes provisions that could enhance the data privacy needs of low-income people.
For instance, the GDPR prohibits certain kinds of automated profiling[46]. This could put the brakes on profiling that limits people’s access to jobs, housing and other life necessities for illegitimate reasons. The law also gives people a right to an explanation[47] about automated decision-making, which could open the current “black box” to help people understand and challenge denials of goods and services.
The law includes a right to be forgotten[48], which requires personal data must be erased when it’s no longer needed for the original purpose or when a person asks for it to be scrubbed. Fundamentally, it means people can get a clean data slate as their financial condition improves.
And to top it off, the law has a meaningful enforcement regime[49] and requires public participation[50] in the data policies set by large companies.
In the United States, I believe the time is right to adopt similar provisions to enhance Americans’ control over their personal data. Data privacy is an issue of economic justice, and Congress should legislate accordingly.
References
- ^ almost a half-century of trying (www.economist.com)
- ^ tech lobby (www.rollcall.com)
- ^ on board (www.cnbc.com)
- ^ growing bipartisan support (www.lawfareblog.com)
- ^ “techlash” (www.wsj.com)
- ^ California passed its own privacy legislation (iapp.org)
- ^ clinical law professor (papers.ssrn.com)
- ^ experience varies (www.nytimes.com)
- ^ data brokers (theconversation.com)
- ^ sources (theconversation.com)
- ^ target low-income Americans (bigdata.fairness.io)
- ^ segment consumers (www.businessinsider.com)
- ^ pushed Facebook (www.vox.com)
- ^ not a protected category (virginialawreview.org)
- ^ Constitution (scholarship.law.duke.edu)
- ^ creates a vicious cycle (motherboard.vice.com)
- ^ trigger predictive software (www.economist.com)
- ^ AP Photo/Seth Perlman (www.apimages.com)
- ^ exclude people (openscholarship.wustl.edu)
- ^ applicant tracking systems (www.fastcompany.com)
- ^ Colleges (slate.com)
- ^ Landlords (www.theverge.com)
- ^ puts too much faith (hbr.org)
- ^ impart their own conscious and implicit biases (medium.com)
- ^ reinforce longstanding prejudices (www.nytimes.com)
- ^ erroneous (theconversation.com)
- ^ increasingly include information (papers.ssrn.com)
- ^ lack of transparency (slate.com)
- ^ Kafkaesque (newrepublic.com)
- ^ kicked out of programs (www.theguardian.com)
- ^ Reuters/Brian Snyder (pictures.reuters.com)
- ^ found a database online (fortune.com)
- ^ follows years of data breaches (www.usatoday.com)
- ^ especially devastating (www.americanbar.org)
- ^ can result (www.americanbar.org)
- ^ people report lower confidence (datasociety.net)
- ^ all 50 states now require companies (www.ncsl.org)
- ^ are considering similar legislation (iapp.org)
- ^ sectoral laws (teachprivacy.com)
- ^ notice and consent regime (theconversation.com)
- ^ read those lengthy notices (theconversation.com)
- ^ count on the fact (www.theatlantic.com)
- ^ overburdened (www.propublica.org)
- ^ are limited (news.bloomberglaw.com)
- ^ rights to control their data (gdpr-info.eu)
- ^ automated profiling (www.lexology.com)
- ^ right to an explanation (papers.ssrn.com)
- ^ right to be forgotten (gdpr-info.eu)
- ^ enforcement regime (www.gdpreu.org)
- ^ public participation (gdpr-info.eu)
Authors: Michele Gilman, Venable Professor of Law, University of Baltimore