The Times Real Estate


.

  • Written by Michele Gilman, Venable Professor of Law, University of Baltimore

Congress may finally be on the verge of passing a comprehensive federal privacy law after almost a half-century of trying[1]. Even the tech lobby[2] is on board[3] following years of resistance.

The growing bipartisan support[4] for privacy legislation seems to be responding to the public “techlash”[5] against a drumbeat of data breaches and social media misinformation campaigns. It also appears aimed at preventing a patchwork of state laws after California passed its own privacy legislation[6] in 2018.

While the time is right to enact a new law, what you may not realize is that data privacy is actually an important economic justice issue. As a clinical law professor[7] representing low-income people for the last 20 years, I have seen how one’s digital privacy experience varies[8] depending on social class.

And poorer Americans are among those who have the most at risk.

Data targeting

Take data brokers[9], which are companies that sell personal data collected from sources[10] such as public records, internet browsing activity, social media posts, emails, app usage and retail loyalty cards.

This industry is one reason why you are barraged with online ads for a product you may have glanced at only briefly. For most of us, this is simply an annoying fact of life. For low-income people, the harms extend beyond this shared sense of creepiness.

For example, the digital dossiers assembled by data brokers are used to target low-income Americans[11] for predatory products such as payday loans, high-interest mortgages and for-profit educational scams. These brokers segment consumers[12] into highly specific categories, such as “rural and barely making it” and “credit crunched: city families.”

While a slew of lawsuits pushed Facebook[13] to stop allowing its advertisers to target groups based on gender, race, zip code and age, advertisers can continue to discriminate against people simply because they are poor. Poverty is not a protected category[14] under our civil rights laws or the Constitution[15].

Meanwhile, police are using big data to predict criminal activity, particularly in low-income and minority neighborhoods. The problem is this creates a vicious cycle[16] in which communities that are already heavily policed trigger predictive software[17] that urges more aggressive policing.

Data insecurity leads to economic injustice – and hits the pocketbooks of the poor most Digital dossiers are used to target low-income Americans with high-interest payday loans. AP Photo/Seth Perlman[18]

Data exclusion

Targeting is not the only problem. Big data can also exclude people[19] living in material poverty from opportunities that would foster their economic stability.

Employers are using applicant tracking systems[20] to predict whether potential employees will perform on the job. Colleges[21] are assessing algorithms to determine which prospective students are likely to stick around for graduation. Landlords[22] are scouring credit reports to predict whether prospective tenants will pay the rent.

And while these can be legitimate objectives, society puts too much faith[23] in the algorithms used to predict human behavior. Computer outputs may have the veneer of objectivity, but human beings impart their own conscious and implicit biases[24] into the software that fuels these predictions. This can reinforce longstanding prejudices[25].

In addition, much of the data fed into algorithms is erroneous[26]. Since these algorithms increasingly include information[27] pulled from social networks, you could be judged on the posts and conduct of your “friends.”

A lack of transparency[28] means that people never learn why they are denied a job, a home or an education. Mechanisms to correct faulty data either do not exist or are so Kafkaesque[29] that people give up in frustration.

Not surprisingly, then, in states that rely on algorithms to assess eligibility for public benefits such as Medicaid, thousands of qualified people have been kicked out of programs[30], imperiling their health and costing lives.

Automated decision-making strips social service delivery of needed nuance.

Data insecurity leads to economic injustice – and hits the pocketbooks of the poor most Algorithms that helps colleges figure out which prospective students will graduate might contain implicit biases that harm low-income applicants. Reuters/Brian Snyder[31]

Data security

Data security is another area of concern for low-income Americans.

Recently, researchers found a database online[32] containing identifying information on 80 million American households. This follows years of data breaches[33] that have put everyone’s data at risk of identity theft.

While always a nightmare, such breaches can be especially devastating[34] for people living on the financial edge. They generally can’t afford the costly and complicated measures needed to clean their credit after someone else steals their identity. Economic losses resulting from a breach can push low-income people over a financial cliff.

Furthermore, identity theft can result[35] in low-income people facing wrongful arrests or utility cut-offs or aggressive debt collection tactics. Unsurprisingly, low-income people report lower confidence[36] in their ability to protect their data.

Data privacy gaps

All these harms are in part because the U.S. still lacks an overarching privacy law.

Although all 50 states now require companies[37] to notify consumers about data breaches, California is the only state to pass a comprehensive privacy law governing how data is collected and used. However, multiple states are considering similar legislation[38].

There are a few federal sectoral laws[39] that protect certain pieces of Americans’ financial and health information. But mostly a notice and consent regime[40] puts the onus on individuals to safeguard their own online privacy.

Do you actually read those lengthy notices[41] that flash before you when you log on to a new website? Companies count on the fact[42] that you probably do not.

For its part, the overburdened[43] Federal Trade Commission has tried to push companies to improve their data security. But its resources and enforcement power are limited[44] under current law.

Lessons from Europe

Lawmakers working on a federal privacy law should look to Europe for inspiration.

About a year ago, the European Union began implementing the General Data Protection Regulation, which gives its citizens a bevy of rights to control their data[45]. In particular, it also includes provisions that could enhance the data privacy needs of low-income people.

For instance, the GDPR prohibits certain kinds of automated profiling[46]. This could put the brakes on profiling that limits people’s access to jobs, housing and other life necessities for illegitimate reasons. The law also gives people a right to an explanation[47] about automated decision-making, which could open the current “black box” to help people understand and challenge denials of goods and services.

The law includes a right to be forgotten[48], which requires personal data must be erased when it’s no longer needed for the original purpose or when a person asks for it to be scrubbed. Fundamentally, it means people can get a clean data slate as their financial condition improves.

And to top it off, the law has a meaningful enforcement regime[49] and requires public participation[50] in the data policies set by large companies.

In the United States, I believe the time is right to adopt similar provisions to enhance Americans’ control over their personal data. Data privacy is an issue of economic justice, and Congress should legislate accordingly.

References

  1. ^ almost a half-century of trying (www.economist.com)
  2. ^ tech lobby (www.rollcall.com)
  3. ^ on board (www.cnbc.com)
  4. ^ growing bipartisan support (www.lawfareblog.com)
  5. ^ “techlash” (www.wsj.com)
  6. ^ California passed its own privacy legislation (iapp.org)
  7. ^ clinical law professor (papers.ssrn.com)
  8. ^ experience varies (www.nytimes.com)
  9. ^ data brokers (theconversation.com)
  10. ^ sources (theconversation.com)
  11. ^ target low-income Americans (bigdata.fairness.io)
  12. ^ segment consumers (www.businessinsider.com)
  13. ^ pushed Facebook (www.vox.com)
  14. ^ not a protected category (virginialawreview.org)
  15. ^ Constitution (scholarship.law.duke.edu)
  16. ^ creates a vicious cycle (motherboard.vice.com)
  17. ^ trigger predictive software (www.economist.com)
  18. ^ AP Photo/Seth Perlman (www.apimages.com)
  19. ^ exclude people (openscholarship.wustl.edu)
  20. ^ applicant tracking systems (www.fastcompany.com)
  21. ^ Colleges (slate.com)
  22. ^ Landlords (www.theverge.com)
  23. ^ puts too much faith (hbr.org)
  24. ^ impart their own conscious and implicit biases (medium.com)
  25. ^ reinforce longstanding prejudices (www.nytimes.com)
  26. ^ erroneous (theconversation.com)
  27. ^ increasingly include information (papers.ssrn.com)
  28. ^ lack of transparency (slate.com)
  29. ^ Kafkaesque (newrepublic.com)
  30. ^ kicked out of programs (www.theguardian.com)
  31. ^ Reuters/Brian Snyder (pictures.reuters.com)
  32. ^ found a database online (fortune.com)
  33. ^ follows years of data breaches (www.usatoday.com)
  34. ^ especially devastating (www.americanbar.org)
  35. ^ can result (www.americanbar.org)
  36. ^ people report lower confidence (datasociety.net)
  37. ^ all 50 states now require companies (www.ncsl.org)
  38. ^ are considering similar legislation (iapp.org)
  39. ^ sectoral laws (teachprivacy.com)
  40. ^ notice and consent regime (theconversation.com)
  41. ^ read those lengthy notices (theconversation.com)
  42. ^ count on the fact (www.theatlantic.com)
  43. ^ overburdened (www.propublica.org)
  44. ^ are limited (news.bloomberglaw.com)
  45. ^ rights to control their data (gdpr-info.eu)
  46. ^ automated profiling (www.lexology.com)
  47. ^ right to an explanation (papers.ssrn.com)
  48. ^ right to be forgotten (gdpr-info.eu)
  49. ^ enforcement regime (www.gdpreu.org)
  50. ^ public participation (gdpr-info.eu)

Authors: Michele Gilman, Venable Professor of Law, University of Baltimore

Read more http://theconversation.com/data-insecurity-leads-to-economic-injustice-and-hits-the-pocketbooks-of-the-poor-most-116231

Metropolitan republishes selected articles from The Conversation USA with permission

Visit The Conversation to see more