The Times Real Estate


.

  • Written by Jeffrey Hirsch, Geneva Yeargan Rand Distinguished Professor of Law, University of North Carolina at Chapel Hill

Science fiction has long imagined a future in which humans constantly interact with robots and intelligent machines. This future is already happening in warehouses[1] and manufacturing businesses[2]. Other workers use virtual or augmented reality as part of their employment training, to assist them in performing their job[3] or to interact with clients[4]. And lots of workers are under automated surveillance[5] from their employers.

All that automation yields data that can be used to analyze workers’ performance. Those analyses, whether done by humans or software programs, may affect who is hired, fired, promoted and given raises[6]. Some artificial intelligence programs can mine and manipulate the data to predict future actions, such as who is likely to quit their job, or to diagnose medical conditions[7].

If your job doesn’t currently involve these types of technologies, it likely will in the very near future[8]. This worries mea labor and employment law scholar[9] who researches the role of technology in the workplace[10] – because unless significant changes are made to American workplace laws, these sorts of surveillance and privacy invasions will be perfectly legal.

New technology disrupting old workplace laws

The United States’ regulation of the workplace has long been an outlier among much of the world. Especially for private, nonunionized workers, the U.S. largely allows companies and workers to figure out the terms and conditions of work on their own.

In general, for all but the most in-demand workers or those at the highest corporate levels, the lack of regulation means companies can behave however they want – although they are subject to laws preventing discrimination[11], setting minimum wages, requiring overtime pay[12] and ensuring worker safety[13].

Worker-protection laws aren't ready for an automated future Many farm workers in the U.S. are completely excluded from most workplace protection rules. AP Photo/Robert F. Bukaty[14]

But most of those laws are decades old and are rarely updated. They certainly haven’t kept up with technological advances, the increase in temporary or “gig” work[15] and other changes in the economy. Faced with these new challenges, the old laws leave many workers without adequate protections against workplace abuses[16], or even totally exclude some workers from any protections at all[17]. For instance, two Trump administration agencies have recently declared that Uber drivers are not employees[18], and therefore not entitled to minimum wage, overtime or the right to engage in collective action such as joining a union.

Emerging technologies like artificial intelligence, robotics, virtual reality and advanced monitoring systems have already begun altering workplaces in fundamental ways[19] that may soon become impossible to ignore[20]. That progress highlights the need for meaningful changes to employment laws.

Consider Uber drivers

Like other companies in what has been called the “gig economy,” Uber has spent considerable amounts of money and time litigating and lobbying[21] to protect regulations classifying its drivers as independent contractors, rather than employees. Uber set its fifth annual federal lobbying record in 2018, spending US$2.3 million[22] on issues including keeping its drivers from being classified as employees.

The distinction is a crucial one. Uber does not have to pay employment taxes – or unemployment insurance premiums – on independent contractors. In addition, nonemployees are completely excluded from any workplace protection laws. These workers are not entitled to a minimum wage or overtime; they can be discriminated against based on their race, sex, religion, color, national origin, age, disability and military status; they lack the right to unionize; and they are not entitled to a safe working environment.

Companies have tried to classify workers as independent contractors ever since there have been workplace laws, but technology has greatly expanded companies’ ability to hire labor that blurs the lines between employees and independent contractors.

Employees aren’t protected, either

Even for workers who are considered employees, technology allows employers to take advantage of the gaps in workplace laws like never before. Many workers already use computers, smartphones and other equipment that allows employers to monitor their activity and location, even when off duty[23].

Worker-protection laws aren't ready for an automated future A doctor implants an RFID chip in a patient’s hand. Paul Hughes/Wikimedia Commons, CC BY-SA[24][25]

And emerging technology permits far greater privacy intrusions. For instance, some employers already have badges that track and monitor workers’ movements and conversations[26]. Japanese employers use technology to monitor workers’ eyelid movements and lower the room temperature if the system identifies signs of drowsiness[27].

Another company implanted RFID chips[28] into the arms of employee “volunteers[29].” The purpose was to make it easier for workers to open doors, log in to their computers, and purchase items from a break room, but a person with an RFID implant can be tracked 24 hours a day. Also, RFID chips are susceptible to unauthorized access or “skimming” by thieves[30] who are merely physically close to the chip.

No privacy protections for workers

The monitoring that’s possible now will seem simplistic compared to what’s coming: a future in which robotics and other technologies capture huge amounts of personal information to feed artificial intelligence software that learns which metrics are associated with things such as workers’ moods and energy levels, or even diseases like depression.

One health care analytic firm, whose clients include some of the biggest employers in the country, already uses workers’ internet search histories and medical insurance claims to predict who is at risk of getting diabetes or considering becoming pregnant[31]. The company says it provides only summary information to clients, such as the number of women in a workplace who are trying to have children, but in most instances it can probably legally identify[32] specific workers.

Except for some narrow exceptions – like in bathrooms and other specific areas where workers can expect to be in relative privacy – private-sector employees have virtually no way, nor any legal right, to opt out of this sort of monitoring. They may not even be informed that it is occurring. Public-sector employees have more protection, thanks to the Fourth Amendment’s prohibition against unreasonable searches, but in government workplaces the scope of that prohibition is quite narrow.

AI discrimination

In contrast to the almost total lack of privacy laws protecting workers, employment discrimination laws – while far from perfect – can provide some important protections for employees. But those laws have already faced criticism for their overly simplistic and limited view of what constitutes discrimination, which makes it very difficult for victims to file and win lawsuits or obtain meaningful settlements[33]. Emerging technology, particularly AI, will exacerbate this problem.

AI software programs used in the hiring process are marketed as eliminating or reducing biased human decision-making[34]. In fact, they can create more bias, because these systems depend on large collections of data[35], which can be biased themselves.

For instance, Amazon recently abandoned a multi-year project to develop an AI hiring program because it kept discriminating against women[36]. Apparently, the AI program learned from Amazon’s male-dominated workforce that being a man was associated with being a good worker. To its credit, Amazon never used the program for actual hiring decisions, but what about employers who lack the resources, knowledge or desire to identify biased AI?

The laws about discrimination based on computer algorithms are unclear, just as other technologies stretch employment laws and regulations well beyond their clear applications. Without an update to the rules, more workers will continue to fall outside traditional worker protections – and may even be unaware how vulnerable they really are.

References

  1. ^ in warehouses (www.nytimes.com)
  2. ^ manufacturing businesses (www.npr.org)
  3. ^ employment training, to assist them in performing their job (qctimes.com)
  4. ^ to interact with clients (fortune.com)
  5. ^ automated surveillance (www.newscientist.com)
  6. ^ may affect who is hired, fired, promoted and given raises (www.cmswire.com)
  7. ^ who is likely to quit their job, or to diagnose medical conditions (fortune.com)
  8. ^ it likely will in the very near future (www.mckinsey.com)
  9. ^ a labor and employment law scholar (www.law.unc.edu)
  10. ^ role of technology in the workplace (ssrn.com)
  11. ^ preventing discrimination (www.eeoc.gov)
  12. ^ setting minimum wages, requiring overtime pay (www.dol.gov)
  13. ^ ensuring worker safety (www.osha.gov)
  14. ^ AP Photo/Robert F. Bukaty (www.apimages.com)
  15. ^ temporary or “gig” work (www.marketwatch.com)
  16. ^ the old laws leave many workers without adequate protections against workplace abuses (ssrn.com)
  17. ^ even totally exclude some workers from any protections at all (www.pbs.org)
  18. ^ Uber drivers are not employees (www.forbes.com)
  19. ^ altering workplaces in fundamental ways (theconversation.com)
  20. ^ impossible to ignore (theconversation.com)
  21. ^ litigating and lobbying (www.vox.com)
  22. ^ spending US$2.3 million (www.bloomberg.com)
  23. ^ monitor their activity and location, even when off duty (arstechnica.com)
  24. ^ Paul Hughes/Wikimedia Commons (commons.wikimedia.org)
  25. ^ CC BY-SA (creativecommons.org)
  26. ^ badges that track and monitor workers’ movements and conversations (www.economist.com)
  27. ^ lower the room temperature if the system identifies signs of drowsiness (www.weforum.org)
  28. ^ RFID chips (theconversation.com)
  29. ^ the arms of employee “volunteers (iapp.org)
  30. ^ unauthorized access or “skimming” by thieves (www.npr.org)
  31. ^ predict who is at risk of getting diabetes or considering becoming pregnant (fortune.com)
  32. ^ probably legally identify (www.wsj.com)
  33. ^ which makes it very difficult for victims to file and win lawsuits or obtain meaningful settlements (ssrn.com)
  34. ^ eliminating or reducing biased human decision-making (www.theverge.com)
  35. ^ large collections of data (theconversation.com)
  36. ^ kept discriminating against women (www.reuters.com)

Authors: Jeffrey Hirsch, Geneva Yeargan Rand Distinguished Professor of Law, University of North Carolina at Chapel Hill

Read more http://theconversation.com/worker-protection-laws-arent-ready-for-an-automated-future-119051

Metropolitan republishes selected articles from The Conversation USA with permission

Visit The Conversation to see more