The Property Pack

  • Written by Janet Vertesi, Associate Professor of Sociology, Princeton University
NASA's Mars rovers could inspire a more ethical future for AI

Since ChatGPT’s release in late 2022, many news outlets have reported on the ethical threats posed by artificial intelligence. Tech pundits have issued warnings of killer robots bent on human extinction[1], while the World Economic Forum predicted that machines will take away jobs[2].

The tech sector is slashing its workforce[3] even as it invests in AI-enhanced productivity tools[4]. Writers and actors in Hollywood are on strike[5] to protect their jobs and their likenesses[6]. And scholars continue to show how these systems heighten existing biases[7] or create meaningless jobs – amid myriad other problems.

There is a better way to bring artificial intelligence into workplaces. I know, because I’ve seen it, as a sociologist[8] who works with NASA’s robotic spacecraft teams.

The scientists and engineers I study are busy exploring the surface of Mars[9] with the help of AI-equipped rovers. But their job is no science fiction fantasy. It’s an example of the power of weaving machine and human intelligence together, in service of a common goal.

An artist's rendition of the Perseverence rover, make of metal with six small wheels, a camera and a robotic arm.
Mars rovers act as an important part of NASA’s team, even while operating millions of miles away from their scientist teammates. NASA/JPL-Caltech via AP[10]

Instead of replacing humans, these robots partner with us to extend and complement human qualities. Along the way, they avoid common ethical pitfalls and chart a humane path for working with AI.

The replacement myth in AI

Stories of killer robots and job losses illustrate how a “replacement myth” dominates the way people think about AI. In this view, humans can and will be replaced by automated machines[11].

Amid the existential threat is the promise of business boons like greater efficiency[12], improved profit margins[13] and more leisure time[14].

Empirical evidence shows that automation does not cut costs. Instead, it increases inequality by cutting out low-status workers[15] and increasing the salary cost[16] for high-status workers who remain. Meanwhile, today’s productivity tools inspire employees to work more[17] for their employers, not less.

Alternatives to straight-out replacement are “mixed autonomy” systems, where people and robots work together. For example, self-driving cars must be programmed[18] to operate in traffic alongside human drivers. Autonomy is “mixed” because both humans and robots operate in the same system, and their actions influence each other.

A zoomed in shot of a white car with a bumper sticker reading 'self-driving car' Self-driving cars, while operating without human intervention, still require training from human engineers and data collected by humans. AP Photo/Tony Avelar[19]

However, mixed autonomy is often seen as a step along the way to replacement[20]. And it can lead to systems where humans merely feed, curate or teach AI tools[21]. This saddles humans with “ghost work[22]” – mindless, piecemeal tasks that programmers hope machine learning will soon render obsolete.

Replacement raises red flags for AI ethics. Work like tagging content to train AI[23] or scrubbing Facebook posts[24] typically features traumatic tasks[25] and a poorly paid workforce[26] spread across[27] the Global South[28]. And legions of autonomous vehicle designers are obsessed with “the trolley problem[29]” – determining when or whether it is ethical to run over pedestrians.

But my research with robotic spacecraft teams at NASA[30] shows that when companies reject the replacement myth and opt for building human-robot teams instead, many of the ethical issues with AI vanish.

Extending rather than replacing

Strong human-robot teams[31] work best when they extend and augment[32] human capabilities instead of replacing them. Engineers craft machines that can do work that humans cannot. Then, they weave machine and human labor together intelligently, working toward a shared goal[33].

Often, this teamwork means sending robots to do jobs that are physically dangerous for humans. Minesweeping[34], search-and-rescue[35], spacewalks[36] and deep-sea[37] robots are all real-world examples.

Teamwork also means leveraging the combined strengths of both robotic and human senses or intelligences[38]. After all, there are many capabilities that robots have that humans do not – and vice versa.

For instance, human eyes on Mars can only see dimly lit, dusty red terrain stretching to the horizon. So engineers outfit Mars rovers with camera filters[39] to “see” wavelengths of light that humans can’t see in the infrared, returning pictures in brilliant false colors[40].

A false-color photo from the point of view of a rover standing at the cliff overlooking a brown, sandy desert-like area that looks blue in the distance. Mars rovers capture images in near infrared to show what Martian soil is made of. NASA/JPL-Caltech/Cornell Univ./Arizona State Univ[41]

Meanwhile, the rovers’ onboard AI cannot generate scientific findings. It is only by combining colorful sensor results with expert discussion that scientists can use these robotic eyes to uncover new truths about Mars[42].

Respectful data

Another ethical challenge to AI is how data is harvested and used. Generative AI is trained on artists’ and writers’ work without their consent[43], commercial datasets are rife with bias[44], and ChatGPT “hallucinates”[45] answers to questions.

The real-world consequences of this data use in AI range from lawsuits[46] to racial profiling[47].

Robots on Mars also rely on data, processing power and machine learning techniques to do their jobs. But the data they need is visual and distance information to generate driveable pathways[48] or suggest cool new images[49].

By focusing on the world around them instead of our social worlds, these robotic systems avoid the questions around surveillance[50], bias[51] and exploitation[52] that plague today’s AI.

The ethics of care

Robots can unite the groups[53] that work with them by eliciting human emotions when integrated seamlessly. For example, seasoned soldiers mourn broken drones on the battlefield[54], and families give names and personalities to their Roombas[55].

I saw NASA engineers break down in anxious tears[56] when the rovers Spirit and Opportunity were threatened by Martian dust storms.

A hand petting a light blue, circular Roomba vacuum. Some people feel a connection to their robot vacuums, similar to the connection NASA engineers feel to Mars rovers. nikolay100/iStock / Getty Images Plus via Getty Images[57]

Unlike anthropomorphism[58] – projecting human characteristics onto a machine – this feeling is born from a sense of care for the machine. It is developed through daily interactions, mutual accomplishments and shared responsibility.

When machines inspire a sense of care, they can underline – not undermine – the qualities that make people human.

A better AI is possible

In industries where AI could be used to replace workers, technology experts might consider how clever human-machine partnerships could enhance human capabilities instead of detracting from them.

Script-writing teams may appreciate an artificial agent that can look up dialog or cross-reference on the fly. Artists could write or curate their own algorithms to fuel creativity[59] and retain credit for their work. Bots to support software teams might improve meeting communication and find errors that emerge from compiling code.

Of course, rejecting replacement does not eliminate all ethical concerns[60] with AI. But many problems associated with human livelihood, agency and bias shift when replacement is no longer the goal.

The replacement fantasy is just one of many possible futures for AI and society. After all, no one would watch “Star Wars” if the ‘droids replaced all the protagonists. For a more ethical vision of humans’ future with AI, you can look to the human-machine teams that are already alive and well, in space and on Earth.


  1. ^ human extinction (www.theverge.com)
  2. ^ will take away jobs (www.weforum.org)
  3. ^ slashing its workforce (www.computerworld.com)
  4. ^ invests in AI-enhanced productivity tools (www.forbes.com)
  5. ^ are on strike (theconversation.com)
  6. ^ their jobs and their likenesses (www.theguardian.com)
  7. ^ heighten existing biases (www.rollingstone.com)
  8. ^ as a sociologist (janet.vertesi.com)
  9. ^ the surface of Mars (mars.jpl.nasa.gov)
  10. ^ NASA/JPL-Caltech via AP (newsroom.ap.org)
  11. ^ replaced by automated machines (ntrs.nasa.gov)
  12. ^ like greater efficiency (hbr.org)
  13. ^ improved profit margins (www.forbes.com)
  14. ^ more leisure time (www.aspeninstitute.org)
  15. ^ cutting out low-status workers (doi.org)
  16. ^ increasing the salary cost (www.jstor.org)
  17. ^ work more (press.uchicago.edu)
  18. ^ self-driving cars must be programmed (doi.org)
  19. ^ AP Photo/Tony Avelar (newsroom.ap.org)
  20. ^ along the way to replacement (doi.org)
  21. ^ feed, curate or teach AI tools (www.prospectmagazine.co.uk)
  22. ^ ghost work (ghostwork.info)
  23. ^ tagging content to train AI (www.bbc.com)
  24. ^ scrubbing Facebook posts (ir.lib.uwo.ca)
  25. ^ traumatic tasks (hbr.org)
  26. ^ a poorly paid workforce (dl.acm.org)
  27. ^ spread across (dl.acm.org)
  28. ^ the Global South (giswatch.org)
  29. ^ the trolley problem (www.moralmachine.net)
  30. ^ with robotic spacecraft teams at NASA (press.uchicago.edu)
  31. ^ Strong human-robot teams (doi.org)
  32. ^ extend and augment (digitalreality.ieee.org)
  33. ^ working toward a shared goal (doi.org)
  34. ^ Minesweeping (www.popsci.com)
  35. ^ search-and-rescue (theconversation.com)
  36. ^ spacewalks (ntrs.nasa.gov)
  37. ^ deep-sea (news.stanford.edu)
  38. ^ both robotic and human senses or intelligences (doi.org)
  39. ^ with camera filters (mars.nasa.gov)
  40. ^ false colors (pancam.sese.asu.edu)
  41. ^ NASA/JPL-Caltech/Cornell Univ./Arizona State Univ (mars.nasa.gov)
  42. ^ uncover new truths about Mars (press.uchicago.edu)
  43. ^ without their consent (theconversation.com)
  44. ^ rife with bias (nyupress.org)
  45. ^ ChatGPT “hallucinates” (www.cnn.com)
  46. ^ lawsuits (www.theverge.com)
  47. ^ racial profiling (www.propublica.org)
  48. ^ generate driveable pathways (www.nasa.gov)
  49. ^ suggest cool new images (mars.nasa.gov)
  50. ^ questions around surveillance (doi.org)
  51. ^ bias (doi.org)
  52. ^ and exploitation (haveibeentrained.com)
  53. ^ unite the groups (shapingscience.net)
  54. ^ mourn broken drones on the battlefield (www.washington.edu)
  55. ^ to their Roombas (faculty.cc.gatech.edu)
  56. ^ break down in anxious tears (press.uchicago.edu)
  57. ^ nikolay100/iStock / Getty Images Plus via Getty Images (www.gettyimages.com)
  58. ^ anthropomorphism (www.britannica.com)
  59. ^ to fuel creativity (computerhistory.org)
  60. ^ eliminate all ethical concerns (www.cambridge.org)

Authors: Janet Vertesi, Associate Professor of Sociology, Princeton University

Read more https://theconversation.com/nasas-mars-rovers-could-inspire-a-more-ethical-future-for-ai-211162

Metropolitan republishes selected articles from The Conversation USA with permission

Visit The Conversation to see more

Metropolitan Business News

Australian-first product for businesses to earn returns on AUD and USD balances

Businesses can earn returns on their US-dollar balances without opening an overseas bank account with a new Australian-first account from global financial platform Airwallex.  Airwallex’s Yield a...

Maximizing Business Potential: The Philippines As A Leading Destination For Contact Centre Outsourcing

The outsourcing of contact centre requirements has become an integral part of business strategies aimed at improving efficiency, enhancing customer service, and driving growth. In the realm of out...

How can you mine and trade Bitcoin?

Bitcoin is a virtual currency that acts as a form of payment and is not controlled by one person like fiat money. Hence it is not controlled by a third party in financial transactions. It is paid ...

Advanced Placement Strategies for Caution Wet Floor Signs in High-Traffic Areas

A caution wet floor sign is no ordinary warning device. It serves the vital purpose of notifying individuals about treacherous and slippery conditions in an area. Its presence acts as a proactive meas...

5 Futuristic Businesses of 2023: Exploring Latest Business Trends

In today’s world, success is not measured solely by profit margins but also by the positive impact made on the world. Now, we get to experience a world where the future meets the present and creat...

Data discovery: why your business needs it for compliance

Every company has to protect their information against unwanted breaches. But organisational information expands at a daily rate, and so it becomes even more important to track the data. Organisat...