The Times Real Estate


.

  • Written by Elana Goldenkoff, Doctoral Candidate in Movement Science, University of Michigan
Are tomorrow’s engineers ready to face AI’s ethical challenges?

A chatbot turns hostile[1]. A test version of a Roomba vacuum[2] collects images of users in private situations. A Black woman is falsely identified as a suspect[3] on the basis of facial recognition software, which tends to be less accurate at identifying women and people of color[4].

These incidents are not just glitches, but examples of more fundamental problems. As artificial intelligence and machine learning tools become more integrated into daily life, ethical considerations are growing, from privacy issues[5] and race and gender biases in coding[6] to the spread of misinformation[7].

The general public depends on software engineers and computer scientists to ensure these technologies are created in a safe and ethical manner. As a sociologist[8] and doctoral candidate[9] interested in science, technology, engineering and math education, we are currently researching how engineers in many different fields learn and understand their responsibilities to the public.

Yet our recent research[10], as well as that of other scholars[11], points to a troubling reality: The next generation of engineers often seem unprepared to grapple with the social implications of their work. What’s more, some appear apathetic[12] about the moral dilemmas their careers may bring – just as advances in AI intensify such dilemmas.

Aware, but unprepared

As part of our ongoing research[13], we interviewed more than 60 electrical engineering and computer science masters students at a top engineering program in the United States. We asked students about their experiences with ethical challenges in engineering, their knowledge of ethical dilemmas in the field and how they would respond to scenarios in the future.

First, the good news: Most students recognized potential dangers of AI and expressed concern about personal privacy[14] and the potential to cause harm – like how race and gender biases[15] can be written into algorithms, intentionally or unintentionally.

One student, for example, expressed dismay at the environmental impact of AI, saying AI companies are using “more and more greenhouse power, [for] minimal benefits.” Others discussed concerns about where and how AIs are being applied, including for military technology[16] and to generate falsified information and images.

When asked, however, “Do you feel equipped to respond in concerning or unethical situations?” students often said no.

“Flat out no. … It is kind of scary,” one student replied. “Do YOU know who I’m supposed to go to?”

Another was troubled by the lack of training: “I [would be] dealing with that with no experience. … Who knows how I’ll react.”

Two young women, one Black and one Asian, sit at a table together as they work on two laptops.
Many students are worried about ethics in their field – but that doesn’t mean they feel prepared to deal with the challenges. The Good Brigade/DigitalVision via Getty Images[17]

Other researchers have similarly found that many engineering students do not feel satisfied[18] with the ethics training they do receive. Common training usually emphasizes professional codes of conduct, rather than the complex socio-technical factors underlying ethical decision-making. Research suggests that even when presented with particular scenarios or case studies, engineering students often struggle to recognize ethical dilemmas[19].

‘A box to check off’

Accredited engineering programs[20] are required to “include topics related to professional and ethical responsibilities” in some capacity.

Yet ethics training[21] is rarely emphasized in the formal curricula. A study assessing undergraduate STEM curricula in the U.S. found that coverage of ethical issues varied greatly in terms of content, amount and how seriously it is presented[22]. Additionally, an analysis of academic literature[23] about engineering education found that ethics is often considered nonessential training.

Many engineering faculty express dissatisfaction[24] with students’ understanding, but report feeling pressure from engineering colleagues and students themselves to prioritize technical skills in their limited class time.

Researchers in one 2018 study interviewed over 50 engineering faculty and documented hesitancy – and sometimes even outright resistance[25] – toward incorporating public welfare issues into their engineering classes. More than a quarter of professors they interviewed saw ethics and societal impacts as outside “real” engineering work[26].

About a third of students we interviewed in our ongoing research[27] project share this seeming apathy[28] toward ethics training, referring to ethics classes as “just a box to check off.”

“If I’m paying money to attend ethics class as an engineer, I’m going to be furious,” one said.

These attitudes sometimes extend to how students view engineers’ role in society. One interviewee in our current study, for example, said that an engineer’s “responsibility is just to create that thing, design that thing and … tell people how to use it. [Misusage] issues are not their concern.”

One of us, Erin Cech, followed a cohort of 326 engineering students[29] from four U.S. colleges. This research, published in 2014, suggested that engineers actually became less concerned[30] over the course of their degree about their ethical responsibilities and understanding the public consequences of technology. Following them after they left college, we found that their concerns regarding ethics did not rebound once these new graduates entered the workforce.

Joining the work world

When engineers do receive ethics training as part of their degree, it seems to work[31].

Along with engineering professor Cynthia Finelli[32], we conducted a survey of over 500 employed engineers[33]. Engineers who received formal ethics and public welfare training in school are more likely to understand their responsibility to the public in their professional roles, and recognize the need for collective problem solving. Compared to engineers who did not receive training, they were 30% more likely to have noticed an ethical issue in their workplace and 52% more likely to have taken action.

An Asian man wearing glasses stares seriously into space, standing against a holographic background in shades of pink and blue. The next generation needs to be prepared for ethical questions, not just technical ones. Qi Yang/Moment via Getty Images[34]

Over a quarter[35] of these practicing engineers reported encountering a concerning ethical situation at work. Yet approximately one-third said they have never received training in public welfare – not during their education, and not during their career.

This gap in ethics education[36] raises serious questions about how well-prepared the next generation of engineers will be to navigate the complex ethical landscape of their field, especially when it comes to AI[37].

To be sure, the burden of watching out for public welfare is not shouldered by engineers, designers and programmers alone. Companies and legislators share the responsibility.

But the people who are designing, testing and fine-tuning this technology are the public’s first line of defense. We believe educational programs owe it to them – and the rest of us – to take this training seriously.

References

  1. ^ chatbot turns hostile (www.npr.org)
  2. ^ Roomba vacuum (www.technologyreview.com)
  3. ^ falsely identified as a suspect (www.cbsnews.com)
  4. ^ less accurate at identifying women and people of color (proceedings.mlr.press)
  5. ^ privacy issues (papers.ssrn.com)
  6. ^ biases in coding (doi.org)
  7. ^ spread of misinformation (www.axios.com)
  8. ^ a sociologist (erinacech.com)
  9. ^ doctoral candidate (www.kines.umich.edu)
  10. ^ recent research (onlinelibrary.wiley.com)
  11. ^ other scholars (doi.org)
  12. ^ appear apathetic (doi.org)
  13. ^ our ongoing research (nemo.asee.org)
  14. ^ personal privacy (english.elpais.com)
  15. ^ race and gender biases (www.worldcat.org)
  16. ^ including for military technology (doi.org)
  17. ^ The Good Brigade/DigitalVision via Getty Images (www.gettyimages.com)
  18. ^ do not feel satisfied (deepblue.lib.umich.edu)
  19. ^ struggle to recognize ethical dilemmas (peer.asee.org)
  20. ^ Accredited engineering programs (www.abet.org)
  21. ^ ethics training (doi.org)
  22. ^ content, amount and how seriously it is presented (onlinelibrary.wiley.com)
  23. ^ academic literature (link.springer.com)
  24. ^ express dissatisfaction (www.liebertpub.com)
  25. ^ even outright resistance (doi.org)
  26. ^ outside “real” engineering work (peer.asee.org)
  27. ^ ongoing research (www.nsf.gov)
  28. ^ seeming apathy (journals.sagepub.com)
  29. ^ followed a cohort of 326 engineering students (doi.org)
  30. ^ less concerned (doi.org)
  31. ^ it seems to work (doi.org)
  32. ^ Cynthia Finelli (scholar.google.com)
  33. ^ a survey of over 500 employed engineers (doi.org)
  34. ^ Qi Yang/Moment via Getty Images (www.gettyimages.com)
  35. ^ Over a quarter (doi.org)
  36. ^ gap in ethics education (doi.org)
  37. ^ when it comes to AI (theconversation.com)

Authors: Elana Goldenkoff, Doctoral Candidate in Movement Science, University of Michigan

Read more https://theconversation.com/are-tomorrows-engineers-ready-to-face-ais-ethical-challenges-213826

Metropolitan republishes selected articles from The Conversation USA with permission

Visit The Conversation to see more