The Times Real Estate


.

  • Written by Adam Gabriele, Ph.D. Student in Sustainability, Arizona State University
Boeing crashes and Uber collision show passenger safety relies on corporate promises, not regulators' tests

Advanced technologies deliver benefits every day. But, sometimes interactions with technology can go awry and lead to disaster.

On March 10, the pilots aboard Ethiopian Airlines Flight 302 were unable to correct a failure[1] in one of the Boeing 737 Max 8’s automated systems, resulting in a crash and the deaths of all passengers and crew. A year earlier, almost to the day, another automated vehicle – not an airplane but an Uber self-driving car – struck and killed Elaine Herzberg[2] in Tempe, Arizona.

As experts in how humans[3] and technologies interact[4], we know that it is impossible to completely eliminate risk[5] in complex technological systems. These tragedies are the result of regulators and industry experts overlooking the complexities and risks of interactions between technologies and humans and increasingly relying on companies’ voluntary self-assessment, rather than objective, independent tests. Tragically, that appears to have happened with Boeing’s aircraft[6] and the Uber car.

Inside the cockpit of a Boeing 737 Max 8.

Risky business

The crash of Ethiopian Airlines Flight 302, as well as that of Lion Air Flight 610[7] in 2018, happened despite oversight from one of the most technologically capable regulators in the world. Air travel is remarkably safe[8] in light of the potential risks.

Before the 737 Max 8 took to the air, it had to pass a series of Federal Aviation Administration inspections. Over the course of that process, Boeing convinced the FAA[9] that the automated system was safer than it actually was[10], and that pilots would need very little training[11] on the new plane.

The FAA cleared the 737 Max 8 and its flight control system to fly – and retained that clearance not only after the Lion Air crash, but also for three days after[12] the Ethiopian Airlines tragedy.

From airplanes to automobiles

As airplane automation is increasing, the same is true for cars. Various companies are testing autonomous vehicles on roads all around the country[13] – and with far less oversight than the aviation industry. Local and federal rules are limited[14], often in the name of promoting innovation. Federal safety guidelines[15] for autonomous vehicles require them to pass only the same performance tests as any other car, like minimum fuel economy standards, seat belt configurations and how well they’ll protect occupants in a rollover crash.

There’s no reliability testing of their sensors, much less their algorithms. Some states do require companies to report “disengagements”[16] – when the so-called “safety driver” resumes control over the automated system. But mostly the self-driving car companies are allowed to do what they want[17], so long as there is a person behind the wheel.

In the months before the March 2018 collision, Uber was under pressure[18] to catch up with GM Cruise and Waymo. Uber’s cars had a sensitive object-recognition system, which at times would be deceived by a shadow on the road and brake to avoid an obstacle that wasn’t actually there. That resulted in a rough, stop-and-start ride. To smooth things out, Uber’s engineers disabled the car’s emergency braking system[19]. The company appears to have assumed the single safety driver would always be able to stop the car in time if there was really a danger of hitting something.

That’s not what happened as Elaine Herzberg crossed the road. The Uber self-driving car that hit and killed her did see her with its sensors and cameras[20], but was unable to stop on its own. The safety driver appears to have been distracted by her phone[21] – in violation of Uber’s policies, though it’s unclear how the company briefed its safety drivers about the change to the automated system.

Policing themselves

Regulators are relying on safety self-assessment practices, whereby private companies vouch for their own products’ compliance with federal standards. The best assurances they – and members of the public – have for the safety and reliability of these vehicles are the guarantees of the companies who intend to sell them.

What reports companies do provide[22] can be slim on hard evidence, touting the number of real and simulated miles driven[23], without details of how the cars are performing under various conditions. And car companies are constantly releasing new models and upgrading their software[24], forcing human drivers to learn about the new features.

This is all the more unnerving because there are far more cars on the roads than there are planes in the air – 270 million cars registered[25] in the U.S. alone, compared with 25,000 commercial aircraft worldwide[26]. In addition, self-driving cars have to handle not just weather conditions but also close-range interactions with other cars, pedestrians, cyclists and e-scooters. Safety drivers don’t get nearly the amount of training that pilots do, either.

Arizona, where we’re based, is a popular place for public testing of autonomous vehicles, in part because of looser oversight[27] than in other states. In the Phoenix area, however, there is growing public concern about safety. Some citizens are harassing autonomous vehicles[28] in efforts to discourage them from driving through their neighborhoods. As one Arizona resident told The New York Times, the autonomous vehicle industry “said they need real-world examples, but I don’t want to be their real-world mistake[29].”

Connecting with the public, innovating responsibly

In the absence of federal safety standards for autonomous vehicles, states and local governments are left to protect the public – often without the expertise and resources to do so effectively. In our view, this doesn’t mean banning the technology, but rather insisting on corporate transparency and true regulatory oversight.

Engaging the public about what’s happening and who is – and isn’t – protecting their safety can help officials at all levels of government understand what their citizens expect, and push them to ensure that technological innovation is done responsibly.

Universities can play an important role[30] in supporting responsible innovation on these issues. The Arizona State University Center for Smart Cities and Regions is working with the Consortium for Science, Policy and Outcomes to host public forums on self-driving cars in cities across the U.S. and Europe[31].

Airplane and car passengers need to trust their vehicles and understand what risks are unavoidable – as well as what can be prevented. Relying on industry to self-regulate when lives and public trust are at stake is not a viable path to ensure that rapidly emerging innovations are developed and deployed responsibly. To the riders, customers and others sharing the road and the skies, there is only one bottom line – and it doesn’t have a dollar sign attached to it.

References

  1. ^ unable to correct a failure (www.theguardian.com)
  2. ^ killed Elaine Herzberg (www.nytimes.com)
  3. ^ experts in how humans (ifis.asu.edu)
  4. ^ technologies interact (scholar.google.com)
  5. ^ impossible to completely eliminate risk (press.princeton.edu)
  6. ^ with Boeing’s aircraft (www.washingtonpost.com)
  7. ^ Lion Air Flight 610 (theconversation.com)
  8. ^ remarkably safe (www.ntsb.gov)
  9. ^ Boeing convinced the FAA (www.wsj.com)
  10. ^ automated system was safer than it actually was (gizmodo.com)
  11. ^ pilots would need very little training (www.reuters.com)
  12. ^ three days after (www.nytimes.com)
  13. ^ testing autonomous vehicles on roads all around the country (www.bloomberg.org)
  14. ^ federal rules are limited (www.caranddriver.com)
  15. ^ safety guidelines (www.rand.org)
  16. ^ report “disengagements” (www.theverge.com)
  17. ^ allowed to do what they want (www.nytimes.com)
  18. ^ Uber was under pressure (arstechnica.com)
  19. ^ disabled the car’s emergency braking system (www.reuters.com)
  20. ^ did see her with its sensors and cameras (www.wired.com)
  21. ^ distracted by her phone (www.theverge.com)
  22. ^ What reports companies do provide (www.nhtsa.gov)
  23. ^ real and simulated miles driven (www.govtech.com)
  24. ^ upgrading their software (www.theverge.com)
  25. ^ 270 million cars registered (hedgescompany.com)
  26. ^ 25,000 commercial aircraft worldwide (www.planestats.com)
  27. ^ looser oversight (www.npr.org)
  28. ^ harassing autonomous vehicles (www.azcentral.com)
  29. ^ their real-world mistake (www.nytimes.com)
  30. ^ important role (meetingoftheminds.org)
  31. ^ public forums on self-driving cars in cities across the U.S. and Europe (themobilitydebate.net)

Authors: Adam Gabriele, Ph.D. Student in Sustainability, Arizona State University

Read more http://theconversation.com/boeing-crashes-and-uber-collision-show-passenger-safety-relies-on-corporate-promises-not-regulators-tests-115034

Metropolitan republishes selected articles from The Conversation USA with permission

Visit The Conversation to see more