Skip to Content, Navigation, or Footer.

Opinion: Fatal Uber accident should not be the end of self-driving cars

Human-driven vehicles can be more dangerous than autonomous vehicles

uber

"The crash took place near ASU's Tempe campus on Mill Avenue and Curry Road." Illustration published on Monday, March 19, 2018.  


On Sunday night, 49-year-old Elaine Herzberg died after being hit by a self-driving Uber in autonomous mode. 

Following the incident, Uber halted the testing of the autonomous vehicles in Phoenix, San Francisco, Pittsburgh and Toronto, meaning ASU students will no longer see these cars around campus.

While there was an incident of a collision in March 2017 between a self-driving Uber and a regular vehicle, this is believed to be the first known incident of a fatality in which an autonomous vehicle was involved.

As unfortunate as the accident was, it should not be viewed as a reason to end the research and development of autonomous vehicles, but rather as a reason for further discussion regarding the risks of such technologies. 

Andrew Maynard, the director of ASU’s Risk Innovation Lab, said that with the innovation of new technology, failure in some aspect is inevitable.

“Something like this was going to happen at some stage. No matter how safe we make vehicles on the roads, they’re never going to be 100 percent safe,” Maynard said. "But it raises questions about whether we’re taking enough cautions and whether we’re being responsible enough with the development of this technology. It also raises questions around how much we can trust current regulations around the technology that are there to ensure safety at the moment.”

Currently, Uber requires all autonomous vehicles to still have a driver behind the wheel, because the technology is still developing. 

Uber’s autonomous vehicles are fitted with a multitude of different sensors and cameras, allowing it to detect road markings, buildings, traffic cones and pedestrians. The vehicles are also designed to slow to a stop before colliding with any kind of potential obstruction. 

Ram Pendyala, a professor for ASU’s School of Sustainable Engineering and the Built Environment, said the extent to which the technology is to blame for the incident remains unclear.

The accident certainly raises questions about the safety of self-driving cars, but it is also important to consider their safety in comparison to human-operated vehicles.

“From what we know at the moment, with good driving conditions and standard driving conditions, the indications are that (self-driving cars) are very safe, simply because they can monitor their environments far more effectively than a human driver,” Maynard said. “What is not known is how effective they are when things aren’t normal … and that is one of the biggest challenges with these technologies.”

According to the Arizona Department of Transportation, 23.63 percent of car accidents in 2016 were associated with distracted driving and 3.89 percent were associated with alcohol consumption. 

Self-driving cars are immune to the weakness which plague human drivers, such as distractedness, drinking, drugging and drowsiness.

“I think the technology is safer,” Pendyala said. “However, there are still some caveats. If you take a very alert human driver, who has the knowledge of all the quirks and the randomness of all the other drivers, bicyclists, pedestrians, and so on that navigate the transportation system, you could question whether the technology has reached the place where it can account for every random quirk that is encountered out in the real world.”

ASU students who used Uber, whether it was self-driving or not, should consider their own safety as they directly experience evolving technologies such as those of autonomous vehicles.

In comparison to the millions of fatalities caused by human negligence, a single fatality from an autonomous vehicle, while unfortunate, should not discourage the innovation of the technology as it is continually improved.

“I would be very surprised if this was the end of testing,” Maynard said. “But I think that this is a decision point for the manufacturers, for regulators and for consumers. I think this is a wake-up call to say that if we want to see the benefits of this technology, we’ve got to have everybody at the table to decide how exactly we want this technology to go forward or what is an acceptable level of risk.”


Reach the columnist at kalbal@asu.edu or follow @KarishmaAlbal on Twitter.

Editor’s note: The opinions presented in this column are the author’s and do not imply any endorsement from The State Press or its editors.

Want to join the conversation? Send an email to opiniondesk.statepress@gmail.com. Keep letters under 500 words and be sure to include your university affiliation. Anonymity will not be granted.

Like The State Press on Facebook and follow @statepress on Twitter.


Continue supporting student journalism and donate to The State Press today.

Subscribe to Pressing Matters



×

Notice

This website uses cookies to make your experience better and easier. By using this website you consent to our use of cookies. For more information, please see our Cookie Policy.