Self-driving Uber vehicle that struck woman in Tempe was not programmed to brake

Federal report finds that Uber was relying on the driver to apply the brakes

The autonomous Uber vehicle that struck and killed a Tempe pedestrian in March was not programmed to brake automatically, a federal report found. 

The National Transportation and Safety Board released the preliminary report of their investigation into the March 24 crash. These initial findings shed some light on many aspects of the accident. 

"Over the course of the last two months, we’ve worked closely with the NTSB," Uber released in a statement last week. "As their investigation continues, we’ve initiated our own safety review of our self-driving vehicles program. We’ve also brought on former NTSB Chair Christopher Hart to advise us on our overall safety culture, and we look forward to sharing more on the changes we’ll make in the coming weeks."

Read more: Uber suspends self-driving car program after fatal accident in Tempe

Despite the vehicle being equipped with “forward- and side-facing cameras, radars, LIDAR, navigation sensors, and a computing and data storage unit,” according to the report, the system was not able to classify forty-nine-year-old Elaine Herzberg as a pedestrian until about a second before the crash. 

“At 1.3 seconds before impact, the self-driving system determined that an emergency braking maneuver was needed to mitigate a collision,” according to the report, but, “The vehicle operator is relied on to intervene and take action. The system is not designed to alert the operator.”

The driver did not brake until about a second after the impact.

Later toxicology tests found Herzberg positive for methamphetamine and marijuana. The driver was determined sober by the police just after the incident, but a video clip shows her eyes diverted toward a screen near her lap just before impact. 

“In a postcrash interview with NTSB investigators, the vehicle operator stated that she had been monitoring the self-driving system interface,” according to the report. 

The truth of the driver’s claim is not judged in the report. The NTSB is still investigating. 

Professor Gary Marchant studies the ethical and legal aspects of emerging technologies, such as autonomous vehicles. Marchant said the crash and the huge layoffs by Uber have been disheartening for him to see. 

“We’re seeing a lot of things like a plastic bag going by and stopping the car,” Marchant said, “So we’re counting on the human driver to be the actual brake, rather than the autonomous system. The problem is humans are not as good as autonomous systems.”

Marchant reinforced multiple times that self-driving technology has the potential to be much safer than human-driven cars.

“People tend to be much more wary of planes than cars, despite planes being far safer,” he said. 

For now, there is no real regulation of autonomous vehicles. It will take the state, the federal government and private companies some time to work out the kinks of this emerging technology. In the meantime, Marchant said, self-driving car companies need to be ethical and smart about their operations. 

Waymo, Google’s self-driving car project, is still testing in Arizona, despite Uber recently pulling all of its self-driving operations out of the state and laying off about 300 employees. 

Read more: Uber permanently ends self-driving car tests in Arizona

In a statement to The State Press, Uber said they intend to conduct self-driving tests in a more limited way and that they are working with different stakeholders to get back on the road, however that looks in the future.

Reach the reporter at or follow @laconicshamanic on Twitter.

Like The State Press on Facebook and follow @statepress on Twitter.

Get the best of State Press delivered straight to your inbox.



This website uses cookies to make your expierence better and easier. By using this website you consent to our use of cookies. For more information, please see our Cookie Policy.