Latest in Gear

Image credit: Just_Super via Getty Images

Uber's self-driving policies, tech face questions after fatal crash

Internal docs revealed by the NYT show its performance trailing competitors.
304 Shares
Share
Tweet
Share
Save
Just_Super via Getty Images

In the wake of a fatal crash where one of Uber's self-driving SUVs struck and killed a pedestrian in Arizona, a New York Times report has dug into the company's program and found it's significantly trailing the competition. Specifically, while the former Google project Waymo could have its cars average 5,600 between incidents where a test driver needed to take control and GM's Cruise averaged some 1,200 miles, Uber's documents reveal it wasn't consistently meeting an internal goal of averaging 13 miles.

Its report also confirms what Jalopnik found: that unlike every other companies testing self-driving car technology, it's only using a single driver for both safety and performance monitoring. Toyota, Nissan and Ford all confirmed the use of two operators as their policy, while Waymo said that since 2015 it has a single driver when using "validated" hardware and software, but adds a second tester when any of that changes, or for new drivers, cities and types of roads.

The NYT report also notes that unlike California's publicly available reports, Arizona has no such requirement, and Uber's test in California haven't been going on long enough for it to issue one there. Additionally, new CEO Dara Khosrowshahi is said to have considered shutting down the program.

Another major question is when and if the car's sensors picked up the victim, Elaine Herzberg. Velodyne Lidar makes the sensors that self-driving cars use to "see" their surroundings in addition to cameras, and its president wrote to Bloomberg about that. In an email after video of the crash was released, Thoma Hall said "Certainly, our Lidar is capable of clearly imaging Elaine and her bicycle in this situation. However, our Lidar doesn't make the decision to put on the brakes or get out of her way...The problem lies elsewhere." As part of the transparency that could help make or break public trust of autonomous tech after an incident like this, Uber will need to explain more about what systems were active during the incident, and what they responded to.

From around the web

ear iconeye icontext filevr