Advertisement
Engadget
Why you can trust us

Engadget has been testing and reviewing consumer tech since 2004. Our stories may include affiliate links; if you buy something through a link, we may earn a commission. Read more about how we evaluate products.

Google self-driving car crashes into a bus (update: statement)

It may be the first instance of a Google autonomous car being at fault in an accident.

AP Photo/Tony Avelar

Google's self-driving cars have been in accidents before, but always on the receiving end... at least, until now. The company has filed a California DMV accident report confirming that one of its autonomous vehicles (a Lexus RX450h) collided with a bus in Mountain View. The crash happened when the robotic SUV had to go into the center lane to make a right turn around some sand bags -- both the vehicle and its test driver incorrectly assumed that a bus approaching from behind would slow or stop to let the car through. The Lexus smacked into the side of the bus at low speed, damaging its front fender, wheel and sensor in the process.

This was a minor incident, and we're happy to report that there were no injuries. However, this might be the first instance where one of Google's self-driving cars caused an accident. If so, the Mountain View crew can no longer say it's an innocent dove on the roads -- while this wasn't a glitch, its software made a decision that led to a crash. We've reached out to Google to see if it can elaborate on what happened.

No matter what the response, it was always going to be difficult to avoid this kind of incident. Until self-driving cars can anticipate every possible road hazard, there's always a chance that they'll either be confused or make choices with unexpected (and sometimes unfortunate) consequences. However, the hope at this early stage isn't to achieve a flawless track record. Instead, it's to show that self-driving cars can be safer overall than their human-piloted counterparts.

Update: Google has provided us with its take on the incident from its February monthly report. It sees the accident as the result of that "normal part of driving" where there's mutual blame: both sides made too many assumptions. So yes, Google acknowledges that it's partly at fault for what happened. In the wake of the crash, it has already tweaked its software to accept that buses are "less likely to yield" and prevent issues like this in the future. Read the full copy below.

Our self-driving cars spend a lot of time on El Camino Real, a wide boulevard of three lanes in each direction that runs through Google's hometown of Mountain View and up the peninsula along San Francisco Bay. With hundreds of sets of traffic lights and hundreds more intersections, this busy and historic artery has helped us learn a lot over the years. And on Valentine's Day we ran into a tricky set of circumstances on El Camino that's helped us improve an important skill for navigating similar roads.

El Camino has quite a few right-hand lanes wide enough to allow two lines of traffic. Most of the time it makes sense to drive in the middle of a lane. But when you're teeing up a right-hand turn in a lane wide enough to handle two streams of traffic, annoyed traffic stacks up behind you. So several weeks ago we began giving the self-driving car the capabilities it needs to do what human drivers do: hug the rightmost side of the lane. This is the social norm because a turning vehicle often has to pause and wait for pedestrians; hugging the curb allows other drivers to continue on their way by passing on the left. It's vital for us to develop advanced skills that respect not just the letter of the traffic code but the spirit of the road.

On February 14, our vehicle was driving autonomously and had pulled toward the right-hand curb to prepare for a right turn. It then detected sandbags near a storm drain blocking its path, so it needed to come to a stop. After waiting for some other vehicles to pass, our vehicle, still in autonomous mode, began angling back toward the center of the lane at around 2 mph -- and made contact with the side of a passing bus traveling at 15 mph. Our car had detected the approaching bus, but predicted that it would yield to us because we were ahead of it. (You can read the details below in the report we submitted to the CA DMV.)

Our test driver, who had been watching the bus in the mirror, also expected the bus to slow or stop. And we can imagine the bus driver assumed we were going to stay put. Unfortunately, all these assumptions led us to the same spot in the lane at the same time. This type of misunderstanding happens between human drivers on the road every day.

This is a classic example of the negotiation that's a normal part of driving -- we're all trying to predict each other's movements. In this case, we clearly bear some responsibility, because if our car hadn't moved there wouldn't have been a collision. That said, our test driver believed the bus was going to slow or stop to allow us to merge into the traffic, and that there would be sufficient space to do that.

We've now reviewed this incident (and thousands of variations on it) in our simulator in detail and made refinements to our software. From now on, our cars will more deeply understand that buses (and other large vehicles) are less likely to yield to us than other types of vehicles, and we hope to handle situations like this more gracefully in the future.