When I heard that the first ever video of a crash caused by a self-driving car had just been released I could barely contain my excitement. This was history! As it turned out, very boring history.

The passengers on the transit bus that Google’s LIDAR-equipped Lexus SUV collided with last month barely seemed to notice, which surprised me. Now that we live in the future, when something happens to a vehicle I expect people to jiggle around like they do on the bridge in Star Trek.

Reader support makes our work possible. Donate today to keep our site free. All donations TRIPLED!

start trek gif

This was the first time Google took the blame for a crash. Google and police officers called to the scene of the dozen accidents that Google’s cars have been involved in before now have blamed such crashes on the human drivers interacting with the robot car, rather than the car itself. This time, the crash happened because the car failed to realize that buses don’t yield in traffic the same way that regular cars do — in fact, buses never yield to anything.

Grist thanks its sponsors. Become one.

Google often trots out the story of how once, when driving itself around Silicon Valley, a Google car successfully avoided a wild turkey being chased across the street by a woman riding in a wheelchair and waving a broom. But in many ways your wild turkeys and wheelchair women are the easiest things for both human drivers and algorithms to detect. They’re a clear deviation from the norm, rather than what caused this particular crash — a situation so normal as to inspire robot overconfidence.

In its Self-Driving Car Project Monthly Report for February, Google described the car’s re-education process thus:

We’ve now reviewed this incident (and thousands of variations on it) in our simulator in detail and made refinements to our software. Our cars will more deeply understand that buses and other large vehicles are less likely to yield to us than other types of vehicles, and we hope to handle situations like this more gracefully in the future.

But in the case of this bus accident, there’s human failure as well as the robot kind. The self-driving car had a driver who failed to engage manual mode and take over the car, because the driver, like the robot, was a little naïve about buses. Driving may feel antisocial, but doing it well requires profound skill at reading social cues. Who’s drunk? Who’s not paying attention? Who’s looking for a place to park and apt to cut you off unexpectedly? Who is (I actually saw this once) eating corn on the cob and steering their car with their elbows?

Grist thanks its sponsors. Become one.

If self-driving cars exist to compensate for the failings of humans, how often might Google’s human drivers compensate for the innocence of robots trying to predict human behavior on the road? It’s likely that these are professional drivers, not the average kind who have a disconcerting tendency to doze off when a vehicle switches to autonomous mode. Look at Google’s own stats of how many road hours have been logged by its autonomous vehicle project, and you’ll see that a person has overridden the autonomous aspect of the car and taken the wheel for nearly half of the hours logged on the road. It raises the question: If humans are still doing nearly half the driving on the self-driving car project, how close are we, really, to letting the robot drive?