Accident Report: The Google Self-Driving Car
When developing the algorithms for its self-driving car, Google had assimilated almost every rule-of-the-road, except at least one: Don’t tangle with municipal bus drivers.
In a presentation at the SXSW Interactive conference, currently being held in Austin, said Chris Urmson, Google’s director of its self-driving car project, offered a glimpse into the AI-driven program, including an account of, after more than 1.4 million miles on the road, the first accident where a Google self-driving car was at least partially at fault.
“Our car made an assumption about what the bus driver was going to do. The bus driver made an assumption about what our car was going to do. And they weren’t the same assumption,” Urmson said.
The fledgling science of self-driving cars, led by the reasoning capabilities of onboard computers, certainly may become a huge industry — General Motors just purchased self-driving car technology Cruise for an undisclosed sum. But Google’s work, steeped in artificial intelligence, has repercussions beyond self-driving cars. It probes the limits to how far computers can go in mimicking, or even improving on, human reasoning.
Google started its self-driving car project in 2009. The company set out to build prototypes that would ride at least 100,000 miles on public roads, “more than ten times than anyone had done before,” Urmson said.
The company bought a fleet of Toyota Priuses, equipped them with cameras and sensors, and sent them out accompanied with engineers to observe what happened.
Until February, the autos never caused an accident, even though they traveled more than a million miles. Like good drivers, the vehicles were programmed to be on guard for all possibilities, even those two or three steps out. The car is cautious almost to a fault, slowing when cut off by another vehicle. If it sees that it is in the blind spot of another vehicle, it will slow down.
The company built “an unprecedented level of sensing,” into the vehicles, Urmson said, using a combination of lasers, radar and cameras. The car can sense up the 200 meters ahead and has a peripheral vision of 360 degrees. Taking in all this data, the onboard computer makes a decision about 10 times a second about how to proceed.
On February 14, in Mountain View, California, a Google self-driving car, a Lexus, side-swiped a bus, at approximately 15 mph. No one was hurt, though in the accident the car’s front left side was crumpled, its tire flattened, and a radar that was affixed to the car was torn off by the bus, according to a report in The Daily Mail. The Santa Clara Valley Transportation Authority, which operated the bus, investigated the incident and found its bus driver not to be responsible.
At SXSW, Urmson offered an accident report.
Attempting to make a right-hand turn at a busy intersection, the self-driving car moved over to the right-hand side of the extra wide lane. This is a common practice many drivers undertake to go around cars waiting to cross through the intersection, and it was one the self-driving car was just recently programmed to do. “This is an important part of driving in the community in a natural way,” he said.
However, the vehicle encountered some sandbags on the shoulder, forcing it to stop. The light turns, cars in the right lane start moving forward while the self-driving car sits behind the sandbags.
“At this point, our car looks back with its sensors, and sees this bus,” Urmson said. It looks at the size of the bus. It looks at the width of the lane and decides the bus will not fit. “At least, it thinks it will not fit,” Urmson said. “In that moment, it had assumed that that bus driver would slow down.”
The bus driver, eyeing the gap, had other ideas on the matter, and so proceeded. And the bus would have made it through, Urmson acknowledged, had the self-driving car stayed put. But the car moved forward and bumped into the bus.
“This was a tough day for us,” Urmson admitted. “We don’t like our cars bumping into things.” Subsequently, Google merged the experience into its algorithms, running the similar scenarios through 3,500 tests to ensure something similar would not happen again, he said.
Still, humans can be, as Spock knows all too well, unpredictable creatures. Urmson also recounted a story where a self-driving vehicle was traveling down the road with a warning sign that ducks might be crossing, and indeed encountered some ducks, but also a woman in an electric wheelchair chasing after them with a broom.
“Now we have a team of people whose job it is to come up with weird stuff, but they didn’t come up with this,” Urmson said.