A self-driving car being tested by Google struck a public bus on a Silicon Valley street, a fender-bender that appears to be the first time one of the tech company’s vehicles caused a crash during testing.
Google accepted at least some responsibility for the collision, which occurred on Valentine’s Day when one of the Lexus SUVs it has outfitted with sensors and cameras hit the side of the bus near the company’s headquarters in Mountain View, California.
No one was injured, according to an accident report Google wrote and submitted to the California Department of Motor Vehicles. It was posted online Monday.
According to the report, Google’s car intended to turn right off a major boulevard when it detected sandbags around a storm drain at the intersection.
The right lane was wide enough to let some cars turn and others go straight, but the Lexus needed to slide to its left within the right lane to get around the obstruction.
The Lexus was going 2 mph when it made the move and its left front struck the right side of the bus, which was going straight at 15 mph.
The car’s test driver – who under state law must be in the front seat to grab the wheel when needed – thought the bus would yield and did not have control before the collision, Google said.
While the report does not address fault, Google said in a written statement, “We clearly bear some responsibility, because if our car hadn’t moved there wouldn’t have been a collision.”
Chris Urmson, the head of Google’s self-driving car project, said in a brief interview that he believes the Lexus was moving before the bus started to pass.
“We saw the bus, we tracked the bus, we thought the bus was going to slow down, we started to pull out, there was some momentum involved,” Urmson told The Associated Press.
He acknowledged that Google’s car did have some responsibility but said it was “not black and white.”
The Santa Clara Valley Transportation Authority said none of the 15 passengers or the driver of the bus was injured.
The transit agency is reviewing the incident and hasn’t reached any conclusions about liability, spokeswoman Stacey Hendler Ross said in a written statement.
There may never be a legal decision on fault, especially if damage was negligible – as both sides indicated it was – and neither Google nor the transit authority pushes the case.
Still, the collision could be the first time a Google car in autonomous mode caused a crash.
Google cars have been involved in nearly a dozen collisions in or around Mountain View since starting to test on city streets in the spring of 2014. In most cases, Google’s cars were rear-ended. No one has been seriously injured.
Google’s written statement called the Feb. 14 collision “a classic example of the negotiation that’s a normal part of driving – we’re all trying to predict each other’s movements.”
Google said its computers have reviewed the incident and engineers changed the software that governs the cars to understand that buses may not be as inclined to yield as other vehicles.
Jessica Gonzalez, a spokeswoman for California’s DMV, which regulates Google’s testing of about two dozen Lexus SUVs in the state, said agency officials spoke Monday with Google but would have no comment. Under state law, Google must retain data from the moments before and after any collision.
“As far as he-said she-said, there shouldn’t be any of that. It’s all there,” said Robert W. Peterson, an insurance law expert at Santa Clara University who has studied self-driving cars.
A critic of Google’s self-driving car efforts said the collision shows the tech giant should be kept from taking onto public streets self-driving prototypes it built without a steering wheel or pedals.
Google sees that as the next natural step for the technology, and has pressed California’s DMV and federal regulators to authorize cars in which humans have limited means of intervening.
“Clearly Google’s robot cars can’t reliably cope with everyday driving situations,” said John M. Simpson of the nonprofit Consumer Watchdog. “There needs to be a licensed driver who can takeover, even if in this case the test driver failed to step in as he should have.”
The Associated Press contributed to this article.
Constitutionalist says
The underlying assumption about robot vehicles is that humans can’t be trusted to pilot a car, truck, or motorcycle.
Mark my words, if/when google robot vehicles are approved for use, the People will be prisoners, unable to go where they wish, when they wish, how they wish, or how often they wish…and apparently, according to the article, these robots will have no steering wheels and pedals, so no man or woman can take control away, either.
i say, shoot ’em. Blow ’em all to hell from whence they came. Stop it now, before it grows like a cancer.
Bruce says
You best be moving my man. Cause it ain’t stopping!
Justin W says
Wrecks happen every day because one driver thought another driver was going to respond in a different way. Automated cars would hopefully be able to communicate with each other to prevent this sort of accident from happening.
Steve says
Would those cars have to obey the “No Texting” laws?
Jeff says
I think it’s fairly obvious that the self driving car made a simple move that any human would make as well. Sandbags on the right around a storm drain is something most people would swerve around as well, just like the Google car did. The bus wasn’t aware the Google car was going to try to avoid the sandbags because the bus driver probably couldn’t see the sandbags the Google car was going to try to avoid. People are always going to try to avoid something in the road that is not normally there – in this case sandbags, just like the Google car did.
No one did anything wrong in this situation. Not the Bus driver, not the Human in the Google car OR the Google car itself. Sometimes stuff just happens that isn’t anyones fault.
I’m not someone who thinks we should have self driving cars really. I can’t believe that companies have put so much effort into building them. I think there are other drawbacks to these cars however, in this case the car did nothing wrong. Any human would have done the same thing. Trying to blame the car or Google is stupid in this situation. I’m sure there will be dozens of situations where these cars will fail to react properly and cause an accident. This isn’t one of them.
Steve says
Any human or robot driver that “think” they have to pull to the left to turn right are IDIOTS and should have their license pulled until they learn how to drive and how their car handles. If you are doing that, why don’t you go to the right to turn left???
There is also no need to ‘drive’ in the #1 lane on the freeway while going 55 with your left turn signal on. JERKS!
ALLAN says
Gathering up every piece of experimentation whether pro or con will eventually reveal the engineering concepts to prevent mishaps of every conceivable nature to the automotive industry. Love to see the Emblem “Proudly Made in America’ flooding the World Markets again:
SailingNewYorkCity says
This is really a non-story. It was a very minor event and these happen thousands of times each week with drivers. Anything made by man will eventually fail and automated cars will be no exception. Now we have cars that automatically park, slow down, stop, etc. Just wait until these vehicles get old and the sensors stop working. The more that is automated the more there is to go wrong. Nonetheless, this technology will inexorably move forward and self driving cars are coming, whether we like it or not. All we can do is get used to it and remember that drivers are not infallible either.
RichardH says
These guys at Google must have heads full of mush when it comes to traffic laws. They apparently have a “feel good” very fuzzy understanding of the laws. I have practiced law for 35 years, and Google needs to be introduced to the basic traffic laws that we have used for over a century (and, with horses and buggies, hundreds of years before that). It was 100% the fault of the Google car. The bus had the right-of-way and IT STAYED IN ITS LANE!! It WAS a “black and white” situation. A classic example of Google’s fuzzy thinking is “negotiation that’s a normal part of driving – we’re all trying to predict each other’s movements”, is total B.S. We are NOT supposed to “predict” the other’s movements!! What should the auto-car have done? It did not have room to proceed in its lane, so it should have come to a stop, and then, after its sensors said the lane on its left was clear it could have eased over into that lane.
If several people had been killed because of this same scenario, Google would have been found 100% liable and would be paying out millions of dollars.
Hugiaino says
Seems to me that safety measures can be put in place such that, the sensors, computer, etc. self-test before operation, and, if faulty, prevent operation.
But computers don’t get road rage, macho moments, or lapses in attention or judgement. On the contrary, they react faster than a teenager and can be programmed with the experience of a long-time, mature driver. For that reason, I believe that self-driving cars will result in fewer accidents and far fewer fatalities.
As for who will commandeer the technology for their own nefarious purposes, we need to trust God, and make sure we elect true, Constitutional conservatives who respect human rights and will do what they say they will do. Otherwise, self-driving cars or no, we’ll all be in trouble.
By the way, rudimentary self-driving technology has been used for decades in airport trams and other vehicles.
Stephen Russell says
Better fix now or pay more later, unify insurers alone on issue, make plans, think ahead like 25 years.
Address NOW or suffer later.
Yes Id still drive a driverless car.
SJ Jolly says
What weary commuter hasn’t wished the car could drive itself? And would be wiling to pay somewhat more for the feature. Self-driving cars are coming.