Self-driving cars are more likely to hurt than help public safety because of unsolved technical issues, engineers and safety advocates told the government Friday, countering a push by innovators to speed government approval.
Even a trade association for automakers cautioned the National Highway Traffic Safety Administration that a slower, more deliberative approach may be needed than the agency’s aggressive plan to provide its guidance for deploying the vehicles in just six months. The decision to produce the guidance was announced in January and officials have promised to complete it by July.
There are risks to deviating from the government’s traditional process of issuing regulations and standards, Paul Scullion, safety manager at the Association of Global Automakers, said at a public meeting on self-driving cars hosted by NHTSA.
Issuing new regulations takes an average of eight years, NHTSA has said. Regulations are also enforceable, while guidance is usually more general and open to interpretation.
“While this process is often time-consuming, these procedural safeguards are in place for valid reasons,” Scullion said. Working outside that process might allow the government to respond more quickly to rapidly changing technology, but that approach would likely come at the expense of thoroughness, he said.
Mark Rosekind, NHTSA’s administrator, said the agency can’t wait because early self-driving technologies are already in cars on the road, including automatic emergency braking that can stop or reduce speed to avoid or mitigate a collision. Another safety option on some vehicles automatically steers vehicles back into their lanes if they start to drift without the driver first using a turn signal.
“Everybody asks, ‘When are they going to be ready?’ I keep saying they’re not coming; they are here now,” Rosekind said.
Without federal instructions, “people are just going to keep putting stuff out on the road with no guidance on how do we do this the right way.”
Rosekind emphasized that he sees self-driving cars as game-changing technology that can someday save the lives of many of the more than 30,000 people killed each year on the nation’s roads.
A General Motors official recently told a Senate committee that the automaker expects to deploy self-driving cars within a few years through a partnership with the ride-sharing service Lyft. Google, a pioneer in the development of self-driving cars, is pushing Congress to give NHTSA new powers to grant it special, expedited permission to sell cars without steering wheels or pedals.
But many of those who addressed the meeting, the first of two the agency has scheduled as it works on the guidelines, described a host of situations that self-driving cars still can’t handle:
—Poorly marked pavement, including parking lots and driveways, could foil the technology, which relies on clear lane markings.
—Bad weather can interfere with vehicle sensors.
—Self-driving cars can’t take directions from a policeman.
—Inconsistent traffic-control devices — horizontal versus lateral traffic lights, for example.
Until the technology has advanced beyond the point where ordinary conditions are problematic, “it is dangerous, impractical and a major threat to the public health, safety and welfare to deploy them,” said Mark Golden, executive director of the National Society of Professional Engineers.
There have been thousands of “disengagements” reported in road tests of self-driving cars in which the vehicles automatically turned control over to a human being, said John Simpson, privacy project director of Consumer Watchdog.
“Self-driving cars simply aren’t ready to safely manage too many routine traffic situations without human intervention, he said.
Rosekind said automakers are learning from the unanticipated situations the vehicles encounter and adapting their software. At the same time, he acknowledged that self-driving cars, like other systems that rely on wireless technology, can be vulnerable to hacking.
James Niles, president of Orbit City Lab, a New York think tank, told the meeting that there is a complete absence of federal regulations and standards to prevent self-driving cars from being turned into weapons by “bad actors.”
“The concern that an autonomous vehicle could be used as a weapon has gone unnoticed by the general public and probably by the majority of government officials,” he said.
President Barack Obama has proposed a 10-year, $3.9 billion automated technologies program, including large-scale pilot deployments of self-driving cars around the country and funding additional cybersecurity research.
The Associated Press contributed to this article.
Well….you can’t think of everything?
Here is my question. If my vehicle is doing the driving and not me, and if it is involved in a serious accident that it created, who has the fault and the financial responsibility? Me or the designers of the vehicle? I may own the vehicle but I was just a passenger and a victim in the accident. If there is a death involved, who gets charged with the resulting death? Until all vehicles are on some kind of chain driven system which prevents vehicles from coming in contact with each other, your safer with a trained monkey being in control of your vehicle!
You do, because you made the decision to allow it to function autonomously. And good luck proving the car did something against your will that you didn’t want it to do, assuming you survive the resulting crash.
Dan,
Trains run on tracks and still require an engineer. I’ll bet there’s a reason for that.
And commercial planes require two pilots. And speaking of planes, there’s SERIOUS disagreement as to whether increasing automation has made pilots jobs easier, and several examples of pilots ending up unprepared when the plane gets into a situation the AI can’t handle (Air France and the more recent Malaysian crashes, for example). And yet, the government/automakers STILL want to shove automatic overrides down our collective throats!
Self driving cars is an insane solution for a problem that can’t be solved. Humans are not perfect. And computers on a desk or in a car can never be programmed to solve unknown future situations. If they are made fail safe to stop if an unknown condition arises, then they will go about 10 feet and stop. And the former driver turned passenger will have to override the auto stop and drive on. Cars can never be programmed to be a set it and forget it type device. So quit wasting my tax dollars in a new pursuit of folly. They are singing a song. Hello Folly! But it is not entertaining at all.
What are these self driven vehicles going to do when there is a power outage and all the traffic signal stop working?And when the lights are working is this technology going to look both ways after the light turns green , or is it just going to take off as if no one is running a red light ?
What if i want to stop and grab something to eat or use the restroom can it be programed to know what I need to do?
This is just too much , soon people will be to lazy to breath and want a machine to do that for them , this is making more people lazy than ever.
What if you call a self driven cab, how does it know it was you that called, and how is it going to force you to pay up before you get out of the car?
This is a retarded and stupid Idea, that’s going to cause more problems than it solves.
The naysayers are out in force. But all innovation has come despite those who fear it, mostly starting small and improving over time. Look at automobiles 100 years ago, for example. And look up Fulton’s Folly or Seward’s Ice Box or even the trouble Galileo and Copernicus had.