Google is telling the California Department of Motor Vehicles that if proposed regulations are enacted that require a human driver behind the wheel, capable of taking control of a self-driving robot car, the company will take the vehicles elsewhere.
That’s despite the fact that Google’s own numbers show that the robot technology cannot reliably handle some real-road scenarios. Speaking at DMV’s autonomous vehicle public workshop in Sacramento Chris Urmson, director or the self-driving project, put it like this:
“We’re focused on developing a fully autonomous car … that requires no monitoring by a human. It would [perform] the whole of the driving task, and on the basis of the proposed DMV regulations we are discussing here today, it would not be available in California.”
So far, at workshops in Sacramento and Los Angeles, the DMV is continuing to put safety first and sticking to the licensed driver requirement.
Maybe someday in the distant future a vehicle like Urmson envisions will exist. The problem is that it doesn’t yet and the DMV is writing regulations for now, not then. Correctly, the DMV is putting safety first as proposes the rules for the general deployment of self-driving robot cars.
Currently, regulations are in place that cover testing the vehicles in California. They require that a driver be behind the steering wheel and pedals capable of taking control. Moreover, the testing regulations require companies to file reports detailing when humans had to take control from the autonomous technology.
Seven companies had to file these “disengagement reports” covering the period from September 2014 through November 2015. Google, with the most vehicles and miles driven using robot technology, filed the most detailed report. It shows Google’s robot cars aren’t ready for unchaperoned access to our highways without a stand-by driver.
The cars are not always capable of “seeing” pedestrians and cyclists, traffic lights, low-hanging branches, or the proximity of parked cars, suggesting too great a risk of serious accidents involving pedestrians and other cars. The cars also are not always capable of reacting to reckless behavior of other drivers and cyclists quickly enough to avoid the consequences.
Google, which logged 424,331 “self-driving” miles over the 15-month reporting period, said a human driver had to take over 341 times, an average of 22.7 times a month. The robot car technology failed 272 times and ceded control to the human driver; the driver felt compelled to intervene and take control 69 times.
Google’s robot technology quit 13 times because it couldn’t handle the weather conditions. Twenty-three times the driver took control because of reckless behavior by another driver, cyclist or pedestrian. The report said the robot car technology disengaged for a “perception discrepancy” 119 times.
Google defines such a discrepancy as occurring when the car’s sensors don’t correctly perceive an object, for instance over-hanging branches. The robot technology was disengaged 55 times for “an unwanted maneuver of the vehicle.” An example would be coming too close to a parked car. The human took over from Google’s robot car three times because of road construction. The was a software glitch 80 times and hardware failed 39 times.
A number of disabled people spoke at both the Sacramento and Los Angeles DMV workshops expressing the hope that they will be able to operate self-driving cars. Someday that might be possible, but we’re not there yet.
Indeed, do we ever want to cede complete control over life and death decisions to an opaque algorithm without the possibility of human intervention?
That question aside, for now there are too many everyday routine traffic situations with which the self-driving robot cars simply can’t cope. Just as the draft regulations require, it’s imperative that a human be behind the wheel capable of taking control when necessary. Self-driving vehicles simply aren’t ready to safely manage many routine traffic situations without human intervention.
It’s essential that the DMV ignore Google’s threats, emphasize safety and continue to insist that self-driving robot cars have a licensed driver behind the wheel capable of taking control when necessary.