We now know a little bit more more about the Valentine’s Day crash of a Google self-driving robot car into a transit bus, but not because of anything Google or the Department of Motor Vehicles did. That needs to change.
This week, in response to a Public Records Act request from the Associated Press, the Santa Clara Valley Transportation Authority released its video and still photos of the incident. Though Google has acknowledged it caused the crash, it has only filed the required report with the DMV.
The report presents Google’s account of what happened and that’s precisely the problem. Explaining what happened is too important to be left entirely up to the company that is testing the robot car and in this case caused the accident.
As Consumer Watchdog has said in a formal petition, the DMV needs to amend its autonomous vehicle regulations to require that all robot car crashes are investigated by the police, that any video taken by the car and technical data associated with the crash be made public.
Meantime, even though it’s not yet required, Google should release its own video and technical data associated with the crash. The Alphabet subsidiary is using our public roads as its own private laboratory. When something goes wrong, Google needs to release everything. It’s the morally right thing to do even if it’s not yet required.
California law requires that self-driving robot cars being tested in the state have a driver behind a steering wheel and brake pedal, capable of taking control when necessary. The DMV has just proposed regulations covering the general use of self-driving cars in the state, and continues the requirement that a driver be behind the steering wheel capable of taking control. Google is opposing the proposed regulations.
Google’s own test results demonstrate the need for a driver who can intervene. A required report filed with the DMV showed the self-driving robot car technology failed 341 times during the reporting period. The self-driving technology could not cope and turned over control 272 times, while the test driver felt compelled to intervene 69 times.
If Google and the other companies testing robot cars want us to accept their cars on the road, there must be complete transparency when something inevitably goes wrong. Increasingly it’s clear that won’t happen voluntarily and the DMV must require it.