Consumer Watchdog says steering wheels must be required in robot cars

(April 28, 2016) STANFORD, Calif. — Consumer Watchdog has called on the National Highway Traffic Safety Administration to require a steering wheel, brake and accelerator so a human driver can take control of a self-driving robot car when safety demands it in the guidelines it is developing on automated vehicle technology.

To dramatize the point, Consumer Watchdog's John M. Simpson gave Chris Urmson, chief technological officer of Google's self-driving car project a steering wheel. Simpson and a Google representative spoke at a NHTSA public meeting Wednesday about automated vehicle technology. Simpson, the nonprofit, nonpartisan public interest group's Privacy Project director, also gave them 10 questions about Google's self-driving project.

"Deploying a vehicle today without a steering wheel, brake, accelerator and a human driver capable of intervening when something goes wrong is not merely foolhardy.  It is dangerous," said Simpson.

Google wants a self-driving robot car without a steering wheel or brake with no way for a human driver to take control.

NHTSA's meeting came the day after the announcement of a new lobbying group including Google, Lyft, Uber, Ford and Volvo called the Self-Driving Coalition for Safer Streets.

"If these manufactures genuinely cared about Safer Streets, rather than pushing self-serving laws and regulations they would be transparent about what they're doing on our public roads," said Simpson.  "When something goes wrong, the technical details should be released to the public. It's not happening."

He noted that a Google robot car crashed into a bus on Valentine's Day. Video recorded on the bus by the transit company was released to the public.  Google says it has no plans to release its video or technical data.

The need to require a driver behind the wheel is obvious after a review of the results from seven companies that have been testing self-driving cars in California since September 2014, Consumer Watchdog said.

Under California's self-driving car testing requirements, these companies were required to file "disengagement reports" explaining when a test driver had to take control. The reports show that the cars are not always capable of "seeing" pedestrians and cyclists, traffic lights, low-hanging branches, or the proximity of parked cars, suggesting too great a risk of serious accidents involving pedestrians and other cars. The cars also are not capable of reacting to reckless behavior of others on the road quickly enough to avoid the consequences, the reports showed.

"Google, which logged 424,331 'self-driving' miles over the 15-month reporting period, said a human driver took over 341 times, an average of 22.7 times a month," Simpson said. "The robot car technology failed 272 times and ceded control to the human driver; the driver felt compelled to intervene and take control 69 times."

"What the disengagement reports show is that there are many everyday routine traffic situations with which the self-driving robot cars simply can't cope," said Simpson. "It's imperative that a human be behind the wheel capable of taking control when necessary. Self-driving robot cars simply aren't ready to safely manage too many routine traffic situations without human intervention."