U.S. Transportation Secretary Anthony Foxx said Tuesday that he wants government regulators and the auto industry to work more closely together to test self-driving technology before people entrust their vehicle's steering and brakes to a robot.
Foxx told about 1,200 people at a self-driving convention in San Francisco that a more rigorous review of robotic controls is needed to make sure the Department of Transportation and manufacturers are "in sync" about the safety of autonomous vehicles before they hit the road.
"This could help assure consumers that the vehicles that they are getting into are stress tested," Foxx said.
He also advised automakers to assume people will be tempted to take foolhardy risks when they activate the autonomous features in a car, making it imperative to design vehicles that minimize the chances of irresponsible behavior.
"Sometimes the coolness of technology may drive people to try to push the limits beyond what the manufacturers intended," Foxx said.
His remarks came less than three weeks after government regulators revealed Joshua Brown, a Canton, Ohio, entrepreneur, died on May 7 after his Tesla Model crashed into a truck in Florida while using a semi-autonomous feature called "Autopilot." The driver of the truck said he heard a Harry Potter video playing in Brown's car after the crash.
Foxx didn't specifically mention Tesla or the crash, which is under investigation.
Brown, 40, was killed after neither he nor the Tesla Autopilot braked for a truck making a left-hand turn near a highway, according to the automaker and federal investigators.
Consumer activists have cited the Tesla fatality as a glaring example of a technology that was allowed on the road before it was properly tested.
Tesla released Autopilot last year, prompting some people to begin posting videos showing them cruising down streets and highways without anyone in the driver's seat. Tesla says drivers should be in a position to take over Autopilot at any time.
Consumer Watchdog, a group critical of self-driving technology, and Joan Claybrook, a former head of the National Highway Traffic Safety Administration, released a letter this week calling for more stringent regulation of self-driving technology as it's developed.
Autopilot's failure is "a poster child for why enforceable safety standards are needed, not useless voluntary guidelines," they wrote in the letter addressed to Tesla Motors CEO Elon Musk and Mark Rosekind, the current head of the National Highway Traffic Safety Administration.
Musk and his management has defended Autopilot as safe if used properly, citing Brown's crash as the first death in more than 130 million miles of driving with the feature activated. Rosekind is scheduled to give a speech Wednesday at the same self-driving conference that welcomed Foxx Tuesday.
Even with more testing, Foxx said it's unrealistic to expect self-driving cars to eliminate all accidents. The goal, he said, is an 80 percent reduction in the current frequency of traffic accidents, which are mostly caused by human error or negligence. "Autonomous does not mean perfect," he said.
Foxx plans to propose federal government guidelines for self-driving vehicles later this summer. In the meantime, self-driving cars are still being tested in several states.
Google, for instance, currently has a fleet of 58 self-driving vehicles being tested on public streets in Silicon Valley, Kirkland, Washington, Austin, Texas, and Phoenix. The company hopes to have its self-driving technology in cars that consumers can purchase by 2020.
In remarks after his Tuesday speech, Foxx said he expects "some variation" of robotic cars to be widely available within the next five to 10 years. But he predicted it will probably take "a couple of decades, maybe more, before full integration of the system."