The downtown strip in Mountain View is a delight. There are bookstores, gelato spots, taquerias, and tea shops. Kids ride their bikes, birds chirp, and the mountains really do command your eye. It’s all quite pleasant.
But I’m not here to taste the good life. I’m here to glimpse the future. Because of its abundant sunshine—more than 250 days each year—and ties to Silicon Valley, this Northern California town has become an unofficial test track for Google’s self-driving cars. They roam the streets day and night, collecting data to make them smarter at navigating intersections, pedestrians, construction zones, and more.
As much of the rest of the nation ponders what a driverless world might look like, the people of Mountain View—roughly 80,000 strong—are watching it unfold firsthand, discovering what it means to share the streets with those computers-on-wheels.
To date, Mountain View hasn’t adopted any special rules for the cars, explains Michael Kasperzak, chair of Mountain View’s transportation council—at least not beyond those drafted by the state of California, which requires that all autonomous vehicles have a driver who can take the controls at any time. Often, there is even a second passenger, also an engineer, for research purposes.
More From Consumer Reports
On the corner of Castro and Villa, the heart of downtown, a young boy points out how easy it is to spot the cars, thanks to the “black thing on top.” That would be the laser range-finding system (also known as LIDAR) used to map the immediate surroundings.
“It’s just like a robot that is a car,” adds a young girl he is with. In other words, no big deal.
The hometown tech giant started building out its fleet of test cars in 2009. It now has 24 sensor-laden Lexus SUVs and 34 two-door, bubble-shaped prototypes that roam Mountain View, as well as Phoenix; Kirkland, Washington; and Austin, Texas. Companies such as BMW have taken to testing their driverless cars in California, too.
Some citizens of Mountain View say the vehicles tend to operate more like hyper-vigilant senior citizens than reckless teens. “They seem to go a little slower than the flow,” says Eric Latter, a Santa Clara Valley Transportation Authority (VTA) bus driver.
In some instances, Google is working to make their cars drive more like people, flaws and all. For example, the cars have been programmed to turn corners slightly wider than is technically ideal as a way of mimicking human behavior and signalling its intention to other drivers, explains Nathaniel Fairfield, lead engineer for the self-driving cars’ planning and control team. This is something a human does out of instinct. “There’s a very good reason to follow certain paths,” Fairfield says.
Though drivers in Mountain View are used to sharing the roads with robots that are cars, there have been a few curious incidents—and yes, accidents, too—that remind everyone that autonomous vehicles may not be as evolved or nuanced as human drivers.
Beware of the Bumps
Last February, one of Google’s self-driving Lexus SUVs tried to merge with traffic and collided with a VTA bus. That was no high-speed crash. The bus was moving at 15 mph, and the robot car was creeping along at 2. No injuries were reported. The Lexus sustained damage to its left front fender and wheel, and one of its driver’s-side sensors.
Google is not the only company dealing with car accidents that involve autonomous technology. A Tesla driver was killed in May traveling along a Florida highway while using the Autopilot mode on his Tesla Model S. In that instance, a tractor trailer turned left in front of the Tesla and neither driver nor Autopilot activated the brake. The Tesla slid beneath the truck, sheering off the roof, according to the accident report.
While Google is aiming for and testing toward complete autonomy in its driverless prototype fleet here and elsewhere, Tesla’s Autopilot is being developed by remotely recording the experiences of real people as they use the technology; it’s a suite of safety features that includes autosteer, lane-departure warning, and emergency braking.
Tesla says that its vehicles have recorded more than 130 million miles driving on Autopilot with real people in the driver’s seat, and there has been one confirmed fatality. For Google, the test log is a far more modest 1.8 million miles, with millions more miles of simulated driving in which the cars’ computer systems respond to data produced by a virtual trip without ever leaving the lab. In addition to the run-in with the VTA bus, Google reports that its cars have been involved in 24 other accidents, mostly rear-endings determined to be the fault of other drivers—basically human error. Some transportation experts tend to believe that autonomous vehicles will ultimately be safer on the road than those piloted by people. In a 2015 report, the consulting firm McKinsey & Company estimated that autonomous vehicles could one day reduce auto accidents in the U.S. by 90 percent.
Though Consumer Reports supports technology that advances the consumer interest, the independent, nonprofit organization believes in a careful balance of innovation and safety.
A world with driverless vehicles raises new questions for law enforcement and insurance companies. How exactly does one determine the liability of an accident involving two vehicles operated by consumer software? And if the projections are correct and so many fewer accidents occur, will that mean lower premiums?
Thorny questions like those haven’t dampened the enthusiasm for autonomous cars in Mountain View. “I see them every day,” says UPS driver Eric Bates. “I love them.” He’d like one of his own, he adds, so that he could sleep during his 84-mile round-trip commute to the UPS office where he picks up his truck for work. There’s another reason he wants one; Bates, who is deaf, says a self-driving car would allow him to have sign language conversations more safely when he and friends and family are driving together.
As for a robot taking over his role as a delivery driver? “I’m not worried,” he says. “Any time they invent new technology, it creates new jobs and new ideas.”
If anything, Google’s bubble cars can be almost too restrained. “They’re about as cautious as you can get,” says Lieutenant Saul Jaeger of the Mountain View Police Department. Last November, a motorcycle officer in the traffic unit pulled one over after clocking it driving 24 mph in a 35 mph zone. Cars were stacking up behind it. In the end a warning was given for impeding traffic, Jaeger explained, but no ticket was issued.
“Driving too slowly?” Google posted on Google+. “Bet humans don’t get pulled over for that too often.”
The company’s prototype is currently programmed to top out at 25 mph on Mountain View roads, but it can drive faster. Google has programmed the bubble cars to drive in the right lane when possible so that faster traffic can move around them, Fairfield says. He adds that the cars are programmed, first and foremost, to be safe. That means they’re conservative, he says, and they tend to slow down when they encounter a new situation, such as a cyclist riding the wrong way in a bike lane. But once the cars have seen a wrong-way cyclist many times, they learn to maintain their speed because the cyclist is not likely to suddenly cross the street. Instead, the cars just give the bike a little extra space.
The Rules of the Road
Learning to be a nuanced driver takes time and practice, and even if one strictly adheres to the laws, there are other subtle behaviors that contribute to a sort of driving culture, which can vary by city, state, and region, even neighborhood. If drivers get those wrong, they could confuse—or annoy—other drivers.
When a fellow driver signals a lane change, people tend to speed up or slow down to make room for the merging vehicle. Google’s cars almost never respond to such courtesies, says Carrie Lampman, CEO of Bay Area Driving Academy. They maneuver in a very “by-the-book” way—one that doesn’t really jibe with the unspoken rules of the road.
By law, drivers in California must give bikes 3 feet of clearance on all sides. But Lampman once saw a driverless car linger in the space behind a cyclist long past the time when an ordinary human would have hit the gas pedal and zipped around it.
“They drive slowly and obey the law, and that creates frustration,” says Kasperzak, of Mountain View’s transportation council. He recalls crossing a street with no painted crosswalk on his way home one night when a driverless car approached and stopped. Kasperzak, a pedestrian, yielded as well, but he didn’t know how to wave on the vehicle. “I was trying to let it go,” he says, “but it was going to wait me out.”
Jaeger has observed pedestrians at crosswalks—familiar with the cautious nature of the cars—having a little fun with them. Robot taunting. “People pretend to step out, and the car will stop. Then they’ll step back and wait, and the car will start to go again,” he says. “I’ve seen it a bunch of times. Most of us would be honking at the guy, but the car will continue to play the game.”
A year ago The New York Times reported that this conscientiousness kept one of Google’s driverless cars from making it through a four-way stop in 2009. The human drivers kept inching forward, and the autonomous vehicle—detecting the motion—remained still, paralyzed with prudence.
Harsh Gill, a Mountain View taxi driver, has witnessed a similar standoff between an autonomous car and three autos with far less intelligence.
Or so it seemed.
“It was the Google car’s turn to go, but it didn’t do it,” Gill explains. “You see two people sitting inside, but you don’t know who’s driving, who has the brain.”
Many challenges remain. Google doesn’t give an official date for when its cars will be consumer-ready. Fairfield notes that the company wants to continue to make sure that the cars are less accident prone than the average driver. “At this point our main focus is really on testing and improving the quality of the driving, and demonstrating to ourselves that we know the cars are ready from a safety perspective,” he says.
Still, if Google’s self-driving cars are going to be widely accepted by human drivers, they’re going to have to get a whole lot smarter. There have been some improvements over the years, such as recognizing hand signals used by cyclists and automatically pulling over to the side of the road when emergency vehicles are detected by sight or by sound (then safely merging into traffic once the vehicle has passed). But the cars will still have to recognize hazards—like, say, potholes—that don’t appear on maps. They will have to follow winding roads even when the lines are obscured by rain and snow.
Ultimately, they have to see the world a bit more like humans do.
Copyright © 2005-2016 Consumers Union of U.S., Inc. No reproduction, in whole or in part, without written permission. Consumer Reports has no relationship with any advertisers on this site.