"Self-Driving" Cars Have a Dirty Secret
Behind the Screen
You may be surprised to learn that "self-driving" cars don't always drive themselves.
Because these systems are far from infallible, companies like Amazon's robotaxi service Zoox often depend on a behind-the-scenes team of human technicians to remotely control their cars when they struggle to pilot themselves.
Today this facet of self-driving operations is more of an open — but closely-guarded — secret. But as The New York Times reports, for years these companies avoided mentioning that their supposedly autonomous vehicles were occasionally controlled by humans at all.
It's only been relatively recently, following several high profile debacles in the autonomous driving industry, that industry leaders, like Waymo this May, have acknowledged the role of human technicians.
And according to the NYT, none of these companies have shared just how many of these remote-technicians they employ, or how often they depend on them. In short, we don't know how deep this practice goes — and it's possible that the smoothest "self-driving" experiences out there are substantially undergirded by hidden human drivers.
Remote Intervention
General Motors' Cruise division effectively embarrassed the entire industry when one of its robotaxis injured a pedestrian last October, leading to a federal investigation into just how widespread these accidents were — and eventually the pulling of all 400 hundred of its vehicles off the streets.
To some degree, we can probably blame that blunder for the small degree of insight we're now afforded into the robotaxi industry's reliance on human intervention.
According to the NYT's sources, Cruise staffed about 1.5 workers per vehicle, including remote assistant techs. Zoox staffs at least one team of around three dozen people overseeing its handful of fully driverless robotaxis.
That seemingly undermines one of the economic selling points of robotaxi services compared to ride-hailing services like Uber: that they don't need humans behind the wheel.
"It may be cheaper just to pay a driver to sit in the car and drive it," Thomas W. Malone, a professor at the Massachusetts Institute of Technology Center for Collective Intelligence, told the NYT.
Automotive Turk
It'd be remiss to fault robotaxi companies for having these kinds of safety fallbacks. The problem is that they're not upfront about the fact that humans are still in the loop, creating a facade of full autonomy that doesn't actually exist.
In the tech sector, we've seen this in everything from AI-powered drive-thrus — of which some turned out to heavily depend on cheap, outsourced laborers correcting orders behind the scenes — to the marketing of Tesla's Full Self-Driving system, which doesn't actually fully drive itself, despite what its name implies.
"That is just how things work in Silicon Valley," journalist Cade Metz, in his accompanying piece about the NYT report he co-authored. "By creating the illusion of complete autonomy, companies can fuel interest in their technology and raise the billions of dollars they need to build a viable robot taxi service."
More on self-driving tech: Waymo Giving 100,000 Robotaxi Rides Per Week But Not Making Any Money