Waymo doesn't like California's benchmark for self-driving research

Nick Summers
Senior Editor
Andrei Stanescu via Getty Images

Waymo is the latest company to criticize "disengagements," a metric that indicates how often a human driver is forced to take over from a fully-autonomous driving system. At the moment, every company with a self-driving car program in California must report their disengagements to the state's Department of Motor Vehicles (DMV). That includes Alphabet subsidiary Waymo, the General Motors-owned Cruise, Aurora and Nuro.

The metric wasn't meant to create a public-facing leaderboard. However, industry onlookers have inevitably used disengagements to compare the maturity of these companies and the sophistication of their self-driving software. Why? Because it's rare for startups to give out lots of meaningful data, especially in a way that can be directly compared to their competition. Disengagements aren't perfect, but as the age-old saying goes, something is better than nothing.

As the latest report from the DMV shows, Waymo logged 1.45 million self-driven miles in California last year, with a disengagement rate of 0.076 per 1,000 miles. Cruise, for comparison, clocked 831,040 miles with a disengagement rate of 0.082. In a Twitter thread, however, Waymo stressed that the metric "does not provide relevant insights" into its self-driving technology — known as Waymo Driver — or "distinguish its performance from others in the self-driving space."

The reasons are plentiful: for one, Waymo's technology is built on a combination of simulated and real-world driving. Most of the latter is completed outside California, the company explained, in cities like Detroit (where it owns a factory of sorts) and Phoenix (where it currently offers a limited consumer service, Waymo One). "Most of our large-scale real-world driving, which is critical for full-system validation (including validating the realism of our simulator) comes from Phoenix," the company added in a tweet.

California, meanwhile, is predominantly used for "engineering development," Waymo explained, and not the final "production releases" that power consumer trips in Pheonix. "This is even more the case in 2019 and 2020, as we are now developing our 5th-generation Waymo Driver in Silicon Valley, San Francisco and Los Angeles," the company said on Twitter. "We don't think California disengagement data should be used to compare performance, or judge readiness or competency."

"We don't think California disengagement data should be used to compare performance, or judge readiness or competency."

Waymo isn't the first to make these complaints. Last month, Cruise co-founder Kyle Vogt penned a Medium blog post titled The Disengagement Myth. He explained that it's easy for companies to offer autonomous trips on a quiet route with few cyclists, pedestrians, and other hard-to-navigate hazards. If you only "tested" your technology on this route, you would finish the year with a near-perfect disengagement record. "Because after all, an autonomous vehicle is only ready for primetime if it can do dozens, hundreds, or even thousands of these kinds of trips without a human touching the wheel," Vogt wrote. "That's the ultimate sign that the technology is ready, right? Wrong."

GM-Autonomous Ride Service

It's in a company's best interest, he said, to test in difficult conditions. And in the most complex urban environments, there will always be moments when a human wants to take the wheel. "Said another way, even if the autonomous vehicle was 100x better at driving than a human, rides through places like downtown San Francisco will still regularly generate disengagements," he said. "As a result, disengagement-free driving is not actually a prerequisite for commercial deployment of autonomous vehicles."

Cruise encourages passengers to "use their judgement" and take the wheel, Vogt said, if a dangerous situation suddenly occurs, "even if it turns out to have been unnecessary." That's because safety is the company's top priority and they want human riders to feel comfortable and, when necessary, in control on the road. There will also be moments that a driver is wary of a car's planned route or decision-making, he said, or simply have a preference for how the trip should continue. Sometimes a human will also want to take over out of politeness to other road users. If another driver is late for work, for instance, their anger might increase if they have to stop and wait for a self-driving car to park itself.

Instead, he argued, the industry needs reliable data on the performance of human and autonomous vehicles in a specific environment, as well as "an objective, apples-to-apples comparison with statistically significant results." Until that happens, though, curious minds are likely to keep browsing and comparing disengagement metrics, regardless of their usefulness and whether they paint a misleading picture.

Images: Andrei Stanescu via Getty Images (Waymo; AP Photo/Paul Sancya (Cruise)