Waymo is the Wolf in Sheep’s Clothing — On Lower Ground than Tesla

Several months ago I wrote an article for LinkedIn and Medium and have made quite a few posts lauding Waymo and John Krafcik for their move from handover for L3 and public shadow driving to develop autonomous vehicles. They said they would skip L3 entirely when passengers were involved and move to simulation to perform most of the development. John Krafcik went as far as admonishing Tesla for continuing the process. Saying that the casualties that will take place will hurt the progress of the whole industry. It appears now that I and everyone else was duped. This unfortunate revelation was brought to my attention by a Waymo engineer on a LinkedIn thread. That led to my reading Waymo’s Safety Report with a fine-tooth comb. That in turn led me to see Waymo speaks out of both sides of its mouth and apparently never actually intended to fully move away from handover. Not to actually walk the loud talk. They are still employing the practice to develop the technology which puts the drivers and public outside of the vehicle at risk. Both during development AND while in L3.

From Waymo’s Safety Report — https://waymo.com/safety/

First the information clearly stating the handover process is dangerous and should be skipped

  • The Case for Full Autonomy: Allowing Passengers to Stay Passengers — Advanced driver-assist technologies were one of the first technologies our teams explored. In2012 we developed and tested a Level 3 system that would drive autonomously on the freeway in a single lane but would still require a driver to take over at a moment’s notice. During our internal testing, however, we found that human drivers over-trusted the technology and were not monitoring the roadway carefully enough to be able to safely take control when needed. As driver-assist features become more advanced, drivers are often asked to transition from passenger to driver in a matter of seconds, often in challenging or complex situations with little context of the scene ahead. The more tasks the vehicle is responsible for, the more complicated and vulnerable this moment of transition becomes. Avoiding this “handoff problem” is part of the reason why Waymo is working on fully selfdriving vehicles. Our technology takes care of all of the driving, allowing passengers to stay passengers.

Now the use of the exact same process for AV development

  • Testing on Public Roads — Waymo has a comprehensive on-road testing program that has been improved and refined continuously over our nine-year history. It’s a critical step that allows us to validate the skills we have developed, uncover new challenging situations, and develop new capabilities. The safety of our on-road testing program begins with highly-trained drivers. Our test drivers undergo extensive classroom training, learning about the overall system and how to monitor the vehicle safely on public roads, including taking defensive driving courses. After this training, our drivers are responsible for monitoring the system and if needed, taking control of the vehicle while we test on public roads.

This makes Waymo worse than Elon Musk and Tesla. While Elon Musk and Tesla are duping their customers and the world into believing handover activities can be safe for L3 and AV development, and state that public shadow driving or their “shadow mode” is the best or only way to create these systems, they are at least not saying they are not using these modes or moving away from them to any significant degree. Not only are Waymo and John Krafcik nowhere near on high enough ground to chastise Elon Musk or Tesla they owe them and apology and are much lower on the ethical, moral and professional ladder. (Volvo is not much better. They said the same things and let Uber use their vehicles for AV development. Even after the very tragic and very avoidable death of Elaine Herzberg.)

All the AV makers using these practices need to understand it is a myth that the use of public shadow driving to develop AVs will ever come close to actually creating one. You cannot drive the one trillion miles, spend over $300B or harm as many people as this process will harm trying to do so. Especially when they run thousands of complex and dangerous scenarios, even actual accident scenarios, thousands of times each. These deaths are NOT the ends justifying the means. The answer is to leverage FAA practices and aerospace/DoD level simulation and to integrate that with a universally accepted L4/5 level Scenario Matrix.

Impediments to Creating an Autonomous Vehicle


Written by

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store