I was a fan of yours a while back because you announced that handover or L3 was dangerous and that you would end the practice. You were right to say that. Handover cannot be made safe in critical scenarios, regardless of the monitoring and notification systems being used, because they cannot provide time to regain the requisite situational awareness needed to do the right thing the right way. But then in your “Safety Report” you stated you would continue the practice in your development which is Public Shadow Driving. You did however say that you would not use it in passenger vehicles. But then recently you reneged on that too.
I wrote about that here — Waymo is the Wolf in Sheep’s Clothing — On Lower Ground than Tesla
Now I see you are increasing the hype by pushing the miles driven metric. That metric as with disengagements means absolutely nothing without knowing the scenario being run. The only metrics that matters are scenarios learned and what scenarios should be learned. (Recently Chris Urmson has stated this. I supposed he is wrong? Don’t worry though he and Aurora still use it as well. As does Ford and Volvo who also said it was dangerous.) Where is that scenario data? I would suggest the reason you don’t put that out is that it would acknowledge your systems, as recent articles have stated, is nowhere near as far along as your hype and the industry echo chamber has advertised. “10 years of experience” and what do you have? Problems with uncovered lefts, merging and intersections in a well-marked, well lined gridded city in daylight and good weather? How much have you spent on that $3B?
Then I see you say, “Safety is at the core of everything you do”. That is absolute rubbish. You, and most AV makers, risks people’s lives needlessly using a process to create your technology that will never get you or them remotely close to creating a true autonomous vehicle. Said differently, you are doing the exact opposite of what you say your mission or intention is. You will never save anywhere near the lives a true AV would because you will never get close to making one and you will take more and more lives, needlessly in your futile efforts trying. It is an absolute myth that public shadow driving is a viable method to create AVs. That process can never come close to creating an AV. You cannot stumble and restumble on all the scenarios needed. Nor can you drive and redrive the one trillion miles needed. Or spend over $300B to do so. You also cannot run thousands of accident cases thousands of times each or cause any more casualties in the industry, especially a child or family, or public perception is going to tank. In your continual outpouring of hype and misleading metrics you are preying on people’s ignorance of this technology, misplaced trust, providing them false confidence and setting them and their families up to be completely unnecessary victims.
Exactly what do you think will happen when you move from the benign and hyped scenarios you run in the public domain now to those that are complex and dangerous, especially accident scenarios? What happens when you run thousands of accident cases thousands of times each? How many deaths will that create? I assume like Elon Musk and Mark Rosekind from Zoox, former head of NHTSA, you believe these are necessary evils, an ends to a means, and we should get used to them? (Odd that you called out Elon Musk for being reckless in the press a while back given what you are doing now. Help me understand how you are on higher ground?) I would also like to know exactly why you don’t run 99% of this is simulation for development? And why you are not using aerospace, DoD and FAA level simulation, engineering practices and testing rigor? And why if you are all about safety you don’t release your scenario data for those scenarios where your vehicles are engaged in the public domain? I am hopeful that in the interest of public safety and openness you will respond to everything I have said. As a matter of fact, a public conversation might be helpful? Possibly on Autonocast or some other forum?
For more information on the issues I mentioned and how to resolve them please see my article here
Impediments to Creating an Autonomous Vehicle