Waymo says it doesn’t cut corners and that’s a big problem

Michael DeKort
3 min readApr 26, 2021

--

The story from Forbes (who like most of the press in this industry are groupies not investigative reporters)

https://www.forbes.com/sites/alanohnsman/2021/04/25/amid-tesla-crash-concerns-waymos-new-co-ceos-say-no-shortcuts-to-safe-autonomy/?sh=3489d35b4bdb

So why does it appear I am suddenly reversing myself and asking for an autonomous vehicle maker to cut corners? Because that would save lives in this case.

In order for Waymo or any other AV maker to get near L4 in the public domain and to rely on public shadow and safety driving, segmented by gaming-based simulation, they literally have to harm or fatally injure their needless human Guinea pig “safety drivers” at some point. Many of them, especially if deep learning is used. Why? Because the machine learning has to experience and learn scenarios to know what to do when they occur. Since machine learning is extremely inefficient it requires a significant amount of repetitive experiencing, testing and neural network adjustments to learn scenarios. This is true no matter what the outcome of the scenarios is. Or whether we want that outcome. A scenario is a scenario. That includes corner and edge cases as well as accident scenarios. (I would argue those scenarios matter the most. For hopefully obvious reasons.) While many accidents are avoided as a result of learning scenarios as the benign progresses to the complex, there are many that cannot be avoided. And, in many cases, just being able to tell they are avoided requires the Kamikaze driver to hang in there and not disengage until there is no longer time to do so. This is where “cutting corners” would help. If Waymo stays with it’s current development method AND gets anywhere near L4 it will have to harm or fatally injure people. If it cut corners, it might save lives, if only to never get to L4.

More in my articles below. Including how to avoid all of this:

If Waymo is really L4, why are they hiding data from Chandler residents and the rest of us? Autonowashing?

· https://imispgh.medium.com/if-waymo-is-really-l4-why-are-they-hiding-data-from-chandler-residents-and-the-rest-of-us-5dce0de45e31

Be Wary of Waymo’s New Safety Record and Brad Templeton’s Declaration the System is Superhuman and should be Deployed Today

· https://imispgh.medium.com/be-wary-of-waymos-new-safety-record-and-brad-templeton-s-declaration-the-system-is-superhuman-and-ea59f5739567

The Autonomous Vehicle Industry can be Saved by doing the Opposite of what is being done now

· https://medium.com/@imispgh/the-autonomous-vehicle-industry-can-be-saved-by-doing-the-opposite-of-what-is-being-done-now-b4e5c6ae9237

SAE Autonomous Vehicle Engineering Magazine — Simulation’s Next Generation (featuring Dactle)

· https://www.sae.org/news/2020/08/new-gen-av-simulation

My name is Michael DeKort — I am a former system engineer, engineering and program manager for Lockheed Martin. I worked in aircraft simulation, the software engineering manager for all of NORAD, the Aegis Weapon System, and on C4ISR for DHS.

Key Autonomous Vehicle Industry Participation

- Founder SAE On-Road Autonomous Driving Simulation Task Force

- Member SAE ORAD Verification and Validation Task Force

- Stakeholder USDOT VOICES (Virtual Open Innovation Collaborative Environment for Safety)

- Member SAE G-34 / EUROCAE WG-114 Artificial Intelligence in Aviation

- Stakeholder for UL4600 — Creating AV Safety Guidelines

- Member of the IEEE Artificial Intelligence & Autonomous Systems Policy Committee

- Presented the IEEE Barus Ethics Award for Post 9/11 DoD/DHS Whistleblowing Efforts

My company is Dactle

We are building an aerospace/DoD/FAA level D, full L4/5 simulation-based testing and AI system with an end-state scenario matrix to address several of the critical issues in the AV/OEM industry I mentioned in my articles below. This includes replacing 99.9% of public shadow and safety driving. As well as dealing with significant real-time, model fidelity and loading/scaling issues caused by using gaming engines and other architectures. (Issues Unity will confirm. We are now working together. We are also working with UAV companies). If not remedied these issues will lead to false confidence and performance differences between what the Plan believes will happen and what actually happens. If someone would like to see a demo or discuss this further please let me know.

--

--

Michael DeKort
Michael DeKort

Written by Michael DeKort

Non-Tribal Truth Seeker-IEEE Barus Ethics Award/9–11 Whistleblower-Aerospace/DoD Systems Engineer/Member SAE Autonomy and eVTOL development V&V & Simulation

No responses yet