Autonomous Vehicles Need to Have Accidents to Develop this Technology

Michael DeKort
4 min readAug 22, 2019

I write about this a lot. But I need people to get it. This may be the most absurd and reckless engineering approach mankind has ever used. And the worst part of all it is unnecessary. This is the biggest dirty secret in the industry.

You cannot divorce the unintended negative consequences from the intended ones when humans are in and around the vehicles.

In order for driverless vehicles to train their machine learning systems in how to execute scenarios they have to experience them over and over and over. Thousands of times each in many cases. Whether that is using imitation learning or reinforcement learning. This means that in order to avoid or best navigate accident scenarios they have to be experienced. That means safety drivers have to literally be Kamikaze drivers. They have to allow the system to get through the scenario threads so engineers know how to apply various methods to lower the error rates until the rate is so low the scenario has been “learned”. This activity involves having accidents. That will cause injuries and deaths. As I said, safety drivers will need to sacrifice their lives, actually commit suicide for this to work. That of course involves the drivers, passengers and the public around them. Men, women, children, families, the elderly, handicapped etc. It is absolutely insane. Worst of all though is the fact that not only does this not have to happen but it is part of a process that can never result in making a legitimate autonomous vehicle and save the lives that tech would save. Resulting in the industry doing the exact opposite of their stated mission. The tech can’t exist, the lives that would save will not be and they are taking lives needlessly in that futile process. This is the whole last 10% or even 2% is the hardest thing. If you cannot learn the relevant accident scenarios this whole effort is useless. And by learn, I mean be as good and then better than a human. Both levels have to be learned for all relevant scenarios. This is a quality issue not a quantity one.

(Why is Public shadow and safety driving untenable? — The process being used by most AV makers to develop these systems, public shadow and safety driving, is untenable, has killed six people to date for no reason and will kill thousands more when accident scenarios are learned. It is impossible to drive the one trillion miles or spend over $300B to stumble and restumble on all the scenarios necessary to complete the effort. In addition, the process harms people for no reason. This occurs two ways. The first is through handover or fall back. A process that cannot be made safe for most complex scenarios, by any monitoring and notification system, because they cannot provide the time to regain proper situational awareness and do the right thing the right way, especially in time critical scenarios. The other dangerous area is training the systems to handle accident scenarios which I discussed above. The solution is to use DoD/aerospace simulation technology, informed and validated by real-world data, for most of the development and testing. Prove you are worthy of going into the real world.)

More on my POV here including the solution here

Proposal for Successfully Creating an Autonomous Ground or Air Vehicle

Using the Real World is better than Proper Simulation for Autonomous Vehicle Development — NONSENSE

Autonomous shuttles are hurting people needlessly — It will get much worse

The Hype of Geofencing for Autonomous Vehicles

SAE Autonomous Vehicle Engineering Magazine-End Public Shadow Driving

My name is Michael DeKort — I am a former system engineer, engineering and program manager for Lockheed Martin. I worked in aircraft simulation, the software engineering manager for all of NORAD, the Aegis Weapon System, and on C4ISR for DHS.

Key Industry Participation

- Lead — SAE On-Road Autonomous Driving SAE Model and Simulation Task

- Member SAE ORAD Verification and Validation Task Force

- Member DIN/SAE International Alliance for Mobility Testing & Standardization (IAMTS) Sensor Simulation Specs

- Stakeholder for UL4600 — Creating AV Safety Guidelines

- Member of the IEEE Artificial Intelligence & Autonomous Systems Policy Committee (AI&ASPC)

- Presented the IEEE Barus Ethics Award for Post 9/11 Efforts

My company is Dactle

We are building an aerospace/DoD/FAA level D, full L4/5 simulation-based testing and AI system with an end-state scenario matrix to address several of the critical issues in the AV/OEM industry I mentioned in my articles below. This includes replacing 99.9% of public shadow and safety driving. As well as dealing with significant real-time, model fidelity and loading/scaling issues caused by using gaming engines and other architectures. (Issues Unity will confirm. We are now working together. We are also working with UAV companies). If not remedied these issues will lead to false confidence and performance differences between what the Plan believes will happen and what actually happens. If someone would like to see a demo or discuss this further please let me know.

--

--

Michael DeKort

Non-Tribal Truth Seeker-IEEE Barus Ethics Award/9–11 Whistleblower-Aerospace/DoD Systems Engineer/Member SAE Autonomy and eVTOL development V&V & Simulation