Sensor companies would benefit greatly from the use of Proper Simulation for most Autonomous Vehicle Development

Michael DeKort
4 min readSep 15, 2019

--

Assuming the revenue model of sensor vendors and any other company supporting this industry relies on mass production sales they should be driving an industry to switch to proper simulation to develop and test these systems. If they do not, the limbo period they are in now will literally be perpetual. Bankrupting most of them.

Except for those involved with Tesla, which has no LiDAR, there are only 1400 autonomous vehicles on the roads. Since the approach current shadow and safety driving approach can never yield anything close to a L4 system for time, cost and safety/liability reasons sales and the associated revenue and profits will never come. Not even from Tesla’s who may produce more vehicles with more sensors however that is a fraction of the possible whole and they to will stop when the first child or family is killed by the reckless and unnecessary safety driving process. The reason for all of this is below. While supporting simulation would lower near term sales it would bring L4 in from never to 5–10 years.

Supporting information

Why is Public shadow and safety driving untenable? The process being used by most AV makers to develop these systems, public shadow and safety driving, is untenable, has killed six people to date for no reason and will kill thousands more when accident scenarios are learned. It is impossible to drive the one trillion miles or spend over $300B to stumble and restumble on all the scenarios necessary to complete the effort. In addition, the process harms people for no reason. This occurs two ways. The first is through handover or fall back. A process that cannot be made safe for most complex scenarios, by any monitoring and notification system, because they cannot provide the time to regain proper situational awareness and do the right thing the right way, especially in time critical scenarios. The other dangerous area is training the systems to handle accident scenarios.

In order for driverless vehicles to train their machine learning systems in how to execute scenarios they have to experience them over and over and over. Thousands of times each in many cases. Whether that is using imitation learning or reinforcement learning. This means that in order to avoid or best navigate accident scenarios they have to be experienced. That means safety drivers have to literally be Kamikaze drivers. They have to allow the system to get through the scenario threads so engineers know how to apply various methods to lower the error rates until the rate is so low the scenario has been “learned”. This activity involves having accidents. That will cause injuries and deaths. As I said, safety drivers will need to sacrifice their lives, actually commit suicide for this to work. That of course involves the drivers, passengers and the public around them. Men, women, children, families, the elderly, handicapped etc. It is absolutely insane. Worst of all though is the fact that not only does this not have to happen but it is part of a process that can never result in making a legitimate autonomous vehicle and save the lives that tech would save. Resulting in the industry doing the exact opposite of their stated mission. The tech can’t exist, the lives that would save will not be and they are taking lives needlessly in that futile process. This is the whole last 10% or even 2% is the hardest thing. If you cannot learn the relevant accident scenarios this whole effort is useless. And by learn, I mean be as good and then better than a human. Both levels must be learned for all relevant scenarios. This is a quality issue not a quantity one.

Using the Real World is better than Proper Simulation for Autonomous Vehicle Development — NONSENSE

Autonomous shuttles are hurting people needlessly — It will get much worse

The Hype of Geofencing for Autonomous Vehicles

SAE Autonomous Vehicle Engineering Magazine-End Public Shadow Driving

My name is Michael DeKort — I am a former system engineer, engineering and program manager for Lockheed Martin. I worked in aircraft simulation, the software engineering manager for all of NORAD, the Aegis Weapon System, and on C4ISR for DHS.

Key Industry Participation

- Lead — SAE On-Road Autonomous Driving SAE Model and Simulation Task

- Member SAE ORAD Verification and Validation Task Force

- Member DIN/SAE International Alliance for Mobility Testing & Standardization (IAMTS) Sensor Simulation Specs

- Stakeholder for UL4600 — Creating AV Safety Guidelines

- Member of the IEEE Artificial Intelligence & Autonomous Systems Policy Committee (AI&ASPC)

- Presented the IEEE Barus Ethics Award for Post 9/11 Efforts

My company is Dactle

We are building an aerospace/DoD/FAA level D, full L4/5 simulation-based testing and AI system with an end-state scenario matrix to address several of the critical issues in the AV/OEM industry I mentioned in my articles below. This includes replacing 99.9% of public shadow and safety driving. As well as dealing with significant real-time, model fidelity and loading/scaling issues caused by using gaming engines and other architectures. (Issues Unity will confirm. We are now working together. We are also working with UAV companies). If not remedied these issues will lead to false confidence and performance differences between what the Plan believes will happen and what actually happens. If someone would like to see a demo or discuss this further please let me know.

--

--

Michael DeKort
Michael DeKort

Written by Michael DeKort

Non-Tribal Truth Seeker-IEEE Barus Ethics Award/9–11 Whistleblower-Aerospace/DoD Systems Engineer/Member SAE Autonomy and eVTOL development V&V & Simulation

Responses (1)