Safety Driving is not Legal as there is no Qualified Driver in Control

I would like to make the case that when the safety driver is allowing the autonomous vehicle to drive itself, usually when testing, there is no legal or qualified driver in control of the vehicle.

It is assumed that since the safety driver can take over in all scenarios the vehicle is actually under the control of and being driven by the human safety driver. This is a fatally flawed assumption. The reason is that in complex and time critical scenarios, of which many are accidents, there is not enough time provided to the safety driver for them to gain enough situational awareness to do the right thing the right way. While training and monitoring and alarm systems help they do not resolve the issue. This means that in those scenarios where a human driving would be able to handle the relevant scenarios there is actually not a qualified and therefore legal operator of the vehicle. Said another way — since the safety driver cannot take over in many time critical or accident scenarios as well as a regular human driver could the vehicle is being driven illegally as neither the safety driver of autonomous system under test is competent.

To make matters worse most monitor and alarm systems allow four or more seconds to notify the driver they are distracted. Tesla waits twenty-four seconds. These are an eternity. Do the math on how far a vehicle goes laterally in that time at various speeds. (I believe these times are chosen so the safety driver loses situational awareness. This will allow the AV to experience longer or entire threads for testing, especially accident scenarios. Said differently — these companies don’t want the human to avoid accidents so they can learn them. Thus, making them suicide or Kamikaze drivers. And of course, they are also risking the lives of others in and outside the vehicle.

More on my POV here

Autonomous Vehicles need to Have Accidents to Develop this Technology

Will Kamikaze drivers do their part to create Driverless Vehicles?

Using the Real World is better than Proper Simulation for Autonomous Vehicle Development — NONSENSE

SAE Autonomous Vehicle Engineering Magazine-End Public Shadow Driving

My name is Michael DeKort — I am a former system engineer, engineering and program manager for Lockheed Martin. I worked in aircraft simulation, the software engineering manager for all of NORAD, the Aegis Weapon System, and on C4ISR for DHS.

Key Industry Participation

- Lead — SAE On-Road Autonomous Driving SAE Model and Simulation Task

- Member SAE ORAD Verification and Validation Task Force

- Member DIN/SAE International Alliance for Mobility Testing & Standardization (IAMTS) Sensor Simulation Specs

- Stakeholder for UL4600 — Creating AV Safety Guidelines

- Member of the IEEE Artificial Intelligence & Autonomous Systems Policy Committee (AI&ASPC)

- Presented the IEEE Barus Ethics Award for Post 9/11 Efforts

My company is Dactle

We are building an aerospace/DoD/FAA level D, full L4/5 simulation-based testing and AI system with an end-state scenario matrix to address several of the critical issues in the AV/OEM industry I mentioned in my articles below. This includes replacing 99.9% of public shadow and safety driving. As well as dealing with significant real-time, model fidelity and loading/scaling issues caused by using gaming engines and other architectures. (Issues Unity will confirm. We are now working together. We are also working with UAV companies). If not remedied these issues will lead to false confidence and performance differences between what the Plan believes will happen and what actually happens. If someone would like to see a demo or discuss this further please let me know.

Systems Engineer, Engineering/Program Management -- DoD/Aerospace/IT - Autonomous Systems Air & Ground, FAA Simulation, UAM, V2X, C4ISR, Cybersecurity

Systems Engineer, Engineering/Program Management -- DoD/Aerospace/IT - Autonomous Systems Air & Ground, FAA Simulation, UAM, V2X, C4ISR, Cybersecurity