NTSB confirms Tesla Autopilot used in Banner death-Neglects to mention similarity to Brown tragedy almost 3 years ago

Michael DeKort
3 min readMay 16, 2019

https://www.theverge.com/2019/5/16/18627766/tesla-autopilot-fatal-crash-delray-florida-ntsb-model-3

The NTSB confirmed today that Autopilot (AP) was used in the Jeremey Banner tragedy on March 1, 2019. (It takes 10 weeks to confirm this?). What they nor most of the press, especially those who are pro-Tesla, are mentioning is this accident is extremely similar to the one that killed Joshua Brown almost 3 years ago. In both accidents the Tesla ran right under a trailer with plenty of time to react. And it was on a clear day. Mostly importantly the NTSB and most reporters also neglected to mention that the NTSB determined after the Brown investigation that any autonomous system should function in location relevant scenarios. Or the feature should not engage. Of course, this makes no sense since these AV makers use public shadow and safety driving to develop and test these systems. Which means they have to stumble and restumble on scenarios just like this to learn them so they can handle them in these locations. The NTSB seems to fail to recognize this. Worse yet is I KNOW they understand most of this should be done in simulation because the NTSB head of investigations, Robert Molloy, stated this in an event I attended in DC last year. Someone should explain exactly why the real world is needed for this vs test tracks or proper simulation. (Proper simulation means using aerospace/DoD technology not gaming based technology. Those systems have significant real-time, model fidelity and loading scaling issues.) Finally, all of this is made far worse by the fact that NHTSA enables safety driving/handover issues by virtue of its 2015 L3 Study. In that study they determined handover can be made safe in all scenarios with monitoring and alarm systems. Problem is they chose to ignore situational awareness issues. Had they done their jobs they would know it is not possible to provide the time to regain proper situational awareness to affect the right outcome in complex scenarios. Those ranging from 3–45 seconds. (That study was sanctioned by Mark Rosekind who led NHTSA at the time. He is now the “Safety Innovation Officer” at Zoox who has stated that lives need to be lost in the process to save more later. That of course in complete nonsense.)

If the NTSB follows what it said in Brown determination almost 3 years ago, especially given this looks to be the same scenario, stating scenarios germane to a location should work, public shadow/safety driving (L2/3/handover) are done. The reason for that being since you will no longer be allowed to put an AV in the public domain that cannot handle relevant scenarios you cannot then use the same locations to train the systems to operate there. That will mean a paradigm shift to simulation. That simulation needs to be aerospace/DoD technology based since gaming based systems have significant real-time, model fidelity and loading/scaling issues.

(If you don’t think aerospace/DoD simulation technology can replace 99.9% of public shadow/safety driving I will be glad to prove it to you with a demo.)

Please see more on this below

Former NHTSA head and NTSB member putting people’s lives at risk to further an untenable and reckless process to develop Autonomous Vehicles

NHTSA saved children from going to school in autonomous shuttles and leaves them in danger everywhere else

SAE Autonomous Vehicle Engineering Magazine-End Public Shadow/Safety Driving

Using the Real World is better than Proper Simulation for Autonomous Vehicle Development — NONSENSE

How Driverless Vehicle Makers Should Prove their Technology Works — —

The Hype of Geofencing for Autonomous Vehicles

--

--

Michael DeKort

Non-Tribal Truth Seeker-IEEE Barus Ethics Award/9–11 Whistleblower-Aerospace/DoD Systems Engineer/Member SAE Autonomy and eVTOL development V&V & Simulation