Calling for Tesla Autopilot recall is actually calling for an industry wide Autonomous Vehicle recall

Now I see the industry starting to call for Tesla’s Autopilot (AP) to be recalled because the NTSB (finally) confirmed Jeremy Banner had his Tesla in AP when he was killed. Killed in a scenario that appears to be just like the one that killed Joshua Brown almost 3 years ago. Be careful what you wish for though. What makes you think Tesla is the only AV maker who deserves, or will soon deserve, to have their systems recalled? Tesla uses the exact same approach virtually every autonomous vehicle maker uses to develop and test their systems — Public Shadow and Safety Driving. Tesla has killed more people because of how aggressive they are and that they do not use LiDAR (and their camera system clearly doesn’t make up the gap.) At some point every AV maker will have to tackle highways. Yes, those with better sensor technology or fusion will kill less people but they will kill people, nonetheless. And the body count for all of them will get far worse when they stumble and restumble on complex and accident scenarios. Thousands of them thousands of times each. Try thinking outside of the conventional wisdom and echo chamber. How is it possible to use this approach and not injure and kill people? Beyond that is the fact that you can never get close to stumbling and restumbling on the equivalent miles you will need to get through (500B to one trillion) or be able to spend over $300B to try.

The solution is a paradigm shift. Switch 99.9% of that public shadow/safety driving for proper simulation. Use the real-world to inform and validate the simulation then utilize a progressive approach to using test tracks and real-world safety driving if needed. To use a test track for safety driving you must prove simulation cannot be effective. Then to use real-world safety driving you must prove neither of those is effective. And if that ever happens it must be a controlled situation. Not unlike a movie set.

So . . . let that epiphany happen. Keep calling for Tesla’s AP to be recalled. That will hasten the paradigm shift we need and hopefully avoid the first death of a child or family for no reason.

(This also goes for folks saying that driver cognition needs to be verified before this is allowed. Since that includes critical/accident scenarios for which it is not possible to provide time to regain proper situational awareness you are also effectively stopping the Autonomous Vehicle industry from developing their systems using Public Shadow/Safety Driving.)

(Note — Proper simulation being aerospace/DoD simulation technology, not gaming technology. Gaming architectures have too many fidelity and scaling issues. That will result in avoidable flawed training, false confidence and real-world tragedies.)

Please see more information here

NTSB confirms Tesla Autopilot used in Banner death-Neglects to mention similarity to Brown tragedy almost 3 years ago

SAE Autonomous Vehicle Engineering Magazine-End Public Shadow/Safety Driving

Using the Real World is better than Proper Simulation for Autonomous Vehicle Development — NONSENSE

How Driverless Vehicle Makers Should Prove their Technology Works

The Hype of Geofencing for Autonomous Vehicles

Systems Engineer, Engineering/Program Management -- DoD/Aerospace/IT - Autonomous Systems Air & Ground, FAA Simulation, UAM, V2X, C4ISR, Cybersecurity

Systems Engineer, Engineering/Program Management -- DoD/Aerospace/IT - Autonomous Systems Air & Ground, FAA Simulation, UAM, V2X, C4ISR, Cybersecurity