NTSB states L3 should be Conditional — Now eliminate Public Shadow Driving

Today the NTSB countered NHTSA and determined Tesla had considerable blame in the tragedy and that policy should be modified regarding L3 vehicles. I believe that by extension that includes public shadow driving for AI, engineering and test. It is the same functionally and far more dangerous.

Key Statements/Summaries from the NTSB and Press Reports

· “The result was a collision that should not have happened. System safeguards were lacking.”

· “Tesla allowed the driver to use the system outside of the environment for which it was designed … and the system gave far more leeway to the driver to divert his attention to something other than driving,” he said. “The result was a collision that, frankly, should have never happened.”

· “A world of perfect self-driving cars would eliminate tens of thousands of deaths and millions of injuries each year,” Sumwalt said. “It is a long route from being partially automated vehicles to self-driving cars. Until we get there, somebody still has to drive.”

· “A driver could have difficulties interpreting exactly [on] which roads it might be appropriate”

· The staff recommended that Tesla and makers of other semiautonomous vehicles use satellite data to determine the type of road their vehicles are on and that they allow Autopilot-style technology to be used only where appropriate.

(While it was also determined that Joshua Brown spent too much time not attending to the vehicle the NTSB’s findings reflect situations where that would not be the case).

What is clear here is that the NTSB and NHTSA are now in conflict regarding the AV capabilities matching the environment. The problem remaining though is the NTSB and NHTSA wrongly believe people can be coerced or forced to pay attention properly in L3 or shadow driving conditions.

This stems from NHTSA’s 2015 L2/L3 Study that determined L3 could be made safe using a notification system. The problem is they determined “total control” of the vehicle is gained when someone addresses the wheel and pedals after being distracted. They neglected to assess the quality of actions taken after that. As well as the situational awareness time needed to perform actions correctly. NASA, Clemson and the University of South Hampton determined that time is 7 to 24 seconds. Meaning it was not possible to use alarm or notification systems to reliably and consistently enable, prod or force drivers to get the proper amount of situational awareness. They believe as do Volvo, Ford, Waymo and Chris Urmson L3 should be skipped. Ford has stated their professional drivers fell asleep and they could not keep it from happening. NASA can back up that study data with real-world air tragedy data.

Whether L3 notification systems worked or not since the “appropriate” capabilities have not been created yet public shadow driving for AI, engineering and test should be minimized to extremely rare and controlled conditions. The industry should use aerospace level simulation and test tracks instead. (This also resolves the fact that one trillion miles of public shadow driving would be needed at a cost of over $300B. A situation that would mean getting to L4 is impossible).

If these approaches are not taken the first child or family will perish needlessly. Worse than that are the thousands of fatalities we will see if the AV makers continue using public shadow driving and are permitted to move beyond the hyped progress of their benign scenario driving to complex and dangerous scenarios. Especially crash scenarios. Scenarios they will run thousands of variations of thousands of times each. When the public, lawmakers, the press and lawyers figure this out they will realize they have been duped. They will then come to the conclusion those involved are incompetent and/or unethical. That will result in a brick wall going up and far more regulation than if this industry could end the hype, Wild West chaos and self-police.

The only reason we have not switched to simulation is that the majority of the industry believe the epitome of simulation is Grand Theft Auto. Or at the very least have no idea how much more advanced aerospace simulation is. If the players in the industry could stop the false confidence creating, overly hyped, uncoordinated, self-defeating and selfish path they are on and come together to self-educate, self-regulate and work to get simulation to where it can be all of this can be easily resolved. And maybe that first child or family need not perish and we can have autonomous vehicles in 5 to 10 years. (I say 5–10 because anything less is hype and simulation will allow for the one trillion miles and the associated scenarios to be culled down by over 99%. Finally Waymo announced recently it is moving to simulation. This after saying L3 had to be skipped. Think there is a reason for this?)

For more on this please see these articles

Who will get to Autonomous Level 5 First and Why (This includes use of aerospace best practices)

Letter to Congress — Handling of minimum standards for Autonomous industry (This article includes all of the AV issues, how to resolve them and root casues)

NASA saved SpaceX — NHTSA is doing the opposite for Tesla, the AV Industry and the Public

The Dangers of Inferior Simulation for Autonomous Vehicles

Systems Engineer, Engineering/Program Management -- DoD/Aerospace/IT - Autonomous Systems Air & Ground, FAA Simulation, UAM, V2X, C4ISR, Cybersecurity