DoT, NHTSA and NTSB are Enabling Autonomous Vehicle Tragedies

It was announced today that the NHTSA and NTSB will investigate the Culver City, CA Tesla Model S firetruck accident. It is reported the Autopilot was engaged. In addition to that the Automatic Emergency Braking (AEB) may not have worked properly. While it may be proven neither of these is true this kind of thing will become not just commonplace but far, far worse as the AV makers move from hyped benign scenarios they run now to thousands of accident scenarios they will have to run thousands of times each to train the AI and “test”. This would all be avoidable if the industry and government did their due diligence regarding several critical engineering practice flaws in the industry.

Industry and Government Wide Problems

  • Most AV makers are using public shadow driving for AI and testing. That practice will not only never get close to producing an autonomous vehicle it will cause completely avoidable tragedies as the scenarios move from benign through the complex to the dangerous. Especially the thousands of accident cases they will have to run thousands of times each. It has also been found to be impossible to drive and redrive the estimated one trillion miles required to stumble and restumble on all the scenarios needed to be driven over and over to train the AI. (And it is impossible for almost every AV maker to spend over $300B to do so).
  • Handover for L2/L3 — This practice cannot be made reliably safe. Especially when the driver is allowed to let go of the wheel, turn their eyes from the road and lose situational awareness. NASA, Waymo, Toyota, Ford, Chris Urmson and a plethora of studies have all shown there is no combination of monitoring and notification system that can be used that makes this practices consistently safe. That is because you cannot provide the 5–45 seconds of time required to regain situational awareness to do the right thing the right way after being distracted. Especially when the situation is complex and at high speed.

Government Specific Issues

  • Note — States and Local Governments — while the new law makes the federal governments the lead on this you need to do your part. Putting your citizens at risk so you can draw the Av makers to your cities is not going to end well if you stay on the same course you are on now.

NHTSA

  • They published an L2/L3 Handover Study in 2015 that stated this practice could be made safe using monitoring and alert systems. They even said they tested it to prove it worked well. The problem is their test was beyond flawed. They determined full control was regained, after being distracted, by simply grabbing the wheel and looking straight ahead. They never even attempted to look at the quality of the action taken or if proper situational awareness time could be regained
  • They determined that Tesla had no fault in the Joshua Brown accident. And that the term “AutoPilot” was not misleading. They blamed Joshua Brown for the tragedy. They determined he was not paying proper attention and ignored or created a condition where he could not hear or see the warning the vehicle was providing him. (I do agree that Mr. Brown was careless).

NTSB

  • First I want to say that I have great respect for what the NTSB has done regarding air and rail accidents. They risk squandering that stellar reputation in the Av world if they do not regroup.
  • The NTSB did find NHTSA was wrong and that Tesla had some blame in the Joshua Brown accident. It was determined that the Tesla should not have allowed AP to be engaged if it could not handle situations that could reasonably occur in the area it was being used.
  • Like NHTSA they also found Joshua Brown to contributed to the accident for letting go of the wheel and not paying attention for up to 33 minutes at a time.
  • Where the NTSB went tragically wrong was agreeing with NHTSA that handover/l2/L3 can be made reliably safe.

DoT

  • When the GAO recently admonished the DoT for not creating tests to ensure AV systems could drive as good and then better than a human the DoT stated they could not create those tests until the technology was worked out. This is patently ridiculous. While the tech is not sorted out, especially since none of the current sensor can detect objects well in bad weather, there is virtually nothing stopping them from determining what scenarios need to be run to test the systems. In the vast majority of the cases the DoT can determine the “what” of the testing and the “how well” without knowing how it is done.

Solutions

Public Shadow Driving for AI and Testing — Switch to use of aerospace level simulation. Those folks have had the technology to do what is needed for 20 years.

YES I KNOW WHAT YOU ARE THINKING

I can here you now. The scenarios in commercial air travel are much less complex. Yes — but in DoD they hold very complex simulated war games in urban areas where hundreds of entities interact. All donated from many simulation systems, simulators and even real live vehicles and personnel.

Regarding your current industry simulation capabilities and products. I have yet to see or even hear about anyone doing what is need to get anywhere near L4 properly. You have to integrate the AV sensor simulation with full motion Driver-in-the-Loop (DIL). And that has to be done in proper real-time. If this is not done right the latency and lack of motion cues will lead to flawed AI in many cases. In some cases those flaws will not be found in testing because it is based on the same flawed design. You will not figure out you were wrong until real world tragedies occur. (Be very wary of cloud based simulation. I do not believe it can work with a full motion DIL in proper real-time).

Handover/L2/L3 — This practice and these levels need to be skipped. As Waymo, Ford, and Chris Urmson have discovered you need to go from L1 to L4. (Yes there are times handover in certain situation may be the best option. But that is not doing it as a normal function. And for you folks who think you can remote control vehicles in trouble you better solve the latency issues and use a full motion simulator. If you do not do this you will get in tragic delay loops and drive improperly in many situations because you lack motion cues).

DoT/NHTSA and the NTSB — You folks need to do your homework, due diligence and lead. Just like the FAA figured out after going through much of the same growth pains. HINT — you don’t have to repeat their mistakes. Create that scenario test matrix, follow the FAA’s example in qualifying and using simulation and simulators and lead the effort to stop public shadow driving and handover/L2/L3.(With regard to creating those tests. You have to include a top down effort).

For more on this please see these articles

Autonomous Levels 4 and 5 will never be reached without Simulation vs Public Shadow Driving for AI

Corner or Edge Cases are not Most Complex or Accident Scenarios

The Dangers of Inferior Simulation for Autonomous Vehicles

Written by

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store