NTSB Chair falsely states “We have done all we can do” with regard to Tesla’s “Autopilot” debacle

NTSB Chair Jennifer Homendy appeared on this week’s Autonocast — https://www.autonocast.com/blog/2021/9/22/236-ntsb-chair-jennifer-homendy — where she stated “we have done all we can do” with respect to Tesla’s “Autopilot” and “Full Self-Driving” performance with regard to the crashes is has had and the people it has injured or killed. That is factually incorrect. The NTSB has not done anywhere near what it can and should do. While it has rightly pushed back on Tesla for its misleading capability naming, it has ignored automatic emergency braking issues and in tandem, Tesla’s issues with stationary and crossing objects. It also believes “driver monitoring” is a panacea. Worse of all its conclusion regarding what these systems should be capable of in the public domain WHILE they are literally learning to do so, is incoherent, paradoxical, and reckless. Please see more on this in my article here.

The NTSB frets over human Guinea pigs then chastises and punts to the even more reckless NHTSA

· https://imispgh.medium.com/the-ntsb-frets-over-human-guinea-pigs-then-chastises-and-punts-to-the-even-more-reckless-nhtsa-f406046ddd3

While discussing Tesla further there was a discussion on what data the NTSB can and use from Tesla. The chairwoman stated the NTSB actually got a car from Tesla. This would allow them to use that car to duplicate the plethora of crash or poor driving scenarios available on the internet or your own investigations. Why does it need more than that to determine the system’s engineering is grossly incompetent and suggest NHTSA shut them down? Or at least impose a moratorium while they investigate further.

Moving beyond Tesla and demonstrating more proof of the NTSB’s flawed understanding of the issues, Chariwoman Homendy lauded Uber for their performance after the Herzberg tragedy and their investigation. She stated Uber went above and beyond after that to take safety seriously. How is that possible if they relied on human Guinea pigs inside and outside the car. Tell me exactly why simulation could not have been used in EVERY Tesla failure or crash or in the Uber crash? While I constantly hammer on gaming technology not being good enough to get near L4. It is good enough for most of these benign or non-complex scenarios, especially from a multiple sensor interaction POV. Why isn’t the NTSB recommending NHTSA demand the AV makers prove why they need to use human Guinea pigs on public roads? Why isn’t there a progressive due diligence process that leverages simulation and tracks first. (Part of this issue is the gaming technology shortfalls. Many AV makers know they have issues, have no idea DoD/aerospace simulation technology solves that, or don’t want to admit they get it now and were wrong

With regard to driver monitoring. Then the chairwoman listed better driver monitoring (DM) as a fix to the Tesla issue. Once again, she fails to get it. Yes, DM would reduce crashes. Including of human Guinea pigs, especially since Tesla’s DM alarm delay is 20 seconds or more and as such is useless, they would benefit from real DM. However, that does not solve handover issues in time critical scenarios. Which many crashes compromise. And it does not address the needless use of human Guinea pigs inside and outside of the vehicles.

As the podcast continued Chairwomen Homendy also stated that the government should be ahead of crashes. That they should be proactive and not wait for crashes to happen to do the right thing. How is that going to happen if the NTSB has all of the issues I listed above?

After this the Chairwoman addressed the self-certification and capability determination and advertising by the manufacturers. She rightly stated that companies should not be able to make their own determinations. And to do so in a bubble. There should be testable criteria that determines these declarations and marketing pitches are legitimate.

Finally, the panel discussed the danger of everyone acting like Tesla. PEOPLE everyone relying on the real-world and human Guinea pigs in the public domain MUST, by definition, harm or kill people as they learn many crash scenarios, especially how to best handle the ones they can’t avoid. Yes, most will harm or kill fewer people than Tesla for a variety of reasons, they will harm or kill people needlessly nonetheless. And it’s avoidable.

More on my POV here

NHTSA should impose an immediate “Autopilot” moratorium and report initial investigation findings in 30 days

· https://imispgh.medium.com/nhtsa-should-impose-an-immediate-autopilot-moratorium-and-report-initial-investigation-findings-de5b6da4d704

The Autonomous Vehicle Industry can be Saved by doing the Opposite of what is being done now to create this technology

· https://medium.com/@imispgh/the-autonomous-vehicle-industry-can-be-saved-by-doing-the-opposite-of-what-is-being-done-now-b4e5c6ae9237

SAE Autonomous Vehicle Engineering Magazine — Simulation’s Next Generation (featuring Dactle)

· https://www.sae.org/news/2020/08/new-gen-av-simulation

How the failed Iranian hostage rescue in 1980 can save the Autonomous Vehicle industry

· https://imispgh.medium.com/how-the-failed-iranian-hostage-rescue-in-1980-can-save-the-autonomous-vehicle-industry-be76238dea36

USDOT introduces VOICES Proof of Concept for Autonomous Vehicle Industry-A Paradigm Shift?

· https://imispgh.medium.com/usdot-introduces-voices-proof-of-concept-for-autonomous-vehicle-industry-a-paradigm-shift-87a12aa1bc3a

Tesla “autopilot” development effort needs to be stopped and people held accountable

· https://medium.com/@imispgh/tesla-autopilot-development-effort-needs-to-be-stopped-and-people-arrested-f280229d2284

Systems Engineer, Engineering/Program Management -- DoD/Aerospace/IT - Autonomous Systems Air & Ground, FAA Simulation, UAM, V2X, C4ISR, Cybersecurity