Tesla “Elon Musk Crash Course” documentary gets main points across but fell significantly and avoidably short to protect the industry

Michael DeKort
4 min readMay 21, 2022

I just finished watching the documentary “Crash Course” on Tesla’s “autopilot” and Full Self Driving”.

I am glad the documentary was made. And it likely opened some people’s eyes. But it not only fell well short, it could have avoided that easily. The reporters and film makers did little actual research and fell prey to the same echo chamber that got us here. Simply regurgitating small shifts in the new “conventional wisdom” gained by incremental epiphanies as opposed to digging just a bit deeper to tell it all. Which could have been done in a couple sentences. Having said that. The real reason is likely far more nefarious then intellectual laziness. They are trying to split hairs to go after Tesla, but keep anyone from figuring out there is an industry problem here. With Tesla only being the most egregious.

Some specific points and observations

· They only mentioned 3 of the 12 confirmed avoidable deaths

· The film NEVER mentioned that the development and testing approach requires the machine and deep learning processes need to experience scenarios, fail many of them, adjust the neural networks and repeat that process many times over to learn the scenarios and objects. (I think this is an example of splitting hairs and protecting the industry.) And that every AV maker does this. Nor did they mention the term ”safety driver” explain what it was or state human beings are the test subjects or Guinea pigs.

· Former NTSB Chairman Robert Sumwalt stated one of the root causes of, and resulting solution for, Tesla crashes is the systems allow autonomy capabilities in environments or Operational Design Domains (ODD) they are not designed for. This is a ridiculous, paradoxical, and reckless comment. Why? Because the systems cannot operate in a new ODD properly until it experiences, fails, updates and experiences the events until they are learned. So . . .if the systems were kept from engaging in the (development) autonomous mode they would never learn them so they could drive them successfully. Meaning the Brown, Banner and Huang deaths scenarios NEED to occur over and over, so they stop occurring.

· There was no mention that the “driver monitoring system (DM) now has an alarm delay of at least 20 seconds. Yes, it is way down from the minutes it had before. But 20 seconds is still effectively useless, especially at speed. Why is this happening? If the human “safety driver” Guinea pig disengages before the crucial crash data is gathered the system cannot learn to handle that scenario.

· Jon McNeill, the former Tesla president, who is clearly grossly negligent, misleading the audience and trying to protect himself, Tesla and Musk, stated “Stationary objects are vexing. Everyone has the problem.” While most do have the problem it is NOT vexing. The tech has existed to do this for decades. The industry uses low lateral, and vertical, fidelity radars to keep the cost down. (Hopefully that is being fixed by imaging radars.)

· Zero mention of simulation as the right development and testing alternative. (Setting aside the gaming-based simulation tech the industry uses is inadequate.)

More on my POV here. Including how to do this right.

SAE Autonomous Vehicle Engineering magazine editor calling me “prescient” regarding my position on Tesla and the overall driverless vehicle industry’s untenable development and testing approach — (Page 2) https://assets.techbriefs.com/EML/2021/digital_editions/ave/AVE-202109.pdf

NHTSA should impose an immediate “Autopilot” moratorium and report initial investigation findings in 30 days

· https://imispgh.medium.com/nhtsa-should-impose-an-immediate-autopilot-moratorium-and-report-initial-investigation-findings-de5b6da4d704

The Autonomous Vehicle Industry can be Saved by doing the Opposite of what is being done now to create this technology

· https://medium.com/@imispgh/the-autonomous-vehicle-industry-can-be-saved-by-doing-the-opposite-of-what-is-being-done-now-b4e5c6ae9237

How the failed Iranian hostage rescue in 1980 can save the Autonomous Vehicle industry

· https://imispgh.medium.com/how-the-failed-iranian-hostage-rescue-in-1980-can-save-the-autonomous-vehicle-industry-be76238dea36

By not providing any meaningful proof of being driverless, even fighting doing through a lawsuit, Waymo, Cruise and Gatik are misleading the public, putting their lives at risk, and collapsing

· https://imispgh.medium.com/by-not-providing-any-meaningful-proof-of-being-driverless-even-fighting-doing-through-a-lawsuit-f07790d6f96a

My name is Michael DeKort — I am Navy veteran (ASW C4ISR) and a former system engineer, engineering, and program manager for Lockheed Martin. I worked in aircraft simulation, as the software engineering manager for all of NORAD, a software project manager on an Aegis Weapon System baseline, and a C4ISR systems engineer for DoD/DHS and the US State Department (counterterrorism). And a Senior Advisory Technical Project Manager for FTI to the Army AI Task Force at CMU NREC (National Robotics Engineering Center)

Autonomous Industry Participation — Air and Ground

- Founder SAE On-Road Autonomous Driving Simulation Task Force

- Member SAE ORAD Verification and Validation Task Force

- Member UNECE WP.29 SG2 Virtual Testing

- Stakeholder USDOT VOICES (Virtual Open Innovation Collaborative Environment for Safety)

- Member SAE G-35, Modeling, Simulation, Training for Emerging AV Tech

- Member SAE G-34 / EUROCAE WG-114 Artificial Intelligence in Aviation

- Member Teleoperation Consortium

- Member CIVATAglobal — Civic Air Transport Association

- Stakeholder for UL4600 — Creating AV Safety Guidelines

- Member of the IEEE Artificial Intelligence & Autonomous Systems Policy Committee

SAE Autonomous Vehicle Engineering magazine editor calling me “prescient” regarding my position on Tesla and the overall driverless vehicle industry’s untenable development and testing approach — (Page 2) https://assets.techbriefs.com/EML/2021/digital_editions/ave/AVE-202109.pdf

Presented the IEEE Barus Ethics Award for Post 9/11 DoD/DHS Whistleblowing Efforts

--

--

Michael DeKort

Non-Tribal Truth Seeker-IEEE Barus Ethics Award/9–11 Whistleblower-Aerospace/DoD Systems Engineer/Member SAE Autonomy and eVTOL development V&V & Simulation