Forcing Tesla to Change the “Autopilot” Name is a Disingenuous Band-Aid

Michael DeKort
4 min readAug 22, 2021

--

Forcing Tesla to Change the “Autopilot” Name is a Disingenuous Band-Aid

Of course, the “Autopilot” and “Full-Self-Driving” names should be changed because they are factually incorrect, misleading, unethical, immoral and have led and will lead to more people being harmed.

But, IT’S NOT THE REAL PROBLEM

THE INDUSTRY WANTS YOU TO THINK IT IS

THE AUTONOWASHING FOLKS ARE WOLVES IN SHEEP’S CLOTHING

THE NTSB DOESN’T GET IT EITHER

If we changed the names “Hari-kari” or “Kamikaze” in World War II it may have save a couple lives. But would it really matter?

The real problem is we are

USING HUMAN GUINEA PIGS NEEDLESSLY AND WILL INJURE OR KILL MANY OF THEM

(NO MORE CAPITAL LETTERS)

The current development approach every public road AV maker currently uses relies on the use of the real-world and human test subjects to experience scenarios so the machine and deep learning can test and learn them. Particularly crash scenarios and edge cases. And the process is massively inefficient. The scenarios not only have to be run hundreds or thousands of times each, they have to be run for very small changes in the scenarios and objects. That could lead to millions of test events spread across all the AV makers. For crash scenarios that means injuries and deaths for each one on them. Yes, a bottoms up incremental approach, there will be some relative scenario leveraging and the simulation technology being used now will lessen those occurrences. But this still leaves a significant number of injuries and deaths to come.

Now comes to mantra I hear when I say all of that. Why hasn’t there been more deaths to date then? First of all, there are 165k miles per crash and 10 million miles per death in the US. That is not in restricted location or weather ODDs. Most of the AV makers are benefactors of those odds right now. But the key issue here is, the other AV makers are disengaging away from crashes. Which means they cannot be learned. At some point that must end. The Tesla debacle demonstrates this. A significant reason why they have so many crashes is their drivers are not disengaging out of these scenarios. (While the naming issue is a key contributor to that, there are quite a few others. They have 250X more cars at L2, their sensor system is fatally flawed, their Driver Monitoring alarm time is 20 seconds or more which is useless, they have no ODD limit, and their drivers are not trained. BTW, fixing all of that would save far more lives than the name change.)

All the disingenuous “autonowash” folks are doing here is diverting your attention to the big bad wolf. What happens if Tesla changes the names tomorrow? Nothing fundamentally changes. These folks KNOW (ooops) the current development process relies on human sacrifice and either buy into the myth this is for the greater good or, worse and much more likely, they know it’s wrong but repeat the mantra for selfish personal and professional reasons.

(BTW why isn’t anyone jumping on Waymo for saying they are fully autonomous in Chandler with zero proof and “safety drivers” still being used in 90% of the vans? Or Cruise for saying they have fully autonomous Friday’s. Also, with zero proof beyond the data they let us see.)

The solution, which USDOT VOICES is brave enough to espouse, is to shift most of the real-world shadow and “safety driving” to proper simulation that is informed and validated by the real-world. This solves the safety, time, and cost issues with the current approach. Proper simulation is not the gaming technology based systems being used now. While the visual engine folks do a great job the core host architectures do not have proper real-time capabilities. Nor do folks federate model or parts of models. And the active senor modeling approaches do not have high enough fidelity.

More details here

Chairperson Homendy, Please Reverse Untenable Brown/Banner Investigation Finding, Support USDOT VOICES and an “Autopilot” Development Moratorium

· https://imispgh.medium.com/chairperson-homendy-please-reverse-untenable-brown-banner-investigation-finding-support-usdot-11a53e8b0da8

The Autonomous Vehicle Industry can be Saved by doing the Opposite of what is being done now to create this technology

· https://medium.com/@imispgh/the-autonomous-vehicle-industry-can-be-saved-by-doing-the-opposite-of-what-is-being-done-now-b4e5c6ae9237

How the failed Iranian hostage rescue in 1980 can save the Autonomous Vehicle industry

· https://imispgh.medium.com/how-the-failed-iranian-hostage-rescue-in-1980-can-save-the-autonomous-vehicle-industry-be76238dea36

We need a USDOT VOICES Vaccine against the Autonomous Vehicle Industry

· https://imispgh.medium.com/we-need-a-usdot-voices-vaccine-against-the-autonomous-vehicle-industry-6b067165bb13

Tesla “autopilot” development effort needs to be stopped and people held accountable

· https://medium.com/@imispgh/tesla-autopilot-development-effort-needs-to-be-stopped-and-people-arrested-f280229d2284

Judge rules Tesla “autopilot” false advertising lawsuit can proceed- And to consider an injunction to stop “Autopilot” and “FSD”

· https://imispgh.medium.com/judge-rules-tesla-autopilot-false-advertising-lawsuit-can-proceed-cfd51e85aa5b

Tesla “Autopilot” killed a teenager in 2019 — Do we really need it to be a young child or family?

· https://imispgh.medium.com/tesla-autopilot-killed-a-teenager-in-2019-do-we-really-need-it-to-be-a-young-child-or-family-39d684f998c9

My name is Michael DeKort — I am a former system engineer, engineering and program manager for Lockheed Martin. I worked in aircraft simulation, the software engineering manager for all of NORAD, the Aegis Weapon System, and on C4ISR for DHS.

Industry Participation — Air and Ground

- Founder SAE On-Road Autonomous Driving Simulation Task Force

- Member SAE ORAD Verification and Validation Task Force

- Member UNECE WP.29 SG2 Virtual Testing

- Stakeholder USDOT VOICES (Virtual Open Innovation Collaborative Environment for Safety)

- Member SAE G-34 / EUROCAE WG-114 Artificial Intelligence in Aviation

- Member Teleoperation Consortium

- Member CIVATAglobal — Civic Air Transport Association

- Stakeholder for UL4600 — Creating AV Safety Guidelines

- Member of the IEEE Artificial Intelligence & Autonomous Systems Policy Committee

- Presented the IEEE Barus Ethics Award for Post 9/11 DoD/DHS Whistleblowing Efforts

--

--

Michael DeKort
Michael DeKort

Written by Michael DeKort

Non-Tribal Truth Seeker-IEEE Barus Ethics Award/9–11 Whistleblower-Aerospace/DoD Systems Engineer/Member SAE Autonomy and eVTOL development V&V & Simulation

Responses (1)