New California law against Tesla using the term “Full Self Driving” will barely address the problem and covers for the industry

Michael DeKort
4 min readSep 1, 2022

The state of California is about to pass a measure forcing Tesla to make changes regarding its use of the term “Full Self Driving”.

Reference article — CA wants to make it illegal for Tesla to label FSD as FSD

First, I am all for dealing with the term “Full Self Driving” name by eliminating it until it is proven to work. (And “Autopilot” as well.) However, doing that and adding more education on top of what Tesla already does will likely save lives but barely make a dent in the problem. Most people are well aware the system is in development, and they are testing it. They simply choose to be Guinea pigs for a variety of reasons. (And in some cases use their children as test targets.)

Overall, this is largely a feel-good measure orchestrated by an industry that wants to find a way to shut down grossly negligent Tesla through a method that isolates them from harming themselves and the real root cause. They don’t want Tesla to harm more people faster than they will and spoil their party. Which BTW includes several companies stating they are driverless when they are not and refusing to provide any data proving it. That includes Waymo, Cruise and Argo. Tesla actually never said they were driverless while these folks do. And where is the press, industry and government on this? Silent and complicit.

The real root cause is the needless and untenable process of using human Guinea pigs to experience scenarios to train machine and deep learning. Not only does it REQUIRE many crashes and associated injuries and deaths to complete, but due to the current state of general and deep learning not having any significant ability to infer, the amount of those scenarios, objects, and their variations the process needs to run over and over to complete is so vast no one has the time or money to do it. Meaning people will be killed needlessly in a futile effort that will never save the lives the industry convinced the public it needs to sacrifice other lives for. And the industry has yet to shift from gaming technology for to aerospace/DoD simulation to resolve the safety and much of the time and cost issues. Why? Because they don’t want to admit they had it so wrong. Maslow’s Triangle issues to include future funding risks.

A real PARTIAL fix for just Tesla would be if it was forced to use a competent sensor system to include radar and LiDAR. The problem there is many others use the same poor fidelity radar technology, which includes AEB, that can’t properly detect stationary or crossing objects. And many don’t use LiDAR to detect objects and create tracks like a radar. Given this my guess is they stayed away from this not to expose that. (Imaging radar and that LiDAR upgrade solves this.)

Another real PARTIAL fix would be to force Tesla to speed up its Driver Monitoring alarm time. Often it is 20 seconds or more. That makes it useless. (Which is likely the point. After all, if the crash isn’t experienced it has no value to Tesla and Elon.)

More on my POV here. Including how to do this right.

I told you Tesla was only the most egregious — BMW driverless vehicle under test kills 1 and injures 9 in Germany


The same folks who now rail against Tesla thought they were a shining star and don’t know or care they are only the worst of the bunch


SAE Autonomous Vehicle Engineering magazine editor calling me “prescient” regarding my position on Tesla and the overall driverless vehicle industry’s untenable development and testing approach — (Page 2)

Tesla “autopilot” development effort needs to be stopped and people held accountable


NHTSA should impose an immediate “Autopilot” moratorium and report initial investigation findings in 30 days


The Autonomous Vehicle Industry can be Saved by doing the Opposite of what is being done now to create this technology


How the failed Iranian hostage rescue in 1980 can save the Autonomous Vehicle industry


My name is Michael DeKort — I am Navy veteran (ASW C4ISR) and a former system engineer, engineering, and program manager for Lockheed Martin. I worked in aircraft simulation, the software engineering manager for all of NORAD, a software project manager on an Aegis Weapon System baseline, and a C4ISR systems engineer for DoD/DHS and the US State Department (counterterrorism). And a Senior Advisory Technical Project Manager for FTI to the Army AI Task Force at CMU NREC (National Robotics Engineering Center)

Autonomous Industry Participation — Air and Ground

- Founder SAE On-Road Autonomous Driving Simulation Task Force

- Member SAE ORAD Verification and Validation Task Force

- Member UNECE WP.29 SG2 Virtual Testing

- Stakeholder USDOT VOICES (Virtual Open Innovation Collaborative Environment for Safety)

- Member SAE G-35, Modeling, Simulation, Training for Emerging AV Tech

- Member SAE G-34 / EUROCAE WG-114 Artificial Intelligence in Aviation

- Member Teleoperation Consortium

- Member CIVATAglobal — Civic Air Transport Association

- Stakeholder for UL4600 — Creating AV Safety Guidelines

- Member of the IEEE Artificial Intelligence & Autonomous Systems Policy Committee

SAE Autonomous Vehicle Engineering magazine editor calling me “prescient” regarding my position on Tesla and the overall driverless vehicle industry’s untenable development and testing approach — (Page 2)

Presented the IEEE Barus Ethics Award for Post 9/11 DoD/DHS Whistleblowing Efforts



Michael DeKort

Non-Tribal Truth Seeker-IEEE Barus Ethics Award/9–11 Whistleblower-Aerospace/DoD Systems Engineer/Member SAE Autonomy and eVTOL development V&V & Simulation