The dangerous, undefined, undisclosed, self-certification and licensing of driverless vehicles

Michael DeKort
8 min readApr 16, 2022


Yet another cosmic timing miracle in the autonomous vehicle industry has occurred. Companies are now becoming “fully driverless” L4 one after another. Interesting how when one announces a supposed breakthrough or advancement, the competition announces the same thing soon after. In this case it’s Waymo, Gatik, Cruise and most recently Mobileye. Watch for Motional, Argo, Aurora, Zoox etc any minute now. (I don’t mention Tesla because their human Guinea pigs are not employees. So, no matter what Elon says, we will know they are nowhere near driverless.)

Now though the reckless hype has led to the removal of the “safety driver”. (A process that largely should not exist anyway. See more in my articles below.) Leaving the overly trusting public, in and around the vehicles, in the most dangerous position of all. All alone in a system they are told is SAE L4. Which means they drive better than a human. Unlike when there is a human driver though, there are zero regulations in place to test these systems or establish the machine driver is properly licensed. No laws mandating the associated standards and test details be made public. Nor the results of those test or even a review by any state or government authority. What we have is the ultimate fox and hen house debacle. The fox gets to secretly self-certify to an unknown standard and tests. All they have to do is state in writing they did something.

CA DMV regulation

· Verifying the technology is capable of detecting and responding to roadway situations in compliance with the California Vehicle Code, and a description of how the vehicle meets the definition of an SAE Level 3, 4 or 5 autonomous technology.

· Verifying the vehicles meet federal Motor Vehicle Safety Standards or have an exemption from the National Highway Traffic Safety Administration.

· Certifying the manufacturer has conducted test and validation methods and is satisfied that the autonomous vehicles are safe for deployment on California public roads.

This is astoundingly reckless and counterproductive. First, the industry has shown is hypes continuously. Second, machine, general and deep learning are nascent technologies that cannot infer well. That means they must hyper memorize billions of scenarios, objects, and their variances. To include many crash and edge cases. Since that requires experiencing them over and over to tweak and retest, where do you think that happened? These companies will mention miles driven, especially in simulation. But release of data beyond that, Waymo sued to provide any meaningful data. Not only did no one else object to that. No one offered to supply the data Waymo refuses to produce or offered to go beyond the scant set CA DMV asked for and provide something meaningful.

The only quasi-bright spot I am aware of, beyond my calls for due diligence and openness, is Jack Weast an Intel Fellow, the CTO of Mobileye’s corporate strategy office, and its vice president for automated vehicle standards. He leads a global team working on automated vehicle safety technology and related standards. Below he calls for a “driver’s test” and to prove it has been passed. Of course, there is no mention of what the test comprises nor what proof will be provided. Since Mobileye recently announced it will be “fully driverless” soon, it has been silent on this.

What does “driving safely” mean in today’s rapidly evolving mobility landscape?

Driving safely means driving at a societally acceptable risk balance and proving you are doing so. For example, when we take a driver’s test, we are demonstrating not just that we understand the rules of the road but also that we understand how to drive in a societally acceptable manner. However, with machines, we don’t have to guess; we can be precise.

Coincidentally I am on SAE’s (Society of Automotive Engineer) On Road Autonomous Driving (ORAD) Verification and Validation (V&V) Task Force. This is an industry group whose charter is to write a testing standard for the industry and governments to use as a guide to create laws, regulations etc. The text below is from an email I sent the leadership. I have not received a response. In it your will see my detailed description of the issues, including a parallel to the post 9/11 DHS/DoD events I blew the whistle on in 2006, as well as detailed suggestions for what needs to be done to remedy this.


I assume you have seen Waymo, Cruise and Gatik declaring they are fully driverless in certain ODDs. Waymo and Cruise are carrying passengers with no safety driver. It also appears CA DMV does not require the machine have a driver’s license and take any kind of a driver’s test. They also appear to rely not only the AV maker self-certifying but self-certifying to an internal test that is not based on any government standard and does not have to be disclosed to them or public. As I am sure you know, Waymo sued and won to provide a very minimal release of safety data. Which no one in the industry either complained about or improved on. This is a very dangerous situation. The conflict of interest is off the scale.

My Lockheed/Northrop Grumman whistleblowing ordeal centered on self-certification. After 9/11 it was determined the USCG could not staff the positions needed on their side to effect the massive upgrade needed to its ships, aircraft etc in a timely manner. To remedy this, they took a historic leap and allowed the contractors to act as trusted agents of the USCG. The process started out as it normally does. The USCG would create mission requirements. Then the contractors would create the system functional specs, design build and test to tests they created. Usually, the government approved those tests and either reused them or used them to create their own tests so they could V&V what the contractor did. Here however things took a historic turn. The contractors ran those tests and self-certified them on behalf of the government. The problem was the contractors abused that privilege. I stopped them and they lost the lead role in what is now a $32B program. And a law was written never allowing this again in DHS or DoD. (It’s all in the book, “Complex Contracting.) I raise this example because the significant safety and security issues caused by the contractors could have been avoided if the contractors didn’t cut corners. That is exactly what is happening in the AV industry. None of these companies has anything close to a legitimate “driverless” L4 system, especially with regard to relevant crash and edge cases. The development and test approach simply can never yield it. But let’s set aside the never part. They surely don’t have it now. What they have is enough capability to play the odds, along with NDA’s, hiding data etc, to make it look like they have all of this covered. Instilling false confidence in the public and governments. Companies who do the right thing the right way do not run from the light by hiding data, cherry-picking videos to release and forcing passengers to sign NDAs. They seek the light.

This brings me to the V&V document. Where is that in the process? What is the expected release date? If it cannot be released, at least in draft for all to see in the next 30 days, I would like to suggest a near term remedy. SAE should release a statement with a summary of the coming standard to the public. Areas it suggests should be tested and data that should be created for delivery and review by the government and the public. (To at least the level this is currently done by the OEMs, aircraft makers etc.) Trust but verify.

I would like to suggest the following at a minimum

· List of learned scenarios. To include their characteristics as delineated in UL4600 and proof of V&V of the performance of AV subsystems to ensure the final outcome was not in error or by chance. This includes the sensor/perception and planning systems

· List of learned objects and their variations. (This is to ensure the machine/general/deep learning processes all did what they should have)

· List of disengagements with all crash or potential crash cases delineated. (Right now only real-world crashes are delineated. Those that would or could have happened are labeled as having various system issues or errors.)

· List of all simulation models and the performance curves used to create them.

· List of all software or system defects

I believe SAE has enough clout to nudge the industry, especially various levels of government, to do the right thing here and reverse the very dangerous course it is on. If we don’t do this the progressively growing hype cycle will force all the rest to keep up and do the same thing. That will lead to some cutting worse corners. That will lead to avoidable injuries and deaths. (Which will likely require many injuries and deaths, unless a major catastrophe occurs, for the industry to stop the cycle. This will occur because the NTSB review will take a very long time and the companies will mislead the public and continue to play the odds until we reach critical mass.)

(Note-that Deepwater debacle is highlighted in the book “Complex Contracting”.)

Clearly, there is no trust but verify here. No engineering or safety due diligence. The grossly negligent cognitive dissonance of the industry echo chamber is deafening. People who do the right thing, the right way seek the light. Not run from it.

More here

The driverless press refuses to ask Cruise if their “fully driverless” machine driver has a license, what test they passed and if their system/service is legal


By not providing any meaningful proof of being driverless, even fighting doing through a lawsuit, Waymo, Cruise and Gatik are misleading the public, putting their lives at risk, and collapsing


Cognitive Dissonance and the Driverless Vehicle Industry


The Autonomous Vehicle Industry can be Saved by doing the Opposite of what is being done now


How the failed Iranian hostage rescue in 1980 can save the Autonomous Vehicle industry

My name is Michael DeKort — I am Navy veteran (ASW-C4ISR) and a former system engineer, engineering, and program manager for Lockheed Martin. I worked in aircraft simulation, the software engineering manager for all of NORAD, a software project manager on an Aegis Weapon System baseline, and a C4ISR systems engineer for DoD/DHS and the US State Department (counter-terrorism). And a Senior Advisory Technical Project Manager for FTI to the Army AI Task Force at CMU NREC (National Robotics Engineering Center)

Autonomous Industry Participation — Air and Ground

- Founder SAE On-Road Autonomous Driving Simulation Task Force

- Member SAE ORAD Verification and Validation Task Force

- Member UNECE WP.29 SG2 Virtual Testing

- Stakeholder USDOT VOICES (Virtual Open Innovation Collaborative Environment for Safety)

- Member SAE G-35, Modeling, Simulation, Training for Emerging AV Tech

- Member SAE G-34 / EUROCAE WG-114 Artificial Intelligence in Aviation

- Member Teleoperation Consortium

- Member CIVATAglobal — Civic Air Transport Association

- Stakeholder for UL4600 — Creating AV Safety Guidelines

- Member of the IEEE Artificial Intelligence & Autonomous Systems Policy Committee

SAE Autonomous Vehicle Engineering magazine editor calling me “prescient” regarding my position on Tesla and the overall driverless vehicle industry’s untenable development and testing approach — (Page 2)

Presented the IEEE Barus Ethics Award for Post 9/11 DoD/DHS Whistleblowing Efforts



Michael DeKort

Non-Tribal Truth Seeker-IEEE Barus Ethics Award/9–11 Whistleblower-Aerospace/DoD Systems Engineer/Member SAE Autonomy and eVTOL development V&V & Simulation