The driverless vehicle press is purposefully ignoring a whistleblower with inside information showing a major AV maker is putting the public at risk
Several months ago, an engineer from one of the major autonomous vehicle makers came to me with company documentation that showed their system was extremely unsafe and putting the public at risk. The documentation clearly showed the system had a flawed perception and object tracking system that led to inadequate braking. This would include pedestrians, cyclists etc. Since this company currently develops and tests its system in the public domain now, this poses an extreme danger to the public. The system with either not brake in time for objects or not brake at all. Leaving the “safety driver” as the backup. Given humans cannot properly disengage in many time critical scenarios, many of which being crash scenarios, because they cannot regain proper situational awareness it is easy to see where this will wind up. And keep in mind this is only one documented example. (I would also note that the odds this issue is confined to just one major AV maker is very low.)
I took this proof to several members of the industry press. Several passed outright saying they were too busy. Yet clearly not enough to write stories on Tesla in the blink of an eye. One took the time to investigate the issue and proof. They fell back on some “expert” denying it without telling me what that response was and saying it would be fixed when released. The problem here is the system is effectively released because it is using humans as Guinea pigs as we speak. If you look at the industry press you will see a very common problem. They protect their benefactor until the stories become so egregious printing becomes not only easy but not doing so would cause the opposite problem. Of course, the problem here is that usually requires injury or death of someone or a pattern of clear egregious behavior. Tesla is an example of this. Years ago, no one would touch Tesla either. Then their clearly incompetent sensor design and negligent behavior, crashes and deaths began to pile up. Now it’s fashionable to go after Tesla and Elon Musk. Clearly, that coverage is deserved. The problem is the rest of the industry uses the same basic reckless and untenable development and testing approach as Tesla. While they have a better sensor system, ODD approach, and trained Guinea pigs, they too will, by design, have to injure and kill people as well to learn many crash scenarios. They can’t avoid them through benign ODDs, disengagements and hiding data forever. What do you think Waymo is signaling when they sued the CA DMV to avoid providing the inadequate safety data they asked for? As well as the entire industry being silent on that? Because the press is aware of this, they tie themselves up in knots, along with the industry and NHTSA, to try to split hairs and isolate the industry from their Tesla reporting efforts. That’s why we see them mention Tesla has untrained Guinea pigs and a naming problem. Where the real root cause is overuse of the real-world and use of the wrong simulation technology to mitigate that. In time though the backbone and ethical fortitude incremental building epiphanies will bring all of this more and more to light. The press will begin to report on it. However, it will likely take the death of the first child or family to do so. And quite unfortunately that will need to happen with a Tesla competitor so the unethical and immoral hair splitting can stop.
(I should also note the press has also refused to ask the AV makers if they will confirm many crash scenarios require the machine learning to experience them over and over to learn them. And this would harm or fatally injure test subjects inside and outside the vehicles. As well as why the AV makers will not supply the safety data required to actually know if they are driverless. That list being scenarios learned, especially crash scenarios, disengagements with likely crashes called out including in simulation and proof of simulation model fidelity, especially sensors.)
I have chosen not to mention the name of the company to protect the whistleblower. The “reporters” do not know their name because I told them I was withholding it and a meeting with them until they did their due diligence and agreed to report on the story. Since that never happened, they don’t know who they are. I am also not naming the reporters at this time. The fear there being they publish a story trying to counter me and name the company. Should they take this tact anyway I would suggest I am not someone who would take that lightly. (The issue has been filed with the DoT IG. This makes us whistleblowers who would seek whistleblower protection.) Having said this I would be glad to say I was wrong about the industry press if someone steps up.
More here
By not providing any meaningful proof of being driverless, even fighting doing through a lawsuit, Waymo, Cruise and Gatik are misleading the public, putting their lives at risk, and collapsing
Driverless industry press is still misreading the root cause of the industry’s collapse
The Autonomous Vehicle Industry can be Saved by doing the Opposite of what is being done now
How the failed Iranian hostage rescue in 1980 can save the Autonomous Vehicle industry
Autonomous Vehicle Industry’s Self-Inflicted and Avoidable Collapse — Ongoing Update
My name is Michael DeKort — I am Navy veteran (ASW-C4ISR) and a former system engineer, engineering, and program manager for Lockheed Martin. I worked in aircraft simulation, the software engineering manager for all of NORAD, a software project manager on an Aegis Weapon System baseline, and a C4ISR systems engineer for DoD/DHS and the US State Department (counter-terrorism). I am currently the Senior Advisory Technical Project Manager for FTI to the Army AI Task Force at CMU NREC (National Robotics Engineering Center)
Autonomous Industry Participation — Air and Ground
- Founder SAE On-Road Autonomous Driving Simulation Task Force
- Member SAE ORAD Verification and Validation Task Force
- Member UNECE WP.29 SG2 Virtual Testing
- Stakeholder USDOT VOICES (Virtual Open Innovation Collaborative Environment for Safety)
- Member SAE G-35, Modeling, Simulation, Training for Emerging AV Tech
- Member SAE G-34 / EUROCAE WG-114 Artificial Intelligence in Aviation
- Member Teleoperation Consortium
- Member CIVATAglobal — Civic Air Transport Association
- Stakeholder for UL4600 — Creating AV Safety Guidelines