CEO of Green Hills Software Dan O’Dowd demonstrates he doesn’t understand the technology as he singles out Tesla for lapses he has himself, and incorrectly thinks Waymo is L4

Michael DeKort
5 min readApr 3, 2022

--

Dan O’Dowd, the founder of Green Hills Software has been going after Tesla saying its “Autopilot” and “Full Self Driving” are unsafe and do not come anywhere near the capabilities those names imply. He has taken out ads in newspapers, made heavy use of Twitter, often by referring to YouTube videos he posts, to make this point. He does all of this from something he created called the “Dawn Project”. He states he is “making computers safe for humanity” through that effort and his software.

Examples

https://twitter.com/RealDanODowd/status/1482624891308605440?s=20&t=uwD1iQaiT0jrmTu3q5JxCA

https://twitter.com/RealDanODowd/status/1510641499544993798?s=20&t=uwD1iQaiT0jrmTu3q5JxCA

https://twitter.com/RealDanODowd/status/1508939605025996802?s=20&t=uwD1iQaiT0jrmTu3q5JxCA

I agree with his efforts regarding Tesla.

The problem is, he thinks Waymo has a “fully driverless” “L4” vehicle driving paying customers around San Francisco. Sans safety driver of course. This is where he is not only wrong but clearly shows he doesn’t understand the technology, that no one is close to L4, nor will they ever be, and each will harm people needlessly.

It is a myth that public shadow and safety driving can create a legitimate autonomous vehicle. And the lives the process takes are necessary and for the greater good. It is impossible to drive the trillion miles or spend $300B to stumble and restumble on all the scenarios necessary to complete the effort. The process also harms people for no reason. The first safety issue is handover. The time to regain proper situational awareness and do the right thing, especially in time critical scenarios. cannot be provided. Another dangerous area is learning accident scenarios. AV makers will have to run thousands of accident scenarios thousands of times to accomplish this. That will cause thousands of injuries and deaths. The next issues is the use of gaming based simulation technology which has too many technical limitations to facilitate the creation of a legitimate real-world digital twin.

Yes, Tesla is the most egregious, reckless, and incompetent of the bunch, but the others will harm people too. As I have said before. And F-35 can fly much higher than a paper airplane, but when the moon is the intended destination, does it matter?

Dan also seems to have no problem declaring Waymo as safer than a human, by virtue of the “L4” rating he deems upon it. He does so with zero proof of capabilities. And without Waymo having a licensed driver who passed a “driver’s test”. Particularly where relevant crash cases are involved. Thus, making the system illegal. This makes Dan not only incompetent himself and a hypocrite, but exactly the kind of person the Dawn Project should rail against.

Dan also demonstrated his lack of technical understanding when he tries to defend his Tesla position. Including in a blog where he tries to counter Warren Redlich’s YouTube video. Had Dan been more aware of how Tesla’s tech works he would have been able to counter Warren’s ad hominem and ill-informed and baseless attacks and points he makes. As well as avoided having to defend himself personally so often. Examples of technical aspects regarding Tesla Dan appears to not understand about Tesla, beyond the common issues I describe above, include the issues with camera only based systems, the negligent 20 second or longer DM alarm delay, the lack of HD mapping and the over reliance and severe capability lapses with the current states of general and deep learning. That being the lack of inference and the system errors and confusion that causes. Let alone it’s contributing to the debilitating time, cost and safety development and testing issues.

The blog — https://dawnproject.com/what-is-the-credibility-and-character-of-warren-redlich/

Another observation here is that Dan does not seem to be that much different than Elon in the way he handles criticizing and being criticized. Each is thin skinned, avoids details and discussions with other directly who disagree with them. I have tried several times to chat with and assist Dan before id decided to write this article. (Dan did challenge a Tesla devotee to debate with another Lemming Lex Fridman to mediate. I found that whole thing to be odd. Why a devotee host? Why just another devotee to defend Tesla instead of Musk himself? In response to that Twitter I suggested he contact Joe Rogan and ask to have all of them on the podcast. While Rogan is a supporter of Elon and Fridman, and has had them both on, he would likely allow Dan to make his case. I offered to help Dan with this. I received no response. And of course, I would love to be on the show and discuss the issues with all of them.)

My suggestion to Dan is he should do his homework and employ the Dawn Project to be part of the whole fix, not just scapegoating Tesla. I provide much of the courseware for free below.

Below are a couple articles that explain my POV in more detail. Including how to do this right.

How are Waymo and Cruise “Fully Driverless” Vehicles Legal?

· https://imispgh.medium.com/how-are-waymo-and-cruise-fully-driverless-vehicles-legal-5a25a495fdf1

Tesla “autopilot” development effort needs to be stopped and people held accountable

· https://medium.com/@imispgh/tesla-autopilot-development-effort-needs-to-be-stopped-and-people-arrested-f280229d2284

The Autonomous Vehicle Industry can be Saved by doing the Opposite of what is being done now to create this technology

· https://medium.com/@imispgh/the-autonomous-vehicle-industry-can-be-saved-by-doing-the-opposite-of-what-is-being-done-now-b4e5c6ae9237

How the failed Iranian hostage rescue in 1980 can save the Autonomous Vehicle industry

My name is Michael DeKort — I am Navy veteran (ASW-C4ISR) and a former system engineer, engineering, and program manager for Lockheed Martin. I worked in aircraft simulation, the software engineering manager for all of NORAD, a software project manager on an Aegis Weapon System baseline, and a C4ISR systems engineer for DoD/DHS and the US State Department (counter-terrorism). And a Senior Advisory Technical Project Manager for FTI to the Army AI Task Force at CMU NREC (National Robotics Engineering Center)

Autonomous Industry Participation — Air and Ground

- Founder SAE On-Road Autonomous Driving Simulation Task Force

- Member SAE ORAD Verification and Validation Task Force

- Member UNECE WP.29 SG2 Virtual Testing

- Stakeholder USDOT VOICES (Virtual Open Innovation Collaborative Environment for Safety)

- Member SAE G-35, Modeling, Simulation, Training for Emerging AV Tech

- Member SAE G-34 / EUROCAE WG-114 Artificial Intelligence in Aviation

- Member Teleoperation Consortium

- Member CIVATAglobal — Civic Air Transport Association

- Stakeholder for UL4600 — Creating AV Safety Guidelines

- Member of the IEEE Artificial Intelligence & Autonomous Systems Policy Committee

SAE Autonomous Vehicle Engineering magazine editor calling me “prescient” regarding my position on Tesla and the overall driverless vehicle industry’s untenable development and testing approach — (Page 2) https://assets.techbriefs.com/EML/2021/digital_editions/ave/AVE-202109.pdf

Presented the IEEE Barus Ethics Award for Post 9/11 DoD/DHS Whistleblowing Efforts

--

--

Michael DeKort
Michael DeKort

Written by Michael DeKort

Non-Tribal Truth Seeker-IEEE Barus Ethics Award/9–11 Whistleblower-Aerospace/DoD Systems Engineer/Member SAE Autonomy and eVTOL development V&V & Simulation

No responses yet