Chairperson Homendy, Please Reverse Untenable Brown/Banner Investigation Finding, Support USDOT VOICES and an “Autopilot” Development Moratorium

Michael DeKort
4 min readAug 17, 2021

--

Chairperson Homendy,

I respectfully request you review the Joshua Brown and Jeremy Banner investigation report and reverse the finding regarding operation in relevant Operational Design Domains (ODD). The finding suggests that Tesla’s “autopilot” should be able to safely handle relevant scenarios for the area and environment the system is traveling in. While that makes perfect sense for a production system, that is not what these are. They are in development. As such your suggestion sets an unsolvable paradox. For the systems to get to the point you rightly suggest, they must experience many relevant edge and crash cases over and over for the machine and deep learning systems to learn them. In many cases, because the location, environment and object characteristics are extremely specific to the machine learning system, the systems cannot infer well yet, and because machine learning is extremely inefficient, this means hundreds if not thousands of these events must take place. With many not covered by inference from similar scenarios or events. This includes “safety drivers” not disengaging and allowing the crashes to occur. If you do not allow the repeated experimentation and crashes the systems cannot learn to not have them or to handle them properly.

Beyond that is the ultimate untenable and reckless nature of the process I just described. In order for it to be successful the process requires thousands of people, in and outside the cars, be injured or fatally injured. And it requires so many scenarios be stumbled on, so many times, it is impossible for any autonomous vehicle maker to spend the money or time to finish. This is the fallacy of driver monitoring systems for L2. If the human test subjects disengage to save themselves and others the scenarios cannot be learned. The loop can never end. (It should also be noted that no DM system can provide the time to regain proper situational awareness in time critical scenarios.) This means SAE L4 will never be reached, the resulting relevant lives will never be saved and those sacrificed are for naught. The only way to remedy all of this is to invert the current development paradigm and use proper aerospace/DoD level simulation, informed, and validated by the real-world, to replace most of the real-world, shadow and “safety driving”. USDOT’s own VOICES group is fully in support of this approach and could use your support. (They currently utilize DoD to assist.)

USDOT introduces VOICES Proof of Concept for Autonomous Vehicle Industry-A Paradigm Shift?

https://imispgh.medium.com/usdot-introduces-voices-proof-of-concept-for-autonomous-vehicle-industry-a-paradigm-shift-87a12aa1bc3a

(A key factor involved in Tesla’s crashes and the resulting deaths is its fatally flawed sensor system. Now made far worse by the removal of radar. I did not mention this above since it being remedied would not change the core untenable and reckless nature of the development and testing process.)

Finally, I request you support a moratorium be placed on all “safety driving” for development and testing, to protect the public, while the NHTSA investigations proceed, and the VOICES process is vetted as a development and testing substitute. Given the avoidable deaths that have occurred so far, the serious nature of the recent NHTSA announcement to investigate Tesla crashes into stationary emergency vehicles, which you rightfully supported, and the fact that a small child or family has not be killed yet and that only being by chance, I believe the moratorium is clearly warranted.

Respectfully,

Michael DeKort

More detail here. Including how to do this right.

The Autonomous Vehicle Industry can be Saved by doing the Opposite of what is being done now to create this technology

· https://medium.com/@imispgh/the-autonomous-vehicle-industry-can-be-saved-by-doing-the-opposite-of-what-is-being-done-now-b4e5c6ae9237

How the failed Iranian hostage rescue in 1980 can save the Autonomous Vehicle industry

· https://imispgh.medium.com/how-the-failed-iranian-hostage-rescue-in-1980-can-save-the-autonomous-vehicle-industry-be76238dea36

Tesla “autopilot” development effort needs to be stopped and people held accountable

· https://medium.com/@imispgh/tesla-autopilot-development-effort-needs-to-be-stopped-and-people-arrested-f280229d2284

Judge rules Tesla “autopilot” false advertising lawsuit can proceed- And to consider an injunction to stop “Autopilot” and “FSD”

· https://imispgh.medium.com/judge-rules-tesla-autopilot-false-advertising-lawsuit-can-proceed-cfd51e85aa5b

Tesla’s “rapid improvement with pure vision” “autopilot” update is another grossly negligent dud

· https://imispgh.medium.com/teslas-rapid-improvement-with-pure-vision-autopilot-update-is-another-grossly-negligent-dud-10e83ceb8fb3

Tesla “Autopilot” killed a teenager in 2019 — Do we really need it to be a young child or family?

· https://imispgh.medium.com/tesla-autopilot-killed-a-teenager-in-2019-do-we-really-need-it-to-be-a-young-child-or-family-39d684f998c9

Elon admits “autopilot” development was harder than expected, but he, nor anyone else actually gets it yet

· https://imispgh.medium.com/elon-admits-autopilot-development-was-harder-than-expected-but-he-nor-anyone-else-actually-gets-d44120af2f65

My name is Michael DeKort — I am a former system engineer, engineering and program manager for Lockheed Martin. I worked in aircraft simulation, the software engineering manager for all of NORAD, the Aegis Weapon System, and on C4ISR for DHS.

Industry Participation — Air and Ground

- Founder SAE On-Road Autonomous Driving Simulation Task Force

- Member SAE ORAD Verification and Validation Task Force

- Member UNECE WP.29 SG2 Virtual Testing

- Stakeholder USDOT VOICES (Virtual Open Innovation Collaborative Environment for Safety)

- Member SAE G-34 / EUROCAE WG-114 Artificial Intelligence in Aviation

- Member Teleoperation Consortium

- Member CIVATAglobal — Civic Air Transport Association

- Stakeholder for UL4600 — Creating AV Safety Guidelines

- Member of the IEEE Artificial Intelligence & Autonomous Systems Policy Committee

- Presented the IEEE Barus Ethics Award for Post 9/11 DoD/DHS Whistleblowing Efforts

--

--

Michael DeKort
Michael DeKort

Written by Michael DeKort

Non-Tribal Truth Seeker-IEEE Barus Ethics Award/9–11 Whistleblower-Aerospace/DoD Systems Engineer/Member SAE Autonomy and eVTOL development V&V & Simulation

No responses yet