Consumer Reports, “safety experts”, the government, most of the press and the driverless vehicle industry want us to believe Tesla is the only one using needless human Guinea pigs

Michael DeKort
4 min readJul 20, 2021

--

The Consumer Reports article — Tesla’s ‘Full Self-Driving’ Beta Software Used on Public Roads Lacks Safeguards — https://www.consumerreports.org/car-safety/tesla-full-self-driving-beta-software-lacks-safeguards-a6698414036/

It is very unfortunate that more people do not have the intellectual, ethical, and moral courage to skip from A directly to Z on common sense and data alone. Instead, they require minute unnecessary shifts in the “conventional wisdom” and echo chamber permission to slowly inch towards reality. The problem is that exceedingly and avoidable slow pace involves the injuries and deaths of other people, not just companies going bankrupt or bruised egos.

Here we see Consumer Reports and Bryan Reimer performing Olympic level hair splitting. They pretend Tesla is the only driverless vehicle maker that must use people inside and outside of the vehicle as Guinea pigs for development and testing. And is the only autonomous vehicle maker that has or will harm or kill people to do it. Reimer actually stated he doesn’t think anyone on the road should be subjected to the risks of a test vehicle. (Then he hedges by saying Tesla is advancing quickly. Which it is not.) OK. . . how can that be eliminated if the development and testing process literally requires it? While the inadequate simulation being used and stepping through progressively more complex and dangerous ODDs, along with a competent sensor systems, better (yet untenable) driver monitoring and trained versus customer Guinea pigs, and far fewer vehicles under test than Tesla will result in less tragedies than Tesla will produce, they cannot eliminate it. The reason for that is two-fold. The first being that the gaming-simulation being used cannot provide the necessary fidelity or real-time operations required to replace enough of the real-world (versus DoD/aerospace/FAA simulation technology) and because the driverless vehicle makers have the same lack of courage I described above. This means they must avoid disengaging at some point to experience many edge and crash cases over and over so they can be learned. The second issue is handover cannot be made safe in many time critical scenarios. Which crash cases make up a large portion of. No matter what driver monitoring system you use. So even if they try to disengage there will be times that enough time is not provided for the “safety driver” to regain proper situational awareness and do the right thing the right way. Particularly if there is pressure to hang in until the last moment. (There are also the issues with time and money. There is not enough of either for anyone to get near L4.)

Do we really need someone, likely Tesla, to kill that first small child or family for that final epiphany? Or to firm up those backbones enough to do the right thing?

More detail here

The Autonomous Vehicle Industry can be Saved by doing the Opposite of what is being done now to create this technology

· https://medium.com/@imispgh/the-autonomous-vehicle-industry-can-be-saved-by-doing-the-opposite-of-what-is-being-done-now-b4e5c6ae9237

SAE Autonomous Vehicle Engineering Magazine — Simulation’s Next Generation

· https://www.sae.org/news/2020/08/new-gen-av-simulation

Tesla “autopilot” development effort needs to be stopped and people held accountable

· https://medium.com/@imispgh/tesla-autopilot-development-effort-needs-to-be-stopped-and-people-arrested-f280229d2284

Tesla’s “rapid improvement with pure vision” “autopilot” update is another grossly negligent dud

· https://imispgh.medium.com/teslas-rapid-improvement-with-pure-vision-autopilot-update-is-another-grossly-negligent-dud-10e83ceb8fb3

Tesla “Autopilot” killed a teenager in 2019 — Do we really need it to be a young child or family?

· https://imispgh.medium.com/tesla-autopilot-killed-a-teenager-in-2019-do-we-really-need-it-to-be-a-young-child-or-family-39d684f998c9

Elon admits “autopilot” development was harder than expected, but he, nor anyone else actually gets it yet

· https://imispgh.medium.com/elon-admits-autopilot-development-was-harder-than-expected-but-he-nor-anyone-else-actually-gets-d44120af2f65

The NTSB frets over human Guinea pigs then chastises and punts to the even more reckless NHTSA

· https://imispgh.medium.com/the-ntsb-frets-over-human-guinea-pigs-then-chastises-and-punts-to-the-even-more-reckless-nhtsa-f406046ddd3

Using the Real World is better than Proper Simulation for AV Development — NONSENSE

My name is Michael DeKort — I am a former system engineer, engineering and program manager for Lockheed Martin. I worked in aircraft simulation, the software engineering manager for all of NORAD, the Aegis Weapon System, and on C4ISR for DHS.

Industry Participation — Air and Ground

- Founder SAE On-Road Autonomous Driving Simulation Task Force

- Member SAE ORAD Verification and Validation Task Force

- Member UNECE WP.29 SG2 Virtual Testing

- Stakeholder USDOT VOICES (Virtual Open Innovation Collaborative Environment for Safety)

- Member SAE G-34 / EUROCAE WG-114 Artificial Intelligence in Aviation

- Member CIVATAglobal — Civic Air Transport Association

- Stakeholder for UL4600 — Creating AV Safety Guidelines

- Member of the IEEE Artificial Intelligence & Autonomous Systems Policy Committee

- Presented the IEEE Barus Ethics Award for Post 9/11 DoD/DHS Whistleblowing Efforts

--

--

Michael DeKort
Michael DeKort

Written by Michael DeKort

Non-Tribal Truth Seeker-IEEE Barus Ethics Award/9–11 Whistleblower-Aerospace/DoD Systems Engineer/Member SAE Autonomy and eVTOL development V&V & Simulation