Tesla “Autopilot” killed a teenager in 2019 — Do we really need it to be a young child or family?
NY Times Article — Tesla Says Autopilot Makes Its Cars Safer. Crash Victims Say It Kills.
https://www.nytimes.com/2021/07/05/business/tesla-autopilot-lawsuits-safety.html
“Benjamin Maldonado and his teenage son were driving back from a soccer tournament on a California freeway in August 2019 when a truck in front of them slowed. Mr. Maldonado flicked his turn signal and moved right. Within seconds, his Ford Explorer pickup was hit by a Tesla Model 3 that was traveling about 60 miles per hour on Autopilot.
A six-second video captured by the Tesla and data it recorded show that neither Autopilot — Tesla’s much-vaunted system that can steer, brake and accelerate a car on its own — nor the driver slowed the vehicle until a fraction of a second before the crash. Fifteen-year-old Jovani, who had been in the front passenger seat and not wearing his seatbelt, was thrown from the Ford and died, according to a police report.”
Yes, of course he should have been wearing a seat belt. But that is just as much of a diversion as Elaine Hezberg crossing the Uber at night. These are diversions because the fundamental development process should not be used.
It is a myth that public shadow and safety driving can create a legitimate autonomous vehicle. And the lives the process takes are necessary and for the greater good. It is impossible to drive the trillion miles or spend $300B to stumble and restumble on all the scenarios necessary to complete the effort. The process also harms people for no reason. The first safety issue is handover. The time to regain proper situational awareness and do the right thing, especially in time critical scenarios. cannot be provided. Another dangerous area is learning accident scenarios. AV makers will have to run thousands of accident scenarios thousands of times to accomplish this. That will cause thousands of injuries and deaths. The next issues is the use of gaming based simulation technology which has too many technical limitations to facilitate the creation of a legitimate real-world digital twin. The solution is to use DoD/Aerospace simulation technology, informed and validated by the real-world, and shift most of the autonomous system development and testing over to it.
The immediate question is, do we need a young child or family to die in order to make this paradigm shift? Many of them?
More detail here
Tesla “autopilot” development effort needs to be stopped and people held accountable
The Autonomous Vehicle Industry can be Saved by doing the Opposite of what is being done now to create this technology
SAE Autonomous Vehicle Engineering Magazine — Simulation’s Next Generation
· https://www.sae.org/news/2020/08/new-gen-av-simulation
Elon admits “autopilot” development was harder than expected, but he, nor anyone else actually gets it yet
The NTSB frets over human Guinea pigs then chastises and punts to the even more reckless NHTSA
Using the Real World is better than Proper Simulation for AV Development — NONSENSE
My name is Michael DeKort — I am a former system engineer, engineering and program manager for Lockheed Martin. I worked in aircraft simulation, the software engineering manager for all of NORAD, the Aegis Weapon System, and on C4ISR for DHS.
Industry Participation — Air and Ground
- Founder SAE On-Road Autonomous Driving Simulation Task Force
- Member SAE ORAD Verification and Validation Task Force
- Member UNECE WP.29 SG2 Virtual Testing
- Stakeholder USDOT VOICES (Virtual Open Innovation Collaborative Environment for Safety)
- Member SAE G-34 / EUROCAE WG-114 Artificial Intelligence in Aviation
- Member CIVATAglobal — Civic Air Transport Association
- Stakeholder for UL4600 — Creating AV Safety Guidelines
- Member of the IEEE Artificial Intelligence & Autonomous Systems Policy Committee
- Presented the IEEE Barus Ethics Award for Post 9/11 DoD/DHS Whistleblowing Efforts