Tesla, not the “safety driver”, should likely be charged in the first “Full Self Driving” manslaughter case
Reference article — Tesla driver first to be charged in fatal crash involving Autopilot — https://nypost.com/2022/01/18/tesla-driver-first-to-be-charged-in-fatal-crash-involving-autopilot/?utm_campaign=iphone_nyp&utm_source=pasteboard_app
A Tesla driver involved in a fatal wreck in California over two years ago while using Autopilot has been charged with two counts of vehicular manslaughter.
The charges against limousine service driver Kevin George Aziz Riad, 27, represent the first felony charges in the US for a deadly crash involving a motorist who was using Tesla’s popular partially automated driving system, the Associated Press reported.
Riad was allegedly behind the wheel of a Tesla Model S that careened off a freeway in the Los Angeles suburb of Gardena, blew a red light and struck a Honda Civic in December 2019.
Two occupants in the Civic, Gilberto Alcazar Lopez and Maria Guadalupe Nieves-Lopez, were killed.
Los Angeles County prosecutors filed the charges against Riad in October, though they just came to light last week.
It is likely Tesla should be charged here.
The reason for this is their development and test method is untenable and grossly negligent from a time, cost, safety and engineering POV. Human “safety drivers” are largely unnecessary if proper simulation is used. (Aerospace/DoD/FAA simulation technology not gaming technology). And the expectation that humans can maintain or properly require proper situational awareness in many time critical scenarios. of which many crashes are included, is technically unreasonable. And Tesla’s DM, with it’s 20 second or longer alarm delay, is useless. Why is that? If “safety drivers” continue to disengage and not sacrifice themselves and others many crash scenarios will never be learned. Just to make sure it is understood, that means thousands more will be injured or killed. (And it’s not just Tesla. They are only the most egregious for several reasons.)
More on my POV here. Including how to do this right.
SAE Autonomous Vehicle Engineering magazine editor calling me “prescient” regarding my position on Tesla and the overall driverless vehicle industry’s untenable development and testing approach — (Page 2) https://assets.techbriefs.com/EML/2021/digital_editions/ave/AVE-202109.pdf
Tesla “autopilot” development effort needs to be stopped and people held accountable
Tesla “Autopilot” “Safety Score” is Creating Better Kamikaze Drivers
NHTSA should impose an immediate “Autopilot” moratorium and report initial investigation findings in 30 days
The Autonomous Vehicle Industry can be Saved by doing the Opposite of what is being done now to create this technology
How the failed Iranian hostage rescue in 1980 can save the Autonomous Vehicle industry
My name is Michael DeKort — I am a former system engineer, engineering, and program manager for Lockheed Martin. I worked in aircraft simulation, the software engineering manager for all of NORAD, a software project manager on an Aegis Weapon System baseline, and on C4ISR for DoD/DHS
Industry Participation — Air and Ground
- Founder SAE On-Road Autonomous Driving Simulation Task Force
- Member SAE ORAD Verification and Validation Task Force
- Member UNECE WP.29 SG2 Virtual Testing
- Stakeholder USDOT VOICES (Virtual Open Innovation Collaborative Environment for Safety)
- Member SAE G-34 / EUROCAE WG-114 Artificial Intelligence in Aviation
- Member Teleoperation Consortium
- Member CIVATAglobal — Civic Air Transport Association
- Stakeholder for UL4600 — Creating AV Safety Guidelines
- Member of the IEEE Artificial Intelligence & Autonomous Systems Policy Committee
SAE Autonomous Vehicle Engineering magazine editor calling me “prescient” regarding my position on Tesla and the overall driverless vehicle industry’s untenable development and testing approach — (Page 2) https://assets.techbriefs.com/EML/2021/digital_editions/ave/AVE-202109.pdf
Presented the IEEE Barus Ethics Award for Post 9/11 DoD/DHS Whistleblowing Efforts