Tesla “Autopilot” has killed 3 more people in past month — It will get far worse from here
Three more people have died in a Tesla in the past month. Each was likely to have been using “Autopilot”.
As I predicted the deaths would continue and the rates rise. This is due to Elon’s Pied Piper manta increasing the false confidence of their test subject owners by routinely putting out updates in the anticipation of declaring L4, in most scenarios, any moment and “Full Self Driving” (FSD) by the end of the year. That means no steering wheel needed.
On the back of this we have articles on NHTSA assisting Tesla in committing this reckless and needless fraud. If you listened to the head for NHTSA Dr. Owens testify after the close of the Uber investigation to the Senate recently you could clearly see he is either incompetent or grievously unethical. This was especially evident when he stated that safety regulations should not be put in place until the technology is sorted out. As I state in my article below Tesla’s routine crashes into stationary objects due to the lack of LiDAR use would be an excellent example of a regulation that should be put in place that would have nothing to do with what technology is used. Regulation example — Properly detect stationary objects and do not hit them.
From here the false confidence rates and deaths will rise. We will have the first of many needless deaths of children and families. That will get Elon the crash data he needs to supposedly avoid tragedies in the future. That means thousands more people in and outside of these vehicles need to be injured and killed so the extremely inefficient and error prone machine and deep learning processes can run through those scenarios with correction data thousands of times each to learn these scenarios.
There is a way forward that resolves all of this.
Proposal for Successfully Creating an Autonomous Ground or Air Vehicle
More in my articles here
Tesla hits Police Car — How much writing on the wall does NHTSA need?
Autonomous Vehicles Need to Have Accidents to Develop this Technology
Using the Real World is better than Proper Simulation for AV Development — NONSENSE
- https://medium.com/@imispgh/using-the-real world-is-better-than-proper-simulation-for-autonomous-vehicle-development-nonsense-90cde4ccc0ce
Simulation can create a Complete Digital Twin of the Real World if DoD/Aerospace Technology is used
- https://medium.com/@imispgh/simulation-can-create-a-complete-digital-twin-of-the-real world-if-dod-aerospace-technology-is-used-c79a64551647
How NHTSA and the NTSB can save themselves and the Driverless Vehicle Industry
NHTSA saved children from going to school in autonomous shuttles and leaves them in danger everywhere else
The Hype of Geofencing for Autonomous Vehicles
My name is Michael DeKort — I am a former system engineer, engineering and program manager for Lockheed Martin. I worked in aircraft simulation, the software engineering manager for all of NORAD, the Aegis Weapon System, and on C4ISR for DHS.
Key Industry Participation
- Lead — SAE On-Road Autonomous Driving SAE Model and Simulation Task
- Member SAE ORAD Verification and Validation Task Force
- Member DIN/SAE International Alliance for Mobility Testing & Standardization (IAMTS) Sensor Simulation Specs
- Stakeholder for UL4600 — Creating AV Safety Guidelines
- Member of the IEEE Artificial Intelligence & Autonomous Systems Policy Committee (AI&ASPC)
- Presented the IEEE Barus Ethics Award for Post 9/11 Efforts
My company is Dactle
We are building an aerospace/DoD/FAA level D, full L4/5 simulation-based testing and AI system with an end-state scenario matrix to address several of the critical issues in the AV/OEM industry I mentioned in my articles below. This includes replacing 99.9% of public shadow and safety driving. As well as dealing with significant real-time, model fidelity and loading/scaling issues caused by using gaming engines and other architectures. (Issues Unity will confirm. We are now working together. We are also working with UAV companies). If not remedied these issues will lead to false confidence and performance differences between what the Plan believes will happen and what actually happens. If someone would like to see a demo or discuss this further please let me know.