Tesla, NHTSA and NTSB should not allow “Autopilot” use without LiDAR

Michael DeKort
4 min readJan 18, 2020

--

Several people have been injured or fatally injured by Tesla’s “Autopilot” failing to stop the vehicles from hitting stationary objects. This due to their inability to detect the position of stationary objects laterally. That causes them to ignore objects in the lane they cannot distinguish from objects outside the lane. All of which would create a false positive and stop the care for no reason. This would all be virtually avoidable if Tesla (and comma.ai and Wayve) used LiDAR to augment their camera and radar system. How could this system be as safe as a human let alone safer with this massive glaring hole? With so many accidents over 3 years, the root cause being well known, Tesla having zero interest in fixing this one has to think Tesla, NHTSA and the NTSB have migrated from ignorant to grossly negligent for not fixing this or shutting down “Autopilot” until it is. And keep in mind that NHTSA has already been caught falsifying data to make Tesla’s look safer than they are. And they, with NTSB’s help, permit the reckless and needless use of “safety drivers’ or human Guinea pigs.

The bigger question why isn’t AEB kicking in?

Why isn’t Tesla fixing this? Because it would require a massive hardware and software change. The former requiring the computer systems and physical vehicle be changed. And Elon has an extremely hard time admitting he is wrong. Yes, some features of Tesla’s are extremely safe, like the crash worthiness. But others like AP and the battery fires that can be so dangerous fire men cannot get close enough to put them out, are examples of where these designs are harmful. Finally, Tesla is always on the edge of not existing. The world finding out AP and FSD are scams due to the LiDAR and public shadow and safety driving approach would tank the stock and the company.

Technical issues — The current system cannot detect objects properly because the camera system, radar and ultrasonic are incapable of doing so. The camera system is not stereo. And even if it were stereo cameras struggle in low or direct light scenarios as well as when the objects appear two dimensional and take up most of the field of view. Remember cameras are passive. They derive distance or depth by comparing stereo data of objects at various distances. The radars they use have a beam and sweep pattern that is so large it cannot discern lateral position unless the object moves. And the ultrasonic system cannot even do that.

More in my articles here

Tesla “Autopilot” has killed 3 more people in past month — It will get far worse from here

· https://medium.com/@imispgh/tesla-autopilot-has-killed-3-more-people-in-past-month-it-will-get-far-worse-from-here-dc32f42ae47e

Proposal for Successfully Creating an Autonomous Ground or Air Vehicle

· https://medium.com/@imispgh/proposal-for-successfully-creating-an-autonomous-ground-or-air-vehicle-539bb10967b1

Tesla hits Police Car — How much writing on the wall does NHTSA need?

· https://medium.com/@imispgh/tesla-hits-police-car-how-much-writing-on-the-wall-does-nhtsa-need-8e81e9ab3b9

Autonomous Vehicles Need to Have Accidents to Develop this Technology

Using the Real World is better than Proper Simulation for AV Development — NONSENSE

Simulation can create a Complete Digital Twin of the Real World if DoD/Aerospace Technology is used

How NHTSA and the NTSB can save themselves and the Driverless Vehicle Industry

· https://medium.com/@imispgh/how-nhtsa-and-the-ntsb-can-save-themselves-and-the-driverless-vehicle-industry-8c6febe0b8ef

NHTSA saved children from going to school in autonomous shuttles and leaves them in danger everywhere else

· https://medium.com/@imispgh/nhtsa-saved-children-from-going-to-school-in-autonomous-shuttles-and-leaves-them-in-danger-4d77e0db731

The Hype of Geofencing for Autonomous Vehicles

My name is Michael DeKort — I am a former system engineer, engineering and program manager for Lockheed Martin. I worked in aircraft simulation, the software engineering manager for all of NORAD, the Aegis Weapon System, and on C4ISR for DHS.

Key Industry Participation

- Lead — SAE On-Road Autonomous Driving SAE Model and Simulation Task

- Member SAE ORAD Verification and Validation Task Force

- Member DIN/SAE International Alliance for Mobility Testing & Standardization (IAMTS) Sensor Simulation Specs

- Stakeholder for UL4600 — Creating AV Safety Guidelines

- Member of the IEEE Artificial Intelligence & Autonomous Systems Policy Committee (AI&ASPC)

- Presented the IEEE Barus Ethics Award for Post 9/11 Efforts

My company is Dactle

We are building an aerospace/DoD/FAA level D, full L4/5 simulation-based testing and AI system with an end-state scenario matrix to address several of the critical issues in the AV/OEM industry I mentioned in my articles below. This includes replacing 99.9% of public shadow and safety driving. As well as dealing with significant real-time, model fidelity and loading/scaling issues caused by using gaming engines and other architectures. (Issues Unity will confirm. We are now working together. We are also working with UAV companies). If not remedied these issues will lead to false confidence and performance differences between what the Plan believes will happen and what actually happens. If someone would like to see a demo or discuss this further please let me know.

--

--

Michael DeKort
Michael DeKort

Written by Michael DeKort

Non-Tribal Truth Seeker-IEEE Barus Ethics Award/9–11 Whistleblower-Aerospace/DoD Systems Engineer/Member SAE Autonomy and eVTOL development V&V & Simulation

No responses yet