IIHS defies widespread public data to the contrary and states “Autopilot” is great at object detection and even better since the radar was removed
IIHS reference links
Top Safety pick — 2022 Tesla Model 3
Test Criteria for the 2022 Model 3
If you look at the test criteria in my third link, for the vehicle and pedestrian crash avoidance scenarios, you will see they test for stationary and crossing objects. That is a plus, especially given most companies, including Tesla, warn these scenarios are not covered or covered well because they want to avoid the false braking created by their poor fidelity radars not being able to determine an objects lateral position well. (As the object could be on the road in front of them or a bridge pylon.) Beyond this I would like to point out several things. The speed is under 25mph, and these are very simple scenarios in good weather and in the daytime. But let’s set that aside. Two things make this extremely troubling. There is a plethora of data available from users, mostly on YouTube and relative press reports, that show this system is a debacle, including in these exact scenarios. And overall, the system does nothing well. Meaning consistently. And there is not only zero proof camera only systems are reliable, especially for determining object location. And no one else does this except Wayve in the UK. IIHS comments on this by extrapolating that Tesla’s performance got better since the got rid of their radar. I may surprise some folks here, but I think that is true, but only in the macro. This stems from my belief that the reason Tesla ditched radar, the radar they said was crucial earlier, is because their main board cannot process a second sensor source properly. (Or they have no idea how to tune a Kalman filter. I think that is far less likely.) Meaning their implementation of radar use is the issue, not the use of radar. Since admitting that and fixing it would doom them, they said they can do better with the camera. (I would note here that most folks use the same poor fidelity radar tech and have crossing and stationary object issues. They however have limited ODDs. And many are starting to use LiDAR for object locating through track creation and are switching to imaging radars.) There is one more item I would like to raise. I have received information that Tesla’s are using GPS to determine where objects might be vs relying on sensors each time. Meaning they are learning to stop just in case an object appears. And this cannot apparently be unlearned. (This may be a cause of the current worsening false braking issues.) This is extremely dangerous. Why would Tesla do this? Because the false brake is preferred to hitting the child. All of this now brings me to the IIHS. Did Tesla miraculously fix their system and it actually works great? Highly unlikely. So, how is it IIHS gets such “great results”? I would have to see more data to know where the issue lies. I believe it is some combination of benign conditions, the luck of running the test only a few times and the GPS issue. The problem here of course is this the Insurance Institute for Highway Safety saying Tesla’s “autopilot” and “full self driving” are very good at vehicle and pedestrian detection and related AEB. That will give folks even more false confidence. That is made worse by NHTSA being a lap dog and “investigating” forever and the NTSB pulling up way short, though they have singled out Tesla, including in areas described here. (See more below.) I call on the IIHS to release more data and to respond to these criticisms. At the very least they should state their testing, rating and recommendation is for a sliver of scenarios and that people should still be very cautious.
Below are a couple articles that explain my POV in more detail. Including why the industry would rather harm people and go down with the ship than change.
The Most Dangerous, Deceitful and Deadly days in the Autonomous Vehicle Industry are upon us
The Autonomous Vehicle Industry can be Saved by doing the Opposite of what is being done now to create this technology
How the failed Iranian hostage rescue in 1980 can save the Autonomous Vehicle industry
Tesla “autopilot” development effort needs to be stopped and people held accountable
The NTSB frets over human Guinea pigs then chastises and punts to the even more reckless NHTSA
My name is Michael DeKort — I am a former system engineer, engineering, and program manager for Lockheed Martin. I worked in aircraft simulation, the software engineering manager for all of NORAD, the Aegis Weapon System, and on C4ISR for DHS.
Industry Participation — Air and Ground
- Founder SAE On-Road Autonomous Driving Simulation Task Force
- Member SAE ORAD Verification and Validation Task Force
- Member UNECE WP.29 SG2 Virtual Testing
- Stakeholder USDOT VOICES (Virtual Open Innovation Collaborative Environment for Safety)
- Member SAE G-34 / EUROCAE WG-114 Artificial Intelligence in Aviation
- Member CIVATAglobal — Civic Air Transport Association
- Stakeholder for UL4600 — Creating AV Safety Guidelines
- Member of the IEEE Artificial Intelligence & Autonomous Systems Policy Committee
- Presented the IEEE Barus Ethics Award for Post 9/11 DoD/DHS Whistleblowing Efforts