Tesla admits loss of Radar degrades the system-Even Brad Templeton understands ditching it is foolish
Electrek article — Tesla announces transition to ‘Tesla Vision’ without radar, warns of limitations at first
Brad Templeton’s article — Teslas With LIDARs And No Radars — Is It The Chip Shortage?
Well, I guess Elon wasn’t bluffing. They are actually ditching the radar.
The Electrek article points out that Tesla has to degrade the system, mostly by slowing it down, until it supposedly makes up the gap.
- Autosteer will be limited to a maximum speed of 75 mph and a longer minimum following distance.
- Smart Summon (if equipped) and Emergency Lane Departure Avoidance may be disabled at delivery.
Elecktek goes on to support the process based solely on what Elon and Tesla says. Completely independent of technology and common sense. Repeating the nonsense that cameras alone are best because that is the sensor humans use. Clearly the camera system and AP/FSD brain are as good as a human. Humans are also blinded to stationary/crossing objects and routinely ignore police cars, fire trucks, passenger cars, tow trucks, barriers and humans. (Camera systems struggle with direct light, bad weather, 2D and complex objects. And processing frames to discern speed or tracks is another issue. especially processing wise. Which Brad Templeton mentions as well.)
As for Brad Templeton. When a Tesla/AV industry groupie like Brad Templeton criticizes Tesla, someone should pay attention. First, I have to credit him for leaving the first part of the article in. It shows his total ignorance of tech in the industry. He goes from discussing the poor radar tech Tesla and others use, to actually mentioning the Arbe and suggesting ditching the radar is wrong. (Templeton clearly thinks this industry’s poor radars represent all radar technology. DoD has had far superior radar for decades. He also believes the stationary/crossing issue and the poor fidelity is due to noise. Nope. It’s due to having a low-density array. They only have a couple transmitters and receivers. That means the beam width is wide in both directions. Picture a LiDAR with 3 “points” in the point cloud. How much real-estate is that gonna cover? This causes the system to merge objects and be unable to tell if the blip is a car in the lane or a bridge pylon next to the road. To avoid the false braking, it ignores the blip and associated object(s).
Templeton also posits the reason for ditching the radar is a lack of chips. I think that’s ridiculous and it’s something far worse. (A chip shortage won’t last forever. Plus, the shortage does not single out radars. There will be other impacts as well.) I believe there is a hardware issue on the main board that precludes proper fusion. First, these radars preprocess until most cameras and LiDAR. They send tracks. Which is a far lighter processor load. Second, Tesla supposedly evaluated the Arbe radar. Assuming it does as advertised, the 48X48 array not only solves the stationary/crossing object issue, which the cameras cannot do nearly as well as a radar especially in bad weather, direct lighting or with 2D or complex objects, it macro classifies objects. As well as provide other things cameras struggle with like object speed. (I would love to see cameras produce tracks a fraction as good as a dense array radar.) This whole thing tells me that The Tesla system could not properly utilize the tracks from a far better radar than they were using before. I also find it hard to believe it’s a fusion software issue no human can resolve. So, what is left? The hardware can’t fuse multiple sources well? That means it’s likely a bus, IO or system loading issue.
(In a previous article I discussed Elon’s statement that “full self-driving” release is coming with no radar and safety is “confirmed”. I challenged that as well. See my article below for more.)
Now having written all this there is something else that bugs me now. If cameras are so good, why didn’t Tesla just alter the fusion/Kalman filter to defer to the cameras? You don’t have to remove any sensor sources to do this. If this is the case, why would Tesla need to ditch the radar to now get more out of the cameras? Once again, I come back to a hardware flaw. That system has to remove radar data for the cameras to work better.
Now two issues remain. Can the hardware given them what they need? And the other is. It doesn’t matter. A camera only system is ridiculous, reckless, for the reasons I state above, and will kill many, many, many more people. That system will still never get near L4.
Which brings me to my final supposition. I think Elon gets this. I think he wants NHTSA etc to shut him down. That way he can finally turn that regulation boogeyman into manufactured reality. Thus giving him a way to save face and try to avoid giving almost a million people their $10k back.
Tesla told NHTSA its AEB and FCW systems are temporarily degraded due to lack of radar. However, they told their customers the opposite on their site. (These systems were already a mess with regard to stationary and crossing objects.)
More detail here
Elon says new “full self-driving” release is coming with no radar and safety is “confirmed”
Tesla “autopilot” development effort needs to be stopped and people held accountable
Tesla ditching radar and Elon’s explanation show us how bad and how deadly this system is
Elon Musk is now telling us a legitimate “Autopilot” and “Full Self-Driving will never exist
Tesla Director of Autopilot Software says Elon’s statement about “Autopilot” capabilities does not match engineering reality
The Autonomous Vehicle Industry can be Saved by doing the Opposite of what is being done now to create this technology
USDOT introduces VOICES Proof of Concept for Autonomous Vehicle Industry-A Paradigm Shift?
My name is Michael DeKort — I am a former system engineer, engineering, and program manager for Lockheed Martin. I worked in aircraft simulation, the software engineering manager for all of NORAD, the Aegis Weapon System, and on C4ISR for DHS.
Industry Participation — Air and Ground
- Founder SAE On-Road Autonomous Driving Simulation Task Force
- Member SAE ORAD Verification and Validation Task Force
- Member UNECE WP.29 SG2 Virtual Testing
- Stakeholder USDOT VOICES (Virtual Open Innovation Collaborative Environment for Safety)
- Member SAE G-34 / EUROCAE WG-114 Artificial Intelligence in Aviation
- Member CIVATAglobal — Civic Air Transport Association
- Stakeholder for UL4600 — Creating AV Safety Guidelines
- Member of the IEEE Artificial Intelligence & Autonomous Systems Policy Committee
- Presented the IEEE Barus Ethics Award for Post 9/11 DoD/DHS Whistleblowing Efforts