Brad Templeton gives “Full Self Driving” an ‘F’ — But what about the rest?
His article- A Robocar Specialist Gives Tesla ‘Full Self-Drive’ An ‘F’ — https://www.forbes.com/sites/bradtempleton/2022/01/13/a-robocar-specialist-gives-tesla-full-self-drive-an-f/
Finally, Tesla fan and advisor to driverless vehicle makers like Waymo, Zoox, Cruise, has had enough incremental epiphanies to see Tesla is a dangerous, untenable mess.
“I have great respect and admiration for Elon Musk, so sorry to say this but … it’s terrible. I mean really bad. After all those videos I didn’t expect a lot, but I expected more than this. My first drive home after activating it was frightening. You’re going to see the second loop I did, one around Apple’s Headquarters in Cupertino California. I’ve now driven this loop a dozen times with the system on, and each drive is different, with a different pattern of errors, several of them serious.”
About 5 minutes into the video you will see commentary on various situations where it had problems including:
- Yielding too long at a 3 way stop, even though it was clearly there first
- Veering towards a trailing on the side of a quiet street
- Being very slow turning onto an arterial and getting honked at
- Pointlessly changing lanes for a very short time
- Failing in many ways at a right turn to a major street that has its own protected lane, almost always freezing and not knowing what to do
- Jerky accelerations and turns
- Stalling for long times at right turns on red lights
- Suddenly veering off-course into a left turn that’s not on the route, then trying to take that turn even though the light is red!
- Finding itself in a “must turn left” lane and driving straight out of it, or veering left into oncoming traffic
- Handing a basic right turn with great uncertainty, parking itself in the bike lane for a long period to judge oncoming traffic
- Taking an unprotected left with a narrow margin, and doing it so slowly that the oncoming driver has to brake hard.
All of these in a simple 3.5 mile loop in a suburban residential/commercial area. (They didn’t all happen on one drive, but most drives had several of them, and each drive had a different pattern of errors.)
It’s very unfortunate the “experts” in the industry are so imprisoned by their own egos, wallets, (or likely crypto coins or NFTs) and lack of actual experience in many cases to let even common-sense sink in. No, what they need is for the proof to be so horrendous and tragic they have no choice. This will come with Waymo, Cruise and the rest. Just slower because they can milk their slightly better competency, sensor systems, micro-ODDs and scant release of data for longer before they have no choice but to either harm people as well, to learn many crash scenarios, shutdown or, wait for it, change their approach. (Odds are when Tesla kills that first small child or family the entire industry uses that to self-impose rigor that will give them the face and company saving CYA they need to make a shift.)
(Templeton gets most of the root causes for Tesla being horrendous correctly except he doesn’t mention imaging radar. LiDAR has too many lighting, weather, and update rate issues to do the job. And most folks do not use it to classify or create tracks right now.. He also believes Waymo, Cruise etc have the right mix of sensor systems, real-world development, and gaming-sim tech approach to get to done. They don’t. As a matter of fact, since their capabilities are or appear better this will create more false confidence and make the “safety driver’s” already impossible, reckless and needless task that much worse.)
More on my POV here. Including how to do this right.
Tesla “autopilot” development effort needs to be stopped and people held accountable
The Autonomous Vehicle Industry can be Saved by doing the Opposite of what is being done now to create this technology
How the failed Iranian hostage rescue in 1980 can save the Autonomous Vehicle industry
The Autonomous Vehicle industry is figuring out gaming simulation technology isn’t adequate-The problem now is they don’t understand the cause is the core architecture and modeling approach
Nostradamus? SAE Autonomous Vehicle Magazine declares I am “Prescient”
USDOT introduces VOICES Proof of Concept for Autonomous Vehicle Industry-A Paradigm Shift?
My name is Michael DeKort — I am a former system engineer, engineering, and program manager for Lockheed Martin. I worked in aircraft simulation, the software engineering manager for all of NORAD, the Aegis Weapon System, and on C4ISR for DHS.
Industry Participation — Air and Ground
- Founder SAE On-Road Autonomous Driving Simulation Task Force
- Member SAE ORAD Verification and Validation Task Force
- Member UNECE WP.29 SG2 Virtual Testing
- Stakeholder USDOT VOICES (Virtual Open Innovation Collaborative Environment for Safety)
- Member SAE G-34 / EUROCAE WG-114 Artificial Intelligence in Aviation
- Member CIVATAglobal — Civic Air Transport Association
- Stakeholder for UL4600 — Creating AV Safety Guidelines
- Member of the IEEE Artificial Intelligence & Autonomous Systems Policy Committee
- Presented the IEEE Barus Ethics Award for Post 9/11 DoD/DHS Whistleblowing Efforts