Brad Templeton gives “Full Self Driving” an ‘F’ — But what about the rest?

Michael DeKort
4 min readJan 14, 2022

His article- A Robocar Specialist Gives Tesla ‘Full Self-Drive’ An ‘F’ — https://www.forbes.com/sites/bradtempleton/2022/01/13/a-robocar-specialist-gives-tesla-full-self-drive-an-f/

Finally, Tesla fan and advisor to driverless vehicle makers like Waymo, Zoox, Cruise, has had enough incremental epiphanies to see Tesla is a dangerous, untenable mess.

“I have great respect and admiration for Elon Musk, so sorry to say this but … it’s terrible. I mean really bad. After all those videos I didn’t expect a lot, but I expected more than this. My first drive home after activating it was frightening. You’re going to see the second loop I did, one around Apple’s Headquarters in Cupertino California. I’ve now driven this loop a dozen times with the system on, and each drive is different, with a different pattern of errors, several of them serious.”

And

About 5 minutes into the video you will see commentary on various situations where it had problems including:

  1. Yielding too long at a 3 way stop, even though it was clearly there first
  2. Veering towards a trailing on the side of a quiet street
  3. Being very slow turning onto an arterial and getting honked at
  4. Pointlessly changing lanes for a very short time
  5. Failing in many ways at a right turn to a major street that has its own protected lane, almost always freezing and not knowing what to do
  6. Jerky accelerations and turns
  7. Stalling for long times at right turns on red lights
  8. Suddenly veering off-course into a left turn that’s not on the route, then trying to take that turn even though the light is red!
  9. Finding itself in a “must turn left” lane and driving straight out of it, or veering left into oncoming traffic
  10. Handing a basic right turn with great uncertainty, parking itself in the bike lane for a long period to judge oncoming traffic
  11. Taking an unprotected left with a narrow margin, and doing it so slowly that the oncoming driver has to brake hard.

All of these in a simple 3.5 mile loop in a suburban residential/commercial area. (They didn’t all happen on one drive, but most drives had several of them, and each drive had a different pattern of errors.)

It’s very unfortunate the “experts” in the industry are so imprisoned by their own egos, wallets, (or likely crypto coins or NFTs) and lack of actual experience in many cases to let even common-sense sink in. No, what they need is for the proof to be so horrendous and tragic they have no choice. This will come with Waymo, Cruise and the rest. Just slower because they can milk their slightly better competency, sensor systems, micro-ODDs and scant release of data for longer before they have no choice but to either harm people as well, to learn many crash scenarios, shutdown or, wait for it, change their approach. (Odds are when Tesla kills that first small child or family the entire industry uses that to self-impose rigor that will give them the face and company saving CYA they need to make a shift.)

(Templeton gets most of the root causes for Tesla being horrendous correctly except he doesn’t mention imaging radar. LiDAR has too many lighting, weather, and update rate issues to do the job. And most folks do not use it to classify or create tracks right now.. He also believes Waymo, Cruise etc have the right mix of sensor systems, real-world development, and gaming-sim tech approach to get to done. They don’t. As a matter of fact, since their capabilities are or appear better this will create more false confidence and make the “safety driver’s” already impossible, reckless and needless task that much worse.)

More on my POV here. Including how to do this right.

Tesla “autopilot” development effort needs to be stopped and people held accountable

· https://medium.com/@imispgh/tesla-autopilot-development-effort-needs-to-be-stopped-and-people-arrested-f280229d2284

The Autonomous Vehicle Industry can be Saved by doing the Opposite of what is being done now to create this technology

· https://medium.com/@imispgh/the-autonomous-vehicle-industry-can-be-saved-by-doing-the-opposite-of-what-is-being-done-now-b4e5c6ae9237

How the failed Iranian hostage rescue in 1980 can save the Autonomous Vehicle industry

· https://imispgh.medium.com/how-the-failed-iranian-hostage-rescue-in-1980-can-save-the-autonomous-vehicle-industry-be76238dea36

The Autonomous Vehicle industry is figuring out gaming simulation technology isn’t adequate-The problem now is they don’t understand the cause is the core architecture and modeling approach

· https://imispgh.medium.com/the-autonomous-vehicle-industry-is-figuring-out-gaming-simulation-technology-isnt-adequate-the-f6ab0a0d11b3

Nostradamus? SAE Autonomous Vehicle Magazine declares I am “Prescient”

· https://imispgh.medium.com/nostradamus-sae-autonomous-vehicle-magazine-declares-i-am-prescient-99325d4c0385

USDOT introduces VOICES Proof of Concept for Autonomous Vehicle Industry-A Paradigm Shift?

· https://imispgh.medium.com/usdot-introduces-voices-proof-of-concept-for-autonomous-vehicle-industry-a-paradigm-shift-87a12aa1bc3a

My name is Michael DeKort — I am a former system engineer, engineering, and program manager for Lockheed Martin. I worked in aircraft simulation, the software engineering manager for all of NORAD, the Aegis Weapon System, and on C4ISR for DHS.

Industry Participation — Air and Ground

- Founder SAE On-Road Autonomous Driving Simulation Task Force

- Member SAE ORAD Verification and Validation Task Force

- Member UNECE WP.29 SG2 Virtual Testing

- Stakeholder USDOT VOICES (Virtual Open Innovation Collaborative Environment for Safety)

- Member SAE G-34 / EUROCAE WG-114 Artificial Intelligence in Aviation

- Member CIVATAglobal — Civic Air Transport Association

- Stakeholder for UL4600 — Creating AV Safety Guidelines

- Member of the IEEE Artificial Intelligence & Autonomous Systems Policy Committee

- Presented the IEEE Barus Ethics Award for Post 9/11 DoD/DHS Whistleblowing Efforts

--

--

Michael DeKort

Non-Tribal Truth Seeker-IEEE Barus Ethics Award/9–11 Whistleblower-Aerospace/DoD Systems Engineer/Member SAE Autonomy and eVTOL development V&V & Simulation