Ohmio’s Mahmood Hikmet, Tesla “Autopilot” and “Full Self Driving” critic. And Wolf in Sheep’s Clothing

Michael DeKort
5 min readMay 9, 2022

--

The link to the YouTube video — https://www.youtube.com/watch?v=sHyOL_vDQMQ

This is very well done. And Tesla is a nightmare. However, Mahmood is severely compromised here. He is trying to split hairs so Tesla is guilty/incompetent and everyone else is innocent/competent. When the truth is they ALL use the same basic untenable approach from a safety time and cost POV. Tesla is simply the least competent and reckless. (Ironic he includes Phil Koopman. Another wolf in Sheep’s clothing.)

(Mahmood is the Head of R&D for autonomous vehicle shuttle maker called Ohmio in New Zealand. Which obviously creates a conflict of interest. While Tesla’s and Ohmio’s systems and operational design domains are largely different, their core development and testing approach is not. As I state below, Ohmio may very well be more competent. But that’s only empirically relative.)

  • First, Mahmood posited why people haven’t died in a Tesla “Full Self Driving” beta. A dozen people have died due to Tesla “Autopilot”. Given the ridiculous amount of system modes, this is largely hair splitting. And it’s only a matter of time.
  • Mahmood asks whether or not Tesla’s systems and development and testing process can ever be safe. None of this can actually be safe for several reasons. First, inherently you are developing and testing machine and deep learning by experiencing, failing, and reexperiencing scenarios over and over in the public domain. (Versus proper simulation where most of this should be done.) Given the nascent state of general and deep learning, which cannot infer, this is especially dangerous. Particularly because many crash scenarios must follow this process. Think about that one. The other issue is “safety driving” cannot be made safe in many time critical scenarios. Many of which are crash scenarios. There is simply not enough time to restore situational awareness. So . . . why haven’t Waymo and Cruise killed anyone? They are smart and competent enough to play the odds. They have far better sensor and AEB systems than Tesla. And they have tight ODDs, don’t report near real-world crashes or any simulation crash data and have riders sign NDAs. At some point there will be crash scenarios they can no longer avoid. This leads to his errant criticism that an unsuccessful disengagement is always the driver’s fault. That is false. You assume a human can always regain proper situational awareness.
  • Regarding Mahmood’s statements of driver monitoring and why Tesla’s is so poor. Tesla doesn’t want it to be effective because he needs crashes to occur to learn them. Hence the 20 second alarm delay. (At some point the more competent systems will preclude critical crash scenarios development and testing. Thus, ending these folks development processes even earlier than the regular failed path would result in. Quite ironic.)
  • Mahmood’s statement that “hopefully the vehicle never drops below human level performance” is extremely incorrect and reckless. It shows his desperate attempt to excuse himself and the rest of the industry from their own untenable and reckless development and testing processes. This comment assumes one can control ALL the scenarios, especially crash scenarios in an ODD. And that time to regain proper situational awareness is always provided. As ALKS in the UK shows, that impossible. To get their 10 second delay, they need a 10 second gap to other objects at 37kph. That is way outside sensor range and such a gap the mode will rarely engage unless the road has few other users. AND this assumes there are no objects popping in from the sides like animals.
  • I also noticed he avoided saying Waymo, Cruise etc are lying about having “fully driverless” L4 systems by stating he was only discussing systems available to consumers.
  • As for the V-chart and actual best engineering practices, the automotive world is much better at this than Silicon Valley, IT etc. But not as good as aerospace/DoD when they operate to a CMMi Level 5.
  • The final missing part involves that last “10%”. (Which is more like 90% in this industry). If the system and development/test approach cannot get you through the most complex and hardest scenarios, the rest will likely be useless. Know anyone working top down or end-state backwards at the same time they work Agile bottoms up? Why not?

More on my POV here. Including how to do this right.

SAE Autonomous Vehicle Engineering magazine editor calling me “prescient” regarding my position on Tesla and the overall driverless vehicle industry’s untenable development and testing approach — (Page 2) https://assets.techbriefs.com/EML/2021/digital_editions/ave/AVE-202109.pdf

Tesla “autopilot” development effort needs to be stopped and people held accountable

· https://medium.com/@imispgh/tesla-autopilot-development-effort-needs-to-be-stopped-and-people-arrested-f280229d2284

Philip Koopman on Autonocast–Ethical, Moral and Professional Chameleon

· https://medium.com/@imispgh/philip-koopman-on-autonocast-ethical-moral-and-professional-chameleon-487f9f2caefc

The Autonomous Vehicle Industry can be Saved by doing the Opposite of what is being done now to create this technology

· https://medium.com/@imispgh/the-autonomous-vehicle-industry-can-be-saved-by-doing-the-opposite-of-what-is-being-done-now-b4e5c6ae9237

How the failed Iranian hostage rescue in 1980 can save the Autonomous Vehicle industry

· https://imispgh.medium.com/how-the-failed-iranian-hostage-rescue-in-1980-can-save-the-autonomous-vehicle-industry-be76238dea36

My name is Michael DeKort — I am Navy veteran (ASW-C4ISR) and a former system engineer, engineering, and program manager for Lockheed Martin. I worked in aircraft/constructive DoD/aerospace/FAA simulation, the software engineering manager for all of NORAD, a software project manager on an Aegis Weapon System baseline, and a C4ISR systems engineer for DoD/DHS and the US State Department (counterterrorism). And a Senior Advisory Technical Project Manager for FTI to the Army AI Task Force at CMU NREC (National Robotics Engineering Center)

Autonomous Industry Participation — Air and Ground

- Founder SAE On-Road Autonomous Driving Simulation Task Force

- Member SAE ORAD Verification and Validation Task Force

- Member UNECE WP.29 SG2 Virtual Testing

- Stakeholder USDOT VOICES (Virtual Open Innovation Collaborative Environment for Safety)

- Member SAE G-35, Modeling, Simulation, Training for Emerging AV Tech

- Member SAE G-34 / EUROCAE WG-114 Artificial Intelligence in Aviation

- Member Teleoperation Consortium

- Member CIVATAglobal — Civic Air Transport Association

- Stakeholder for UL4600 — Creating AV Safety Guidelines

- Member of the IEEE Artificial Intelligence & Autonomous Systems Policy Committee

SAE Autonomous Vehicle Engineering magazine editor calling me “prescient” regarding my position on Tesla and the overall driverless vehicle industry’s untenable development and testing approach — (Page 2) https://assets.techbriefs.com/EML/2021/digital_editions/ave/AVE-202109.pdf

Presented the IEEE Barus Ethics Award for Post 9/11 DoD/DHS Whistleblowing Efforts

--

--

Michael DeKort
Michael DeKort

Written by Michael DeKort

Non-Tribal Truth Seeker-IEEE Barus Ethics Award/9–11 Whistleblower-Aerospace/DoD Systems Engineer/Member SAE Autonomy and eVTOL development V&V & Simulation

No responses yet