Consumer Reports — Tesla “Autopilot” less Competent than Human Driver — Unsafe

Michael DeKort
3 min readMay 22, 2019

--

From the article — Tesla’s Navigate on Autopilot Requires Significant Driver Intervention

https://www.consumerreports.org/autonomous-driving/tesla-navigate-on-autopilot-automatic-lane-change-requires-significant-driver-intervention/

Consumer Reports just tested Tesla’s latest “Autopilot” feature and found it to be incompetent and unsafe. This demonstrates my concern that Tesla has gone from reckless to negligent in order to try and make Elon Musk’s impossible goal of reaching L4 this year and L5 next. In order to do that they have to learn complex and dangerous scenarios this year. That means running thousands of accident scenarios thousands of times each.

“In practice, we found that Navigate on Autopilot lagged far behind a human driver’s skill set: The feature cut off cars without leaving enough space and even passed other cars in ways that violate state laws, according to several law enforcement representatives CR interviewed for this report. As a result, the driver often had to prevent the system from making poor decisions.

“The system’s role should be to help the driver, but the way this technology is deployed, it’s the other way around,” says Jake Fisher, Consumer Reports’ senior director of auto testing. “It’s incredibly nearsighted. It doesn’t appear to react to brake lights or turn signals, it can’t anticipate what other drivers will do, and as a result, you constantly have to be one step ahead of it.”

It is encouraging to see CR ask for proof of competency before these systems are available for public use, especially through simulation. The next question is are they aware that you can conduct 99.9% of the development and testing required using aerospace/DoD simulation technology and systems/safety engineering practices? And the reason for that being it is impossible to drive the one trillion miles or spend over $300B to stumble and restumble on all the scenarios necessary to complete the effort. Many of which are accident scenarios no one will want you to run once let alone thousands of times. Also handover cannot be made safe for most complex scenarios, by any monitoring and notification system, because they cannot provide the time to regain proper situational awareness and do the right thing the right way.

(Note — Gaming architecture-based simulation systems have significant real-time, model fidelity and scaling issues that will lead to poorly trained systems, false confidence and real-world tragedies.)

More details here

Tesla is exposing Autonomous Vehicle Industry’s avoidably Dangerous and Impossible Engineering Approach

· https://medium.com/@imispgh/tesla-is-exposing-autonomous-vehicle-industrys-avoidably-dangerous-and-impossible-engineering-13c0450bb6e8

Using the Real World is better than Proper Simulation for Autonomous Vehicle Development — NONSENSE

· https://medium.com/@imispgh/using-the-real-world-is-better-than-proper-simulation-for-autonomous-vehicle-development-nonsense-90cde4ccc0ce

The Autonomous Vehicle Podcast — Featured Guest — —

SAE Autonomous Vehicle Engineering Magazine-End Public Shadow Driving

Common Misconceptions about Aerospace/DoD/FAA Simulation for Autonomous Vehicles

The Hype of Geofencing for Autonomous Vehicles

How Driverless Vehicle Makers Should Prove their Technology Works

--

--

Michael DeKort
Michael DeKort

Written by Michael DeKort

Non-Tribal Truth Seeker-IEEE Barus Ethics Award/9–11 Whistleblower-Aerospace/DoD Systems Engineer/Member SAE Autonomy and eVTOL development V&V & Simulation

No responses yet