Robots need to take a Driver’s Test

The impetus for this article is a story on people suing autonomous vehicle makers if their systems are shown to not perform properly.

https://www.theregister.co.uk/2017/11/16/prosecute_driverlesscar_devs_cycling_uk/

The Dream

· AV makers make really cool things. The companies they work for are incredibly successful.

· The public, governments, automakers and others are amazed by the apps and games they see. Especially those involving vehicles like Grand Theft Auto. They believe the developers of these products can make anything and trust these AV makers to use public roads to engineer and test the vehicles.

· Computer power has gotten to the point where it can handle machine learning.

· We have this belief that industry that when left to their own devices , unencumbered by governments, the private sector will find the best possible products and practices.

· AV makers tout the incredible progress they have made and tell us we will have a fully autonomous vehicles very soon. Even by the end of this year.

· To promote this technology states and local communities lower regulatory bars to entice the AV makers to build their products in their backyards.

The Reality

· While most of the developers making AV are incredibly smart and hardworking they largely come from Commercial IT where they usually work on products nowhere near as complex and massive as what is required here. They do not deal with most of the specific engineering or technology involved. Especially around exception handling. Their engineering practices are woefully inadequate for the task. Largely unaware of all of this and brimming with a dangerous combination of arrogance, invincibility and not knowing what they don’t know these folks jumped right in.

· AV makers hype their current capabilities and when they will have a fully autonomous vehicle.

· The public, governments, automakers and others buy in to the hype. They have no problem believing these AV developers can do this in their sleep.

· These developers lean on AI to compensate for their lack of experience and because where it does work it is faster than the usual engineering and coding methods.

· AI has significant flaws. It can be fooled by noise, unexpectedly goes off into left field and takes a massive amount of repetition to learn.

· The AV makers use public streets to train and test the AI because they think simulation isn’t up to it and that one trillion miles, at an expense of over $300B, is required to complete the engineering and testing required. Which is impossible to do. They also don’t tell you that handover (L2/L3) and public shadow driving for AI cannot be made reliably safe. And that at some point they have to drive thousands of dangerous and crash scenarios thousands of times each to learn those scenarios. This will cause casualties. All for a process that will literally never lead to an autonomous vehicle.

· All of this will lead to several very avoidable tragedies. 1) Thousands of avoidable casualties. Including the avoidable first death of a child or family. 2) The collapse of the industry and bankruptcy of most of the AV makers. 3) The public, governments, insurance companies, press and others realizing they were duped by an industry that misled them about their capabilities and what it really takes to create these vehicles. That will lead to mistrust and much more litigation, regulation and delay than the industry would have had if they self-policed.

The Solution

· Handover/L2/L3 and Public Shadow driving should not be used

· Utilize aerospace engineering practice

· Simulation — Aerospace level Simulation can be used to train and test 99% of what is required here. Where the simulation needs validated or cannot handle certain scenarios test tracks would be used or controlled public shadow driving.

· Scenario Matrix/Taxonomy — A minimal standards set or Scenario Matrix/Taxonomy should be created to ensure the AV has the right capabilities for L4 and L5. Including geofenced situations so they technology can be rolled out incrementally.

· Robot Driver’s Test — These systems should be tested to ensure they meet the minimal capabilities. Specifically to ensure they actually are as good or various levels better than a human.

For much more detail on this please see several of my other articles

Autonomous Levels 4 and 5 will never be reached without Simulation vs Public Shadow Driving for AI

Who will get to Autonomous Level 5 First and Why

Written by

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store