Jeopardy of Negligence Growing for Autonomous Vehicle Industry

While it is clear that having autonomous vehicle technology will have benefits the path to getting there must be doable, ethical and responsible. Unfortunately the majority of the industry is using a dangerous extremely counterproductive practices to create this technology and in products that have sold to the public. Those practices being handover which manifests itself in L2+/L3 vehicles and public shadow driving for AI, engineering and testing. (Some call it safety driving. I will not use that term). Not only will this process cause avoidable injuries, which will grow as scenarios migrate to the more complex and dangerous ones, most AV makers are exaggerating their systems capabilities. This creates a false confidence level to the public, government and entities such as insurance companies. And there is no Scenario Matrix that determines the minimal capabilities required for each autonomous level and geofenced condition. This led one insurer to tell me they were “blind” and had no way of assessing risk.

Regarding the creation of this technology. Public shadow driving for AI, engineering and test is not only dangerous, as it will lead to thousands of avoidable accidents as the AI scenarios move to the dangerous and complex, it will never actually lead to an autonomous vehicle. That because no one can drive the one trillion miles and spend over $300B required to stumble on and restumble on all the scenarios that need to be learned. And regarding those dangerous scenarios. How long do you think the public, governments, insurance companies press and public will put up with this process after that first child or family casualty? And they realize this has to be done for thousands of scenarios, thousands of times each and for every AV maker and unique vehicle?

This brings us to the negligence part. Even reckless disregard and gross negligence. Every single study I have seen, and there are dozens of them, where those who conducted them looked at the time it takes to gain situational awareness after being distracted, has determined that regardless of the driver monitoring and notification system used it cannot be made reliably and consistently safe. (NHTSA performed a 2015 study that came to the opposite conclusion but only because they determined taking back control was the action of grabbing the wheel. They did not look at the quality of the actions taken, the situational awareness time required to do the right thing and if monitoring and notification systems were effective. If someone knows of a study that researches this correctly and came to a different conclusion please let me know). In addition to those studies Toyota and Waymo have declared those practices should not be used because they cannot be made safe. What all of this leads to is growing civil and criminal jeopardy for anyone involved in producing, insuring, using, or approving these vehicles and the associated dangerous technology. If you are sued or charged how will you defeat the arguments made using those studies and entities like Toyota and Waymo as examples? What due diligence did you do to ensure you were doing the right thing? What might you have known and ignored? Or had the responsibility to know?

There is of course solution for this. The solution has three parts. The first is to not use L2+ (steering) or L3. To use aerospace level simulation to replace most public shadow driving for AI, engineering and test. And to create a Scenario Matrix that delineates the minimal acceptable capability by autonomous level and geofenced environment and is then used to verify those capabilities exist. (Toyota and Waymo use simulation vs public shadow driving as their main path for AI and engineering).

For much more information on this, including links to the information I cited, please see my article –

Letter to Congress — Handling of minimum standards for Autonomous industry

My name is Michael DeKort. I am a former systems engineer, engineering and program manager for Lockheed Martin. I worked in aircraft simulation, the Aegis Weapon System, NORAD and on C4ISR for DHS. I also worked in Commercial IT. Including cybersecurity. I also received the IEEE Barus Ethics Award for whistleblowing regarding the DHS/USCG Deepwater program post 9/11. •http://ieeexplore.ieee.org/stamp/stamp.jsp?arnumber=4468728

I am leading an effort to pull together a group of autonomy and mobility community professionals who wish to ensure the approach to gaining this technology can actual be achieved and is done in a responsible and safe manner.

Professionals for Responsible Mobility

Please let me know if you would like to join our group.

Written by

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store