Shahin Farshchi from Lux exemplifies the industries extremely Counter-Productive and needlessly Dangerous approach to developing Autonomous Vehicles

Michael DeKort
3 min readApr 3, 2019

--

Please refer to this article in Business Insider — It’s ‘inevitable’ that a self-driving car will kill someone. Here’s why a VC thinks we should be investing in them anyway.

This is article clearly shows the extremely unfortunate thought process the majority of this industry has regarding how autonomous vehicles should be developed and tested. The path they are on can literally never get close to creating a legitimate autonomous vehicle. And worse of all takes and will take thousands of more lives needlessly as it tries and fails.

I will explain the issues and their resolution in my response to several quotes and my responses from the article

“All the individual components of this problem have been dealt with.”

· Which sensors by themselves or even in unison can handle all extremely weather conditions and combination? Where is redundancy coming from?

· Which system has proven to drive itself when there is bad weather and significant loss of traction?

· How will you get through all scenarios that need learned using public shadow/safety driving? How will you overcome the trillion miles, $300B per AV maker and the thousands of deaths that will occur when thousands of accident scenarios are run thousands of times? (See next section)

Deaths that result from failures in autonomous systems are tragedies, Farshchi, a partner at Lux, said in an interview with Business Insider on Monday. But accidents like those involving Uber and Boeing “would not dissuade us or scare us or make us less interested in investing in these kinds of technologies.”

· This assumes that the use of public shadow and safety driving for developing and testing driverless vehicles is the best or only way to do this and that the lives it takes is for the greater good. This could not be more wrong, counter-productive or reckless.

· It is impossible to drive the one trillion miles or spend over $300B to stumble and restumble on all the scenarios necessary to complete the effort. Many of which are accident scenarios no one will want you to run once let alone thousands of times. Also, handover cannot be made safe for most complex scenarios, by any monitoring and notification system, because they cannot provide the time to regain proper situational awareness and do the right thing the right way.

· The Boeing issue is nothing at all like what is happening here. That is a case of people skirting proper process. Had standard process been in place there would have been training, in a simulator, and checklist data. Public shadow and safety driving is a fruitless and needlessly dangerous process that should be replaced by mostly simulation.

· The solution is to use aerospace/DoD level simulation, informed and validated by the real-world and test tracks, to develop and test 99.9% of this. (Note — the reason aerospace/DoD simulation technology is needed is because the vast majority of products being used or created in this space has significant real-time and model fidelity issues that will lead to flawed outcomes. Most of which will be hidden until an analogous real-world tragedy occurs. I would be glad to demonstrate this.)

Please find more in my articles here

SAE Autonomous Vehicle Engineering Magazine — End Public Shadow Driving

The Hype of Geofencing for Autonomous Vehicles

Common Misconceptions about Aerospace/DoD/FAA Simulation for Autonomous Vehicles

Remote Control for Autonomous Vehicles — A far worse idea than the use of Public Shadow “Safety” Driving

--

--

Michael DeKort
Michael DeKort

Written by Michael DeKort

Non-Tribal Truth Seeker-IEEE Barus Ethics Award/9–11 Whistleblower-Aerospace/DoD Systems Engineer/Member SAE Autonomy and eVTOL development V&V & Simulation

No responses yet