Michael DeKort
3 min readOct 26, 2017

--

Colin

I appreciate your taking the time to research and write this article. There are a couple items I would like to discuss

· First I want it to be known that I want to see this technology available ASAP. However it needs to be done so in a safe and efficient manner that will actually succeed. Tesla and many others are using an engineering method that will never result in an autonomous vehicle and will lead to thousands of avoidable casualties. Thus taking lives needlessly and saving few.

· The title premise that the NTSB did not explain how to not mis-use the system is not correct. They made two suggestions. And you seem to believe it is the NTSB’s job to do the engineering.

· A vehicle’s geofenced AP capabilities should be “appropriate” for a given area and not be permitted to engage if it is not. They suggested they do that by GPS coordinates. You state the first part of that in your article. The questions you ask beyond this is not NTSBs job nor the DOT actually to solve. It is the engineer’s job. The problem here is there is no minimal verifiable scenario matrix by level and geofenced area. And Tesla, like most AV makers, exaggerates their capabilities which misleads those downstream.

· The other recommendation being that an L2+/L3 monitoring and notification system would help the driver know when to regain control.(Which you also mention). That system did not exist in the Brown car but does now to some degree. While you again allude to the NTSB having to figure out how to engineer this in this case the capability should not be used at all. The problem with that solution is that the NTSB, like NHTSA got their determination that handover/L2+/L3 can be made safe incorrect. This is not accurate. A plethora of studies, NASA and even recently Waymo have confirmed this. This goes to the untenable, dangerous and reckless engineering method I mentioned above. Tesla like most others (except Toyota and now Waymo) use public shadow driving for AI, engineering and test. As I said above that method will never result in an autonomous vehicle and will cause thousands of casualties when these folks move into complex and dangerous scenarios. The answer is to use aerospace level simulation. (NHTSA made this worse by conducting a fatally flawed study in 2015. They determined control of steering was regained when you grab the wheel after being distracted. They, unlike many others, never looked at the quality of the action taken, the time it takes to gain the proper level of situational awareness to do the right thing and if that is even possible.)

· There is no data to support the 40% Tesla accident rate improvement. Several organizations have asked for it and submitted FOIA’s to NHTSA and have been denied. Given AAA’s decision to raise rates due to accident issues I would offer there is more data to the contrary.

For much more detail on the points I made including links to supporting information I cite please see my article

Letter to Congress — Handling of minimum standards for Autonomous industry

https://medium.com/@imispgh/letter-to-congress-handling-of-minimum-standards-for-autonomous-industry-699bbf2ce1ea

Michael DeKort

Professionals for Responsible Mobility

--

--

Michael DeKort

Non-Tribal Truth Seeker-IEEE Barus Ethics Award/9–11 Whistleblower-Aerospace/DoD Systems Engineer/Member SAE Autonomy and eVTOL development V&V & Simulation