If the new NHTSA investigation of Tesla was legitimate the industry itself would be under investigation
The story — Thirty Tesla crashes linked to assisted driving system under investigation in US — https://www.theguardian.com/technology/2021/jun/18/thirty-tesla-crashes-linked-to-assisted-driving-system-under-investigation-in-us
Right off the bat, it appears we do not know exactly what parts of the “advanced driver assist” system is being investigated. Is it the Automatic Emergency Braking (ABS), is it “Autopilot” (AP) and “Full Self-Driving” (FSD) or all of it? And is this limited to production systems or systems in development?
While Tesla is by far the most egregious actor in this industry, it is not alone. Many in the industry have issues with ABS and stationery and crossing objects. And virtually everyone uses the same untenable fundamental autonomous system development method as Tesla. If the investigation were complete the entire industry would be under investigations not just the most egregious actor.
Automatic Braking System
In most of the crashes the industry ignores ABS issues. They go right for whether AP was engaged. If it wasn’t they move on. Rarely asking why ABS failed regardless of whether the human or machine was driving. (This includes the regulators and press.) In most Tesla crashes, except where speed over 80mph is involved (where ABS does not function) it failed. I do not believe this is some form of mass ignorance by the industry, especially the press and regulators. It is by design. (Props to Sam Abulsamid for mentioning this issue. BUT only with regard to Tesla.) Why does ABS fail? Because Tesla and many in the industry use a radar with poor fidelity. (Yes, Tesla negligently dropped radar. However, that was recently. Before most of these crashes.) The poor fidelity is due to the use of just a couple transmitters and receivers. (Imagine a LiDAR with a couple massive light beams and the unit not spinning). The fewer you use the wider their beam pattern is. That wider beam pattern intersects with a far larger area and far more objects at one time. (These radars do not physically rotate or scan so no help there.) When the beam captures an object or objects in the right lane and to the right of that, like a bridge pylon, it does not know exactly where the object is. To avoid false braking, it ignores them. In most vehicles these radars are only being used in systems where the human driver does no cede steering control. This is why they are not having the accidents, yet. While the technology to do this right has existed in DoD etc for decades, this industry has finally stared scaling them down and providing them at a price point where the industry can “afford” them. They should use them immediately. Tesla on the other hand won’t be using them because it ditched radar entirely. And that was after evaluating the Arbe dense array radar and not using LiDAR. (On LiDAR being a panacea. Most of them are not FMCW. This means they operate frame to frame and do not detect an objects speed or make tracks like radar. This means they are more useful for navigation that object position and speed. Having said this AV makers could process this part themselves. That however is a chore because the data set is massive and at the 40hz or so they should be doing this, is quite the processing load.) At the very least NHTSA should mandate the stationary/crossing object issue be resolved where the drivers are permitted to cede steering to the vehicle. Keep in mind, NHTSA and the NTSB have ignored all of this across the entire industry to date.
“Autopilot” and “Full Self-Driving”
In this area the industry has a massive blind spot (pun intended) because they pretty much all use the same untenable development process. That means they have to find some way of making Tesla guilty without the world figuring out they are reckless hypocrites. Of course, Tesla makes this easy because they have the most accidents and deaths. Why? Because their ODD (Operational Design Domain) is the entire USA, they have a worse sensor system than most others, use their customers vs employees as Guinea pigs and have about $1M vehicles running about vs under 2000 for all of the rest. At some point though every AV maker who wants to produce driverless vehicles for public roads will have to widen their ODDs and literally sacrifice their human test subjects so many scenarios can be learned. While that bottoms up approach and better sensor system will cut down on the harm to humans, it will in no way eliminate it.
What is the current development approach and why is it untenable from a time, cost, and safety POV?
The current approach relies on the real-world for the vast majority of development and testing. Why isn’t that mostly simulation? Because the perfect Silicon Valley storm thinks gaming-technology is the end all and be all of simulation technology. Since it has significant real-time and physics modeling fidelity gaps, we go right back to the real-world.
Why is the process holistically untenable?
It is a myth that public shadow and safety driving can create a legitimate autonomous vehicle. And the lives the process takes are necessary and for the greater good. It is impossible to drive the trillion miles or spend $300B to stumble and restumble on all the scenarios necessary to complete the effort. The process also harms people for no reason. The first safety issue is handover. The time to regain proper situational awareness and do the right thing, especially in time critical scenarios. cannot be provided. Another dangerous area is learning accident scenarios. AV makers will have to run thousands of accident scenarios thousands of times to accomplish this. That will cause thousands of injuries and deaths.
How is all of this resolved? Flip the paradigm with aerospace/DoD/FAA simulation technology. Use the real-world to inform and validate proper simulation.
Development vs Production
Only a hand full of companies are using the public/customers as “safety drivers”. (They all use the public around them as Guinea pigs). That means NHTSA and the NTSB might either be legally stuck with regard to development systems or are using it as a cop out. In addition to Tesla there is comma.ai and Wayve in the UK I am aware of. Beyond that NHTSA has punted on all of this before, including recently when Tesla announced it is getting rid of radar. The NTSB makes this worse by stating, in the Brown and Banner crash investigation reports, that Tesla should not allow the cars to enable AP unless relevant scenarios are learned for an ODD. How do they get to that point if they don’t exercise machine learning trial and error and kill a bunch of folks? See the inane and insane dilemma here and how clueless or ethical challenged the NTSB is?
Where we go from here
NHTSA and the NTSB have been part of the problem all along. The NTSB seems to have more of a conscience but doubles down on trying to improve the current development approach vs replace it. Buying in to the hype that the current development method is the right or only one and humans must be sacrificed to save more later. (Which as I explained is a myth and will never happen.) USDOT VOICES is the only group that gets all of this. They are so small though they don’t have enough clout in USDOT to overcome the others. Beyond that the NHTSA ties the NTSB’s hands. NHTSA does this under the ridiculous argument that any safety regulation, especially something testable, gets in the way of innovation. Of course, that is nonsense as history has shown the opposite. It levels the playing field and reduces the desire to hype and be reckless. Interestingly Ford and some others are now clamoring for regulation to real in Tesla. But again, they would like to carry on with the same reckless use of human Guinea pigs and simply harm people later
The solution, as I have said here, is a paradigm flip. Literally do the opposite of everything we are doing now. Is this new round of investigations the beginning of that flip? Possibly, though not likely. If it where NHTSA would call for a moratorium on the current development efforts until their year or so longer process drags out. That leaves us with the unfortunate reality that we don’t get where we need to be until the first child or family is killed, backbones are fortified, and the immense wagon circling and history rewriting starts.
More detail here
The Autonomous Vehicle Industry can be Saved by doing the Opposite of what is being done now to create this technology
USDOT introduces VOICES Proof of Concept for Autonomous Vehicle Industry-A Paradigm Shift?
SAE Autonomous Vehicle Engineering Magazine — Simulation’s Next Generation (featuring Dactle)
· https://www.sae.org/news/2020/08/new-gen-av-simulation
Tesla “autopilot” development effort needs to be stopped and people held accountable
NHTSA Downplays Tesla Loss of Radar Safety Issues in Expedited Process Filing
The NTSB frets over human Guinea pigs then chastises and punts to the even more reckless NHTSA
Tesla ditching radar and Elon’s explanation show us how bad and how deadly this system is
Elon Musk is now telling us a legitimate “Autopilot” and “Full Self-Driving will never exist
Tesla Director of Autopilot Software says Elon’s statement about “Autopilot” capabilities does not match engineering reality
My name is Michael DeKort — I am a former system engineer, engineering, and program manager for Lockheed Martin. I worked in aircraft simulation, the software engineering manager for all of NORAD, the Aegis Weapon System, and on C4ISR for DHS.
Industry Participation — Air and Ground
- Founder SAE On-Road Autonomous Driving Simulation Task Force
- Member SAE ORAD Verification and Validation Task Force
- Member UNECE WP.29 SG2 Virtual Testing
- Stakeholder USDOT VOICES (Virtual Open Innovation Collaborative Environment for Safety)
- Member SAE G-34 / EUROCAE WG-114 Artificial Intelligence in Aviation
- Member CIVATAglobal — Civic Air Transport Association
- Stakeholder for UL4600 — Creating AV Safety Guidelines
- Member of the IEEE Artificial Intelligence & Autonomous Systems Policy Committee
- Presented the IEEE Barus Ethics Award for Post 9/11 DoD/DHS Whistleblowing Efforts