UN and UK propose their first driverless standard-It shows how futile the industries development approach is

The UK government released a call for comments this week regarding their future implementation of an Automated Lane Keeping System (ALKS). The approach is not unlike the UN stated they wanted to pursue several months back.


There are two issues I would like to discuss here. The first is the term being used and the much more important issue being the repercussions of implementing what they are suggesting. The latter is so significant it lays bare the futility the reliance on public shadow and safety driving for the development and testing of these systems is.

Regarding the term “automated lane keeping system”. This system is an automated function, not simple L1/ADAS driver assist. It drives the vehicle within the confines of the lane. It is an L2/3 feature. Personally, I think the term is too easily confused with the lane keeping capability that assists the driver and will cause unnecessary confusion. But that is the least of my worries.

Now let’s get to the real issue. That being how these organizations want handover of the system back to the “safety driver” to work. They stipulate a minimum of 10 seconds is needed for the system to hand the system back over to the driver when it cannot handle a scenario properly.

“The transition demand will be made obvious to the driver via the user interfaces within the vehicle. The infotainment system (see Part 5), if engaged, will automatically be suspended as soon as a transition demand is issued. No more than four seconds after the start of a transition demand, if no response has been detected from the driver, the system will escalate its warnings with a mixture of auditory and haptic (e.g. vibration of the driver’s seat) inputs. If at this point the driver retakes control (see below) then manual driving is resumed. The driver will have a minimum of 10 seconds to respond to a transition demand.”

First, is the good part. Finally, someone instrumental to the industries success is recognizing the handover can only be safe and effective if the human is given time to properly regain situational awareness before taking back control of the vehicle. Most experts stipulate that time frame is 3–45 seconds depending on the scenario. (Some folks strongly suggest if is 6–45 seconds). 10 seconds is clearly on the low end of that scale. But let us set that aside and look at what that means. In press articles the ODD where this would be utilized is a divided road or highway with speeds up to 70mph. Others have stated it is 40mph tops. 10 seconds at 40mpgh is ~600ft. At 70mph it is ~1000ft. Just shy of a quarter mile. That distance is how far away any vehicle that could interact with the primary vehicle has to be for the system to employ the automated lane keeping capability. There are two significant issues here. One is the likelihood of that situation occurring and the other is whether the AV system is capable of doing it. Regarding the likelihood of the scenario occurring. 600ft or more between the primary vehicles and other vehicles that could act upon or with it is a long way. The odds of this occurring are slim. The most likely time for this to occur is late night and early morning through sunrise. And these times of day bring lighting conditions camera sensors struggle with or flat out do not work in. The larger issue is 600ft or more is outside the operational envelop of many radar and camera sensors. Factor in 25, 30, 40 seconds and/or a higher speed and LiDAR is out of range. What the UN and UK are demonstrating here is the utter futility of using the real-world for most AV development and testing. And that that is just from the safety side. It does not address the fact that no one, not even if every AV maker were combined, can spend the time or money to drive and redrive all the scenarios and associated miles needed to get close to L4. The solution of course is simulation that is informed and validated by (far less) public shadow and safety driving. The rub there being the gaming based systems being used by the industry have technical deficiencies that won’t facilitate anything close to the fidelity or legitimate digital twin required to do all that would be needed. (Their approach to modeling active sensors using ray tracing vs modeling how active sensors actually interact with real-world objects is another debilitating factor)

It is interesting, frustrating and disappointing to see so many smart people, virtually the entire industry, trying so desperately to avoid saying the king is so naked and dangerous. Something simple math and common sense lays bare. Relying on the real world supported by gaming-based sim for development and testing of AVs is a futile waste of time, money, and lives. When that first significant voice demonstrates the intestinal fortitude to state the actual condition of the king the rest of the echo chamber will follow and reverse course in the blink of an eye. The question is will that happen before a total collapse or the first child or family is killed needlessly

More in my articles here

The Autonomous Vehicle Industry can be Saved by doing the Opposite of what is being done now

· https://medium.com/@imispgh/the-autonomous-vehicle-industry-can-be-saved-by-doing-the-opposite-of-what-is-being-done-now-b4e5c6ae9237

Autonomous Vehicle Industry’s Self-Inflicted and Avoidable Collapse — Ongoing Update

· https://medium.com/@imispgh/i-predicted-this-a-year-and-a-half-ago-1b47bf098b03

Proposal for Successfully Creating an Autonomous Ground or Air Vehicle

· https://medium.com/@imispgh/proposal-for-successfully-creating-an-autonomous-ground-or-air-vehicle-539bb10967b1

Simulation can create a Complete Digital Twin of the Real World if DoD/Aerospace Technology is used

Simulation Photorealism is almost Irrelevant for Autonomous Vehicle Development and Testing

· https://medium.com/@imispgh/simulation-photorealism-is-almost-irrelevant-for-autonomous-vehicle-development-and-testing-136871dee440

Autonomous Vehicle Industry’s Self-Inflicted and Avoidable Collapse — Ongoing Update

· https://medium.com/@imispgh/i-predicted-this-a-year-and-a-half-ago-1b47bf098b03

Autonomous Vehicles Need to Have Accidents to Develop this Technology

Using the Real World is better than Proper Simulation for AV Development — NONSENSE

· https://medium.com/@imispgh/using-the-real-world-is-better-than-proper-simulation-for-autonomous-vehicle-development-nonsense-90cde4ccc0ce

Why are Autonomous Vehicle makers using Deep Learning over Dynamic Sense and Avoid with Dynamic Collision Avoidance? Seems very inefficient and needlessly dangerous?

· https://medium.com/@imispgh/why-are-autonomous-vehicle-makers-using-deep-learning-over-dynamic-sense-and-avoid-with-dynamic-3e386b82495e

The Hype of Geofencing for Autonomous Vehicles

My name is Michael DeKort — I am a former system engineer, engineering and program manager for Lockheed Martin. I worked in aircraft simulation, the software engineering manager for all of NORAD, the Aegis Weapon System, and on C4ISR for DHS.

Key Industry Participation

- Founder SAE On-Road Autonomous Driving Simulation Task Force

- Member SAE ORAD Verification and Validation Task Force

- Stakeholder for UL4600 — Creating AV Safety Guidelines

- Member of the IEEE Artificial Intelligence & Autonomous Systems Policy Committee (AI&ASPC)

- Presented the IEEE Barus Ethics Award for Post 9/11 Efforts

My company is Dactle

We are building an aerospace/DoD/FAA level D, full L4/5 simulation-based testing and AI system with an end-state scenario matrix to address several of the critical issues in the AV/OEM industry I mentioned in my articles below. This includes replacing 99.9% of public shadow and safety driving. As well as dealing with significant real-time, model fidelity and loading/scaling issues caused by using gaming engines and other architectures. (Issues Unity will confirm. We are now working together. We are also working with UAV companies). If not remedied these issues will lead to false confidence and performance differences between what the Plan believes will happen and what actually happens. If someone would like to see a demo or discuss this further please let me know.

Systems Engineer, Engineering/Program Management -- DoD/Aerospace/IT - Autonomous Systems Air & Ground, FAA Simulation, UAM, V2X, C4ISR, Cybersecurity