How “Geofenced” Autonomous Vehicles should prove they are a legitimate L4

Michael DeKort
3 min readOct 13, 2019

--

Geofencing is largely is hype. While geofencing can cut down on the work that needs to be done it is nowhere near the time and money saver people make it out to be. The problem of course being that the message these companies want to be sent is that the subset is significant. Thus, misleading people and providing false confidence.

There are several key areas of due diligence that still need to be done, especially around Perception;

· Relevant accident scenarios, avoiding and best handling them, have to be learned for every road pattern in the area.

· Proper detection of moving objects. This is critical as perception systems often struggle with different patterns and finishes. Due diligence dictates you have to learn every object pattern that could be where you are. This means fabric patterns from around the world need to be learned for pretty much everywhere.

· Proper detection of fixed objects especially shadows cast by trees and buildings at various times of day. They can look different to the camera systems due to shadow differences. Shadows are often confused for solid objects, especially where there is no LiDAR. This means a common road pattern, like a 4-way intersection, may appear to be the same in many places but is not.

Given this proof of proper due diligence should be provided. That includes;

· Scenarios learned, especially accidents

· Disengagement data with associated root cause and scenario data

· Objects learned — including degraded versions

· Proof of a full digital twin. Simulation, model fidelity and proper real-time for every model type. Visual and full physics. Ego model, environment, fixed and moving models, sensors etc

Supporting information

Autonomous Vehicles Need to Have Accidents to Develop this Technology

Proposal for Successfully Creating an Autonomous Ground or Air Vehicle

Using the Real World is better than Proper Simulation for AV Development — NONSENSE

Simulation can create a Complete Digital Twin of the Real World if DoD/Aerospace Technology is used

My name is Michael DeKort — I am a former system engineer, engineering and program manager for Lockheed Martin. I worked in aircraft simulation, the software engineering manager for all of NORAD, the Aegis Weapon System, and on C4ISR for DHS.

Key Industry Participation

- Lead — SAE On-Road Autonomous Driving SAE Model and Simulation Task

- Member SAE ORAD Verification and Validation Task Force

- Member DIN/SAE International Alliance for Mobility Testing & Standardization (IAMTS) Sensor Simulation Specs

- Stakeholder for UL4600 — Creating AV Safety Guidelines

- Member of the IEEE Artificial Intelligence & Autonomous Systems Policy Committee (AI&ASPC)

- Presented the IEEE Barus Ethics Award for Post 9/11 Efforts

My company is Dactle

We are building an aerospace/DoD/FAA level D, full L4/5 simulation-based testing and AI system with an end-state scenario matrix to address several of the critical issues in the AV/OEM industry I mentioned in my articles below. This includes replacing 99.9% of public shadow and safety driving. As well as dealing with significant real-time, model fidelity and loading/scaling issues caused by using gaming engines and other architectures. (Issues Unity will confirm. We are now working together. We are also working with UAV companies). If not remedied these issues will lead to false confidence and performance differences between what the Plan believes will happen and what actually happens. If someone would like to see a demo or discuss this further please let me know.

--

--

Michael DeKort
Michael DeKort

Written by Michael DeKort

Non-Tribal Truth Seeker-IEEE Barus Ethics Award/9–11 Whistleblower-Aerospace/DoD Systems Engineer/Member SAE Autonomy and eVTOL development V&V & Simulation

No responses yet