I am agnostic to the technology used if it works. Having an aerospace background I know the value of designing to the worst conditions and having proper exception handling or redundancy. I also know the value of 3D radar. Why isn’t 3D radar being used here? I assume a cost/size scaling issue? I see some companies saying they are doing it but no video or other proof. Especially in bad conditions.

Let’s talk about LiDAR. There are reasons DoD and aerospace don’t use it for primary navigation. Those being issues handling bad weather and some object surface compositions. I have yet to see anyone provide highly detailed video showing it work well in a blizzard with fog and night and in day and with the sun or some bright light coming right at you. Have you done that?

Regarding proper redundancy. The military and aerospace use various combinations of sensors and multiple instances of sensors with priority and probability filters to ensure the highest possible data fidelity and accuracy at all times. Especially when the environment is poor and Murphy hits. You have to assume that at any time a sensor or severals sensors are out or degraded. What sensor capability requirements do you have, how do you handle double or triple checking the associated data and what is the redundancy hierarchy? If you cannot use 3D radar and can actually make LiDAR handle extreme situations you will need to make that entire system redundant correct? As nothing else or any combinations of other sensors can fill in if it is out or degraded correct?

Systems Engineer, Engineering/Program Management -- DoD/Aerospace/IT - Autonomous Systems Air & Ground, FAA Simulation, UAM, V2X, C4ISR, Cybersecurity