Love to see someone prove LiDAR can actually work in the worse conditions. As a matter of fact I would like to see a fused system with actual redundancy work in the worst conditions and then see what happens when each sensor is removed. I see folks working on 3D radar but no proof they can scale it for size or cost. And even at that are there gaps due to color recognition and fine detail? And what about FLIR? There are military aircraft that navigate with instruments only in very bad conditions. However do they have or need the same granularity? I think there is a very real possibility there is no solution for this. Not technically, but that can scale to cost?

Regarding fusion. I hope folks are using filtering to look at all inputs and based on probability and priority the best merged or single data is used. I am concerned there is little or none of that going on. Different sensors are used for different things and there is no actual full redundancy or fusion.

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store