Yes, that’s right — safety driving and stopping at green lights. And not just that. If you look at the video in The Drive link below it also stops randomly.
Newest Tesla Autopilot Update Makes the Car Stop at Green Lights
First, this issue is supposed to be updated soon. That is how I caught that I completely blew this off in April. I remember seeing it and even wrote an article on having to learn intersections. But did not catch the systems stops for GREEN LIGHTS. I think I could not bring myself to think that Tesla, Musk and Karpathy could be this selfish, incompetent and negligent. How is this not illegal and grossly negligent? Also please tell me why this can’t be done on a track or in shadow mode? Or if this really is necessary how many scenarios does it need to experience over and over to do this well? If it’s deep learning does it need to learn every light on the planet? In all lighting, positional, object and weather conditions?
More of my related articles
Forget Tesla’s “autopilot” their Automatic Emergency Braking is a Debacle
Tesla is now using Customer Guinea pigs to learn how to handle Crosswalks in addition to Stop Signs and Lights — Beyond Reckless and Avoidable
The Autonomous Vehicle Industry can be Saved by doing the Opposite of what is being done now to create this technology
Proposal for Successfully Creating an Autonomous Ground or Air Vehicle
Simulation can create a Complete Digital Twin of the Real World if DoD/Aerospace Technology is used
- https://medium.com/@imispgh/simulation-can-create-a-complete-digital-twin-of-the-real world-if-dod-aerospace-technology-is-used-c79a64551647
Autonomous Vehicles Need to Have Accidents to Develop this Technology
Using the Real World is better than Proper Simulation for AV Development — NONSENSE
- https://medium.com/@imispgh/using-the-real world-is-better-than-proper-simulation-for-autonomous-vehicle-development-nonsense-90cde4ccc0ce
My name is Michael DeKort — I am a former system engineer, engineering and program manager for Lockheed Martin. I worked in aircraft simulation, the software engineering manager for all of NORAD, the Aegis Weapon System, and on C4ISR for DHS.
Key Industry Participation
- Lead — SAE On-Road Autonomous Driving SAE Model and Simulation Task
- Member SAE ORAD Verification and Validation Task Force
- Stakeholder for UL4600 — Creating AV Safety Guidelines
- Member of the IEEE Artificial Intelligence & Autonomous Systems Policy Committee (AI&ASPC)
- Presented the IEEE Barus Ethics Award for Post 9/11 Efforts
My company is Dactle
We are building an aerospace/DoD/FAA level D, full L4/5 simulation-based testing and AI system with an end-state scenario matrix to address several of the critical issues in the AV/OEM industry I mentioned in my articles below. This includes replacing 99.9% of public shadow and safety driving. As well as dealing with significant real-time, model fidelity and loading/scaling issues caused by using gaming engines and other architectures. (Issues Unity will confirm. We are now working together. We are also working with UAV companies). If not remedied these issues will lead to false confidence and performance differences between what the Plan believes will happen and what actually happens. If someone would like to see a demo or discuss this further please let me know.