Customers can Change the Driverless Software-comma.ai’s Negligent Terrorist’s Dream
From the openpilot wiki — https://en.wikipedia.org/wiki/Openpilot
Community Car specification editor A user annotating a drive
Development is supported by an open-source community using Discord and GitHub.
comma.ai has released tools and guides to help developers port their cars. In addition, they released tools to let users review their drives.
comma.ai maintains the openpilot codebase and releases, and there is a growing community that maintains various forks of openpilot. These forks consist of experimental features such as stop light detection.
Pre-Autopilot Tesla models have been retrofitted with openpilot through a community fork. Chrysler and Jeep models have also gained support through community contributions.
There are over 4,200 forks of the openpilot GitHub repository.
The approach already relies on human Guinea pigs for testing. A process that is untenable, reckless and should not exist in the public domain. But here we see the gross negligence George Hotz displays and NHTSA permits. Their customers can change the software. Should I have to make a case for why that is insane? Might this also be a terrorist’s dream?? (Beyond this comma.ai uses a smart phone camera and the vehicles radar. No LiDAR and A SMART PHONE CAMERA!!!!!!!!!!!!! And keep in mind many of these radars are cheap poorly scanning CW units that struggle with stationary and crossing objects.)
(It is a myth that public shadow and safety driving can create a legitimate autonomous vehicle. And the lives the process takes are necessary and for the greater good. It is impossible to drive the trillion miles or spend $300B to stumble and restumble on all the scenarios necessary to complete the effort. The process also harms people for no reason. The first safety issue is handover. The time to regain proper situational awareness and do the right thing, especially in time critical scenarios. cannot be provided. Another dangerous area is learning accident scenarios. AV makers will have to run thousands of accident scenarios thousands of times to accomplish this. That will cause thousands of injuries and deaths.)
More in my articles here
The Autonomous Vehicle Industry can be Saved by doing the Opposite of what is being done now to create this technology
SAE Autonomous Vehicle Engineering Magazine — Simulation’s Next Generation (featuring Dactle)
Simulation can create a Complete Digital Twin of the Real World if DoD/Aerospace Technology is used
- https://medium.com/@imispgh/simulation-can-create-a-complete-digital-twin-of-the-real world-if-dod-aerospace-technology-is-used-c79a64551647
Using the Real World is better than Proper Simulation for AV Development — NONSENSE
Autonomous Vehicle Industry’s Self-Inflicted and Avoidable Collapse — Ongoing Update
Autonomous Vehicle makers should take the Consumer Reports Challenge
My name is Michael DeKort — I am a former system engineer, engineering and program manager for Lockheed Martin. I worked in aircraft simulation, the software engineering manager for all of NORAD, the Aegis Weapon System, and on C4ISR for DHS.
Key Industry Participation
- Founder SAE On-Road Autonomous Driving Simulation Task Force
- Member SAE ORAD Verification and Validation Task Force
- Stakeholder for UL4600 — Creating AV Safety Guidelines
- Member of the IEEE Artificial Intelligence & Autonomous Systems Policy Committee (AI&ASPC)
- Presented the IEEE Barus Ethics Award for Post 9/11 Efforts
My company is Dactle
We are building an aerospace/DoD/FAA level D, full L4/5 simulation-based testing and AI system with an end-state scenario matrix to address several of the critical issues in the AV/OEM industry I mentioned in my articles below. This includes replacing 99.9% of public shadow and safety driving. As well as dealing with significant real-time, model fidelity and loading/scaling issues caused by using gaming engines and other architectures. (Issues Unity will confirm. We are now working together. We are also working with UAV companies). If not remedied these issues will lead to false confidence and performance differences between what the Plan believes will happen and what actually happens. If someone would like to see a demo or discuss this further please let me know.