First Tesla Needless Human Guinea Pig Criminally Charged in Canada
Article — ‘A legal first’: B.C. man accused of dangerous driving for sleeping in self-driving, speeding Tesla — https://globalnews.ca/news/7534323/a-legal-first-b-c-man-accused-of-dangerous-driving-for-sleeping-in-self-driving-speeding-tesla/
A 20-year-old named Learn Cai was criminally charged in Vancouver Canada for falling asleep at the wheel of a Tesla in “autopilot”. He was also charged for speed at 150 km/h.
· Why is a Tesla in “autopilot” going 150 km/h, especially with no driver input?
· I thought Tesla’s were supposed to slow down and stop if the system does not detect a driver driving?
· Were their alarms and they were ignored?
· Tesla’s driver monitoring system is a reckless joke. There is no camera monitoring the driver. It relies on torque which can be rigged with a water bottle etc.
· The system delays alarms for 10 even 30 seconds or more. Do the math on how far the car goes at that speed.
· The system is supposed to stop when it does not detect a driver. There have been several cases where someone fell asleep and this did not happen. Including in Japan where Yoshihiro Umeda was killed by a Tesla AP driver who fell asleep. (He actually said he thought he could.)
· Elon and Tesla mislead drivers and the public in many ways about the capabilities of the system and not letting go of the wheel. In addition to calling the non-autopilot and “autopilot” they constantly send mixed messages about keeping hands on the wheel. Elon is seen in many videos and interviews with his hands off the wheel. Earlier in the year Tesla held a massive internet demo day and the safety driver never touched the wheel once.
· The entire process is unnecessary. It is a myth that public shadow and safety driving can create a legitimate autonomous vehicle. And the lives the process takes are necessary and for the greater good. It is impossible to drive the trillion miles or spend $300B to stumble and restumble on all the scenarios necessary to complete the effort. The process also harms people for no reason. The first safety issue is handover. The time to regain proper situational awareness and do the right thing, especially in time critical scenarios. cannot be provided. Another dangerous area is learning accident scenarios. AV makers will have to run thousands of accident scenarios thousands of times to accomplish this. That will cause thousands of injuries and deaths. The solution is to use DoD simulation technology and shift most of the autonomous system development and testing over to it. (Not gaming based simulation Tesla and most of the industry uses.)
· Other issues with Tesla include a massive fatal sensor design flaw that results in the system not properly detecting stationary or crossing objects. This has led to several deaths and cars hitting police cars, tow trucks, trailers, police cars, street sweepers, barriers etc. (See more below.)
Tesla “autopilot” development effort needs to be stopped and people arrested
Tesla “autopilot” development includes Stopping at Green Lights
Forget Tesla’s “autopilot” their Automatic Emergency Braking is a Debacle
The Autonomous Vehicle Industry can be Saved by doing the Opposite of what is being done now to create this technology
SAE Autonomous Vehicle Engineering Magazine — Simulation’s Next Generation (featuring Dactle)
Simulation can create a Complete Digital Twin of the Real World if DoD/Aerospace Technology is used
- https://medium.com/@imispgh/simulation-can-create-a-complete-digital-twin-of-the-real world-if-dod-aerospace-technology-is-used-c79a64551647
Autonomous Vehicles Need to Have Accidents to Develop this Technology
Using the Real World is better than Proper Simulation for AV Development — NONSENSE
- https://medium.com/@imispgh/using-the-real world-is-better-than-proper-simulation-for-autonomous-vehicle-development-nonsense-90cde4ccc0ce
NTSB’s Tragically Incompetent Tesla-Banner Investigation Report
NHTSA is Enabling the Crash of the Driverless Vehicle Industry and More Needless Human Test Subject Deaths
My name is Michael DeKort — I am a former system engineer, engineering and program manager for Lockheed Martin. I worked in aircraft simulation, the software engineering manager for all of NORAD, the Aegis Weapon System, and on C4ISR for DHS.
Key Industry Participation
- Founder SAE On-Road Autonomous Driving Simulation Task Force
- Member SAE ORAD Verification and Validation Task Force
- Stakeholder for UL4600 — Creating AV Safety Guidelines
- Member of the IEEE Artificial Intelligence & Autonomous Systems Policy Committee (AI&ASPC)
- Presented the IEEE Barus Ethics Award for Post 9/11 Efforts
My company is Dactle
We are building an aerospace/DoD/FAA level D, full L4/5 simulation-based testing and AI system with an end-state scenario matrix to address several of the critical issues in the AV/OEM industry I mentioned in my articles below. This includes replacing 99.9% of public shadow and safety driving. As well as dealing with significant real-time, model fidelity and loading/scaling issues caused by using gaming engines and other architectures. (Issues Unity will confirm. We are now working together. We are also working with UAV companies). If not remedied these issues will lead to false confidence and performance differences between what the Plan believes will happen and what actually happens. If someone would like to see a demo or discuss this further please let me know.