The report — https://data.consumerreports.org/wp-content/uploads/2020/11/consumer-reports-active-driving-assistance-systems-november-16-2020.pdf

After Consumer Reports did the right thing and failed Tesla’s “autopilot” and “Full Self-Driving” it has now forfeited any ethical and professional standing it has in understanding and reviewing autonomous vehicle technology. Because of who they are and their associated reputation this report will lead directly to consumer false confidence, injuries and deaths.

Consumer Reports Tesla article — Tesla’s ‘Full Self-Driving Capability’ Falls Short of Its Name — https://www.consumerreports.org/autonomous-driving/tesla-full-self-driving-capability-review-falls-short-of-its-name/

Consumer Reports commits several tragic errors in their multi-OEM review.

· They don’t weigh autonomous system performance high enough — When you allow people to let go of the steering wheel and lose situational awareness, which is unavoidable regardless of monitoring, that should be weighted far higher than other categories. …


The announcement and link — https://www.nhtsa.gov/press-releases/public-comment-automated-driving-system-safety-principles

I am going to start off by making my first positive comment ever about NHTSA on this space. It is saying SOME of the right words with regard validating safety.

Examples

· NHTSA envisions that a framework approach to safety for ADS developers would use performance-oriented approaches and metrics that would accommodate the design flexibility needed to ensure that manufacturers can pursue safety innovations and novel designs in these new technologies.”

· At this stage, NHTSA believes there are four primary functions of the ADS that should be the focus of the Agency’s attention. First, how the ADS receives information about its environment through sensors (“sensing”). Second, how the ADS detects and categorizes other road users (vehicles, motorcyclists, pedestrians, etc.), infrastructure (traffic signs, signals, etc.), and conditions (weather events, road construction, etc.) (“perception”). Third, how the ADS analyzes the situation, plans the route it will take on the way to its intended destination, and makes decisions on how to respond appropriately to the road users, infrastructure, and conditions detected and categorized (“planning”). Fourth, how the ADS executes the driving functions necessary to carry out that plan (“control”) through interaction with other parts of the vehicle. While other elements of ADS safety are discussed throughout this Notice, these four primary functions serve as the core elements NHTSA is considering.


Link to the report — https://www.rand.org/pubs/research_reports/RRA569-1.html

Overall, this report is very disappointing and a missed opportunity. It clearly shows the wild west that is this industry. Which would all be fine if folks were not using human Guinea pigs in the vehicles and around them to test these systems while they figure things out.

RAND is treating these folks with very little relevant systems, test and safety engineers like experts. Like most of this is new and has never been done before. While this is a massive and hard task it is no excuse for the bar being so low. …


Ghost just announced they will have a kit available that makes some vehicles Level 3 on highways in 2021. The kicker is how far their pitch goes.

From the article — Self-driving startup Ghost claims Level 3 autonomy is coming in 2021 for US$99 per month — https://techau.com.au/self-driving-startup-ghost-claims-level-3-autonomy-is-coming-in-2021-for-us99-per-month/amp/

Here’s the Hype Pulitzer Prize List

· Real Collision Avoidance

· Superhuman Reflexes

· No Oversight Needed

· No Bugs, No Glitches

Wow. . .you gotta give it to John Hayes. He blows Elon Musk and George Hotz away in the reckless hype department. NO OVERSIGHT, BUGS OR GLITCHES!!!!!!!!!!!!!!!!!!!!!!!!

Let’s set aside this is not remotely possible to create a system like this relying on public shadow and safety driving (explained in my first article below), these folks are creating a situation that will harm people for no reason. Primarily caused by inducing a grossly negligent level of false confidence for the drivers. These false promises go way beyond Elon Musk’s hype list which includes calling the system “autopilot” and “Full Self-driving” when it is not. Releasing a “beta” when it’s not close to that level. As well as saying it is “capable of zero intervention”. (In fairness to Elon and Tesla they have also recently said “Full Self-Driving is in early limited access Beta and must be used with additional caution. It may do the wrong thing at the worst time, so you must always keep your hands on the wheel and pay extra attention to the road. Do not become complacent.” and “Faults will never be zero, but at some point, the probability of a fault will be far lower than that of the average human.” …


Brad Templeton’s Article — Waymo Data Shows Superhuman Safety Record. They Should Deploy Today — https://www.forbes.com/sites/bradtempleton/2020/10/30/waymo-data-shows-incredible-safety-record--they-should-deploy-today/?utm_source=dlvr.it&utm_medium=twitter&sh=35516adf3829

Templeton’s Findings

(This section is a direct quote)

The report shows an incredible, superhuman safety record and suggests it is past time for them to deploy a service at scale, at least in simpler low-urban/suburban zones like Chandler, Arizona.

The report is notable for several reasons:


The video — https://www.youtube.com/watch?v=JZ2bs4g9CLQ&feature=youtu.be

Noted Issues — All of this took place in 7 uninterrupted minutes

· :50 — Thinks the road is 2 way

· 1:43 — Missed left turn

· 2:05 — Was going to crash making late left turn

· 3:30 — Drivers mention they are getting sick from the drifting

· 3:30 — Would have hit curb on the right side while making right hand turn

· 3:50 — Thinks the road is 2 way again

· 4:55 — Drifts too far to the right

· 6:35 — Did not detect crossing car

Elon said it would be “capable of zero intervention. Well . . . I guess he’s correct because there were several seconds in a row it drove correctly without a disengagement. …


Article reference — https://electrek.co/2020/10/22/tesla-4d-radar-twice-range-self-driving/

With this move Elon and Tesla are admitting they NEED a much better radar. This means they are admitting not having LiDAR was a mistake and their camera-based sensor and perception system is a joke. This also means they are finally admitting to having a very fatal design flaw that had killed at least six people and has caused several accidents, including with police cars and firetrucks. All due to the system not being able to detect stationary and crossing objects properly. This is caused by caused by the camera system’s inability to determine object distances or that they are actual things, vs photos etc. To make matters worse they also recklessly chose not to use LiDAR to make up for this. And chose to use a poorly scanning radar to try to pick up the slack. It is not able to do that because it lacks enough array transmitters and antennas to properly discern lateral position of objects. Since they do not want false positives those vague returns are ignored. (The ultrasonic seems useless unless the vehicle is going very slow and even then, it seems the objects must be large. I have seen videos where Tesla’s run over small objects right in front of them.) I hope the lawyers of all these people hurt or killed pay attention to this. …


Well now we have it. The “beta” Full Self Driving (FSD) from Tesla is out. Of course, the hedges Elon Musk applies are off the charts. (Something he made far worse than he needed to when he declared the system would be SAE Level 5 this year. An insane thing to do. More on that in my articles below.)

· “Full Self-Driving is in early limited access Beta and must be used with additional caution. It may do the wrong thing at the worst time, so you must always keep your hands on the wheel and pay extra attention to the road. …


From the openpilot wiki — https://en.wikipedia.org/wiki/Openpilot

Community Car specification editor A user annotating a drive

Development is supported by an open-source community using Discord[14] and GitHub.

comma.ai has released tools and guides to help developers port their cars.[15] In addition, they released tools to let users review their drives.[16]

Forks

comma.ai maintains the openpilot codebase and releases, and there is a growing community that maintains various forks of openpilot. These forks consist of experimental features such as stop light detection.

Pre-Autopilot Tesla models have been retrofitted with openpilot through a community fork.[17] …


Waymo just announced it is “fully autonomous” in areas of Phoenix. I asked them if there was a local or remote “safety driver” and they said no. (As I understand it, they do have people who are available to support.) The issue here being that zero proof has been provided. How do we know engineering due diligence has been done? How were accident scenarios that cannot be avoided & need to be best handled learned? How were all relevant accident scenarios that can be avoided learned? What about rain? Loss of traction scenarios? Would be nice if Waymo released proof of due diligence-Tested scenario, disengagement, and root cause data. Heck, it would be great to see Waymo let Consumer Reports run the same benign scenario tests Tesla recently failed. (Several people have reminded me about the trove of data Waymo has been releasing. That is sensor data. Not learned scenario, disengagement, and root cause data. It does it even mean perception is right. …

Michael DeKort

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store