New FAA Certification Chief’s Words and Intentions Send Up Serious Red Flags
Reference article — FAA’s Aircraft Certification Chief Says It Must Flexibly Close Regulatory Gaps for New Aircraft
Unfortunately, the FAA appears to be oblivious, or worse yet ignoring, many key indicators that this industry is on the wrong path. Said more succinctly, they are buying an extreme amount of hype that will kill people needlessly and doom the industry. And keep in mind they are already suffering from lower public trust due to the Boeing 737 MAX debacle. Which was largely caused by greed not a lack of engineering rigor. Here we have both. This manifests itself in the apparent adoption of and support for Silicon Valley/IT buzzwords, engineering and testing processes and tools vs those that aerospace has used extremely successfully overall for over 50 years. The overall problem being of course, those engineering and testing processes and tools are wholly inadequate if any measure of the safety is to be maintained. The world and the FAA is buying into hype, especially given the systems or products involved are rarely similar in any meaningful way. These are aircraft not social apps, internet search tools, ride or office building sharing or games. (And before someone raises SpaceX as an example to counter this. In 2011 NASA failed their code for being poorly tested and not handling safety exceptions. The then head of software, who came from gaming, said they needed to hire NASA folks to fix that.)
The Red Flag Terms = Flexibility, Risk-based incremental approach, lessons learned from aircraft flying, and an “agile” certification system
Reminder — The aircraft industry does not get anything close to the same amount of get out of jail free cards other industries get. Wat happens to one, likely happens to all.
Issues
· Most of the folks creating these new aircraft have little domain or relative engineering experience.
· Many of the tools and simulation they are using are not adequate for aircraft or autonomy design, development, and testing nor human or machine pilot training or testing. That is leading to real world flights and testing that should not be occurring yet. (Gaming based systems like X-Plane for example)
· The FAA is unaware the autonomous development and testing processes being used by many of the developers are untenable from a safety, cost, and time POV. Those being the same deficient simulation and tool issues, and the use AI, machine learning (ML), deep learning (DL) to train and test these systems. That process requires scenarios be experienced over and over for the process to learn. That means many crash scenarios must be experienced as well. Think though what that means if you use the real-world or inferior simulation. (For those touting remote operations or the “safety pilot” can disengage. The former will not work in time critical scenarios due to latency. That latency also makes using a motion system impossible as people will get sick. That in turn negates critical motion cues. Regarding handover, many of those time critical scenarios negate the human’s ability to regain proper situational awareness to do the right thing the right way.)
· They are looking to an “agile”, “risk-based incremental approach” to gather “lessons learned for aircraft flying”. The only term they didn’t mention was “Move fast and break things”. This forgoes most historical best practices or systems engineering.
· UAS, Part 107 and waivers, were mentioned as an example of flexibility. Does the Chief really think that is an apples-to-apples safety comparison? The overall concern here being waivers and “special conditions” are used to lower safety standards that are already too low given the areas I mentioned above. Particularly to make up a significant amount of lost time.
· The certification environment is unnecessarily chaotic and fractured. Far too many regulatory bodies and standards organizations mudding up and slowing the works. I would bet most of the eVTOL makers can’t even staff up enough to attend the plethora of redundant meetings.
· I would like to note here that I don’t think many of the eVTOL makers are actually in a hurry to have meaningful certifications and regulations produced. Despite their complaints stating otherwise. The reason for this is they know or are concerned they would not pass them. That would stop the funding inflow and put them out of business. This is why I believe the industry nor eVTOL companies have put up their own certification drafts. If they are too weak, they expose their issues. If they produce something meaningful, they will fail it.
· Clearly, there is a safety, time and cost balance that needs to be met. And we live in a world of international commerce.
Solution
· Form a single oversight and coordination body that creates an international core framework and certification process for developing, testing, and certifying the aircraft, human or machine pilots, development tools and simulation and engineering and testing processes. (Possibly the UN?) And choose a Program Manager and Chief Engineer to lead that effort who have the proven professional and ethical fortitude to not bow to cost and schedule pressures that meaningfully degrade well established safety goals and expectations.
· Create a progressive due diligence and V&V process that assures the proper processes, development tools, simulation, and testing rigor has been utilized prior to real-world testing, especially where humans are involved. Where real-world flight is needed to inform and validate that process ensure it is warranted and it is accomplished in as safe a manner as possible. Where that involves remote control, or some form of AI (to include a rules-based system) ensure that has been properly vetted.
· Regarding the V&V of autonomous systems. Due to the propensity for these systems to produce eros or flawed results, largely because of the lack of inference, the process must ensure the right or expected result is not arrived at by coincidence, chance or in error. That means ensuring all parts of the autonomous “Stack” are doing as they should. That includes the Perception, Planning and Execution subsystems.
My name is Michael DeKort — I am Navy veteran (ASW C4ISR) and a former system engineer, engineering, and program manager for Lockheed Martin. I worked in aircraft simulation, the software engineering manager for all of NORAD, a software project manager on an Aegis Weapon System baseline, and a C4ISR systems engineer for DoD/DHS and the US State Department (counterterrorism). And a Senior Advisory Technical Project Manager for FTI to the Army AI Task Force at CMU NREC (National Robotics Engineering Center)
Autonomous Industry Participation — Air and Ground
- Founder SAE On-Road Autonomous Driving Simulation Task Force
- Member SAE ORAD Verification and Validation Task Force
- Member UNECE WP.29 SG2 Virtual Testing
- Stakeholder USDOT VOICES (Virtual Open Innovation Collaborative Environment for Safety)
- Member SAE G-35, Modeling, Simulation, Training for Emerging AV Tech
- Member SAE G-34 / EUROCAE WG-114 Artificial Intelligence in Aviation
- Member Teleoperation Consortium
- Member CIVATAglobal — Civic Air Transport Association
- Stakeholder for UL4600 — Creating AV Safety Guidelines
- Member of the IEEE Artificial Intelligence & Autonomous Systems Policy Committee
SAE Autonomous Vehicle Engineering magazine editor calling me “prescient” regarding my position on Tesla and the overall driverless vehicle industry’s untenable development and testing approach — (Page 2) https://assets.techbriefs.com/EML/2021/digital_editions/ave/AVE-202109.pdf
Presented the IEEE Barus Ethics Award for Post 9/11 DoD/DHS Whistleblowing Efforts