How eVTOL aircraft are reshaping automation in aviation

Avatar for Elan HeadBy Elan Head | December 21, 2020

Estimated reading time 26 minutes, 47 seconds.

On Oct. 29, 2018, the Boeing 737 Max 8 operated as Lion Air Flight 610 plunged into the Java Sea shortly after take-off from Jakarta, Indonesia, killing all 189 people on board. Four-and-a-half months later, another 737 Max 8 departing from Addis Ababa, Ethiopia — Ethiopian Airlines Flight 302 — also crashed shortly after take-off, killing 157 people.

The investigation into both crashes implicated a hidden piece of software called the Maneuvering Characteristics Augmentation System (MCAS), which was introduced during Max development to automatically counter the tendency for the aircraft’s nose to pitch up in certain flight regimes. The system as originally certified was triggered by only one of the aircraft’s two angle-of-attack (AOA) vanes; on both the Lion Air and Ethiopian Airlines flights, this vane failed, leading MCAS to repeatedly push the nose of the airplane down until the descent was unrecoverable.

The Lion Air Boeing 737 Max, registered as PK-LPQ, crashed just 13 minutes after takeoff, taking the lives of all 189 people on board. flickr/CanAmJetz Photo
In October 2018, Lion Air Flight 610, a Boeing 737 Max 8, plunged into the Java Sea shortly after take-off from Jakarta, Indonesia, killing all 189 people on board. Pictured is the accident aircraft, registered as PK-LPQ. flickr/CanAmJetz Photo

The accidents led to a 20-month grounding of the 737 Max fleet, during which MCAS was substantially redesigned before the U.S. Federal Aviation Administration (FAA) finally cleared the Max to return to service on Nov. 18, 2020. Most of the world’s scrutiny during this period fell on Boeing — which was revealed to have ignored internal warnings about the Max and pressured its engineers to limit safety testing — and the FAA, which, as the original certifying agency, had delegated many of its certification tasks to Boeing. But there were also some people who chose to blame the accident pilots, who had failed to follow the runaway stabilizer checklist that would have disabled MCAS.

In a long article for the New York Times Magazine, the aviation writer and pilot William Langewiesche derided the pilots of Lion Air Flight 610 and Ethiopian Airlines Flight 302 as “incompetent” and described both accidents as “a textbook failure of airmanship.” Langewiesche’s characterization was condemned by many experts including “Miracle on the Hudson” pilot Sully Sullenberger, who in a letter to the editor described MCAS as a “death trap” that “should never have been approved.”

Nevertheless, Langewiesche found an appreciative audience especially among some older pilots who are concerned with the decline in manual flying skills and, perhaps, the transformation of a profession that has long been based on exclusion and mastery. Langewiesche described a visit to a Lion Air flight training academy in Indonesia where he “stood in front of a class of buzz-cut clean-shaven young recruits in white uniform shirts and narrow black neckties — the new checklist children of global aviation.” He concluded: “The situation is evidently grave.”

As one segment of the aviation industry decries a flight training model that puts inexperienced pilots into the right seats of airliners, another is busy pushing it to its logical conclusion. The emerging urban air mobility (UAM) industry, which aspires to see fleets of electric air taxis whisking passengers around crowded metro areas, would ultimately like to do away with pilots altogether. In the meantime, it is working toward a paradigm in which the industry can deploy pilots with minimal training, enabling it to recruit pilots more widely and at lower expense. To UAM proponents, pilots whose experience with flying is “scripted, bounded by checklists and cockpit mandates and dependent on autopilots” are not to be despised, as Langewiesche would have it, but instead welcomed as cab drivers of the city skies.

As the 737 Max accidents illustrate, however, this approach is problematic in aircraft designs that rely on human pilots to compensate for complex system failures — which is pretty much every aircraft certified to date. So as the UAM industry moves toward a new model for pilot training, it’s simultaneously pursuing a new approach to vehicle design, one that shifts some of the ultimate responsibility for safety of flight from human pilots onto automated systems. The name given to this combined vision is simplified vehicle operations, or SVO.

Automation pitfalls

According to the General Aviation Manufacturers Association (GAMA), “SVO is the use of automation coupled with human factors best practices to reduce the quantity of trained skills and knowledge that the pilot or operator of an aircraft must acquire to operate the system at the required level of operational safety.”

In one sense, SVO is the extension of a trend — making aircraft easier to fly — that has spanned the history of aviation. Frank Delsing, technical lead for the U.S. Air Force’s Agility Prime project to accelerate development of the commercial eVTOL industry, highlighted this in a recent Agility Prime webinar on SVO. He drew a line from the 1903 Wright Flyer, which “took a lot of work on the pilot just to keep the thing flying straight and level,” to the Lockheed Martin F-35B fighter with unified flight control, which even non-pilot journalists have been able to pick up fairly quickly in the sim. SVO takes this a step further, Delsing said, “really to the point where you start to [relax] some of the requirements for pilot qualification and currency and things like that.”

That’s because making an aircraft easier to fly does not, on its own, reduce the quantity of trained skills and knowledge that a pilot needs in order to operate it safely. The automation systems on today’s aircraft have generally been certified with the assumption that, if they fail, the pilot can take over to fly the plane manually. That requires pilots to be as skillful as ever, but also more knowledgeable, able to instantly comprehend the nature of an automation failure and respond appropriately.

The problem is that humans as a species aren’t particularly good at managing automation. A 2019 MITRE Corporation technical report on automation-related accidents in aviation identified recurring problems including “poor vigilance, skill degradation, trust in automation, and complacency.” According to the report, most human factors experts agree that pilots are “perceptually and cognitively ill-suited” to monitor automation for long periods of time, and monitoring errors are likely. Moreover, when automated systems experience technical failures, rarely do they make it clear to the pilot exactly what has gone wrong.

In the 737 Max crashes, there was no message to tell pilots that their AOA vane had malfunctioned and MCAS had activated. Due to a Boeing software error, the warning light that would have alerted both crews to a disagreement between the aircraft’s two AOA vanes wasn’t working; it was functional only for operators who had paid to include the AOA reading on the primary flight display. And Boeing didn’t even tell 737 Max operators that MCAS existed until two weeks after the Lion Air crash.

According to a safety recommendation report by the U.S. National Transportation Safety Board (NTSB), the pilots in both accidents were presented with an array of confusing signals, including stick shaker activation and airspeed and altitude disagree alerts, all while MCAS was repeatedly pushing the aircraft’s nose down. “Neither Boeing’s system safety assessment nor its simulator tests evaluated how the combined effect of alerts and indications might impact pilots’ recognition of which procedure(s) to prioritize in responding to an unintended MCAS operation caused by an erroneous AOA input,” the NTSB said.

So, the accident pilots had a puzzle to solve at the exact moment they were dealing with a misbehaving airplane, and as it happens, humans aren’t very good at thinking under stress, either. A 2005 NASA paper cited by the NTSB noted that “in high workload situations, crew errors and less-than-optimal responses often can be linked directly to inherent limitations in human cognitive processes. These are limitations all humans experience when faced with threat, are under stress, or are overloaded with essential tasks. Even simple things can be easily overlooked in non-normal situations.”

The Boeing 737 MAX was certified on the assumption that its pilots would respond correctly within four seconds to any failure of its MCAS. Two fatal accidents revealed that assumption to be deeply flawed. Matthew Thompson/Boeing Photo

In his letter to the editor of the New York Times Magazine, Sully Sullenberger pointed out that he knows “a thing or two about overcoming an unimagined crisis.” He cited his own experience replicating the 737 Max accident flights in a Level D full motion simulator to affirm that the emergencies did not present as a classic runaway stabilizer problem, as Langewiesche had argued, but initially as ambiguous unreliable airspeed and altitude situations that masked the role of MCAS. “I know firsthand the challenges the pilots on the doomed accident flights faced,” he wrote, “and how wrong it is to blame them for not being able to compensate for such a pernicious and deadly design.”

Sharing responsibility

Could a “better airman,” in Langewiesche’s words, have recovered the doomed Max flights? Perhaps — but counting on exceptional airmanship is not an approach that has served the aviation industry reliably at any point in its history. And it’s certainly not a viable one for the urban air mobility industry, which is explicitly trying to get away from expensive and frequent flight training. Although simplified vehicle operations are based on automation, it is clear that the automation paradigm in the 737 Max won’t suffice.

“In many ways, this is flipping the script on how we have traditionally approached autonomy and high levels of automation,” explained Anna Mracek Dietrich, a UAM leader and co-founder of the Community Air Mobility Initiative, during the Agility Prime SVO webinar. “Instead of just saying, ‘Oh well, it will be OK, we’ll just dump it on the human,’ how do we avoid that? How do we create a system where you [don’t have] a tired, overwhelmed pilot that now all of a sudden has a failure condition to worry about?”

The approach being taken so far, according to GAMA, is “to deconstruct the functions that pilots are trained to accomplish today, and to recognize that some of these functions may be more efficiently/reliably executed by an automated or autonomous system while other functions may be extremely difficult or impractical to automate in all desired operating situations.” GAMA has identified 13 pilot skill categories, ranging from aircraft handling and navigation — where automated systems are demonstrably superior — to decision making, where humans still have an edge.

Current regulations enshrine the concept of a “pilot in command” (PIC)  who “is directly responsible for, and is the final authority as to, the operation of [an] aircraft.” Today, a PIC must be proficient in all 13 pilot skill categories, but in SVO, pilot proficiency will be required in only some portion of them — the aircraft will be responsible for the rest.

“If the human is responsible, s/he must be given the appropriate training; if the automation is responsible for the safe execution of the function, it must be certified to achieve a level of reliability that is better than the comparable human PIC performing the function,” GAMA explains.

One way to achieve that level of reliability is through the design of redundant systems. “This is not new stuff — this is just old-school, solid systems engineering stuff,” said Carl Dietrich, the co-founder of eVTOL developer Jump Aero and chair of GAMA’s SVO subcommittee. “But the issue is it’s expensive, right? You don’t get super reliable systems that fail only once per billion flight hours cheaply.”

Historically, he explained, airframe makers have opted for automated systems that fail back onto the pilot because they’re simpler and cheaper. As he put it: “We could either add two more layers of sophistication, or just say, ‘Hey, I detected there’s a problem, I’m going to switch off automatically and the pilot will take over.’ . . . The pilots have to be qualified in order to fly the plane anyway, so we’ll just do that.” Automation suitable for SVO will necessarily require more engineering investment upfront.

As more sophisticated, nondeterministic autonomy systems are introduced onto aircraft, they could be tied to safety monitors that would revert the aircraft to a more determinate mode of operation in the event of a system failure. In a recent opinion piece for eVTOL.com, Anna Dietrich and Erin Rivera, aviation lawyer at Fox Rothschild, gave the example of a computer vision system failing on an eVTOL aircraft, making it unable to verify the safety of an unimproved landing area. Instead of suddenly asking the human pilot to perform a complex and risky off-airport landing — the model for automation failures today — the aircraft would use a backup system to land at a known helipad or runway using instrument landing procedures and navigation aids.

Because SVO will change the very definition of a pilot, it will change how pilots are trained, too. This was affirmed during the Agility Prime webinar by FAA test pilot David Sizoo, who described an “inextricable link” between aircraft and pilot certification. “We have to change our rules in aircraft cert, and flight standards needs to change the way they certify airmen,” he said. “If the aircraft has certain functions that normally the pilot would do, but now the system is taking care of that, you don’t need to, you shouldn’t have to demonstrate those same functions in the PTS [practical test standards].”

To Anna Dietrich, this ability to connect pilot training and operational requirements with vehicle certification presents a unique opportunity. “Because we can work with the FAA to adjust both pilot training requirements, operational requirements, and aircraft or system certification requirements, we have the ability to trade risk back and forth between these different pieces,” she said. “And I think that’s really powerful and . . . what’s going to enable us to really realize some of the potential benefits of these of these systems going forward.”

Orders of magnitude

Broadly speaking, the potential benefits of SVO fall into two buckets: economics and safety.

Unlike conventional aircraft, most eVTOL aircraft have multiple distributed electric motors and propellers — far too many for a human pilot to control without the aid of a fly-by-wire flight control computer. Volocopter Photo

On the economics side, pilots of any kind are not optimal for the urban air mobility business case; the consulting firm McKinsey and Company estimates that the cost per passenger-seat-kilometer of a piloted UAM flight could be up to twice the cost of an autonomous one. But McKinsey also points out that fully autonomous air taxis are probably a decade or more in the future due to technology issues, regulatory concerns, and the need to gain public acceptance. In the meantime, the industry will need pilots.

According to McKinsey partner Robin Riedel, SVO has the potential to reduce costs associated with pilot salaries as well as pilot training. “Today, we source pilots by telling them, ‘You have to go through a commercial pilot course, it’s probably going to take you two years, it’s probably going to cost you $80-150,000,” he said in the Agility Prime webinar. “As we use SVO and we make it less of a specialized profession that needs two years of training up front that people invest in, we can probably . . . lower pilot [salaries] and open this up to other demographics of people.”

Riedel predicted that making it easier to source pilots will become even more important later in the decade, when growth in UAM will coincide with other factors likely to exacerbate a pilot shortage. “When we’re starting to see scale in [UAM], we will probably be back in a growth rate for airlines and business aviation, so there’s going to be a bit of a [pilot] shortage just from that,” Riedel said. Meanwhile, attrition associated with the COVID-19 pandemic will further challenge the broader aviation industry’s ability to fill pilot seats as business recovers, he added.

That economic case is the driver that will justify the engineering investment in SVO, but slashing pilot salaries and training costs is not a strategy that has tended to yield safety benefits in the past. Nevertheless, SVO proponents believe that not only will their approach save lives, it may be the only approach that can deliver large and lasting improvements in aviation safety.

“It’s the same in the rotorcraft and in the fixed-wing world that the leading cause of fatal accidents is simply loss of control, and we fail at that task at about once every 100,000 hours,” Carl Dietrich said. “That’s about our failure rate, it’s pretty well established. And we haven’t managed to get better in a meaningful way.”

He continued: “In essence, part of what we’re doing with SVO is recognizing we’re not going to get an order of magnitude better if we continue to rely on humans to do the stick-and-rudder control skills. We’re not going to address this loss of control accident problem that we have today in any meaningful way. If we want to meaningfully address that problem, we need to have the computers start to really fly the aircraft and use envelope protection routines as part of our SVO scheme so that the human being literally can’t put the aircraft into an uncontrolled situation.”

Of course, there is SVO in theory, and there is SVO in practice, and it remains to be seen how well it will deliver on its safety promise. Given the FAA’s much-maligned role in certifying MCAS in the 737 Max, it’s also fair to question the agency’s SVO certification process. Can we be sure that it won’t overlook similarly “pernicious and deadly” designs?

According to Carl Dietrich, we can’t — but that’s not a knockout argument against the concept. “The design and development of these systems can be fraught with error,” he acknowledged. “I would never say that this will eliminate all accidents. There probably will be some instances early on where there are failures, where the system failed and it made it through the cert process. . . . There’s nothing that we can do that makes it impossible.”

The difference, he said, is that any such failures can be fixed in ways that prevent them from ever happening again, unlike solutions that rely on recurrent training of human pilots.

“The biggest potential benefit that can come from this shift is that we are developing systems that will never make the same mistake twice” — in contrast to humans, who are prone to making the same mistakes repeatedly. “There are definitely more proficient pilots and there are less proficient pilots, but we know what that spectrum looks like. And we need to do better,” Carl Dietrich said.

Tomorrow’s pilots

Whatever the hurdles to realizing this vision, “we wouldn’t be even talking about automation if we didn’t have already the tools [that] we need to fully protect the pilot: from running themselves into the ground, into a building, from running out of state of charge,” Joby Aviation chief test pilot Justin Paines said at the Vertical Flight Society’s Transformative Vertical Flight Conference earlier this year. A former Royal Air Force Harrier pilot who helped develop the unified flight control strategy on the F-35, and who is now leading flight testing of Joby’s five-seat eVTOL air taxi, Paines is very much in favor of fully protecting the pilot from their own error, including in failure conditions.

“There was some talk after the 737 Max. They said, ‘Oh, we need to give more back to the pilot, we need to limit automation,'” Paines remarked. “Absolutely the wrong thing to do. We just need to avoid partial automation. If we can avoid partial automation, we can fully protect the pilot — we have the technology to do it today.”

Like systems and components, human pilots can be thought of as having a “failure rate.” For some tasks, automated systems have a lower failure rate than their human counterparts. iStock/MatusDuda Photo

In fact, SVO is not only possible for eVTOL aircraft, it is necessary. In his New York Times Magazine article, Langewiesche declared: “Airplanes are living things. The best pilots do not sit in cockpits so much as strap them on.” That may be true, but eVTOLs are a different beast. The core innovation that enables them, distributed electric propulsion — multiple electric motors synchronized electronically rather than mechanically — also makes most of them impossible to control manually. Only a fly-by-wire flight computer can manage the dozens of tiny adjustments per second necessary to keep them behaving, particularly in and during transitions to the vertical mode. Any pilot who straps into an eVTOL aircraft will already be ceding a large portion of his or her authority to automation.

“You’ve got eVTOL aircraft that have 18 motors,” noted Carl Dietrich, pointing out the impracticality of creating direct flight control laws for that many inceptors. “It would be basically impossible for a human being to fly it, so what’s the point of having a direct law, when we all know that the computer is actually the thing that’s flying the aircraft?”

Langewiesche’s article can be read as a lament for an era that, culturally and technologically, is already slipping away: one in which the pilot’s status as master of his aircraft was so secure that it was only natural to blame him when things went wrong. The compensation for this weighty responsibility was an enviable level of pay or, at least, prestige, and the unmatched thrill of physically commanding a flying machine thousands of feet over the ground. These are things that many professional pilots will feel the loss of if and when the industry transitions to SVO, even if they fully appreciate the business and safety case behind it.

For whatever is lost, though, there will also be much that is gained if SVO succeeds in enabling more people to fly, more safely than ever before. In an interview with eVTOL.com last year, Paines observed that “there’s something that grabs the imagination” about vertical flight in particular, which he suggested might account for the explosive growth of the UAM industry in recent years.

“There’s something special about taking off like a butterfly and alighting like a bumblebee, and soaring over all the traffic stuck below,” he said. “So there’s an emotional attraction to this industry which I think will keep pilots flying Joby airplanes, even when they might have other options.”

Notice a spelling mistake or typo?

Click on the button below to send an email to our team and we will get to it as soon as possible.

Report an error or typo

Have a story idea you would like to suggest?

Click on the button below to send an email to our team and we will get to it as soon as possible.

Suggest a story

Leave a comment

Your email address will not be published. Required fields are marked *