Les Abend recently retired after 34 years as a Boeing 777 captain for American Airlines. He is a CNN aviation analyst and senior contributor to Flying magazine.
For those who remember the original 1973 movie “Westworld,” it can seem that the possibility of machines going rogue may be close to reality. Yul Brynner played the robot Gunslinger, who started shooting humans after being infected by some unknown computer virus and suffering a program breakdown.
Could a version of such human vs. machine science fiction happen with today’s highly technological, automated airplanes?
No doubt, the continuing saga of Boeing’s 737 Max tragedies has raised such a concern, as it continues to create angst for anyone about to climb aboard a commercial jetliner. After two crashes in five months claimed the lives of 346 people, Boeing acknowledged the role of the 737 Max MCAS (Maneuvering Characteristics Augmentation System), which automatically lowers the nose to prevent an aerodynamic stall. Boeing has grounded the 737 Max and is cutting the production rate on the plane as the company works to return it to flight.
But the nagging question remains: Can I trust that the airplane I am about to fly is not going to override its pilots?
Indeed, in some ways the 737 Max seems to have experienced a Westworld-like situation — although the flaw that overtook the planes was not a computer virus. According to the preliminary accident investigation report, it appears a design flaw contributed to the Ethiopian Airlines crash in March.
Let me explain. The larger and more powerful engines of the 737 Max had to be mounted farther forward on the wings. At lower speeds, the airplane has a greater tendency to pitch the nose up when pilots step on the gas. A higher pitch of the nose could potentially lead to an aerodynamic stall, meaning the airplane would no longer have enough lift to fly.
To prevent the airplane from stalling, Boeing designed the MCAS, which pushes the nose down automatically without input from pilots. This system is only active when pilots are hand-flying the airplane.
But if MCAS receives a bad signal from a faulty sensor that the airplane is stalling when it’s not, the nose keeps pitching toward the ground, overriding the pilot’s input to do otherwise. Unfortunately, it only takes one faulty sensor to activate the system, and that’s a major problem.
This is unprecedented for Boeing. I have flown this manufacturer’s airplanes for over three decades; their systems (electrical, hydraulics, etc.) always have at least one additional backup in case of failure. And none of the systems take complete control away from pilots. We call it redundancy in the airline world.
Now it appears that before the Ethiopian Airlines crash, the pilots tried to rectify the erroneous activation of the MCAS with a checklist procedure that Boeing had recommended to all airlines after investigators found that the Lion Air crash was caused by similar circumstances back in October. But the checklist procedure failed to save the airplane, a contention I made early in the field investigation.
In my opinion, fixing this catastrophic flaw will require not only a software update to revise how MCAS reacts, but also a hardware update to correct the lack of redundancy when only one faulty sensor can cause such tragedy.
Getting back to the original question of whether pilots will henceforth have to battle modern-day airplanes and their automated systems for control: The answer should be a resounding no. Although manufacturers have different philosophies, airplanes are designed to work with pilots, not against them. That said, the systems have to be designed properly: If they fail, pilots always must have the capability to safely land the plane without the automation.
For the past couple of decades, my colleagues have often discussed whether automation degrades basic pilot skills. To be sure, automation has become increasingly reliable, and pilots have grown increasingly dependent on automation features because they are incorporated into standard airline operations.
After a long red-eye, for example, many older and more experienced pilots find it safer to use the autopilot for the approach rather than hand-flying the airplane all the way to touchdown on the runway.
Additionally, aspiring airline pilots are training with flight schools at unprecedented rates, and students may be funneled quickly to major airlines.
The training is intense and usually specific to modern-day, automated cockpit technology. It’s a necessary part of the learning process, but it only affords a relatively short period to focus on basic flying skills. Once the pilot trainee reaches the minimum threshold qualifications required by the airline, she or he can work as a co-pilot on a regional jet, becoming more immersed in the world of cockpit automation.
It should be incumbent upon the airline to design training that allows pilots the opportunity to maintain proficiency in hands-on flying. Some airlines have already incorporated such training into their recurrent curriculums. For example, before I retired from my airline last year, my last three recurrent training sessions in the 777 simulator involved a scenario of taking off, flying a visual traffic pattern, and landing back on the runway with no automation aids, no auto throttles, and no autopilot — similar to the practice I performed as a student pilot in small airplanes.
Although seemingly simple, the scenario involves basic flying skills that every pilot should practice.
Regardless, we have to maintain the highest standards in safety, but accept the fact humans design these machines. And humans make mistakes, whether they are pilots or aeroengineers. Let’s allow the Federal Aviation Administration and Boeing time to ensure that the 737 Max can be flown safely.
In the meantime, rest assured that your pilots are not in a perpetual struggle of man versus machine.