Several pilot errors this year have called into question aviation safety in the United States. Two of the most high profile were the near collision of a Southwest 737 and FedEx plane as the two prepared to takeoff and land from the same runway at the same time and the American Airlines Boeing 777 that crossed in front of a Delta 737 that was taking off. Both were near misses.
While we need improvements in our air traffic control system for averting such disasters, do we also need better technology? Do pilots have too much control over the aircraft?
There were design flaws in Boeing 737 MAX automation (MCAS) that contributed to two crashes, but those flaws ultimately were about an inability to compensate for handling mistakes by cockpit crew and maintenance mistakes by the airlines. Automation isn’t inherently bad and Airbus continues to push the envelope of automation such that there are real discussions around a future for autonomous flight – zero pilot cockpits.
There’s been real progress in automation that enhances safety. I’ve written about Garmin’s Autoland technology which is certified by the FAA and – with a single button activation by any passenger on board – will navigate to the nearest suitable airport, land the plane, and stop autonomously, all while communicating with air traffic control. While 50% of men think they could land a plane if their pilots became incapacitated there’s now technology that means they wouldn’t have to.
Transportation Robert Poole suggests that Airbus efforts could “succeed in enabling increased automation for cargo and passenger airlines” and reduce the number of pilots necessary in the cockpit for commercial airline operations, “perhaps with a remote pilot on the ground monitoring some number of single-pilot flights.”
Airbus ran a two-year project called Autonomous Taxi, Take-off, and Landing (ATTOL), which finished in mid-2020. It demonstrated fully automated operations using onboard image recognition in an A-350 wide-body airliner. The follow-on project is called DragonFly, aiming to demonstrate automated operations under any environmental conditions. It has added the capability to select an emergency diversion airport, taking into account weather, flight zones, terrain, etc.
The European Aviation Safety Agency (EASA) has issued a concept paper on the use of artificial intelligence for autonomous flight operations. Already in both EASA and FAA certification process is an AI-based visual landing system (VLS) developed by Swiss startup company Daedalean…
Zero-pilot operations are, of course, planned by eVTOL developer Wisk from the get-go, and are in the longer-range plans of many of the other eVTOL startups such as Archer, Joby, and Lilium. But much closer to being operational are projects being developed for conventional take-off and landing cargo aircraft.
There will be a time when AI is safer than a human co-pilot. It’s not clear whether that’s in 2025, 2030, or later but achieving this is inevitable.
As Bob Poole points out, commercial airlines used to have as many as five crewmembers per cockpit (captain, first officer, flight engineer, navigator and radio operator) and this was reduced to three by the early 1960s. Eastern Airlines flight engineers went on strike in response to new aircraft which no longer supported a need for their employment.
Pilot unions will fight against this, and as far as replacing them today there’s probably no one willing to push that button. But the technology will arrive that will make travel safer replacing a human at least as co-pilot, even if full automation is farther off. And at that point pilot unions will be lobbying against safety.
After the German Wings pilot suicidal pilot issue, absolutely not. I’m more comfortable with two individuals in the cockpit.
@Court – that crash was caused by the co-pilot!
No Way.
As someone who pioneered AI within the nonprofit tech space, there is no way AI should be allowed to be a co-pilot, even with someone else on the ground.
The reason? Because AI can’t react to brand new situations where it’s not trained for. Humans can.
You have to have multiple redundancies. Having said that, I have no problem testing with both pilots still there *AND* an AI/person on the ground.
But folks, we have a basic stat problem. Since the # of errors/catastrophe rate per 10 million flights is so low, it’s going to take millions or more flights- to get to even a 95% confidence interval, let alone 99% or more that is likely needed to say flying with a pilot and AI/person on the ground is safer than two+ pilots.
Having witnessed (with my own son!) a mid-air emergency where the Usairway “on-call” doctor was supposed to be reachable in an emergency, and was not, I am very skeptical (even with LEO (Low Earth Orbit) satellites that promise better connections.
-Jon
Otto the Autopilot, or is it HAL? There are possibilities but also issues. You’ll need relief flight deck members for long distance travel, and on shorter hauls there is always the chance that one person will become incapacitated. Sure, the system could probably take off, fly and land an aircraft without a human touching the controls…until there is a solar flare, a malicious hacking or an onboard multiple system failure and ground contact is lost. At that point even with one person flying the workload could become extremely high, especially if there is bad weather involved.
Eventually we will move in this direction, though I can’t imagine where we’ll get enough trained people. Even cutting the license time down from the arbitrary 1500 hours it still take a lot of investment, and not everybody may find work as a flight instructor to just build up time, nor will the military be likely to supply enough bodies either.
Dont worry, pilot will put themselves out of a job asking for the rates they want these days.
Germanwings and Malaysia show that you always need a second pilot in the cabin during flight and a second person when either pilot leaves the cockpit. While you might take issue with my inclusion of Malaysia, it’s extremely probable that the pilot crashed the plane intentionally.
I’m aware that there are a number of pilots who read this column, so they will have a far better view than I. That said, as mentioned already, the issue of incapacitation (typically a medical emergency) makes the idea of AI as an assist to a single pilot dead on arrival. It seems to me that we would need two things: 1) AI to offload a lot of tasks from the single pilot; and 2) A proven system in place where remote pilots could step in instantly, and fly (and land) the plane remotely. The military does this successfully with drones, so it’s not out of the realm of reason. Bottom line: I’m unable to conceive of any other way to accept the idea of a single pilot flying the aircraft, given the potential for medical emergencies.
Private pilot and professional software engineer who has worked as such at an aerodynamics testing facility here. We are decades from autonomous flight being at the level of safety of real pilots in the flight deck with automation helping them out.
And the FedEx/Southwest incident was primarily controller error, though it is odd that neither crew rejected the instructions.
Also, Airbus has had their share of serious automation problems, it’s just that theirs didn’t manage to kill anyone primarily because of pilot skill and judgment combined with luck. An A340 over the North Atlantic entered an uncommanded extreme climb that would have taken it right through the A330 1,000 feet above it had the pilots not chosen to fly with a lateral offset from their assigned track. The A340 blew through the A330’s altitude so quickly that the A330’s TCAS was still telling them to climb for a bit after the A340 was already above them. In another incident, a Qantas A330 entered two different extreme uncommanded dives, the first of which threw people in the cabin against the ceiling, injuring several of them. Had this happened at the same altitudes as the 737 MAX accidents instead of at cruise altitude, it very well could have had the same end.
It is not by accident that mainline U.S. airlines have had a grand total of 1 passenger death since 2002 due to an aviation-related incident (as opposed to a medical emergency) and that one didn’t have anything to do with pilot error, but rather a mechanical failure. And, even when including the regionals, there have been only 2 since 2010.
Automation is great, but it is great at assisting pilots (and controllers,) not at replacing them. As for the “push a button to land at nearest suitable” feature, we’ve had the technology to do that for something like 40-50 years now. Autoland has been around for quite a long time and FMSs with sufficient information to find and route to the nearest suitable airport have been around for a similar length of time. Automation can land a plane just fine – and better than a human – when everything is going as expected. It’s when things are not going as expected that highly-trained pilots earn their pay. Computers are very good and dealing completely (or at least mostly) predictable situations that we can program them to handle according to strictly-defined conditions. They are much worse at dealing with unpredictable situations, especially with no oversight from trained humans.
And two pilots are much, much more safe than one. Pilot incapacitation events (such as the recent one in the single-pilot jet over NYC/DC/Virginia) are certainly one reason for that, but the bigger reason is CRM. When flying in challenging conditions – especially in situations where things aren’t going as expected – you want one pilot to be able to focus completely on flying while the other runs checklists, communicates with ATC, and otherwise works whatever problems may be going on. And, of course, people make mistakes… which the second pilot is very likely to catch and correct. While a remote second pilot would be better than none at all, it wouldn’t work nearly as well as a two-pilot crew working together in the flight deck.
Additionally, as for the remote (either secondary or primary) pilot, any system that can be controlled remotely… can be controlled remotely. Hackers driving airplanes full of passengers into buildings from their basement on the other side of the planet is not ideal. Or rouge states capturing an airliner and holding the occupants hostage. This is not a theoretical problem. We’ve already had at least one known incident of Iran hijacking a remotely-controlled military aircraft and forcing it to land at an airport of their choosing. And that aircraft was undoubtedly secured much better than civilian aircraft would be.
I’m not a fan of unions (in any industry,) but having two pilots in the flight deck assisted by automation is much safer than automation alone (or than one pilot plus automation) regardless of what one may think about unions. And this is unlikely to change for at least decades. Just because some language model can rip off some text from websites where experts have written about various topics and try to associate it to a question someone asks (often unsuccessfully) does not mean that we suddenly have general AI that can problem solve as well as a human.
Wrong question. For the cruising phase of flight, AI has already replaced both the pilot and the copilot. As AI improves, it will continue to replace both the pilot and co-pilot for more phases of flight, until it replaces the pilot entirely, leaving a human co-pilot to do the check-list before flight and deal with any weirdness, like passengers in the back acting up.
@Gary – why do you hate pilots so much? Did you dream about being a pilot and the plan crashed due to medical issues? Are you in pain pilots are doing well nowadays? Why didn’t you post anything about this 3-4 years ago when pilots were not so “hot shit”.
Oh and GERMANWINGS was NOT caused by the co-pilot, it was ACTUALLY caused on a mentally ill person, which makes this more inevitable if there’s a SINGLE pilot operation.
Sorry Gary. I feel bad for you for posting this endless pilots bashing posts. Kick back from airlines?
Airlines would love to do away with a second pilot, both to save money and to weaken union bargaining power. As a passenger I would be much more comfortable with a copilot on the flight deck . Add the remote assistance to enhance safety but leave the second pilot in place.
@ Gary….it is immaterial which pilot became suicidal, when there is no one else on the flight deck, one deranged pilot leaves no margin of safety.
@vbscript2 Love your perspective! My only fear is that since I programmed in vbscript, then your are an actual AI Language model programmed to respond to me! 😉 😉
Seriously though, that’s why I use my real name in these posts. Somehow, I really still enjoy flying and everything it means.
-Jon
From what I see of automated driving, automated flying is still a ways off. The skill level needed for flying passengers is way higher than that to drive automobiles. I have no major objection to it when it is finally ready to prime time. It will be at least as good as undertrained pilots and copilots. How will it do with shoddily repaired airplanes? Remote flying of drones has changed warfare but a drone going down is regrettable and not a tragedy.
I’ll never fly with a single commercial pilot and automation.
Did you ever try to make a phone call but you couldn’t complete the call due to some technical issue? Access and connection issues happen everyday with most technologies.
I’ll be sticking with the 2 pilot model.
@Gary,
That tragedy was caused by one of the 2 pilots, the fact that he was the First Officer (you should know better than to call him the “co-pilot”, about as meaningless a term in the industry as using the term “tarmac”) is inconsequential to the question you pose.
Until technology makes it impossible for a single human to intentionally crash a plane, that’s a hard no.
Gary: this comment is idiotic even for you “@Court – that crash was caused by the co-pilot!”
And if that pilot had been the pilot in a single-pilot operation? Guess what: a crash.
Or is it your contention that only “co-pilots” (whatever that is) have risk of mental illness?
No way single-pilot commercial operations are getting approval any soon. Nor should they.
Do you really expect anyone to take the dreck you post here seriously?
If Mr. Leff could see what airline pilots see every day he would understand how ridiculous the idea of single pilot or pilotless airliners really is. Maybe 100+ years from now, but for the foreseeable future large airports are way too dynamic and busy for AI to cope with.
We’re more likely to see remotely piloted flights by pilots working from home than AI pilots.
AI in theory could prevent another Germanwings, Air France 447 or Malaysian 370. If pilot tries to deviate from ATC clearance or operate controls in an unsafe manner, AI could first alert the crew and then override the pilot actions if corrective action is not taken by the pilot.
As much as we dont want to admit it, the current system requires a lot of human labor that contributes to the shortage of pilots. Long haul flights usually require 3 or 4 pilots. AI use at cruise could allow for longer 2 pilot operations.
I also think AI is coming sooner than we might realize because of the generous contracts airlines are giving pilots. Airline business is historically low yielding with lots of downwind swings. Pilots wont likely be willing to ever agree to pay cuts but if the first officer position is done away with, you can keep paying captains and still get your cost savings. I doubt airline managers are giving out these raises without having a plan to cut costs elsewhere in a highly competitive, price sensitive business.
Whomever vbscript2 is; took the words right out of my mouth! It’s would be fantastic to have AI assist working pilots; NEVER REPLACE THEM!
I have so many things I want unpack here, and as a veteran flight attendant, that’s a lot!
But I’ll stick with two:
Jon Biedermann Is on point. His comment that AI cannot react to things it’s never been taught is exactly why self-driving cars, despite decades of research, still aren’t a viable mode of transportation. Thank you for sharing your knowledge & expertise, Jon.
And to those of you calling out @Gary for bashing on pilots and incorrectly saying AI would’ve stopped the German Wings catastrophe, thank you.
There are multiple reasons why a commercial pilot is not allowed to be alone in the flight deck during flight. One being the German Wings incident.
Gary you’re not an airline employee, just a frequent traveler with a lot of good tips and travel information. Just because, as a flight attendant who travels internationally every week, I don’t pretend to know what it’s like as a paying passenger and deals & discounts.
Stick to things you know about and stop trying to “educate” the public on things you don’t know first hand.
No system is fail proof but here in the USA we have the best damn pilots in the world. Gary may be a good man , may be a smart man but my perception is that he is an absolute idiot when you read some of the stuff he posts. The LAST thing you want Mr Gary is no pilot up front when you’re deviating around thunderstorms over South America late at night. You’re saying someone will be in a room communicating with ATC manually steering around massive build ups that radar doesn’t always paint ? What about instant gusts of cross wind. At the controls I can respond in an instant. How dare you bring up two incidents that you did and use that as an argument when we are in the safest period in aviation history thanks to not one but both pilots up front. Automation is ridiculously flaws and new automation is getting over engineered and actually worse.
As an airline pilot, it is absolutely inexcusable to replace us with a machine. We cant even get self driving cars to work, what makes you think it’ll work well enough for air travel.
It won’t be in our lifetime.
What will happen is improvements to safety methods and technology.
Anyone who is for this concept is 100% for the idea of killing off countless jobs of innocent working people for the sole purpose of saving on costs.
That’s the real killer.
@steve. No one cares about our jobs except us .. unfortunately the people lose sight of what they set out to accomplish in the first place. A company is supposed to be a place where profits can be made while providing jobs and services. Now they just want to skip the jobs part and with service going down too it looks like they just want to make the $$.
I read this article and I think about how sad it is that someone can use two examples of when pilots made mistakes , thankfully not resulting in injury or death and this guy is too small minded to think about the tens of thousands of times pilots have done small things that automation would have failed at. Yes we are human .. but if we ever go fully automated or even close to it … you’ll be hearing of crashes weekly … not once every few decades.
Some food for thought.
What happens if something is hacked ?
How many times does ATC ground stop us due to system issues … well what happens when the entire automation redundancy fails ? Happens in tried and true systems today.
The human brain can think out side the box … what happens when the answer to the problem is not pre written into the AI?
Our ATC system is currently 20-30 years behind (this is not an exaggeration ). How will this system be updated to be 20-30 years ahead ?
Going to a single pilot for a scheduled commercial airline flight makes absolutely no sense at all.
Better training, better systems in planes and at airports, however, makes a whole lot of sense.
A lot of airlines are also starting to push the diversity nonsense for pilots.
Going to need more safety features as a result of unqualified people that will be hired.
@Steve, you said that flying airplanes won’t happen in our lifetime? I beg to differ sir. 🙂
They absolutely will, simply because the chances of disaster is likely on the order of 10,000 times less than an automatic car simply because of 3d space and the lack of objects in that space, compared to the ground. I.E., no pesky pedestrians!
However, this will happen at a smaller scale. I.E., flying taxis and cars are going to happen first. I for one will be happy to be one of the first to try it out, because if something bad happens, the aircraft is small/light enough that parachutes will be mandatory, and while not ideal, I will take that chance all day long as the odds of accidentally parachuting into electric wires where you could accidentally die is so remote, as in a 10 million to one, I will take that risk, just like I take the risk of driving my own car and flying commercial today.
It’s acceptable and beautiful. 🙂
-Jon
Open the cockpit door HAL.
A problem with remote control of aircraft is the delay introduced by the speed of light. The longer the distance radio communication needs to travel roundtrip the longer the delay in responding to changes. This delay becomes very large if satellites in synchronous orbit are used.
Missed in this discussion so far is the huge number of advanced military drones that crash every year. They work until they don’t. The military builds the crashes into planning as a cost of doing business.
https://dronewars.net/2019/06/09/accidents-will-happen-a-dataset-of-military-drone-crashes/
It amazes me how ignorant people are to think that piloting a commercial airliner is something that will be replaced by AI soon. When autonomous cars and trucks clog the roadways we will still be decades away from autonomous aircraft. Even then, insurance companies will refuse to touch them and passengers will likely choose the autonomous bus/car/wagon/etc before hopping in the back. Either way, I’d probably prefer a fully automated aircraft over one with a single pilot and absolutely no one else there to challenge his or her decision making process. That’s what the “co-pilot” is there for and this level of intuition will never be replaced by a computer in our lifetime. In response to the ignorant prediction that an autopilot will replace a human – thank you for the belly laugh!! It’s quite apparent that you have zero idea how an autopilot works.
As an aside, you post some truly ignorant articles fairly routinely that prove you’re totally clueless regarding anything in this industry other than what they’re serving in first class on AA/UAL/DAL and how they periodically fail to bow down to you. You’re not a pilot and you’re not an expert. You are a regular passenger who writes a third tier travel blog. Stop making yourself look like a complete buffoon.
Gary, I’ve seen the future, and AI has been WAAAAy oversold. It will never be true AI, and it will never replace humans. Why? Because of the amount of computing power and electricity that it will require.
Just look at how much computing power and electricity bit coin requires to mine bitcoin. And that is just a complicated math equation. What happens when you multiply that and put in on a plane? AI will require a lot more computing to resolve all of the equations necessary to do the tasks. So independent AI will never happen where it is independent on board a plane to make decisions. The computing power and the electricity to run it would be too much demand.
That means you would need a ground-based system that flies the plane remotely. So now you have a security problem. Dallas just got hacked and their systems have been shut down for weeks. Fort Worth also. Utilities are getting hacked. Just imagine what would happen if someone were to hack an aircraft, or several aircraft full of people and threatened those flights. The only way to avoid hacking is to have a closed loop system, which is doable, but still vulnerable it its own way. Just think about how people don’t trust electronic voting systems. Yeah.
But the plain truth is we have had the ability to have planes fly autonomously for decades. The problem is getting passengers to accept the concept. So long as a pilot has their butt in the seat, there is a sense of safety in the fact that the pilot has their own butt at risk just like the passengers. That pilot will go to extremes to save not only the plane and passengers, but himself. So AI controlled autonomous aircraft, that is feasable and reality today right now. But public acceptance is still a long ways off.
Gary, have you seen the safety record of the airlines over the last 20 years. Why risk that? To save a buck? Look where that took Boeing and other companies. When you cut corners on safety, people die. That is why we call it safety.
You seem to be blaming pilots when it was other things at fault. The Southwest FedEx issue was not pilot error. You can’t blame a southwest pilot for slow rolling in poor visibility. That controller put both planes at risk, and the pilot’s averted catastrophe. I don’t see how the pilots were at fault. And if you are saying AI or autonomy could have maintained separation better, that Fedex pilot questioned the clearance of the SW flt right away. In a sterile cockpit, those things are noticed.
The JFK issue, maybe, just maybe a computer would never have put those two aircraft at risk. But there is better technology to keep that from happening that keeps humans in the loop. The 737 Max debacle was simply the fact that Boeing not only failed to communicate how the system worked, but refused to tell everyone, and so it wasn’t included in the training. That was a computer malfunction! And you want to entrust the public to computers? In those two crashes, the computer had bad information and the systems were trying to override the pilot inputs and the pilots didn’t know why the plane was doing what it was doing. That is a computer and training issue, not a pilot issue. Nice try.
As I commented above, AI will never be capable of flying a plane autonomously. That is a pipe dream sold by marketing departments at a lot of companies. I’ve seen a lot of software overpromise and underdeliver in the last few years.