After experiencing what it was like to fly on the Gulfstream G650, it was time to explore the engineering advancements Honeywell was developing at their Deer Valley lab.
What I was shown would put aviators that finished their careers even fifteen years ago in absolute awe.
Honeywell has a four-step approach to designing cockpit avionics:
- Give the pilot what they need
- Give the pilot only what they need
- Give the pilot the information only when they need it
- Give them the information in a way that is intuitive, unambiguous, and easy to understand
The Primus Epic system on the G650 was designed with those four principles in mind. The positive response from flight crews has been overwhelming. Clearly the real-world use is matching up with the testing. This positive response has allowed Honeywell to go even further in their exploration of pilot-aircraft interface.
It is coming, no question, but in the interim Honeywell is developing other uses for Synthetic Vision Systems. When I visited their lab in Redmond, the ability of using such a display for airport navigation was discussed. Here, I was given a demonstration. I should be clear, this is not an either/or scenario. Honeywell will eventually be able to, legally, integrate Synthetic Vision into all phases of aircraft operation. Keeping the pilot immersed, focused, and always flying the aircraft.
In a dark, windowless, room deep inside the flight deck of the future laboratory, we were presented with a laptop with speakers, and a headset microphone akin to what pilots wear. Clearly, not a very photogenic space, but the content was more important.
One of Honeywell’s avionics futurists was loading a map of Frankfurt Airport. This is not an easy airport to navigate. Rather than having to take his hands off the (imaginary) controls, he was moving the map and navigating to specific points using only words. Honeywell has been developing an advanced Natural Language Processing system that can discern commands even with the constant noises associated with flying.
Honeywell does not expect a pilot to ever fly a plane using only their words. Instead, this is being designed as a way of keeping pilots fully focused on flying the aircraft, rather than having to divert either their gaze or their hands to manipulate a mundane tool in a critical phase of flight.
Much the same is the gesture control, and even eye-movement control, systems also being tested. By gesture control, I do mean hand movement tracking in free-space. This is going beyond swiping things on a touchpad or screen. Again, the goal is to have intuitive and integrated controls that allow the pilot to have access to more information without breaking their concentration on actually flying the aircraft.
If that was not enough, Honeywell is also working on giving flight control instruments – like throttles – a method of giving input feedback to the pilot. They are not quite sure of the practical applications of this yet, but there are numerous safety applications that can stem from such a capability.
It will be a while before we will see many of these in a flight deck – the above concepts are probably at least five years from production. When things get closer to real-world testing, they are moved to a laboratory across the hall. This laboratory contains a flight simulator with full-motion capability. It is not a Level 6-ICAO certified flight training instrument, but rather is a generic cockpit that allows Honeywell to rapidly change instrumentation they desire to test. Entirely full of green hydraulic fluid, the system can replicate any flight control input.
The more exciting part, to me at least, was that the system can also replicate turbulence. As I got to see, Honeywell flies their Convair 580 through turbulence every summer in Florida to help test their advanced weather radars (like the RDR4000). They have managed to also develop a method of recording the turbulence they fly through and storing it for later playback.
This allows Honeywell to throw real-world conditions at their test pilots in the simulator when they are, for example, touching icons on a newly-designed flight deck interface. With real-life turbulence the design engineers can pinpoint the exact size they need to make icons, what spacing they need to have between them, and even what kind of contrast the screen needs so that the devices are usable in all flight environments.
Like all great user interface designers, the Honeywell engineers want their products to be discoverable and intuitive. They work hard to develop systems that pilots will think of as “common sense.”
They have even gone as far as to explore neural interfaces. Imagine if a future pilot wants to move something on a screen simply by thinking about turning a page. That kind of sensitivity and signal interpretation is decades away, but Honeywell is working on this technology to bring it out of its infancy.
All of this talk of the flight deck of the future bring up the question of pilot residency. Will there be a point where we can have a single resident captain, with the first officer staying on the ground and monitoring several flights? I am aware it seems like a leap, but if you can develop avionics systems that present such highly-detailed yet non-complex information- it begins to stop mattering where the flight crew is.
Modern aircraft, satellite, and internet bandwidth is exponentially increasing. Honeywell can imagine a point in the future where a pilot is responsible for several aircraft at a time with no real first officer.
NASA is also working on this. In Honeywell’s opinion, a pilot does not have to be on a plane to be a pilot, but we are still decades away from that day.