Designs for avionics and synthetic vision link pilot with environment

[From Military & Aerospace Electronics; also see an article in Flying magazine in which a pilot describes his experience using Honeywell’s Smartview system]

Designs for avionics and synthetic vision rely heavily on human factors research

Jan 4, 2011
By John McHale

People interact with machines in different ways — with their eyes, touch, voices, and even their brain waves. These human factors are important when designing cars, home theaters, and especially commercial and military aircraft cockpits.

Telepathic flight control still resides in fictional realms such as the 1982 Clint Eastwood movie FireFox — in which a pilot stole a Soviet jet fighter that was programmed to respond to human thoughts. The Eastwood character controlled the fictional high-performance jet by thinking in Russian. Today, however, avionics designers are exploring touch screens, virtual worlds, 3-D moving maps, and even voice control in all types of cockpits.

“We are passionate about the user experience on the flight deck, going beyond human factor issues so pilots can do what they need to do,” says Sarah Barber, systems engineer and human factors expert for Rockwell Collins in Cedar Rapids, Iowa. “There is nothing worse than having frustrated pilots on the flight deck. We focus on what the sensation of perception pilots get from the flight deck display.”

Getting in a rental car and looking at the dashboard controls can be confusing when one is accustomed to a different make and model, says Bob Witwer, vice president of aerospace advanced technology at Honeywell Aerospace in Phoenix. It is the same for pilots; they need to interact with cockpit controls intuitively without thinking about where the controls are, he explains. Improving the pilot’s situational awareness means enabling him always to be one step ahead, Barber says.

Honeywell “has a human factors mantra with four primary points guiding top level design: giving the pilot what he needs in terms of data; giving the pilot only the stuff he needs with no clutter; giving the pilot the data only when he needs it, vs. providing information when he doesn’t need it, creating a distraction; and providing the data in way that is intuitive, unambiguous, and easy to understand,” Witwer says.

Primary flight displays can display a lot of information, so designers make sure the pilot receives the proper data with no ambiguity, Witwer says. “We try to keep workload as low as possible for takeoff and final approach when things get very busy,” Barber adds. “With a reconfigurable flight deck you don’t want the pilots spending time setting up different formats and increasing their workload.”

The basics — displays and controls

Human factors from the bottom up covers the design of displays and controls, Barber says. One of the low-end human factors involves the representation and use of color, graphics, and text in avionics displays, she says. Within commercial aviation the display colors are stipulated by regulation, Barber says.

Font size and style of text may differ as some aircraft manufacturers take different approaches on upper case vs. lower case letters, Barber continues. There are regulations that even govern eye-view and fonts — on how far the eye needs to be from certain size fonts, she adds. “We look at lower level things like that,” Barber says. “We also look at ergonomic issues such as the type and placement of controls in the flight deck.” The task is to define what controls you need, she adds.

An example of ergonomic human factors is where to put the track ball within the flight controls, Barber continues. Avionics designers ideally put the ball on the pedestal so it will not strain the pilot’s wrist. The pilot also must be able to grip the track ball with small finger and thumb to hold onto during flight, she adds. “We have to control location, shape, and style,” Barber says.

Another area where touch comes into play involves knobs on the flight panel, Barber says. Each knob has a different size and shape to enable pilots to locate it by feel and muscle memory, she adds. “This is especially important when there is smoke in the cockpit.”

Tactile feel concerns also fit into a philosophical approach to the flight deck calling for a “quiet, dark cockpit, where there are no lights or sounds” Barber says. When there is a low oil alert or other warning, the pilot can see it easily in the dark, she continues.

High-level human factors

High-level requirements look to create flexibility in how information is provided on future flight decks through a standard user interface involving on-screen menus and a multifunction keypad, Barber explains. Operators and aircraft manufacturers are open to tailorability in the navigation displays where it makes sense and where it can improve situational awareness, Witwer says.

Air transport requires commonality in displays and control panels, and allows flexibility in applications such as detailed air surface moving maps, she says. Avionics designers for business aviation, on the other hand, use their own branding and styling in the user interface.

Some tailoring will be available in applications enabled by Automatic Dependent Surveillance — Broadcast (ADS-B) satellite navigation technology, Barber says. Electronic flight bags (EFBs) will be a key enabler for ADS-B information in the cockpit, says Dan Wade, vice president of business development at Astronautics Corporation of America in Milwaukee. “We spent a lot of time on how to present this information in an unambiguous way.”

Information congestion

All this new information could overwhelm pilots with data, Witwer says. “We have to ask do we want a guy to focus on details about other airplanes or just focus on flying his own plane,” he continues. “There are times where providing less data is better — if data does not help accomplish the mission then it is not necessary for the pilot to see it,” he adds.

It is a balance when doing upgrades; designers must mimic as much as possible the operations of the legacy cockpit with the new capabilities to cut down on pilot training for transition to the retrofitted cockpit, Wade says.

One new information tool for cockpits is synthetic vision technology, Witwer says. Human factors research and engineering has been the driving force behind synthetic vision technology, which will improve pilot awareness, decision making, efficiency, and safety.

Synthetic vision systems are computer-generated depiction of the terrain and obstacles presented on the primary flight display, which are created from terrain/obstacle databases, attitude information, and navigation solutions. “It is clearly not a photo, but contains vital information,” Witwer says.

“Even on a clear bright day synthetic vision has benefits,” Barber says. A dome/igloo type of graphic symbol can be seen over the airport in synthetic vision from as far as 100 miles out — giving pilots instant situational awareness.

When regulating synthetic scene textures “we even measure the virtual angle of sun to control way the display is lighted,” Barber says. Every graphical element of the design is created to improve the pilot situational awareness, she adds.

“The nice thing is to be able to use flight data to establish most of the requirements for synthetic vision applications,” Barber says. We bring results of real flight data analysis to the product line to build synthetic vision applications.

“We are now transitioning onto head-up-displays (HUDs) and have demonstrated the HUD synthetic vision to several manufacturers and they are pleased at how well image overlays real world,” Barber continues. “If you are flying in the clouds the HUD makes it seem like you have x-ray vision — when you look side to side all you see is clouds,” but when viewing through HUD you see the ground.

Enhanced vision

Enhanced vision is the next step after synthetic vision, and promises pilots unprecedented situational awareness by overlaying actual real-time, real-world flight information over the synthetic vision display without ambiguity, Witwer says.

For example on a runway approach in a Honeywell enhanced vision system, pilots will see their critical real-time information put in the center of the flight path and synthetic data gracefully move to the side. Real-world data is always represented in the flight path in enhanced vision systems, he adds.

Human factors and future cockpits

Touch-screen capability is next for the flight deck, Witwer says, and are intuitive to the young pilots coming up today who use iPhones and iPads, Witwer says. “We’re always looking into next generation of user interface devices in the flight deck such as touch screens,” Barber says. As of today many pilots still interact with the flight management system (FMS) on planes with a trackball, she adds. Pilots want to know when they are going to move toward touch screens because it is a “more natural way to navigate around the FMS, Barber says.

“In next five years I’m not sure that the flight deck will change drastically, but new applications will change the way airplanes operate,” Barber says. However, looking further out, maybe 15 to 20 years, the concept of a single pilot flight deck will be developed, “which will require major adjustments to how the flight deck will be laid out.”

“I think voice will have a place in the cockpit when it is really helpful and only used when it is a 100 percent certain that the results will be good” — without interference from background noise, Witwer says.

ISPR Presence News

Search ISPR Presence News:



Archives