Enhanced Situational Awareness
An advanced avionics aircraft offers increased safety with enhanced situational awareness. Although aircraft flight manuals (AFM) explicitly prohibit using the moving map, topography, terrain awareness, traffic, and weather datalink displays as the primary data source, these tools nonetheless give the pilot unprecedented information for enhanced situational awareness. Without a well-planned information management strategy, these tools also make it easy for an unwary pilot to slide into the complacent role of passenger in command.
Consider the pilot whose navigational information management strategy consists solely of following the magenta line on the moving map. He or she can easily fly into geographic or regulatory disaster, if the straight-line GPS course goes through high terrain or prohibited airspace, or if the moving map display fails.
A good strategy for maintaining situational awareness information management should include practices that help ensure that awareness is enhanced, not diminished, by the use of automation. Two basic procedures are to always double-check the system and verbal callouts. At a minimum, ensure the presentation makes sense. Was the correct destination fed into the navigation system? Callouts—even for single-pilot operations—are an excellent way to maintain situational awareness, as well as manage information.
Other ways to maintain situational awareness include:
- Perform verification check of all programming. Before departure, check all information programmed while on the ground.
- Check the flight routing. Before departure, ensure all routing matches the planned flight route. Enter the planned route and legs, to include headings and leg length, on a paper log. Use this log to evaluate what has been programmed. If the two do not match, do not assume the computer data is correct, double check the computer entry.
- Verify waypoints.
- Make use of all onboard navigation equipment. For example, use VOR to back up GPS and vice versa.
- Match the use of the automated system with pilot proficiency. Stay within personal limitations.
- Plan a realistic flight route to maintain situational awareness. For example, although the onboard equipment allows a direct flight from Denver, Colorado, to Destin, Florida, the likelihood of rerouting around Eglin Air Force Base’s airspace is high.
- Be ready to verify computer data entries. For example, incorrect keystrokes could lead to loss of situational awareness because the pilot may not recognize errors made during a high workload period.
Advanced avionics offer multiple levels of automation, from strictly manual flight to highly automated flight. No one level of automation is appropriate for all flight situations, but in order to avoid potentially dangerous distractions when flying with advanced avionics, the pilot must know how to manage the course deviation indicator (CDI), the navigation source, and the autopilot. It is important for a pilot to know the peculiarities of the particular automated system being used. This ensures the pilot knows what to expect, how to monitor for proper operation, and promptly take appropriate action if the system does not perform as expected.
For example, at the most basic level, managing the autopilot means knowing at all times which modes are engaged and which modes are armed to engage. The pilot needs to verify that armed functions (e.g., navigation tracking or altitude capture) engage at the appropriate time. Automation management is another good place to practice the callout technique, especially after arming the system to make a change in course or altitude.
In advanced avionics aircraft, proper automation management also requires a thorough understanding of how the autopilot interacts with the other systems. For example, with some autopilots, changing the navigation source on the e-HSI from GPS to LOC or VOR while the autopilot is engaged in NAV (course tracking mode) causes the autopilot’s NAV mode to disengage. The autopilot’s lateral control will default to ROL (wing level) until the pilot takes action to reengage the NAV mode to track the desired navigation source.
Risk management is the last of the three flight management skills needed for mastery of the glass flight deck aircraft. The enhanced situational awareness and automation capabilities offered by a glass flight deck airplane vastly expand its safety and utility, especially for personal transportation use. At the same time, there is some risk that lighter workloads could lead to complacency.
Humans are characteristically poor monitors of automated systems. When asked to passively monitor an automated system for faults, abnormalities, or other infrequent events, humans perform poorly. The more reliable the system, the poorer the human performance. For example, the pilot only monitors a backup alert system, rather than the situation that the alert system is designed to safeguard. It is a paradox of automation that technically advanced avionics can both increase and decrease pilot awareness.
It is important to remember that EFDs do not replace basic flight knowledge and skills. They are a tool for improving flight safety. Risk increases when the pilot believes the gadgets compensate for lack of skill and knowledge. It is especially important to recognize there are limits to what the electronic systems in any light GA aircraft can do. Being PIC requires sound ADM, which sometimes means saying “no” to a flight.
Risk is also increased when the pilot fails to monitor the systems. By failing to monitor the systems and failing to check the results of the processes, the pilot becomes detached from the aircraft operation and slides into the complacent role of passenger in command. Complacency led to tragedy in a 1999 aircraft accident.
In Colombia, a multi-engine aircraft crewed with two pilots struck the face of the Andes Mountains. Examination of their FMS revealed they entered a waypoint into the FMS incorrectly by one degree resulting in a flight path taking them to a point 60 NM off their intended course. The pilots were equipped with the proper charts, their route was posted on the charts, and they had a paper navigation log indicating the direction of each leg. They had all the tools to manage and monitor their flight, but instead allowed the automation to fly and manage itself. The system did exactly what it was programmed to do; it flew on a programmed course into a mountain resulting in multiple deaths. The pilots simply failed to manage the system and inherently created their own hazard. Although this hazard was self-induced, what is notable is the risk the pilots created through their own inattention. By failing to evaluate each turn made at the direction of automation, the pilots maximized risk instead of minimizing it. In this case, a totally avoidable accident become a tragedy through simple pilot error and complacency.
For the GA pilot transitioning to automated systems, it is helpful to note that all human activity involving technical devices entails some element of risk. Knowledge, experience, and mission requirements tilt the odds in favor of safe and successful flights. The advanced avionics aircraft offers many new capabilities and simplifies the basic flying tasks, but only if the pilot is properly trained and all the equipment is working as advertised.