Preventing runway incursions is considered a top safety priority for the National Transportation Safety Board and is a
growing problem among commercial air traffic at controlled airfields. This problem only increases in difficulty when the
weather and airfield conditions become severely degraded. Such is the case in this Air Force Research Laboratory
(AFRL) work, which focused on the decision making process of aircrew landing under near zero-zero weather at an
unimproved airfield. This research is a part of a larger demonstration effort using sensor technology to land in near zero-zero
weather at airfields that offer no or unreliable approach guidance. Using various head-up (HUD) and head-down
(HDD) display combinations that included the sensor technology, pilot participants worked through the decision of
whether the airfield was safe to land on or required a go-around. The runway was considered unsafe only if the boundary
of the runway was broken by an obstacle causing an incursion. A correct decision is one that allowed the aircrew to land
on a safe runway and to go-around when an incursion was present. While going around is usually considered a safe
decision, in this case a false positive could have a negative mission impact by preventing subsequent landing attempts. In
this study we found a combination of display formats that provided the greatest performance without making significant
changes to an existing avionics suite.
When flying an airplane, landing is arguably the most difficult task a pilot can do. This applies to pilots of all skill levels particularly as the level of complexity in both the aircraft and environment increase. Current navigational aids, such as an instrument landing system (ILS), do a good job of providing safe guidance for an approach to an airfield. These aids provide data to primary flight reference (PFR) displays on-board the aircraft depicting through symbology what the pilot's eyes should be seeing. Piloting an approach under visual meteorological conditions (VMC) is relatively easy compared to the various complex instrument approaches under instrument meteorological conditions (IMC) which may include flying in zero-zero weather. Perhaps the most critical point in the approach is the transition to landing where the rate of closure between the wheels and the runway is critical to a smooth, accurate landing. Very few PFR's provide this flare cue information. In this study we will evaluate examples of flare cueing symbology for use in landing an aircraft in the most difficult conditions. This research is a part of a larger demonstration effort using sensor technology to land in zero-zero weather at airfields that offer no or unreliable approach guidance. Several problems exist when landing without visual reference to the outside world. One is landing with a force greater than desired at touchdown and another is landing on a point of the runway other than desired. We compare different flare cueing systems to one another and against a baseline for completing this complex approach task.
KEYWORDS: 3D displays, Visualization, 3D acquisition, Video, Head, Situational awareness sensors, Target detection, 3D volumetric displays, Signal processing, Sensors
Several laboratory studies and flight demonstrations have indicated the potential benefits to operators/pilots of combined audio/visual displays (McKinley and Ericson, 1997). The primary focus of these previous studies was cockpit applications but significant laboratory and field work was accomplished in command and control applications. However, most audio and visual displays and their associated symbologies have been developed independently and therefore were not integrated in a human factors sense. Potential benefits from developing integrated audio/visual displays and symbologies include: reduced operator response time, improved situation awareness, reduced search excursions, improved visual target detection ranges, improved target discrimination, and reduced workload. In order to realize these, and other potential benefits, research is needed to truly integrate aural and visual displays and symbologies. The purpose of this paper is to present the results from previous studies and describe a plan to improve the integration of audio/visual displays and symbologies.
The term synthetic vision is used to describe combinations of sensor-based imagery (e.g., forward-looking infrared, millimeter-wave radar, light amplification or night vision systems) and imagery based on databases (e.g., digital terrain elevation data, obstacle and obstruction data, approach path data). While sensor-based imagery (often referred to as enhanced vision) has been available in military cockpits for several years, imagery based on databases (often referred to as artificial vision) has not. This paper discusses the display requirements needed for combinations of enhanced and artificial vision in military cockpits. We briefly survey current efforts to achieve synthetic vision displays in both military and civilian cockpits and the costs and benefits of these efforts. The relative advantages and disadvantages of enhanced and artificial vision are discussed within the context of current and future display capabilities, focusing on the human factors of these displays. A sampling of synthetic vision formats envisioned for use in military and civilian cockpits is presented to illustrate what might be required of head-down, head-up, and helmet-mounted displays in terms of resolution, luminance, and color. Further discussion is given to how these display requirements might be altered by aircraft mission, type, and the need to compensate for varying visibility and laser threat conditions.
Conference Committee Involvement (3)
Enhanced and Synthetic Vision 2008
19 March 2008 | Orlando, Florida, United States
Enhanced and Synthetic Vision 2007
9 April 2007 | Orlando, Florida, United States
Enhanced and Synthetic Vision 2006
17 April 2006 | Orlando (Kissimmee), Florida, United States
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.