Flying blind: Exploring the visual cues used by helicopter pilots in degraded visual environments
Open Access
Article
Conference Proceedings
Authors: Brandon Dreslin, Molly C Mersinger, Shivani Patel, Alex Chaparro
Abstract: Helicopter pilots rely on visual cues from the environment and instrument displays during critical phases of flight – particularly final approach and landing – to safely land. However, the specific visual cues pilots rely on and how they integrate those cues to make anticipatory inceptor inputs or corrections are not well understood. Importantly, those cues may be degraded under nighttime and brownout/whiteout conditions where the downwash of a helicopter’s rotors cause loose dirt/snow to be projected into the air, resulting in the obfuscation of the pilot’s vision outside the aircraft. The lack of visual cues in these conditions means that pilots are often ‘flying blind’ and must transition from using visual scene-based cues and motion-based cues to alphanumeric or pictographic information on displays. This transition of in-flight rules involves changes in perceptual and cognitive processing during a time of increased cognitive and physical workload. Pilots must shift their visual and attentional focus from the external scene to head-down displays. Additionally, cognitive processing shifts from natural visual cues to detection and response. This shift is not instantaneous. Delays in recognizing and understanding the alphanumeric information increases the risk of spatial disorientation. Therefore, it becomes imperative to identify what cues pilots may rely on to inform the design of displays that may be more effective under degraded viewing conditions. To address this issue, we reviewed the literature on the visual cues used to process forward motion (i.e., speed, heading), altitude, position in space, and collision detection (specifically during the landing flare). Analyses conclude that (a) optical flow supports awareness of linear motion, (b) lines of splay and depression promote altitude regulation, (c) accretion and deletion of environmental features outside the aircraft allow for roll, yaw, and heave detection, (d) motion parallax is crucial for motion detection when an aircraft is hovering, and (e) a successful landing flare may rely on a combination of time-to-contact and time-to-passage cues. These results suggest that visual cues can be incorporated in an artificial visual environment. Providing information on the visual cues processed during landing can assist designers and developers alike to design a synthetic display that facilitates spatial awareness.
Keywords: Degraded visual environments, depression, helicopter brownout, optical flow, spatial disorientation, splay, visual cues
DOI: 10.54941/ahfe1003851
Cite this paper:
Downloads
230
Visits
692