Fake Aircraft, Real Threats: Training Air Traffic Controllers for Cyberattacks
Open Access
Article
Conference Proceedings
Authors: Maria Hagl, Supathida Boonsong, Tim H Stelkens-kobsch, Tim Ruediger, Andrei Gurtov, Gurjot Singh Gaba
Abstract: As part of the modernisation of Air Traffic Management (ATM), an increasing number of technologies are being deployed to improve the provision of Air Traffic Services. For example, the large-scale implementation of Automatic Dependent Surveillance–Broadcast (ADS-B) helps controllers gain better situational awareness of aircraft in the airspace and manage traffic with greater safety and efficiency. Digitalisation in aviation, however, is accompanied by cybersecurity risks, and recent statistics indicate a substantial increase in cyberattacks within the aviation sector. The sources of vulnerability are multifaceted. Increased connectivity and human factors pose exploitable entry points for attackers. The cyber threats here are not typical IT incidents like phishing, but critical operational threats, like malicious aircraft injection. Training operators is crucial; air traffic controllers (ATCOs) must be able to distinguish real traffic from false information to react appropriately. The article’s study introduces an initial training concept tested in a half-day session using the Attack Simulator developed by Linköping University. It features six cyber threats exploiting ADS-B system vulnerabilities that ATCOs may face during duty.The participants in the study were eight experienced Swedish ATCOs. None of them had received prior training specific to cyberattacks in ATM, nor had they encountered the Attack Simulator used in this study. The study followed a repeated-measures design, with each participant completing three simulation runs of 25 minutes. The first run was a familiarisation session that allowed participants to get used to the Attack Simulator; no data from this phase were included in the analysis. In the second and third runs of the experiment, we conducted actual measured simulations with cyberattacks. Before the second run (Run 2), participants received a briefing on the types of cyberattacks that could occur during the simulation. Prior to the third run (Run 3), participants were presented with a video-animated knowledge transfer session designed to improve their understanding of how cyberattacks manifest within the simulation environment. In both Run 2 and Run 3, six types of attacks could occur, with a 30% probability for pre-spawned aircraft and a 40% probability for newly spawned aircraft. This experimental design was intentionally selected to mimic real-world cyberattacks and to minimise the chances of participants anticipating all potential stimuli and adjusting their responses accordingly. During these simulation runs, we recorded performance scores, which were later weighted based on the detection rate of specific attacks and the total number of events per scenario. For any omission or incorrect guess regarding an event type, the points deducted for that event type were multiplied by the detection rate of the event type. The points awarded for a correct identification were multiplied by “1 minus the detection rate of the event”. Thus, events that were more difficult to correctly identify were penalised less than those that were easier to correctly identify. The obtained score at the end of a scenario was divided by the total number of attacks that occurred during the simulation run. This approach ensured that participants’ reactions were evaluated in the context of the actual events they experienced. It is essential to note that air traffic control (ATC) tasks have been excluded from the analysis, as evaluating operator performance related to ATC tasks was outside the scope of this study. After each experimental run, participants completed post-simulation questionnaires that assessed situational awareness (SASHA), perceived workload (NASA-TLX), stress (SSSQ), and tailored questions regarding their confidence in correctly identifying and labelling the attacked aircraft. Results show improved ATCO performance scores, confirmed by participant feedback. The significant increase between Run 2 and Run 3 likely results from a learning effect due to repeated practice, video-based knowledge transfer, or both. Future studies with larger samples should include varied conditions to better identify the effects of each variable and their interaction.Subjective assessments of situation awareness, workload, and stress were on average acceptable across both runs. Individually, some variation appeared in global scores and assessments, likely due to the probabilistic event generation by the Attack Simulator, producing different scenarios for each participant. Answer patterns indicated that self-reported awareness, workload, and stress are linked to the volume of traffic and the frequency of attack occurrences. With more resources and higher sample sizes, future studies should design identical scenarios.
Keywords: Cybersecurity, Risk, Training, Human Factors, Air Traffic Management, Digitalisation
DOI: 10.54941/ahfe1007074
Cite this paper:
Downloads
1
Visits
2


AHFE Open Access