Third person drone – gamification motivated new approach for controlling UAV

Open Access
Conference Proceedings
Authors: Thomas HofmannDennis LinderPhilip Lensing

Abstract: Over the last years drones became a more and more popular solution for inspection and survey tasks. Controlling these drones, especially in tight spaces, using ‘line of sight’ or a ‘first person’ view from the perspective of the drone can be a difficult task. Often users experience an increased rate of difficulty that can be traced back to the limited situationaloverview of the user. To investigate whether a different form of visualization and interaction might result in a higher level of usability for the user, an experimental workspace was set up, with the goal of exploring the possibility of implementing a ‘third person view’ metaphor, like one used in video games. To further allow the user to experience his environment the use of virtual reality was used to stream the followers perspective directly to the users headset. This allowed the user to fly inside a simulated environment allowing for a control- and repeatable testing ground of the software. The workspace consisted of a simulation in which a ‘proof of concept’ was developed. In this simulation a drone used a conventional GPS sensor to follow a human controlled drone, offering his view, from a static camera, as a third person perspective to the controller using a virtual reality headset. Within the framework of the project, two aspects in particular were investigated: The performance of the technical system and the basic user experience and ergonomics of this form of interaction. To evaluate the performance of the follower system, the GPS position, as well as execution times and latencies were recorded. The user experience was evaluated based on personal interviews. The results show that the developed system can in fact follow a drone based on the GPS position alone, as well as calculate the desired positions in a timely manner. Yet, the existing delay in movement induced by the controller execution, as well as the drones own inertia did not allow for continues camera tracking of the drone using a static camera. This introduced several issues regarding tracking and impacted the user experience, but still showed that such a metaphor could in theory be implemented and further refined. The personal interviews showed that users would try tracking the drone by moving their head, like they are used to in virtual reality games. Ultimately, it was deduced that introducing a vectorbased drone movement, additional range detection sensor, as well as a moveable camera, controlled via head movement would be next steps to improve of the overall system. Since the prototype created in this paper only contained a bare bones user interface and experience the use of a usability study has been foregone in exchange for a more stable software solution. This allows further research into this topic the possibility of evaluating possible types of spatial user interfaces, which could improve the user immersion.

Keywords: Drone control, gamification, experimental ergonomics, computer sciences, third person hmi

DOI: 10.54941/ahfe1003742

Cite this paper: