Augmented Reality Application for HoloLens Dedicated to the Accuracy Test: Evolution and Results

Open Access
Article
Conference Proceedings
Authors: Julien BarbierFranck GechterSylvain Grosdemouge

Abstract: Augmented Reality (AR) proposes new ways to visualize and to interact with virtual objects. Depending on the target interaction modality and the application requirements, different type of devices can be chosen. If AR on smartphones can propose a Graphical User Interface without impacting the immersion, AR headset procures a more immersive experience, the interaction modality relying mainly on hand gesture control even if various types of interactions modalities have been explored in literature. One of the most widespread headsets is the Microsoft Hololens which offers a documentation about the set-up of interactions between the users and virtual entities. However, the ergonomic of the proposed hand gesture needs to be learnt and is not intuitive for most people and cannot be well fitted depending on the type of application.The goal of this paper is to test, in a medical application perspective, the ergonomic of different types of human machine interface in AR, the impact of changes made by the return of the users and the usability of the final human machine interface. An application dedicated to the accuracy test of the headset has been made. This application has been tested by different users who never had any previous experience with AR headset before. The virtual object used inside this application is a simple cube to simplify the interaction with the virtual entity as much as possible. After that, a users’ return of experience protocol has been propose. It has been used to feed proposals for changing interaction modalities in the application. This return of experience is based on the estimation of the ease to place the virtual entity relatively to elements of the real world, the estimation of the ease to orientate the entity and the estimation of quality of the visualization. At the end of the protocol, the final human machine interface is tested, and a comparison is made between the different types of interaction modalities proposed.Among the proposed solutions, the one without any graphical user interface artifacts (i.e. using only hand tracking to interact with the cube) results in bad comprehension and manipulation that can lead to prevent the use of this application. One explanation can be tied to the lack of precise hand tracking which can result in bad hand pose. The second solution, based on the addition of a 3D plane GUI, demonstrates a more precise appropriation of the AR context. However, the GUI plane must be positioned manually by the user to have better result. Besides, results shows that the cube must be rendered with boxes to delimit the edge and thus helping the user to make the cube closer to his/her perception expectations.These experiments showed that the use of world anchored graphical user interface for high accuracy application is needed to provide a better understanding for newcomers and can be considered as an intuitive way to use the application. If for most entertainment applications the hand interaction can be sufficient, the hand tracking is not accurate enough for the moment to allow a high precision positioning of virtual entities for medical application.

Keywords: AR, MR, Ergonomic, GUI, High Accuracy Application

DOI: 10.54941/ahfe1002097

Cite this paper:

Downloads
154
Visits
378
Download