Multiphase pointing motion model based on hand-eye bimodal cooperative behavior

Open Access
Article
Conference Proceedings
Authors: Chenglong ZongXiaozhou ZhouJichen HanHaiyan Wang

Abstract: Pointing, as the most common interaction behavior in 3D interactions, has become the basis and hotspot of natural human-computer interaction research. In this paper, hand and eye movement data of multiple participants in a typical pointing task were collected with a virtual reality experiment, and we further clarified the movements of the hand and eye in spatial and temporal properties and their cooperation during the whole task process. Our results showed that the movements of both the hand and eye in a pointing task can be divided into three stages according to their speed properties, namely, the preparation stage, ballistic stage and correction stage. Based on the verification of the phase division of hand and eye movements in the pointing task, we further clarified the phase division standards and the relationship between the duration of every pair of phases of hand and eye. Our research has great significance for further mining human natural pointing behavior and realizing more reliable and accurate human-computer interaction intention recognition.

Keywords: 3D User Interface, Pointing, Human motor behavior

DOI: 10.54941/ahfe1002844

Cite this paper:

Downloads
91
Visits
418
Download