Smartphone based accurate touch operations on an AR desktop

Open Access
Article
Conference Proceedings
Authors: Masahiro MikiMunehiro TakimotoYasushi Kambayashi

Abstract: Recently, we have seen and used touch-based operations as a popular interface for smart phones and tablet PCs. As the interface, we interact with a device through touching the physical surface of the screen. The touch-based manner gives simple and intuitive interface for users but is not always convenient for several users to operate a single device cooperatively. Also, in some cases, manipulating a screen without direct touch is rather desirable. Because touchless operations prevent viruses from infecting the users. It is especially true in the current situation of COVID-19 pandemic. In order to implement such touchless interfaces, researchers have proposed augmented reality (AR) based desktops or keyboards. Most of such AR based interfaces, however, require interactions with virtual objects in the air, which are not accurate. Therefore, such interfaces are rarely used in practice. In this paper, we propose a new operation manner of AR based desktop. In our operation manner, the user can perform operations on his or her smartphone as an AR object displayed in the virtual space on the screen. The smartphone has its own AR marker, which specifies the coordinate of it, and gives the reference point for displaying the cursor. The cursor is always displayed at the center of the smartphone and represents the home position of fingers. Thus, even if the virtual screen is displayed with wide size such as one of a desktop PC, a user moves his or her fingers along with his or her smartphone to the location of a specific icon on the screen, and then can give any operations constituting of single touch or double touch, such as tapping, flicking, pinching or swiping. Notice here that the user physically operates the smartphone although he or she feels like operating the virtual screen through directly touching it. The physical operations on the smartphone contribute to intuitive manipulation and providing actual feelings. They also help the users perform accurate operations. In order to demonstrate the effectiveness of our manipulation manner, we have implemented a prototype system and have conducted numerical experiments on it. The prototype system recognizes an AR marker attached on the users’ smartphones by external web cameras and decides their locations, so that the user can apply any operations of fingertips to the AR objects on the virtual screen. In the experiment, the screen is displayed in front of each user, who can identify and synchronize movement of his or her smartphone with the cursor. Once the cursor reaches the target position, each user can apply one of the following six operations: tap to launch apps, double-tap/drag to move app icons, pinch-out/pinch-in to zoom in and out, and flick to create or switch screens. We focused how the users could grasp the touch operation intuitively and manipulate the system as they intended. As the results, we have confirmed that the users can feel as if they touch the actual screen in the virtual space and manipulate the prototype system as intended.

Keywords: Augmented Reality, User Interface, Touch Operations

DOI: 10.54941/ahfe100940

Cite this paper:

Downloads
81
Visits
198
Download