Collaborative Learning through XR. A study of eye- and hand-based XR interactions to support collaborative learning in the chemistry classroom
Open Access
Article
Conference Proceedings
Authors: Boribun Wisanukorn, Frank Heidmann
Abstract: In this paper, we explore how extended reality (XR) and virtual reality (VR) can facilitate shared experiences that transcend individual boundaries. Using an example from the chemistry classroom, we illustrate how these technologies can enhance education and learning by fostering collaboration and shared understanding.Despite the vast potential of XR and VR, many of their applications remain isolated. Currently, sharing experiences often involves streaming onto a screen or verbal explanations, which can leave individuals feeling excluded if they are not directly involved. Even when multiple users wear VR or XR headsets, shared experiences are often limited to specific contexts like multiplayer VR games. Unfortunately, in domains such as education, work, or collaborative learning, these technologies are often underutilized.To bridge this gap, our study developed a prototype that leverages the eye-tracking and hand-tracking capabilities of the HoloLens2. This prototype enables collaborative interactions with 3D elements in an XR environment. By tracking participants' gaze within the XR scene, users can easily identify the element they are currently viewing. Through eye control and pinch gestures, users can manipulate digital content on the XR plane. When working within the same XR environment, users seamlessly share interactions and information, fostering a unified and collaborative experience with holograms.The current prototype is tailored to explore the intricate molecular details of sugar molecules within a collaborative XR environment. Teams of users can collectively delve into molecules such as fructose, glucose, galactose, and mannose, gaining deeper insights into their unique characteristics. Through interactions like eye-tracking and pinch gestures, users can directly interact with holographic 3D representations of these molecules. The prototype's user interface intuitively indicates which user is interacting with or observing each 3D object within the XR environment. Unlike conventional controllers, this prototype employs hand-tracking technology, enabling users to effortlessly navigate the XR interface using natural hand movements. This approach delivers an immersive and dynamic learning experience that surpasses traditional teaching methods, fostering heightened engagement and deeper understanding among learners. To assess the functionality and cognitive impact of the XR interface prototype, 19 participants were recruited for evaluation. Criteria for participant selection ensured they lacked prior knowledge or involvement in chemistry or related fields concerning sugar molecules. Following a brief introduction to the prototype's interaction patterns and eye calibration, participants engaged in testing across diverse learning scenarios. Initial survey findings suggest participants could swiftly distinguish between monosaccharides like glucose, fructose, galactose, and mannose based on their three-dimensional structures. However, results regarding gaze interaction evaluation are inconclusive and necessitate further experimentation.
Keywords: Collaborative Learning, Eye-based Interaction, Multi-user XR
DOI: 10.54941/ahfe1005489
Cite this paper:
Downloads
105
Visits
356