Enabling and Advancing Adaptive HMI Ecosystems for Highly Configurable Multimodal Smartphone Control
Open Access
Article
Conference Proceedings
Authors: Michael Jenkins, Daniel Ferris, Sean Kelly
Abstract: In modern professional and operational environments, the increasing reliance on smartphone-based applications for mission-critical tasks has heightened the need for adaptive and efficient human-machine interfaces (HMIs). Many current smartphone interfaces, however, are limited in their flexibility to accommodate diverse input modalities (i.e., beyond the native touchscreen) or to adapt effectively to dynamic and high-stakes contexts. These limitations become especially evident in scenarios where operators must seamlessly switch between multiple control mechanisms—such as gestures, voice commands, or touch inputs—without sacrificing responsiveness or accuracy. Furthermore, most commercially available HMI solutions are designed with standardized user interactions in mind, which may hinder their usability for individuals with specific accessibility needs or for users working in constrained environments. This lack of adaptability can compromise an operator’s ability to interact quickly and accurately with necessary applications, potentially impacting job or task outcomes. The need for a more flexible, multimodal HMI ecosystem has never been greater, especially as smartphone use expands across professional, industrial, and defense sectors where varied input mechanisms could be leveraged to enhance productivity and effectiveness.To address these limitations in smartphone HMI, our team is developing "THALAMUS". The THALAMUS solution introduces a highly adaptable framework that allows for flexible, multimodal interaction with Android-based applications, empowering users to operate their devices through various input mechanisms without needing software-level integration with each application. THALAMUS leverages a customizable, invisible grid system on the touchscreen, enabling users to map inputs from diverse HMI peripherals—such as gesture gloves, voice commands, and body-worn controllers—onto specific areas of the screen. This feature, combined with an ability to detect the active app and its state, allows THALAMUS to seamlessly transition between different grid configurations, tailoring its response to specific user and app requirements in real time. By enabling device-wide interaction customization, THALAMUS creates an “HMI amplifier” effect, allowing operators to integrate multiple control methods across any application without the need for modification, making it adaptable for a wide range of operational needs.The versatility of THALAMUS is particularly significant for professional and defense applications where hands-free, rapid, and reliable smartphone interaction is essential. In military environments, for instance, THALAMUS could enable operators to interact with situational awareness applications or communication systems using only voice or gesture controls, minimizing heads-down time and improving situational awareness. In other high-stakes contexts, such as healthcare or logistics, THALAMUS can enhance accessibility by allowing professionals to configure interaction methods based on immediate situational needs or individual ergonomic constraints. For accessibility-focused applications, THALAMUS’s customizability allows users to establish personalized controls, supporting users with unique physical or cognitive needs and enabling greater digital inclusivity.THALAMUS promises a foundational platform for multimodal HMI ecosystems, allowing diverse input devices to coalesce within a unified smartphone interaction framework. As an adaptable layer for managing multiple input modes, it facilitates highly configurable control environments that can evolve with emerging HMI peripherals, supporting future innovation in control modalities. THALAMUS thus represents an essential step toward more intelligent, adaptive HMI solutions, allowing device manufacturers and software developers to create tailored, mission-aligned user interfaces for varied operational communities. Ultimately, THALAMUS not only enhances immediate usability but also establishes a scalable and flexible framework for future multimodal HMI ecosystem development across professional and operational domains.
Keywords: Human Machine Interaction (HMI), Human Computer Interface (HCI), Adaptive HMI, Multi-Modal Control, Smartphone and Mobile HMI
DOI: 10.54941/ahfe1006240
Cite this paper:
Downloads
0
Visits
11