Lightweight Transformer for Robust Human Activity Recognition Using Smartphone IMU Data
Open Access
Article
Conference Proceedings
Authors: Hossein Shahverdi, Seyed Ali Ghorashi
Abstract: Human-activity recognition (HAR) underpins a wide range of m-health, smart-home, and context-aware services, yet conventional approaches frequently struggle with overfitting, class imbalance, and limited capacity to capture long-range temporal dependencies. In this study we introduce a lightweight, end-to-end Transformer pipeline that learns directly from raw smartphone inertial signals, eliminating the need for manually engineered features. We evaluate the approach on MotionDetection, a 12-channel dataset collected from 24 volunteers who performed a scripted series of everyday movements while carrying a Samsung Galaxy Note 20 Ultra. After windowing and minimal preprocessing, the Transformer attains %98 validation accuracy with no discernible overfitting. Relative to a strong CNN-BiLSTM baseline, it improves the macro F1-score by 3.6 percentage points while employing a smaller parameter budget, underscoring its computational efficiency. These findings indicate that Transformer architecture can provide a robust, scalable foundation for real-world HAR on commodity mobile devices, paving the way for battery-friendly, on-device activity monitoring in health and ambient-assisted applications
Keywords: Human Activity Recognition, Transformers
DOI: 10.54941/ahfe1005972
Cite this paper:
Downloads
23
Visits
121