Improvement of safety in collaborative robotics through prompt detection of abrupt movements
Open Access
Article
Conference Proceedings
Authors: Greta Di Vincenzo, Michele Polito, Elisa Digo, Laura Gastaldi, Stefano Pastorelli
Abstract: The advent of the Industry 5.0 has reestablished the central role of humans within the production processes, giving robotics a new perspective. Humans and robots can cooperate in shared workspace enhancing their respective strengths. An effective production process is guaranteed because robots execute repetitive tasks with high precision and speed, while humans express their essential decision-making capabilities. To ensure an effective and safe collaboration the robot system must constantly recognize human gestures and promptly react, especially at unexpected state. Indeed, among repetitive standard gestures typical of manipulation tasks in a workshop station, an operator may occasionally perform abrupt movements caused by inattention or external circumstances unrelated to the work task.Due to the still limited attention devoted to abrupt gestures in the scientific community, the aim of this study was to recognize human movements in real-time integrating wearable magneto-inertial measurement units (MIMUs) with deep learning techniques. The final outcome consisted in utilizing data from a MIMU fixed on the human forearm to distinguish between standard and abrupt gestures during a typical industrial task. Specifically, a Long Short-Term Memory (LSTM) neural network was first trained on a previously collected dataset including MIMUs signals from sixty subjects during a traditional pick-and-place task interspersed with randomly induced abrupt gestures. The LSTM training was conducted on MIMUs linear accelerations of the dataset, which were processed removing the gravity component, calculating the norm, and segmenting the resulting signals into overlapping sliding windows of 0.5 s. Various overlap percentages (50%, 75%, 90%, 95%, 99%) were tested to optimize the network performance for real-time applications. Subsequently, the trained network was tested online on data recorded from five new participants. The same task was executed streaming in real-time MIMUs signals and exploiting the fine-tuned system to classify each gesture as standard or abrupt.During the training phase, 90% was identified as the optimal overlap to enhance the network performance, with an average classification time per window of 0.84 ms. In the testing phase, classification performance was evaluated from the comparison between the actual gesture sequences of each participant with those identified by the detection system. Based on the resulting confusion matrix, excellent classification performance of the system were demonstrated, achieving a balanced accuracy of 90.96%, a macro F1-score of 90.11%, a specificity of 96.92%, and a recall of 85%. In particular, the system guarantees a very low number of abrupt gestures classified as standard (less than 2%).. Apart from the MIMUs streaming delay, the pre-processing time (approximately 9 ms) for MIMUs accelerations and the classification time (approximately 260 ms in the actual hardware set up) required to distinguish between standard and abrupt gestures are sufficiently low to assume the system as near real-time.Overall, results demonstrate the effectiveness of a LSTM network trained on forearm-MIMU accelerations in detecting abrupt movements during a typical industrial task under conditions approaching real-time. Current efforts are devoted to improve the signal acquisition and processing procedure in order to minimize streaming delays and hence to detect the abrupt movement at its earliest stage.
Keywords: human gesture, human-machine interaction, abrupt movements
DOI: 10.54941/ahfe1006357
Cite this paper:
Downloads
10
Visits
33