A Method for Human-Robot Collaborative Assembly Action Recognition Based on Skeleton Data and Transfer Learning

Open Access
Article
Conference Proceedings
Authors: Shangsi WuHaonan FangPeng WangXiaonan YangYaoguang HuJingfei WangChengshun Li

Abstract: Human-robot collaborative assembly (HRCA) has become a vital technology in the current context of intelligent manufacturing. To ensure the efficiency and safety of the HRCA process, robots must rapidly and accurately recognize human assembly actions. However, due to the complexity and variability of the human state, it is challenging to accurately recognize such actions. Furthermore, with the lack of a large-scale assembly action dataset, the model only constructed from the data obtained in a single assembly scenario demonstrates limited robustness when applied to other situations. To achieve rapid and cost-effective action recognition, this paper proposes a method for human action recognition based on skeleton data and transfer learning. First, we screen the action samples which are similar to assembly actions from the NTU-RGB+D dataset to build the source dataset and reduce the dimension of its skeleton data. Afterwards, the Long Short-Term Memory (LSTM) network is used for learning universal features from the source dataset. Second, we use Microsoft Kinect to collect skeleton data of human assembly actions as the initial target dataset and use the sliding time window method to expand its size. After aligning the data of two datasets, the gradient freezing strategy is adopted during the transfer learning process to transfer the features learned from the source dataset into the recognition of HRCA actions. Third, the transfer model is validated through a small-scale reducer assembly task. The experimental results demonstrate that the method proposed can achieve assembly actions recognition rapidly and cost-effectively while ensuring a certain level of accuracy.

Keywords: Action Recognition, Human-Robot Collaboration, Transfer Learning, Skeleton Data

DOI: 10.54941/ahfe1005009

Cite this paper:

Downloads
60
Visits
146
Download