Automated Procedural Error Detection in Human-Robot Collaborative Assembly Using Vision-Based Template Matching
Open Access
Article
Conference Proceedings
Authors: Isaiah Nassiuma, Ella-mae Hubbard, Mey Goh
Abstract: In collaborative human-robot assembly, robust error identification is paramount for ensuring process integrity and safety, particularly in the post task phase where a comprehensive analysis provides an opportunity to identify subtle and cumulative errors that may have been missed in real-time. Traditional manual verification is often tedious and prone to human error, including oversight and fatigue, which can compromise quality. This paper evaluates the efficacy of an automated, vision-based error detection system using OpenCV template matching as a more reliable alternative. Our method identifies procedural errors, such as missed components or out-of-sequence operations, by comparing real-time images of the assembly state against a library of reference templates that depict correctly completed procedural steps. Visual dissimilarity metrics are used to automatically flag deviations from the expected sequence. Experimental results demonstrate that the automated system significantly outperforms manual verification in the consistent and rapid identification of both missing and mis-sequenced assembly steps. Whilst its performance can be influenced by challenges such as variable lighting and low-contrast features, the vision-based approach proved substantially more dependable than human inspection especially for structured and defined tasks where the objects consistent and predicted visual features. We conclude that template matching provides a robust and scalable solution for quality control in collaborative assembly tasks. This automated approach enhances operational efficiency and safety, though further tuning may be required to optimise performance in visually complex environments.
Keywords: Human-Robot Collaboration, Template Matching, Error Detection, Computer Vision
DOI: 10.54941/ahfe1006808
Cite this paper:
Downloads
11
Visits
41