Instrumenting the Human into Safety 4.0
Authors: Saed Amer, Dana Alhashmi, Ravindra Goonetilleke, Ahmad Mayyas
Abstract: Managing the workers’ health and safety faces many challenges due to the dependency on human interactions especially when it comes to human monitoring and detecting nonconformance. Conventionally, the input to HSE making decisions is collected from the worker himself or by an HSE officer making it mostly biased and hard to communicate. The team proposes a constant and continuous approach to objectively monitor the workers using machine vision capabilities along with smart decision-making tools to detect, recognize and classify human behaviors. The input of the system is coherent and effective while the output is unbiased, quantifiable, and communicable, the needed ingredients to integrate the human worker into Industry 4.0. The scope of this work focuses on the worker’s health and safety setting another building block in the Safety 4.0 vision. The proposed system consists of multiple integrated components including continuous video streaming devices, Machine vision components, computer logic for decision making, communication schemes, and locally implemented effectors. The system was tested on a simulated environment using a human factors simulation platform then was validated with actual environments with workers acting with HSE nonconformance while performing different tasks. The results show the system’s ability to recognize the human posture, speed, and throughput then compare it to the HSE guidelines. The results also show that the system was able to provide fast responses by giving warnings, reporting an incident to the management, or shutting the process down if an injury is recognized. Finally, the system generates data and reports that are ready to be transmitted onto the Internet of Things.
Keywords: Posture, Health, Safety, Supervision, Safety 4.0, Computer vision, Autonomous, Human Factors, Ergonomics.
Cite this paper: