Leveraging Human Data for Avatar Action Recognition in Virtual Environments

Open Access
Article
Conference Proceedings
Authors: Somaya EltanboulyOsama Halabi

Abstract: In the dynamic metaverse landscape, avatars serve as digital embodiments, crucial in representing user interactions. This research investigates the dynamics of avatars within the metaverse, with a specific focus on their movements and actions in virtual environments. Real-world pose estimation techniques, notably OpenPose, are applied to establish a correlation between human and avatar movements. A Multilayer Perceptron model, trained on the NTU RGB+D dataset, proves highly effective in accurately classifying avatar actions, achieving an impressive accuracy rate exceeding 95%. The study explores the impact of different avatar types on action recognition, revealing notable performance differences. Notably, a half-body avatar with only the upper part demonstrates comparable performance to a full-body avatar, while the absence of head joints adversely affects accuracy. Additionally, the research assesses the generalizability of human data for avatar recognition, highlighting its superiority over models trained exclusively on avatar data. The findings underscore the adaptability of these techniques to diverse avatar configurations and offer promising advancements in action recognition within the ever-evolving metaverse. This research contributes valuable insights into effectively integrating real-world action recognition techniques to understand avatar behaviors in virtual spaces comprehensively.

Keywords: Metaverse, Avatar, Action Recognition, Pose Estimation, Virtual Environments

DOI: 10.54941/ahfe1004622

Cite this paper:

Downloads
81
Visits
223
Download