Generating a Multimodal Dataset Using a Feature Extraction Toolkit for Wearable and Machine Learning: A pilot study

Open Access
Conference Proceedings
Authors: Edwin Marte ZorrillaIdalis VillanuevaJenefer HusmanMatthew Graham

Abstract: Studies for stress and student performance with multimodal sensor measurements have been a recent topic of discussion among research educators. With the advances in computational hardware and the use of Machine learning strategies, scholars can now deal with data of high dimensionality and provide a way to predict new estimates for future research designs. In this paper, the process to generate and obtain a multimodal dataset including physiological measurements (e.g., electrodermal activity- EDA) from wearable devices is presented. Through the use of a Feature Generation Toolkit for Wearable Data, the time to extract clean and generate the data was reduced. A machine learning model from an openly available multimodal dataset was developed and results were compared against previous studies to evaluate the utility of these approaches and tools. Keywords: Engineering Education, Physiological Sensing, Student Performance, Machine Learning, Multimodal, FLIRT, WESAD

Keywords: Human side of engineering, Engineering education, Physiological sensing, Student performance

DOI: 10.54941/ahfe1001448

Cite this paper: