Accurate estimation of energy expenditure (EE) plays a fundamental role in the areas of health monitoring, clinical research, and fitness tracking, as well as fatigue management to prevent overtraining. By monitoring EE, practitioners and individuals can refine exercise programs, modify rehabilitation protocols, and track training loads to prevent the risks of excessive strain. Traditional EE prediction methods often struggle to generalize across diverse activities and individuals due to their reliance on simplified statistical models or handcrafted features. In this study, we introduce TransEE, a subject-specific and context-aware Transformer model designed for EE estimation under varied walking and running conditions. Our approach differs from the traditional time-series data-oriented approaches in using contextual data such as speed, activity (walking or running), and personal characteristics (e.g., age, body weight) alongside with sequential IMU readings. We evaluate TransEE on an existing dataset gathered from 3 participants, and achieve a normalized root mean squared error (NRMSE) of 0.047 and mean absolute percentage error(MAPE) of 11.0%. The results demonstrate that the TransEE outperforms transformer based model and CNN+LSTM model, demonstrating the power of transformers for precise and flexible EE estimation. In addition, we also investigate the impact of sensor placement on estimation accuracy. Our experiments reveal that combining wrist and thigh IMU data leads to the most accurate EE predictions, outperforming individual sensor placements. The proposed model offers a valuable tool for clinical monitoring, fitness and overall clinical studies. Additionally, by leveraging IoT-enabled wearable devices for real-time IMU data collection, TransEE enhances the practicality of EE estimation in dynamic environments, supporting continuous monitoring in both clinical and everyday fitness applications.
IMU-Based Energy Expenditure Estimation for Various Walking Conditions Using TransEE
Abdelkarim MamenPrimo
;Teodoro MontanaroSecondo
;Ilaria Sergi;Enrico Junior Schioppa;Luigi Patrono
Ultimo
2025-01-01
Abstract
Accurate estimation of energy expenditure (EE) plays a fundamental role in the areas of health monitoring, clinical research, and fitness tracking, as well as fatigue management to prevent overtraining. By monitoring EE, practitioners and individuals can refine exercise programs, modify rehabilitation protocols, and track training loads to prevent the risks of excessive strain. Traditional EE prediction methods often struggle to generalize across diverse activities and individuals due to their reliance on simplified statistical models or handcrafted features. In this study, we introduce TransEE, a subject-specific and context-aware Transformer model designed for EE estimation under varied walking and running conditions. Our approach differs from the traditional time-series data-oriented approaches in using contextual data such as speed, activity (walking or running), and personal characteristics (e.g., age, body weight) alongside with sequential IMU readings. We evaluate TransEE on an existing dataset gathered from 3 participants, and achieve a normalized root mean squared error (NRMSE) of 0.047 and mean absolute percentage error(MAPE) of 11.0%. The results demonstrate that the TransEE outperforms transformer based model and CNN+LSTM model, demonstrating the power of transformers for precise and flexible EE estimation. In addition, we also investigate the impact of sensor placement on estimation accuracy. Our experiments reveal that combining wrist and thigh IMU data leads to the most accurate EE predictions, outperforming individual sensor placements. The proposed model offers a valuable tool for clinical monitoring, fitness and overall clinical studies. Additionally, by leveraging IoT-enabled wearable devices for real-time IMU data collection, TransEE enhances the practicality of EE estimation in dynamic environments, supporting continuous monitoring in both clinical and everyday fitness applications.I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.


