Scopus Indexed Publications

Paper Details


Title
Painthenticate: Feature Engineering on Multimodal Physiological Signals

Author
Sajeeb Datta,

Email

Abstract

Pain recognition using physiological signals is a critical step toward advancing automatic pain assessment, particularly in clinical and experimental settings. In this study, we present a machine learning-based approach to address the AI4PAIN 2025 challenge, which involves classifying pain intensity into three levels—No Pain, Low Pain, and High Pain—based on multimodal time-series signals including electrodermal activity (EDA), blood volume pulse (BVP), respiration (RESP), and peripheral oxygen saturation (SpO2). After extracting relevant statistical and frequency-domain features from each modality, we evaluated a series of classical machine learning models: RandomForestClassifier, LGBMClassifier, HistGradientBoostingClassifier, XGBClassifier, and CatBoostClassifier. Among these, the Random Forest Classifier emerged as the most robust model, achieving the highest test accuracy of 62.07%, indicating better generalization compared to others. Although the XGBClassifier attained the highest validation accuracy of 68%, its performance degraded more noticeably on the test set, suggesting potential overfitting. Both LGBMClassifier and HistGradientBoostingClassifier reached similar validation accuracies (around 64%), with the latter showing better class-wise balance under label imbalance conditions. However, Random Forest demonstrated the most consistent performance across both validation and test evaluations, making it the most reliable model in this study. The entire preprocessing, feature engineering, and classification pipeline developed for this task is encapsulated in our proposed system, Painthenticate, designed to support reproducible and interpretable pain classification. These findings highlight the value of model selection not only based on aggregate accuracy but also on consistency across class distributions in physiological pain classification tasks.


Keywords

Journal or Conference Name
ICMI 2025 - Companion Publication of the 27th International Conference on Multimodal Interaction

Publication Year
2025

Indexing
scopus