Abstract
Children with autism spectrum disorder (ASD) often display atypical emotional expressions and physiological responses, making emotion recognition challenging. This study proposes a multimodal recognition model employing a late fusion framework combining facial expression with physiological measures: electrodermal activity (EDA), temperature (TEMP), and heart rate (HR). Emotional states are annotated using two complementary schemes derived from a shared set of labels. Three annotators provide one categorical Ekman emotion for each timestamp. From these annotations, a majority-vote label identifies the dominant emotion, while a proportional distribution reflects the likelihood of each emotion based on the relative frequency of the annotators’ selections. Separate machine learning models are trained for each modality and for each annotation scheme, and their outputs are integrated through decision-level fusion. A distinct decision-level fusion model is constructed for each annotation scheme, ensuring that both the categorical and likelihood-based representations are optimally combined. The experiments on the EMBOA dataset, collected within the project “Affective loop in Socially Assistive Robotics as an intervention tool for children with autism”, show that the late fusion model achieves higher accuracy and robustness than unimodal baselines. The system attains an accuracy of 68% for categorical emotion classification and 78% under the likelihood-estimation scheme. The results obtained, although lower than those reported in other studies, suggest that further research into emotion recognition in autistic children using other fusions is warranted, even in the case of datasets with a significant number of missing values and low sample representation for certain emotions.
Affiliated Institutions
Related Publications
Voluntary Facial Action Generates Emotion‐Specific Autonomic Nervous System Activity
ABSTRACT Four experiments were conducted to determine whether voluntarily produced emotional facial configurations are associated with differentiated patterns of autonomic activ...
Taxonomy and comorbidity of conduct problems: Evidence from empirically based approaches
Abstract Many children meet criteria for multiple Diagnostic and Statistical Manual (DSM) categories, such as oppositional defiant disorder (ODD) and conduct disorder (CD). If e...
Emotional Expressions Reconsidered: Challenges to Inferring Emotion From Human Facial Movements
It is commonly assumed that a person’s emotional state can be readily inferred from his or her facial movements, typically called emotional expressions or facial expressions. Th...
Evolution and facial action in reflex, social motive, and paralanguage
Based upon current evolutionary theory and recent laboratory and field data, this paper introduces a behavioral-ecology view of human facial displays that contrasts with previou...
EEG Emotion Recognition Using Dynamical Graph Convolutional Neural Networks
In this paper, a multichannel EEG emotion recognition method based on a novel dynamical graph convolutional neural networks (DGCNN) is proposed. The basic idea of the proposed E...
Publication Info
- Year
- 2025
- Type
- article
- Volume
- 25
- Issue
- 24
- Pages
- 7485-7485
- Citations
- 0
- Access
- Closed
External Links
Social Impact
Social media, news, blog, policy document mentions
Citation Metrics
Cite This
Identifiers
- DOI
- 10.3390/s25247485