Emotional State Recognition Performance Improvement on a Handwriting and Drawing Task
Author
Publication date
2021-02-21DOI
10.1109/ACCESS.2021.3058443
Abstract
In this work we combine time, spectral and cepstral features of the signal captured in a tablet
to characterize depression, anxiety, and stress emotional state recognition on the EMOTHAW database.
EMOTHAW contains the emotional states of users represented by capturing signals from sensors on the tablet
and pen when the user is performing 3 specific handwriting and 4 drawing tasks, which had been categorized
into depressed, anxious, stressed, and typical, according to the Depression, Anxiety and Stress Scale
(DASS). Each user was characterized with six time-domain features, and the number of spectral-domain
and cepstral-domain features for the horizontal and vertical displacement of the pen, the pressure on the
paper, and the time spent on-air and off-air, depended on the configuration of the filterbank. As next step,
we select the best features using the Fast Correlation-Based Filtering method. Since our dataset has 129 users,
then as next step, we augmented the training data by randomly selecting a percentage of the training data
and adding a small random Gaussian noise to the extracted features. We then train a radial basis SVM
model using the Leave-One-Out (LOO) methodology. The experimental results show an average accuracy
classification improvement ranging of 15%, and an accuracy classification improvement ranging from 4%
to 34% compared with baseline (state of the art) for specific emotions such as depression, anxiety, stress,
and typical emotional states.
Document Type
Article
Document version
Published version
Language
English
Keywords
Data augmentation, emotional state recognition, emotional states, feature extraction, SVM
Pages
9 p.
Publisher
IEEEAccess
This item appears in the following Collection(s)
- Articles [3]
The following license files are associated with this item:
Except where otherwise noted, this item's license is described as http://creativecommons.org/licenses/by-nc-nd/4.0/