| Summary: | Internationally, the recent pandemic caused severe
social changes forcing people to adopt new practices in their
daily lives. One of these changes requires people to wear
masks in public spaces to mitigate the spread of viral diseases.
Affective state assessment (ASA) systems that rely on facial
expression analysis become impaired and less effective due to the
presence of visual occlusions caused by wearing masks. Therefore,
ASA systems need to be future-proofed and equipped with
adaptive technologies to be able to analyze and assess occluded
facial expressions, particularly in the presence of masks. This
paper presents an adaptive approach for classifying occluded
facial expressions when human faces are partially covered with
masks. We deployed an unsupervised, cosine similarity-based
clustering approach exploiting the continuous nature of the
extended Cohn-Kanade (CK+) dataset. The cosine similaritybased
clustering resulted in twenty-one micro-expression clusters
that describe minor variations of human facial expressions.
Linear discriminant analysis was used to project all clusters
onto lower-dimensional discriminant feature spaces, allowing for
binary occlusion classification and the dynamic assessment of
affective states. During the validation stage, we observed 100%
accuracy when classifying faces with features extracted from
the lower part of the occluded faces (occlusion detection). We
observed 76.11% facial expression classification accuracy when
features were gathered from the uncovered full-faces and 73.63%
classification accuracy when classifying upper-facial expressions -
applied when the lower part of the face is occluded. The presented
system promises an improvement to visual inspection systems
through an adaptive occlusion detection and facial expression
classification framework.
|