Affective computing researchers adopt a variety of methods in analysing or synthesizing aspects of human behaviour. The choice of method depends on which behavioural cues are considered salient or straightforward to capture and comprehend, as well as the overall context of the interaction. Thus, each approach focuses on modelling certain information and results to dedicated representations. However, analysis or synthesis is usually done by following label-based representations, which usually have a direct mapping to a feature vector. The goal of the presented work is to introduce an interim representational mechanism that associates low-level gesture expressivity parameters with a high-level dimensional representation of affect. More specifically, it introduces a novel methodology for associating easily extracted, low-level gesture data to the affective dimensions of activation and evaluation. For this purpose, a user perception test was carried out in order to properly annotate a dataset, by asking participants to assess each gesture in terms of the perceived activation (active/passive) and evaluation (positive/negative) levels. In affective behaviour modelling, the contribution of the proposed association methodology is twofold: On one hand, when analysing affective behaviour, it can enable the fusion of expressivity parameters alongside with any other modalities coded in higher-level affective representations, leading, in this way, to scalable multimodal analysis. On the other hand, it can enforce the process of synthesizing composite human behaviour (e.g. facial expression, gestures and body posture) since it allows for the translation of dimensional values of affect into synthesized expressive gestures.
Malatesta, L., Asteriadis, S., Caridakis, G., Vasalou, A., & Karpouzis, K. (2016). Associating gesture expressivity with affective representations. Engineering Applications of Artificial Intelligence, 51, 124–135. https://doi.org/10.1016/j.engappai.2016.01.010