Unimore logo AImageLab

Virtual EMG via Facial Video Analysis

Abstract: In this note, we address the problem of simulating electromyographic signals arising from muscles involved in facial expressions - markedly those conveying affective information -, by relying solely on facial landmarks detected on video sequences. We propose a method that uses the framework of Gaussian Process regression to predict the facial electromyographic signal from videos where people display non-posed affective expressions. To such end, experiments have been conducted on the OPEN EmoRec II multimodal corpus.


Citation:

Boccignone, G.; Cuculo, V.; Grossi, G.; Lanzarotti, R.; Migliaccio, R. "Virtual EMG via Facial Video Analysis" Image Analysis and Processing : ICIAP 2017, vol. 10484, Catania, pp. 197 -207 , 2017, 2017 DOI: 10.1007/978-3-319-68560-1_18

 not available