Paper
10 January 2014 Realistic facial animation generation based on facial expression mapping
Hui Yu, Oliver Garrod, Rachael Jack, Philippe Schyns
Author Affiliations +
Proceedings Volume 9069, Fifth International Conference on Graphic and Image Processing (ICGIP 2013); 906903 (2014) https://doi.org/10.1117/12.2049921
Event: Fifth International Conference on Graphic and Image Processing, 2013, Hong Kong, China
Abstract
Facial expressions reflect internal emotional states of a character or in response to social communications. Though much effort has been taken to generate realistic facial expressions, it still remains a challenging topic due to human being’s sensitivity to subtle facial movements. In this paper, we present a method for facial animation generation, which reflects true facial muscle movements with high fidelity. An intermediate model space is introduced to transfer captured static AU peak frames based on FACS to the conformed target face. And then dynamic parameters derived using a psychophysics method is integrated to generate facial animation, which is assumed to represent natural correlation of multiple AUs. Finally, the animation sequence in the intermediate model space is mapped to the target face to produce final animation.
© (2014) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Hui Yu, Oliver Garrod, Rachael Jack, and Philippe Schyns "Realistic facial animation generation based on facial expression mapping", Proc. SPIE 9069, Fifth International Conference on Graphic and Image Processing (ICGIP 2013), 906903 (10 January 2014); https://doi.org/10.1117/12.2049921
Lens.org Logo
CITATIONS
Cited by 5 scholarly publications.
Advertisement
Advertisement
RIGHTS & PERMISSIONS
Get copyright permission  Get copyright permission on Copyright Marketplace
KEYWORDS
Solid modeling

3D modeling

Data modeling

Motion models

Associative arrays

3D image processing

Clouds

Back to Top