Abstract
Human facial animation is an interesting and difficult problem in computer graphics. In this paper, a novel B-spline (NURBS) muscle system is proposed to simulate a 3D facial expression and talking animation. The system gets the lip shape parameters from the video, which captures a real person?s lip movement, to control the proper muscles to form different phonemes. The muscles are constructed by the non-uniform rational B-spline curves, which are based on anatomical knowledge. By using different number of control points on the muscles, more detailed facial expression and mouth shapes can be simulated. We demonstrate the flexibility of our model by simulating different emotions and lip-sync to a video with a talking head using the automatically extracted lip parameters.