Animating lip-syncing with Adobe Character Animator.
You can use Adobe Sensei AI technology in Character Animator to assign mouth shapes or visemes, to mouth sounds or phonemes. There are two ways to do this.
- Control an animated character in Character Animator, known as a puppet, with your own movements via a camera.
- Upload prerecorded audio and sync it with an existing puppet, while adding gestures and other movements manually.
In either case, you can customise and alter specific mouth shapes in Character Animator or manually assign certain mouth shapes to the puppet at particular times in the recording. You can tweak how long mouth shapes last and how exaggerated they are. This goes a long way to showing what a character is thinking or feeling.
Except for the neutral, smiling and surprised expressions, which come from the camera, everything that drives mouth shapes in Character Animator comes from the audio.
Mouth shapes are Adobe Photoshop and Illustrator documents. You can borrow a mouth for one character and use it on another or make new mouth sets entirely.
Discover more about Character Animator.