Cartoon mouths depicting letter being said as a reference guide for animating speech in characters


Animated lip-syncing powered by Adobe AI.

Give your characters a voice. Both Adobe Character Animator and Adobe Animate help you streamline animating speech.

Not sure which apps are best for you?

Take a minute. We'll help you figure it out.

Streamline animation with Adobe AI technology.

Animating a character’s mouth as it speaks has traditionally been one of the most time- and labor-intensive aspects of animation. Adobe Character Animate and Adobe Animator both draw on the power Adobe Sensei AI technology to radically reduce the amount of time and effort it takes to create lip-sync animation.

Animating lip-syncing with Adobe Character Animator.

You can use Adobe Sensei AI technology in Character Animator to assign mouth shapes, or visemes, to mouth sounds, or phonemes. There are two ways to do this.


  • Control an animated character in Character Animator, known as a puppet, with your own movements via a camera.

  • Upload prerecorded audio and sync it with an existing puppet, while adding gestures and other movements manually.


In either case, you can customize and alter specific mouth shapes in Character Animator or manually assign certain mouth shapes to the puppet at particular times in the recording. You can tweak how long mouth shapes last and how exaggerated they are. This goes a long way to showing what a character is thinking or feeling.


Except for the neutral, smiling, and surprised expressions, which come from the camera, everything that drives mouth shapes in Character Animator comes from the audio.


Mouth shapes are Adobe Photoshop and Illustrator documents. You can borrow a mouth for one character and use it on another or make new mouth sets entirely.

Discover more about Character Animator.

Cartoon character being animated in Adobe Character Animator

Create mouth shapes.

Find out how mouth movements work in Character Animator and how to customize existing mouth shapes or create your own.

Learn more

A control puppet in Adobe Character Animator

Control puppets using behaviors.

Behaviors in Character Animator include how a character moves, how it speaks, and phenomena like motion lines. Find out how to customize lip-syncing when you’re deciding how a character talks.

Learn more

A cartoon character controlled in Adobe Character Animator for a live stream

Tips and tricks for streaming.

Stream as a cartoon character. Learn about real-time animation and performing as a puppet for a live audience.

Learn more

Skeleton cartoon character being animated in Adobe Character Animator

Keyboard shortcuts for Character Animator.

Make the fastest animation application even faster. Learn the shortcuts, save time, and get animating in record time.

Learn more

Animating lip-syncing with Animate.

Adobe Animate supports AI-powered lip-syncing, while also giving you total control over animation. Animate draws on the capabilities of Adobe Sensei AI technology to match visemes to phonemes.


Animate also supports traditional frame-by-frame lip-sync animation. Animators can work with an audio track and mouth chart to place mouth shapes manually, precisely matching images to given keyframes within the timeline.


Work seamlessly with other Adobe Creative Cloud applications, and go from initial illustrations and concept art to edited, finished video in a single, unified system.


  • Create your own library of mouth shapes in Animate, and even import assets from Adobe Photoshop or Illustrator.
  • Record dialogue and other audio in Adobe Audition.
  • Bring characters to life in Animate with movement, speech, and facial expressions.
  • Add graphics and flourishes in Adobe After Effects.
  • Edit everything together in Adobe Premiere.
Character getting lip-sync matching added in Adobe Animate

Discover more about Animate.

Create and work with symbol instances in Animate.

Map specific sounds to specific visuals. Discover how to create  symbols, such as mouth shapes, and make each symbol appear at certain points on a timeline.

Learn more

Mapping sounds to visuals in Adobe Animate

Auto lip-syncing.

Find out how to make a library of mouth shapes for a character, and then assign them to speech via Adobe Animate AI.

Learn more

How to lip-sync with Auto Lip-Sync.

Watch a video series that outlines the step-by-step process of assigning mouth shapes to parts of speech.

Learn more

Take your animation even further.

Take your animated lip-syncing videos even further by creating animated explainer videos for your audiences.

Automated lip-syncing is only one of the ways Character Animator and Animate enhance the animation process. See for yourself how Adobe animation apps can help you animate just about anything.


Try Character Animator


Explore Animate

Premiere Pro

Do more with Adobe Premiere Pro.

You may also like