Adding Lipsync to RunwayML Videos

January 07, 2024

Runway ML is one of the leading tools for generating AI videos from text or image. It allows you to control camera movement, and also allows you extend videos beyond 4 seconds and up to 12 seconds long. Today, we'll show you how you can add lip movement your Runway ML videos to make your content even more dynamic.

 

Step 1. Generating a Video from RunwayML

Create an account on Runway ML and generate a video from text. Let's say for the purpose of this tutorial, we use the prompt "A medieval king sitting on a horse with an army behind him".

screen

The video should looks something like you see below.

 

Step 2. Upload the Video on Polymorf

Login to https://polymorf.me. You should see a library of your current videos, or if none, you should see a "Create Video" button. Click that, and make sure to choose "Lipsync Video", and it'll take you to the editor.

Library

Once you're on the editor, click on the videos tab, and click on the "Click to Upload" button on videos library on the left drawer to upload the video you just created.

Editor

 

Step 3. Enter a script/sentence

You can enter a script to lipsync the video. Most runwayml videos are 4 to 12 seconds in length so make sure your text doesnt exceed the video duration. We can verify the video duration by clicking the "Render" button and checking the duration field. It shows that the text "We will not let you pass" as having 2 seconds duration.

Script

 

Step 4. Render the video

Once that's done, you can click "Render", and with enough credits, generate a video with mouth movement. See the results below.

 

Conclusion

As you can see, the process is pretty simple and straightforward, and you can do this either on a website or mobile device. If you want to add mouth movement and make your Runway Ml videos talk, try using Polymorf AI for free, and you can check it out at https://polymorf.me/lipsync-video .