Dynalips, a cutting-edge lipsync solution, animates the MetaHumans!

12 September 2022

The Dynalips technology allows to precisely synchronize the lip movements of  3D characters and the words they are supposed to pronounce. This lipsync technology, based on the research work of Slim Ouni, associate professor in computer science and member of the Multispeech team (CNRS, Inria, University of Lorraine), offers a fast solution to precisely and automatically synchronize the lip movements of a 3D character with speech.

Dynalips, an automatic and realistic lipsync solution.

Based on the team’s expertise in speech modeling, Dynalips draws on the results of pioneering research in the field of audiovisual speech synthesis and articulatory speech production. This cutting-edge technology provides professionals with a solution to accurately and automatically synchronize the lip movements of a 3D virtual character with speech.

This technology is based on artificial intelligence algorithms based on the analysis of real data recorded on human speakers, allowing the development of models that are accurate to the anatomy and dynamic behavior of the speaker’s face. Its models take into account the phenomenon of coarticulation, widely studied in the field of spoken communication, and guarantee a more fluid and natural synchronization between speech and the lip movements of the characters. Dynalips allows to animate almost any 3D model in a realistic and near-instantaneous manner.

From video games to foreign language learning systems and assistance for the hearing impaired, Dynalips covers a wide spectrum of applications and is open to many professionals. “In the field of animated films in particular, developers can save a considerable amount of time: our technology allows them to model the speech of characters in a few seconds, compared to a day in the studio when this process is done manually,” explains Slim Ouni, Dynalips creator. “Our lipsync tool also paves the way for the development of virtual assistants for the hearing impaired, where the precision of lip movements is very important for the spoken message to be intelligible.

Towards a dynamic and visual realism thanks to MetaHuman from Epic Games

Very recently, Slim Ouni won an Epic MegaGrant, which opens new horizons for Dynalips. The goal is to integrate this innovative lip sync technology to Epic Games’ Unreal Engine, to animate the MetaHumans, highly realistic 3D models of humans created by Epic Games. Unreal Engine is a comprehensive suite of 3D authoring tools for game development, visualization, content creation for film and television, and other real-time applications.

The first promising results of integrating Dynalips technology into Unreal Engine to animate MetaHuman models can already be seen online:

The combination of Dynalips and MetaHuman, the Unreal Engine application, opens new horizons to 3D animation for an ever more immersive user experience.