Epic Games has unveiled MetaHuman Animator, a revolutionary tool that allows developers to capture realistic human faces using just an iPhone and a PC. This powerful tool delivers exceptional facial animation in a matter of minutes, making it a game-changer for digital human creation.
MetaHuman Animator simplifies the process of capturing an actor's performance by utilizing an iPhone or a stereo head-mounted camera system (HMC). The captured performance can then be seamlessly applied as high-fidelity facial animation to any MetaHuman character, eliminating the need for manual intervention.
Every subtle expression, look, and emotion is accurately captured and faithfully replicated on digital humans, resulting in astonishingly lifelike animations. The best part is that achieving remarkable results is now accessible to anyone, regardless of their experience level.
For newcomers to performance capture, MetaHuman Animator offers a convenient way to incorporate facial animation based on real-world performances into MetaHumans. And for those already familiar with performance capture, this feature set significantly enhances existing workflows, reducing time and effort while providing greater creative control. By combining MetaHuman Animator with an existing vertical stereo head-mounted camera, visual fidelity can be further elevated.
Previously, faithfully recreating an actor's performance on a digital character would have taken months and required a team of experts. With MetaHuman Animator, the laborious tasks are handled automatically, drastically reducing both time and effort.
The new feature set leverages a 4D solver that combines video and depth data with the MetaHuman representation of the performer. The animation process is performed locally using GPU hardware, resulting in the final animation being available within minutes.
Using MetaHuman Animator is a straightforward process: developers simply point the camera at the actor and press record. The tool accurately captures and reproduces the actor's individuality and nuances onto any MetaHuman character.
Furthermore, the animation data generated by MetaHuman Animator is semantically correct, utilizing appropriate rig controls, and temporally consistent, enabling smooth control transitions. This allows for easy artistic adjustments and fine-tuning of the animation as desired.
The impressive level of fidelity achieved with MetaHuman Animator is showcased in "Blue Dot," a short film created by Epic Games' 3Lateral team in collaboration with local Serbian artists. Actor Radivoje Bukvić delivers a monologue based on a poem by Mika Antic, with the resulting performance being transformed into an animation. This demonstrates the potential of using MetaHuman Animator alongside stereo head-mounted cameras and traditional filmmaking techniques.
With just a few clicks, the facial animation captured using MetaHuman Animator can be applied to any MetaHuman character or any character adopting the new MetaHuman facial description standard. This flexibility empowers developers to design characters freely, knowing that the facial animation seamlessly integrates with their creations.
From a technical standpoint, Mesh to MetaHuman can now create a MetaHuman Identity using only three frames of video and depth data captured with an iPhone or reconstructed from a vertical stereo head-mounted camera. This personalized approach ensures that MetaHuman Animator can generate animation that works effectively on any MetaHuman character, even incorporating convincing tongue animation using audio input.
Epic Games aims to democratize facial performance capture, making it accessible to all creators rather than solely experts with high-end capture systems. MetaHuman Animator can be used with just an iPhone (iPhone 12 or above) and a desktop PC. The Live Link Face iOS app has been updated to capture raw video and depth data, which is then directly processed in Unreal Engine.
Developers can also utilize MetaHuman Animator with an existing vertical stereo head-mounted camera system to achieve even greater fidelity. Whether using an iPhone or a stereo HMC, MetaHuman Animator enhances the speed and ease of the capture workflow, providing developers with the flexibility to choose hardware based on specific shoot requirements and the desired level of visual fidelity.
The captured animation data supports timecode, allowing for seamless alignment of facial performance animation with body motion capture and audio, resulting in a cohesive and comprehensive character performance.
MetaHuman Animator is designed to facilitate creative iteration on set, enabling quick processing and transfer of facial animation to any MetaHuman character. With the ability to review animation data directly in Unreal Engine during the shoot, the quality of the capture can be evaluated in advance, ensuring optimal results before the final character animation begins.
Additionally, the real-time nature of MetaHuman Animator empowers developers to conduct reshoots while the actor is still on stage, eliminating the need for costly and time-consuming post-production adjustments. This streamlined workflow saves both time and resources, allowing for efficient and cost-effective production.
Epic Games' MetaHuman Animator brings facial performance capture within reach of all creators, revolutionizing the process with its simplicity, accessibility, and impressive results. By democratizing high-quality facial animation, Epic Games is empowering developers to unleash their creativity and deliver immersive digital experiences like never before.