Spotlight Sale: Save 50% on select Blueprints now through September 9.

ACE Unreal Engine Plugin

NVIDIA - Code Plugins - Jul 10, 2024

Developers can use the NVIDIA ACE for Games UE plugin to build and deploy customized speech, conversation, and animation AI models in their software, including games, across the cloud and PC. Currently supporting Audio2Face only.

  • Supported Platforms
  • Supported Engine Versions
    5.4

NVIDIA ACE UE plugin currently supports Audio2Face (A2F), however NVIDIA Riva automatic speech recognition (ASR) and text-to-speech (TTS) capabilities are coming soon.


The NVIDIA ACE Unreal plugin allows you to send speech clips to an NVIDIA Audio2Face service and receive synchronized audio and ARKit-compatible facial animations. You can animate any character that has an ARKit-compatible pose mapping. The plugin includes a pose mapping asset for animating MetaHuman faces, for convenience.


Key Features:

  • NVIDIA Audio2Face allows animating in-game characters' lips to match the voice audio. 
  • Use the API key to connect to the A2F backend service.
  • Adjust A2F model parameters, such as blink strength or tongue offset, or emotion parameters, such as anger or joy.

Technical Details

This plugin requires an NVIDIA Audio2Face service. Options for connecting include:



Detailed documentation is also included with ACE_Unreal_Plugin_Guide.pdf installed in the root plugin folder, to guide you through setting up a character to be animated through Audio2Face.


Note: The default MetaHuman face animation blueprint contains a "Mouth Close" animation that interferes with Audio2Face-provided lip movement. It's highly recommended to bypass the Mouth Close animation when animating MetaHumans using Audio2Face. See the plugin documentation for more details.


Supported Unreal Engine versions: 5.3, 5.4

Documentation: NVIDIA ACE — ACE documentation latest documentation, NVIDIA ACE 2.1 Plugin — ACE documentation latest documentation

Example Project: https://developer.nvidia.com/ace#game-characters