Nvidia Makes It Possible To Endow NPCs In Games With AI
NVIDIA CEO Jensen Huang introduced the company’s new development – ACE for Games technology. It brings video game characters to life with natural language conversations.
Moreover, the latest development can also convert speech into facial expressions and vice versa.
Huang gave a playable demo where an NPC named Jin, the owner of an Asian diner, interacted with the player by answering questions with their voice and giving realistic answers that matched the character’s story.
The ACE for Games service is based on NVIDIA Omniverse technology and includes optimized AI models for various aspects of interaction with NPCs, such as:
NVIDIA NeMo – for creating, configuring and deploying language models using your own data. Large language models can be adapted to the story and characters, and protected from unwanted or unsafe dialogue using NeMo Guardrails.
NVIDIA Riva – for automatic speech recognition and text-to-speech synthesis to provide live conversational communication.
NVIDIA Omniverse Audio2Face – for instantly creating expressive facial animation of a game character that matches any sound track. Audio2Face supports Omniverse connectors for Unreal Engine 5 so developers can add facial animation directly to MetaHuman characters.
Developers can integrate the entire suite of NVIDIA ACE for Games solutions or use only the components they need.