At the information technology exhibition Computex 2023, currently taking place in Taipei, NVIDIA presented Avatar Cloud Engine (ACE), a new artificial intelligence system for creating interactive NPCs. ACE consists of several tools that create a complex system of “live communication” with non-game characters.

According to NVIDIA, it should work something like this: instead of choosing one of the dialogue options in the game menu, the player simply presses a button, says a line, ACE recognizes the text using the NVIDIA Riva tool, while another tool – NVIDIA NeMo – generates the answer (based on the lore of the game and the backstory of the characters), and another component – NeMo Guardrails – will protect the game from “counterproductive” or dangerous conversations.

And finally, NVIDIA Omniverse Audio2Face will instantly create facial animations to match the audio you’ve just generated. Audio2Face supports the Unreal Engine, so developers can add such animations directly to Metahuman characters.

Nvidia demonstrated the work of the Avatar Cloud Engine on the example of a conversation between a live player and the owner of the armband, during which the player receives a side quest. To be honest, at least this dialogue does not sound much better than the same answer options prescribed by the developers themselves, but if in games AI really reacts to the live speech of players in this way, then it may well lead to the creation of completely new game genres.

Well, we are waiting for a real demo to evaluate the performance of Avatar Cloud Engine for ourselves.