At the Consumer Electronics Show (CES) on Monday, Nvidia showed off the latest advancements in its generative AI-powered NPCs, showcasing automated conversations between a player and computer-generated characters that could change how games are made. Nvidia’s Avatar Cloud Engine (ACE) tech combines speech-to-text recognition and text-to-speech responses with generative AI facial animation and automated character personas to spit out computer-created character interactions.
Nvidia’s special address at CES 2024 featured Seth Schneider, senior product manager of ACE, demoing the technologies working in tandem. According to Schneider, the demo interprets a player’s speech, transforming it into text. That text is then fed to a cloud-based large-language model to generate a response from an NPC. The response text is fed into Omniverse Audio2Face, which lip syncs the generated spoken audio, which is then rendered in-game.
The demo is an update to a previous version of the tech shown at 2023’s Computex, which featured a character speaking to a futuristic ramen shop owner, Jin. Nvidia’s new demo expands upon that by having Jin and another NPC, Nova, have AI-generated conversations that can be unique on each playthrough.
The CES 2024 demo also shows off new technology from another company called Convai, which lets AI-powered NPCs do more than just converse. Those NPCs can also interact with objects in their environment. In the newer demo, Jin is shown pulling out a bottle of booze when prompted to “bring out the good stuff” by Schneider. According to an asset shared by Convai, environment-aware AI NPCs would be able to interact with a number of objects, including bowls, bottles, lights, and other props in the scene.
Nvidia says a number of game developers are already utilizing its ACE production services, which includes the Audio2Face facial animation generative AI and Riva automatic speech recognition. Schneider named “top digital avatar developers” like Genshin Impact publisher Mihoyo, NetEase Games, Tencent, and Ubisoft as some of the companies making AI-powered NPCs for their products.
It’s not clear yet which games will incorporate these types of AI-generated NPCs, but Nvidia and Convai boast the tech will integrate “seamlessly” with game engines like Unreal Engine and Unity. It’s also unclear whether the real-world output will be any good, or like Jin and Nova’s conversation, unnervingly uncanny. Both characters sound robotic and strange in their respective deliveries, despite the clever output of nearly convincing conversations.
One thing’s almost guaranteed to come out of Nvidia’s new demo: increased suspicion the bad NPC interactions we’ll experience in future games is crafted by AI instead of an actual human being.