Monday, May 29th 2023

NVIDIA ACE for Games Sparks Life Into Virtual Characters With Generative AI

NVIDIA today announced NVIDIA Avatar Cloud Engine (ACE) for Games, a custom AI model foundry service that transforms games by bringing intelligence to non-playable characters (NPCs) through AI-powered natural language interactions. Developers of middleware, tools and games can use ACE for Games to build and deploy customized speech, conversation and animation AI models in their software and games.

"Generative AI has the potential to revolutionize the interactivity players can have with game characters and dramatically increase immersion in games," said John Spitzer, vice president of developer and performance technology at NVIDIA. "Building on our expertise in AI and decades of experience working with game developers, NVIDIA is spearheading the use of generative AI in games."
Pioneering Generative AI in Games
Building on NVIDIA Omniverse, ACE for Games delivers optimized AI foundation models for speech, conversation and character animation, including:

NVIDIA NeMo — for building, customizing and deploying language models, using proprietary data. The large language models can be customized with lore and character backstories, and protected against counterproductive or unsafe conversations via NeMo Guardrails.
NVIDIA Riva — for automatic speech recognition and text-to-speech to enable live speech conversation.
NVIDIA Omniverse Audio2Face — for instantly creating expressive facial animation of a game character to match any speech track. Audio2Face features Omniverse connectors for Unreal Engine 5, so developers can add facial animation directly to MetaHuman characters.
Developers can integrate the entire NVIDIA ACE for Games solution or use only the components they need.

'Kairos' Offers a Peek at the Future of Games
NVIDIA collaborated with Convai, an NVIDIA Inception startup, to showcase how developers will soon be able to use NVIDIA ACE for Games to build NPCs. Convai, which is focused on developing cutting-edge conversational AI for virtual game worlds, integrated ACE modules into its end-to-end real-time avatar platform.

In a demo, called Kairos, players interact with Jin, the purveyor of a ramen shop. Although he is an NPC, Jin replies to natural language queries realistically and consistent with the narrative backstory — all with the help of generative AI. Watch the demo, which is rendered in Unreal Engine 5 using the latest ray-tracing features and NVIDIA DLSS.

"With NVIDIA ACE for Games, Convai's tools can achieve the latency and quality needed to make AI non-playable characters available to nearly every developer in a cost-efficient way," said Purnendu Mukherjee, founder and CEO at Convai.

Deploy NVIDIA ACE for Gaming Models Locally or in the Cloud
The neural networks enabling NVIDIA ACE for Games are optimized for different capabilities, with various size, performance and quality trade-offs. The ACE for Games foundry service will help developers fine-tune models for their games, then deploy via NVIDIA DGX Cloud, GeForce RTX PCs or on premises for real-time inferencing.

The models are optimized for latency — a critical requirement for immersive, responsive interactions in games.

Generative AI to Transform the Gaming Experience
Game developers and startups are already using NVIDIA generative AI technologies for their workflows.

GSC Game World, one of Europe's leading game developers, is adopting Audio2Face in its upcoming game, S.T.A.L.K.E.R. 2: Heart of Chernobyl.
Fallen Leaf, an indie game developer, is using Audio2Face for character facial animation in Fort Solis, a third-person sci-fi thriller that takes place on Mars.
Charisma.ai, a company enabling virtual characters through AI, is leveraging Audio2Face to power the animation in its conversation engine.
Learn more about building on NVIDIA Omniverse using NVIDIA ACE and other technology advancements at COMPUTEX.
Add your own comment

6 Comments on NVIDIA ACE for Games Sparks Life Into Virtual Characters With Generative AI

#1
evernessince
I watched the video demonstration for this and they asked the most generic questions possible to the AI. It's like the people who put together the presentation are not gamers and had zero enthusiasm for the tech. There are Skryim mods that do a far better job of demonstrating the potential of AI natural language interactions in games than Nvidia has done here.
Posted on Reply
#2
Bomby569
Nvidia is now the "gimmick company". Release overpriced shitty gpus, and spend the ReD money in all sorts of craps.
Posted on Reply
#3
Vayra86
Bomby569Nvidia is now the "gimmick company". Release overpriced shitty gpus, and spend the ReD money in all sorts of craps.
Yeah the proprietary floodgates have opened, holy crap.
Posted on Reply
#4
Bwaze
I think rather than push this technology into RPG games where there are hundreds or thousands of NPCs we have limited interest into, and only need a few sentences of information from them, they should be focussing on new games that wouldn't even be possible with just scripted conversations.

Escape room game with a single AI partner? Murder mistery with limited number of suspects, Agatha Christie style? Adventure games with big focus on interaction? There's tons of more interesting uses of more realistic conversations!
Posted on Reply
#5
ZoneDymo
so wait, does htis need live AI hardware to function? or does it use AI during game development to just fill up the data needed automatically?
Posted on Reply
#6
JustBenching
Bomby569Nvidia is now the "gimmick company". Release overpriced shitty gpus, and spend the ReD money in all sorts of craps.
And then you realize the "gimmick" company has the fastest gpu on the planet for raster performance, RT performance, the best upscaling algorithm, the only gpus that can do motion interpolation, the fastest productivity gpu (by far, btw). Yeah, full on gimmick lol
Posted on Reply
Nov 19th, 2024 08:43 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts