• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA Digital Human Technologies Bring AI Characters to Life

TheLostSwede

News Editor
Joined
Nov 11, 2004
Messages
17,771 (2.42/day)
Location
Sweden
System Name Overlord Mk MLI
Processor AMD Ryzen 7 7800X3D
Motherboard Gigabyte X670E Aorus Master
Cooling Noctua NH-D15 SE with offsets
Memory 32GB Team T-Create Expert DDR5 6000 MHz @ CL30-34-34-68
Video Card(s) Gainward GeForce RTX 4080 Phantom GS
Storage 1TB Solidigm P44 Pro, 2 TB Corsair MP600 Pro, 2TB Kingston KC3000
Display(s) Acer XV272K LVbmiipruzx 4K@160Hz
Case Fractal Design Torrent Compact
Audio Device(s) Corsair Virtuoso SE
Power Supply be quiet! Pure Power 12 M 850 W
Mouse Logitech G502 Lightspeed
Keyboard Corsair K70 Max
Software Windows 10 Pro
Benchmark Scores https://valid.x86.fr/yfsd9w
NVIDIA announced today that leading AI application developers across a wide range of industries are using NVIDIA digital human technologies to create lifelike avatars for commercial applications and dynamic game characters. The results are on display at GTC, the global AI conference held this week in San Jose, Calif., and can be seen in technology demonstrations from Hippocratic AI, Inworld AI, UneeQ and more.

NVIDIA Avatar Cloud Engine (ACE) for speech and animation, NVIDIA NeMo for language, and NVIDIA RTX for ray-traced rendering are the building blocks that enable developers to create digital humans capable of AI-powered natural language interactions, making conversations more realistic and engaging.




"NVIDIA offers developers a world-class set of AI-powered technologies for digital human creation," said John Spitzer, vice president of developer and performance technologies at NVIDIA. "These technologies may power the complex animations and conversational speech required to make digital interactions feel real."

World-Class Digital Human Technologies
The digital human technologies suite includes language, speech, animation and graphics powered by AI:
  • NVIDIA ACE—technologies that help developers bring digital humans to life with facial animation powered by NVIDIA Audio2Face and speech powered by NVIDIA Riva automatic speech recognition (ASR) and text-to-speech (TTS). ACE microservices are flexible in allowing models to run across cloud and PC depending on the local GPU capabilities to help ensure the user receives the best experience.
  • NVIDIA NeMo—an end-to-end platform that enables developers to deliver enterprise-ready generative AI models with precise data curation, cutting-edge customization, retrieval-augmented generation and accelerated performance.
  • NVIDIA RTX—a collection of rendering technologies, such as RTX Global Illumination (RTXGI) and DLSS 3.5, that enable real-time path tracing in games and applications.

Building Blocks for Digital Humans and Virtual Assistants
To showcase the new capabilities of its digital human technologies, NVIDIA worked across industries with leading developers, such as Hippocratic AI, Inworld AI and UneeQ, on a series of new demonstrations.

Hippocratic AI has created a safety-focused, LLM-powered, task-specific Healthcare Agent. The agent calls patients on the phone, follows up on care coordination tasks, delivers preoperative instructions, performs post-discharge management and much more. For GTC, NVIDIA collaborated with Hippocratic AI to extend its solution to use NVIDIA ACE microservices, NVIDIA Audio2Face along with NVIDIA Animation graph and NVIDIA Omniverse Streamer Client to show the potential of a generative AI healthcare agent avatar.

"Our digital assistants provide helpful, timely and accurate information to patients worldwide," said Munjal Shah, cofounder and CEO of Hippocratic AI. "NVIDIA ACE technologies bring them to life with cutting-edge visuals and realistic animations that help better connect to patients."

UneeQ is an autonomous digital human platform specialized in creating AI-powered avatars for customer service and interactive applications. Its digital humans represent brands online, communicating with customers in real time to give them confidence in their purchases. UneeQ integrated the NVIDIA Audio2Face microservice into its platform and combined it with Synanim ML to create highly realistic avatars for a better customer experience and engagement.

"UneeQ combines NVIDIA animation AI with our own Synanim ML synthetic animation technology to deliver real-time digital human interactions that are emotionally responsive and deliver dynamic experiences powered by conversational AI," said Danny Tomsett, founder and CEO of UneeQ.

Bringing Dynamic Non-Playable Characters to Games
NVIDIA ACE is a suite of technologies designed to bring game characters to life. Covert Protocol is a new technology demonstration, created by Inworld AI in partnership with NVIDIA, that pushes the boundary of what character interactions in games can be. Inworld's AI engine has integrated NVIDIA Riva for accurate speech-to-text and NVIDIA Audio2Face to deliver lifelike facial performances.

Inworld's AI engine takes a multimodal approach to the performance of non-playable characters (NPCs), bringing together cognition, perception and behavior systems for an immersive narrative with stunning RTX-rendered characters set in a beautifully crafted environment.

"The combination of NVIDIA ACE microservices and the Inworld Engine enables developers to create digital characters that can drive dynamic narratives, opening new possibilities for how gamers can decipher, deduce and play," said Kylan Gibbs, CEO of Inworld AI.

Game publishers worldwide are evaluating how NVIDIA ACE can improve the gaming experience.

Developers Across Healthcare, Gaming, Financial Services, Media & Entertainment and Retail Embrace ACE
Top game and digital human developers are pioneering ways ACE and generative AI technologies can be used to transform interactions between players and NPCs in games and applications.

Developers and platforms embracing ACE include Convai, Cyber Agent, Data Monsters, Deloitte, Hippocratic AI, IGOODI, Inworld AI, Media.Monks, miHoYo, NetEase Games, Perfect World, Openstream, OurPalm, Quantiphi, Rakuten Securities, Slalom, SoftServe, Tencent, Top Health Tech, Ubisoft, UneeQ and Unions Avatars.

More information on NVIDIA ACE is available at https://developer.nvidia.com/ace. Platform developers can incorporate the full suite of digital human technologies or individual microservices into their product offerings.

Developers can start their journey on NVIDIA ACE by applying for the early access program to get in-development AI models. To explore available models, developers can evaluate and access NVIDIA NIM, a set of easy-to-use microservices designed to accelerate the deployment of generative AI, for Riva and Audio2Face on ai.nvidia.com today.

View at TechPowerUp Main Site | Source
 
Joined
Jun 3, 2008
Messages
775 (0.13/day)
Location
Pacific Coast
System Name Z77 Rev. 1
Processor Intel Core i7 3770K
Motherboard ASRock Z77 Extreme4
Cooling Water Cooling
Memory 2x G.Skill F3-2400C10D-16GTX
Video Card(s) EVGA GTX 1080
Storage Samsung 850 Pro
Display(s) Samsung 28" UE590 UHD
Case Silverstone TJ07
Audio Device(s) Onboard
Power Supply Seasonic PRIME 600W Titanium
Mouse EVGA TORQ X10
Keyboard Leopold Tenkeyless
Software Windows 10 Pro 64-bit
Benchmark Scores 3DMark Time Spy: 7695
I'm skeptical that the quality of the story wouldn't be affected. I think it could be done well, but also think many won't take the time to do it well.
 
Joined
Sep 15, 2015
Messages
1,092 (0.32/day)
Location
Latvija
System Name Fujitsu Siemens, HP Workstation
Processor Athlon x2 5000+ 3.1GHz, i5 2400
Motherboard Asus
Memory 4GB Samsung
Video Card(s) rx 460 4gb
Storage 750 Evo 250 +2tb
Display(s) Asus 1680x1050 4K HDR
Audio Device(s) Pioneer
Power Supply 430W
Mouse Acme
Keyboard Trust
There is no games in my language.
 
Joined
Mar 21, 2016
Messages
2,508 (0.78/day)
It a nice enough idea using AI to make games more engaging and realistic and varied, but knowing Nvidia they'll try to make it proprietary to and for their hardware and governments will just allow it because they don't care how much of a monopolistic game development scenario that is GameWorks all over again.

Also I already touched on AI being used to make NPC interactions more realistic recently in regard to Multi-Core CPU's and threading. The GPU's will probably do a lot of the grunt work, but CPU threading will still be pretty important if you involve lots of AI with lots of different interaction behavioral algorithm patterns. I'm sure there will be cases where both GPU or CPU is a bit more important from a latency standpoint to keep frames reasonable yet entail very broad game scene dynamics and interactions.

It's cool though I like it I had mentioned AI behavior of zombies to the developer of DeadPoly about a year ago and get a bunch of different scripts that can be hybridized in a procedural manner. Like honestly overly predictable set patterns is a bit dated for games. There needs to be a good bit more variance to the behavior though controllable to reasonable point by a weighting system.

UO had a interesting eco system game environment, but was very simplistic, but imagine something like that meets Everquest with variable AI pathing and spawns that can be single spawns or groups and with variable behavior on how they fight and aggro. At that point combining all of it together you'd end up with something rather spectacular I believe. Mixing and combining and sort of hybridizing it a bit like Grim Dawn with classes, but more for procedural aspects of different parts of the AI design to blend and combine. You could even weight each part a bit differently. You could even then rationalize it into proximity area zone presets that follow their own biome AI behaviors.
 
Last edited:
Top