Tuesday, December 17th 2024

NVIDIA Blackwell RTX and AI Features Leaked by Inno3D

NVIDIA's RTX 5000 series GPU hardware has been leaked repeatedly in the weeks and months leading up to CES 2025, with previous leaks tipping significant updates for the RTX 5070 Ti in the VRAM department. Now, Inno3D is apparently hinting that the RTX 5000 series will also introduce updated machine learning and AI tools to NVIDIA's GPU line-up. An official CES 2025 teaser published by Inno3D, titled "Inno3D At CES 2025, See You In Las Vegas!" makes mention of potential updates to NVIDIA's AI acceleration suite for both gaming and productivity.

The Inno3D teaser specifically points out "Advanced DLSS Technology," "Enhanced Ray Tracing" with new RT cores, "better integration of AI in gaming and content creation," "AI-Enhanced Power Efficiency," AI-powered upscaling tech for content creators, and optimizations for generative AI tasks. All of this sounds like it builds off of previous NVIDIA technology, like RTX Video Super Resolution, although the mention of content creation suggests that it will be more capable than previous efforts, which were seemingly mostly consumer-focussed. Of course, improved RT cores in the new RTX 5000 GPUs is also expected, although it will seemingly be the first time NVIDIA will use AI to enhance power draw, suggesting that the CES announcement will come with new features for the NVIDIA App. The real standout feature, though, are called "Neural Rendering" and "Advanced DLSS," both of which are new nomenclatures. Of course, Advanced DLSS may simply be Inno3D marketing copy, but Neural Rendering suggests that NVIDIA will "Revolutionize how graphics are processed and displayed," which is about as vague as one could be.
Just based on the information Inno3D has revealed, we can speculate that there will be a new DLSS technology, perhaps DLSS 4. As for Neural Rendering, NVIDIA has a page detailing research it has done relating to new methods of AI-generated textures, shading, and lighting, although it's unclear which of these new methods—which seem like they will also need to be added to games on the developer side—it will implement. Whatever it is, though, NVIDIA will likely divulge the details when it reveals its new 5000 series GPUs.
Sources: HardwareLuxx, NVIDIA
Add your own comment

87 Comments on NVIDIA Blackwell RTX and AI Features Leaked by Inno3D

#51
SIGSEGV
TSiAhmatDid you really mean the PS4? (not the PS5)
yes.
PS5 pro
Posted on Reply
#52
Nater
Any chance developers start using the AI portion to make CPU characters not so dumb? (bots in Call of Duty for example)

PhysX was cool when it came out and ragdoll physics took over, now it's kind of a default working behind the scenes in all our games.
Posted on Reply
#53
TSiAhmat
NaterAny chance developers start using the AI portion to make CPU characters not so dumb? (bots in Call of Duty for example)
Aren't CPU Characters already powered by "AI" since their inception. Yeah i wouldn't call that AI intelligent, but same could be argued about today's.

Also the AI most people know about is LLM which is for text-based (chat-bots in a way) which wouldn't really work that well in a 3D Movement-type Area. At least I assume it wouldn't.
Posted on Reply
#54
mouacyk
TSiAhmatAren't CPU Characters already powered by "AI" since their inception. Yeah i wouldn't call that AI intelligent, but same could be argued about today's.

Also the AI most people know about is LLM which is for text-based (chat-bots in a way) which wouldn't really work that well in a 3D Movement-type Area. At least I assume it wouldn't.
LLM based ai will only get you enemies that can insult you like a 5th grader. They'll need a model that's trained on tactical experiences of swat teams and special forces to get what you want.
Posted on Reply
#55
TSiAhmat
mouacykLLM based ai will only get you enemies that can insult you like a 5th grader. They'll need a model that's trained on tactical experiences of swat teams and special forces to get what you want.
Not to mention the cost of performance you have per bot in a match, imagine someone leaving your online match and your games start lagging. wouldn't be great.
Posted on Reply
#56
sepheronx
SIGSEGVFor me, I got 4090 (although it's second-hand and like a new item) for 1K, and I still felt like getting ripped off. On the other hand, I really need to get this stuff to support my research project.
I am done spending more on a GPU for gaming and put the spending on a console instead (PS4).
I plan to get a PS4 Pro in the near future (for me, it's way more logical than to make a donation for Nvidia, lol).
If you need it for research purpose, then I would say $1K is not bad because if its work related, you could probably claim it on taxes or something.

But I feel ya, everything is becoming a waste of money. Heck, the PS5 Pro is a waste of money over the normal PS5. I used to buy brand new many many years ago but most of the time, I will not. I just cant really comprehend most prices these days, not in CPU and not in GPU. CPU side isn't as bad though since I remember the Intel Extreme processors going for over a thousand.
Posted on Reply
#57
Zazigalka
RogueSixAwesome. This is relentless innovation.
This is just the usual marketing talk we"ve all heard a thousand times. Let's wait and see what's behind it.
Posted on Reply
#58
Vayra86
Neo_MorpheusI personally dont care for upscaling, but if I had no choice but to use it and we have the current options, I will always use the one that works for everyone, instead of the one that takes away my options, even if such option is not the absolute best.

See above.

I dont use FSR and obviously, cant use dlss.

That said, I have been perplexed by the claims that FSR is absolute trash and DLSS is bigger than the second coming, so I have read and watched many videos where some unbiased reviewers (very few these days sadly) got to a point where they say FSR is good enough and depending on the game and dev, the same flaws observed in one, show in the other.

So when I read comments like that (FSR is trash, AMD lost, ETC) confuses me and make me believe that is someone simply repeating the other non AMD customers baseless attacks to defend Ngreedia.

Same group that still claims that all AMD drivers are trash.

If they are the only one providing the tool that you need, I can understand and support the purchase of a 4090.
But that's the thing, FSR support isn't better than DLSS support. Each upscale method has its own approach to the market, developers need to implement it, and they need to implement the best version of it. The DLSS push in that respect is better. FSR's open nature does not make it appear in more games and does not improve the solution itself by a meaningful margin.

AMD's FSR strategy, therefore, while I applaud it for the approach in a general sense, is not as effective as Nvidia's approach.
With FreeSync you saw a different result, for example. And why? Because the technology just works, and works everywhere. Support.

I'm just observing the market. I don't have any favoritism towards any brand. I view AMD's open approach as marketing, as much as I view Nvidia's ecosystem approach as marketing. In the end it doesn't matter much: what matters is what technologies survive and deliver the best results. And then, we hope the industry embraces them ubiquitously.
Posted on Reply
#59
evernessince
TSiAhmatAren't CPU Characters already powered by "AI" since their inception. Yeah i wouldn't call that AI intelligent, but same could be argued about today's.

Also the AI most people know about is LLM which is for text-based (chat-bots in a way) which wouldn't really work that well in a 3D Movement-type Area. At least I assume it wouldn't.
Game AI just uses a set of branching conditions to determine behavior. You can make it somewhat decent by having a ton of those branching conditions but because it relies on branching logic the complexity of coding that increases exponentially with size. It's extremely tedious and absolutely not a good fit for character AI.

On the flipside, LLM based AI can consider billions of parameters right now and that number will only increase. I'm not sure if you've ever modded Bethesda games but the number of AI parameters is in the tens and I expect more "advanced" traditional AI in games like Elden ring to be 150 or less. They really are worlds apart but it makes sense, LLMs are designed similar to neural networks in your brain.

I supsect that once tools become available for devs to add LLM based AI we might start seeing it. The problem right now is that there is no pre-made infrastructure for devs to do so and thus you either have to make a bespoke implementation or wait.

There's also a tendency of the video game industry to put a lot of it's funding towards graphics. Take a look around the gaming industry, most of the improvements have been to how good a game looks but little to none to other systems like Audio, Physics, ect.
Posted on Reply
#60
Nater
evernessinceThere's also a tendency of the video game industry to put a lot of it's funding towards graphics. Take a look around the gaming industry, most of the improvements have been to how good a game looks but little to none to other systems like Audio, Physics, ect.
That's what got me thinking about it. I threw a grenade near a pile of tires in Warzone the other day and they all...MOVED! OMG. Then I accidentally shot a traffic cone while in a gunfight and it tipped over!

Everything was "dead" in the environment before Black Ops 6 Warzone dropped. For me anyways. In games of yesteryear, physics was a big deal. The sandbox was destructible in quite a few games. It's all went away to the point that tipping over a traffic cone w/ a 5.56 round surprised me!

Be nice with all this horsepower they could make bots not stand in front of you and plate up instead of finishing you off when they had you dead to rights.

A fully armored APC shouldn't get stopped dead in its tracks to a stick-built wall lined with gypsum, let alone a lazy 3D-sprite plant in an open field.
Posted on Reply
#61
Onasi
evernessinceThere's also a tendency of the video game industry to put a lot of it's funding towards graphics. Take a look around the gaming industry, most of the improvements have been to how good a game looks but little to none to other systems like Audio, Physics, ect.
That feeling when Valve creates a drop-in solution for actual simulated realistic 3D sound with proper HRTF and just makes it open and available for everyone to use and then nobody does.
Posted on Reply
#62
evernessince
NaterThat's what got me thinking about it. I threw a grenade near a pile of tires in Warzone the other day and they all...MOVED! OMG. Then I accidentally shot a traffic cone while in a gunfight and it tipped over!

Everything was "dead" in the environment before Black Ops 6 Warzone dropped. For me anyways. In games of yesteryear, physics was a big deal. The sandbox was destructible in quite a few games. It's all went away to the point that tipping over a traffic cone w/ a 5.56 round surprised me!

Be nice with all this horsepower they could make bots not stand in front of you and plate up instead of finishing you off when they had you dead to rights.

A fully armored APC shouldn't get stopped dead in its tracks to a stick-built wall lined with gypsum, let alone a lazy 3D-sprite plant in an open field.
Yep, some games back in the late 2000s had decent physics systems but then game devs stopped caring and we saw regression on that front for awhile.

A lot of those physics in online games to this day still aren't synchronized either. In otherwords, physics is just for show. They'll look different depending on the client so at the end of the day, just like most things in games, there's no depth and only done for the looks.
OnasiThat feeling when Valve creates a drop-in solution for actual simulated realistic 3D sound with proper HRTF and just makes it open and available for everyone to use and then nobody does.
It's crazy how much an improvement it is to audio and pretty much no one outside of VR uses it.
Posted on Reply
#63
Onasi
@evernessince
I may not be much of a CS guy myself, but HRTF implementation there is amazing and the fact that you can customize it and there is even an in-game EQ… yeah. Meanwhile, AAA games release with the same mediocre “hollywood-ish” home theater mix with barely any channel separation and it sounds so flat when used on headphones you wonder why we bother spending millions on “muh graphics” when EAX enabled games from the late 90s had a better soundscape.
Posted on Reply
#64
Nater
Onasi@evernessince
I may not be much of a CS guy myself, but HRTF implementation there is amazing and the fact that you can customize it and there is even an in-game EQ… yeah. Meanwhile, AAA games release with the same mediocre “hollywood-ish” home theater mix with barely any channel separation and it sounds so flat when used on headphones you wonder why we bother spending millions on “muh graphics” when EAX enabled games from the late 90s had a better soundscape.
The first time anyone played Unreal with a Sound Blaster is a core memory. Absolutely haunting.
Posted on Reply
#65
Dawora
sepheronxYou mean ATI? Yeah, me too and earlier. What's your point? That isn't even an argument.

Enjoy being ripped off. As someone else said, if AMD or intel came out with a better gpu with similar performance, you wouldn't buy it.
Can cows fly or can Amd/intel make similar performance, No. not ATM.

Ripped Off? Lets say something cost 300$ more u willing to pay
U use it 24month
its only 12.5$/month, i say its cheap.

Gpu cost is VERY LOW vs anything else we need for daily life, food, car,gasolines..

And still we cry here like something cost a kidney or two.
Posted on Reply
#66
sepheronx
DaworaCan cows fly or can Amd/intel make similar performance, No. not ATM.

Ripped Off? Lets say something cost 300$ more u willing to pay
U use it 24month
its only 12.5$/month, i say its cheap.

Gpu cost is VERY LOW vs anything else we need for daily life, food, car,gasolines..

And still we cry here like something cost a kidney or two.
What?

Huh, judging by what is saying by other senior techs here too, it does seem to be a rip off.

But believe whatever you like. Because you like to get raped in prices, doesn't mean the rest of us does.
Vayra86But that's the thing, FSR support isn't better than DLSS support. Each upscale method has its own approach to the market, developers need to implement it, and they need to implement the best version of it. The DLSS push in that respect is better. FSR's open nature does not make it appear in more games and does not improve the solution itself by a meaningful margin.

AMD's FSR strategy, therefore, while I applaud it for the approach in a general sense, is not as effective as Nvidia's approach.
With FreeSync you saw a different result, for example. And why? Because the technology just works, and works everywhere. Support.

I'm just observing the market. I don't have any favoritism towards any brand. I view AMD's open approach as marketing, as much as I view Nvidia's ecosystem approach as marketing. In the end it doesn't matter much: what matters is what technologies survive and deliver the best results. And then, we hope the industry embraces them ubiquitously.
Well, I agree to a degree but if one just plays STALKER 2 and see how bad DLSS and everything else has been, I would wager that it is also trash. Just because FSR is worst, doesnt actually mean DLSS is much better.


Posted on Reply
#67
StimpsonJCat
AI this, AI that... All marketing BS with little to no truth to it whatsoever. These are things that are most likely driver based and artificially limited to the 50x0 series. At best I would assume that NV have a new RT denoiser algorithm, and most likely with a perf hit. NV can see that their "amazing" RT features are beginning to be caught up with. They are nowhere near as good as they thought they were, and now they have to start marketing fake reasons why their RT is better than AMD's RDNA4s or Intel's B580, an almost bottom of the range chip doing great RT now!

I throw up just a little in my mouth when I see companies plastering AI all over their marketing, when in 9.9 times out of ten, the only actual AI going on is the text used in the marketing blurb. NV promised years ago that they would use AI to remake their drivers because it was so much better than "human" code. What is it, 3 years now? Yep, more BS.
Posted on Reply
#68
Bwaze
"NVIDIA today reported revenue for the third quarter ended October 27, 2024, of $35.1 billion, up 17% from the previous quarter and up 94% from a year ago. "

All this due to AI. So you bet they're going to shove it into everything. If it's usable or not.
Posted on Reply
#69
StimpsonJCat
Bwaze"NVIDIA today reported revenue for the third quarter ended October 27, 2024, of $35.1 billion, up 17% from the previous quarter and up 94% from a year ago. "

All this due to AI. So you bet they're going to shove it into everything. If it's usable or not.
More like real or not.
Posted on Reply
#70
Bwaze
Record breaking 35 billion dollar revenue is real enough. And that at the end of the generation, not at the release of the new one. To put this into some perspective:

Posted on Reply
#71
TheinsanegamerN
Neo_MorpheusJust so Ngreedia didn’t charge you as much as they do.

People like you will never buy an AMD gpu.
"people like you" constantly say this, then wonder why AMD keeps losing consumers.

People have said that about me. After cycling through a RX 480, a vega 64, and now a 6800xt, apparently I only want a 5090 competitor from AMD so "I can buy Nvidia".

This is just silly hostility that gets us nowhere.
Posted on Reply
#72
Neo_Morpheus
TheinsanegamerNpeople like you" constantly say this, then wonder why AMD keeps losing consumers.
Those people were looking for excuses if a nobody like me opinion’s mattered that much to them.

No, AMD lost via mindshare and marketing.
TheinsanegamerNThis is just silly hostility that gets us nowhere.
Nah, misinformation is one of the things killing them.

Example, mention bad drivers and watch the responses, yet intel has garbage drivers for their gpus and nobody will believe you if you said that.

You might want and would buy a 5090 equivalent from AMD but dont be silly in assuming that everyone will.

Hell, the 7900 xtx is often cheaper than the 4080, but in many situations is faster and if you were going to believe people’s comments, those 7900 xtx are absolutely trash.

There are people (those people) that will only buy Ngreedia regardless .

Just look at the Steam survey ( which i dont think its entirely correct, but its the closest we have) and all you see is Ngreedia even though AMD does provide options and in cases, better options.
Posted on Reply
#73
Bwaze
Neo_MorpheusJust look at the Steam survey ( which i dont think its entirely correct, but its the closest we have) and all you see is Ngreedia even though AMD does provide options and in cases, better options.
Like what, bugs with pedigree? They have constant problems with high idle power draw with multiple monitors - not just a bit higher power draw that reviews show but actual bugs where power draw shoots through the roof, and it's a recurring thing that's going on for more than a decade...
Posted on Reply
#74
TheinsanegamerN
Neo_MorpheusThose people were looking for excuses if a nobody like me opinion’s mattered that much to them.

No, AMD lost via mindshare and marketing.



Nah, misinformation is one of the things killing them.

Example, mention bad drivers and watch the responses, yet intel has garbage drivers for their gpus and nobody will believe you if you said that.

You might want and would buy a 5090 equivalent from AMD but dont be silly in assuming that everyone will.

Hell, the 7900 xtx is often cheaper than the 4080, but in many situations is faster and if you were going to believe people’s comments, those 7900 xtx are absolutely trash.

There are people (those people) that will only buy Ngreedia regardless .

Just look at the Steam survey ( which i dont think its entirely correct, but its the closest we have) and all you see is Ngreedia even though AMD does provide options and in cases, better options.
IDK what forums you're looking at. Mention intel here and every 3rd comment talks about how their drivers make AMD look amazing.

The 7900xtx isnt bad, but it WAS a disappointment, after AMD came out swinging with the 6900xt, going toe to toe with nvidia, their next high end was a tier below nvidia's top, with power consumption issues and FAR slower RT. Despite that, I remember reviewers singing its praises and the card being out of stock for months after release.

As for being faster.....well I looked into it here on TPU
Stalker 2: nah
BO6: yup (activision CoD engine has loved AMD for awhile)
silent hill 2: nah
Silent hill 2 RT: OH HELL NAH
GoW: ragnarok: Nah (but gets close at 4k)
FF XVI: about equal
Space marine 2: Yeah
Star wars outlaws: nah
Black myth wukong: nah
Black myth pathtracing: LMFAOROFL no
The first descendent: about equal
Ghost of tushima : yeah
Homeworld 3: nah
Horizon: yeah
Avatar: frontier: nah
Alan wake 2: equal
Alan wake 2 RT: LOL no
Alan wake 2 path trace: LOL no
AC mirage: nah
Lords of the fallen: nah
Lords of the fallen RT: also nah, but not that bad

So that's 9 nah, 4 yeah, and 3 equal on performance, out of the last 16 games. Over 50% of the time, the 7900xtx is slower then the 4080.

With RT: that changes to 10 nah, 4 yeah, and 2 equal. Also, with RT enabled, not only is the 7900xtx slower than a 4080, it frequently ties with or is slightly slower than the 4070ti, a $800 card. which would explain why the 7900xt had dropped to near $800 more than once, which is actually a good price for it.

It wasn't a bad card, but it DOES feel second rate to nvidia with RT, and the price reflects that. For people like me, I had to way overpay to get my 6800xt, so I'm not really about to jump on another near $1000 card just 2 years later. And it's clearly not just me, nvidia's sales were way down too, until AI hit.
Posted on Reply
#75
evernessince
BwazeLike what, bugs with pedigree? They have constant problems with high idle power draw with multiple monitors - not just a bit higher power draw that reviews show but actual bugs where power draw shoots through the roof, and it's a recurring thing that's going on for more than a decade...
High idle multi-monitor power draw sure but I'm not so sure about random spikes in power draw. Unless you are referring to the RTX 3000 series or RX 6000 series I have never heard or seen any post related to the claimed random high power draw bug . Mind you those two gens don't have a "bug", that's just their design.

You want bugs with Pedigree, Nvidia has had a DSP latency bug for over a decade now and it's only been semi-fixed on the RTX 3000 series (no idea why that gen only). Still an issue for 4000 series and on gens prior to 3000 series. RTX 3000 series feeds back noise into the 12V Sense pin which causes certain PSUs like the Seasonic Prime series OCP to trip. Now that's a hardware level bug that Nvidia never bothered to fix and will never fix.

If I'm listing off AMD bugs vs Nvidia bugs over the past few years, the Nvidia one's are way worse. The fact that cards could cook themselves in New World alone was extremely severe. Let alone the hardware level screwups they've had with the 3000 series and connectors on the 4000 series. Just like AMD they have software bugs as well. Flickering In COD, stuttering in VR, discord lowering clocks, ect. All of the above were officially confirmed in driver patch notes and eventually fixed (the VR issue took 6 months to fix).

I agree that companies should fix their existing and long standing bugs but I find the one-sidedness of arguments around this topic rather telling. People need to hold Nvidia to account as well, they have been getting away with extremely bad bugs and hardware issues because not enough people call them out. If AMD had a driver that allowed their cards to cook themselves like Nvidia did, we'd not be hearing the end of it for at least a decade and 1/2 (and that'd be kind of fair given the severity). Nvidia? Doesn't register apparently.
Posted on Reply
Add your own comment
Dec 22nd, 2024 04:18 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts