Tuesday, December 17th 2024
NVIDIA Blackwell RTX and AI Features Leaked by Inno3D
NVIDIA's RTX 5000 series GPU hardware has been leaked repeatedly in the weeks and months leading up to CES 2025, with previous leaks tipping significant updates for the RTX 5070 Ti in the VRAM department. Now, Inno3D is apparently hinting that the RTX 5000 series will also introduce updated machine learning and AI tools to NVIDIA's GPU line-up. An official CES 2025 teaser published by Inno3D, titled "Inno3D At CES 2025, See You In Las Vegas!" makes mention of potential updates to NVIDIA's AI acceleration suite for both gaming and productivity.
The Inno3D teaser specifically points out "Advanced DLSS Technology," "Enhanced Ray Tracing" with new RT cores, "better integration of AI in gaming and content creation," "AI-Enhanced Power Efficiency," AI-powered upscaling tech for content creators, and optimizations for generative AI tasks. All of this sounds like it builds off of previous NVIDIA technology, like RTX Video Super Resolution, although the mention of content creation suggests that it will be more capable than previous efforts, which were seemingly mostly consumer-focussed. Of course, improved RT cores in the new RTX 5000 GPUs is also expected, although it will seemingly be the first time NVIDIA will use AI to enhance power draw, suggesting that the CES announcement will come with new features for the NVIDIA App. The real standout feature, though, are called "Neural Rendering" and "Advanced DLSS," both of which are new nomenclatures. Of course, Advanced DLSS may simply be Inno3D marketing copy, but Neural Rendering suggests that NVIDIA will "Revolutionize how graphics are processed and displayed," which is about as vague as one could be.Just based on the information Inno3D has revealed, we can speculate that there will be a new DLSS technology, perhaps DLSS 4. As for Neural Rendering, NVIDIA has a page detailing research it has done relating to new methods of AI-generated textures, shading, and lighting, although it's unclear which of these new methods—which seem like they will also need to be added to games on the developer side—it will implement. Whatever it is, though, NVIDIA will likely divulge the details when it reveals its new 5000 series GPUs.
Sources:
HardwareLuxx, NVIDIA
The Inno3D teaser specifically points out "Advanced DLSS Technology," "Enhanced Ray Tracing" with new RT cores, "better integration of AI in gaming and content creation," "AI-Enhanced Power Efficiency," AI-powered upscaling tech for content creators, and optimizations for generative AI tasks. All of this sounds like it builds off of previous NVIDIA technology, like RTX Video Super Resolution, although the mention of content creation suggests that it will be more capable than previous efforts, which were seemingly mostly consumer-focussed. Of course, improved RT cores in the new RTX 5000 GPUs is also expected, although it will seemingly be the first time NVIDIA will use AI to enhance power draw, suggesting that the CES announcement will come with new features for the NVIDIA App. The real standout feature, though, are called "Neural Rendering" and "Advanced DLSS," both of which are new nomenclatures. Of course, Advanced DLSS may simply be Inno3D marketing copy, but Neural Rendering suggests that NVIDIA will "Revolutionize how graphics are processed and displayed," which is about as vague as one could be.Just based on the information Inno3D has revealed, we can speculate that there will be a new DLSS technology, perhaps DLSS 4. As for Neural Rendering, NVIDIA has a page detailing research it has done relating to new methods of AI-generated textures, shading, and lighting, although it's unclear which of these new methods—which seem like they will also need to be added to games on the developer side—it will implement. Whatever it is, though, NVIDIA will likely divulge the details when it reveals its new 5000 series GPUs.
87 Comments on NVIDIA Blackwell RTX and AI Features Leaked by Inno3D
PS5 pro
PhysX was cool when it came out and ragdoll physics took over, now it's kind of a default working behind the scenes in all our games.
Also the AI most people know about is LLM which is for text-based (chat-bots in a way) which wouldn't really work that well in a 3D Movement-type Area. At least I assume it wouldn't.
But I feel ya, everything is becoming a waste of money. Heck, the PS5 Pro is a waste of money over the normal PS5. I used to buy brand new many many years ago but most of the time, I will not. I just cant really comprehend most prices these days, not in CPU and not in GPU. CPU side isn't as bad though since I remember the Intel Extreme processors going for over a thousand.
AMD's FSR strategy, therefore, while I applaud it for the approach in a general sense, is not as effective as Nvidia's approach.
With FreeSync you saw a different result, for example. And why? Because the technology just works, and works everywhere. Support.
I'm just observing the market. I don't have any favoritism towards any brand. I view AMD's open approach as marketing, as much as I view Nvidia's ecosystem approach as marketing. In the end it doesn't matter much: what matters is what technologies survive and deliver the best results. And then, we hope the industry embraces them ubiquitously.
On the flipside, LLM based AI can consider billions of parameters right now and that number will only increase. I'm not sure if you've ever modded Bethesda games but the number of AI parameters is in the tens and I expect more "advanced" traditional AI in games like Elden ring to be 150 or less. They really are worlds apart but it makes sense, LLMs are designed similar to neural networks in your brain.
I supsect that once tools become available for devs to add LLM based AI we might start seeing it. The problem right now is that there is no pre-made infrastructure for devs to do so and thus you either have to make a bespoke implementation or wait.
There's also a tendency of the video game industry to put a lot of it's funding towards graphics. Take a look around the gaming industry, most of the improvements have been to how good a game looks but little to none to other systems like Audio, Physics, ect.
Everything was "dead" in the environment before Black Ops 6 Warzone dropped. For me anyways. In games of yesteryear, physics was a big deal. The sandbox was destructible in quite a few games. It's all went away to the point that tipping over a traffic cone w/ a 5.56 round surprised me!
Be nice with all this horsepower they could make bots not stand in front of you and plate up instead of finishing you off when they had you dead to rights.
A fully armored APC shouldn't get stopped dead in its tracks to a stick-built wall lined with gypsum, let alone a lazy 3D-sprite plant in an open field.
A lot of those physics in online games to this day still aren't synchronized either. In otherwords, physics is just for show. They'll look different depending on the client so at the end of the day, just like most things in games, there's no depth and only done for the looks. It's crazy how much an improvement it is to audio and pretty much no one outside of VR uses it.
I may not be much of a CS guy myself, but HRTF implementation there is amazing and the fact that you can customize it and there is even an in-game EQ… yeah. Meanwhile, AAA games release with the same mediocre “hollywood-ish” home theater mix with barely any channel separation and it sounds so flat when used on headphones you wonder why we bother spending millions on “muh graphics” when EAX enabled games from the late 90s had a better soundscape.
Ripped Off? Lets say something cost 300$ more u willing to pay
U use it 24month
its only 12.5$/month, i say its cheap.
Gpu cost is VERY LOW vs anything else we need for daily life, food, car,gasolines..
And still we cry here like something cost a kidney or two.
Huh, judging by what is saying by other senior techs here too, it does seem to be a rip off.
But believe whatever you like. Because you like to get raped in prices, doesn't mean the rest of us does. Well, I agree to a degree but if one just plays STALKER 2 and see how bad DLSS and everything else has been, I would wager that it is also trash. Just because FSR is worst, doesnt actually mean DLSS is much better.
I throw up just a little in my mouth when I see companies plastering AI all over their marketing, when in 9.9 times out of ten, the only actual AI going on is the text used in the marketing blurb. NV promised years ago that they would use AI to remake their drivers because it was so much better than "human" code. What is it, 3 years now? Yep, more BS.
All this due to AI. So you bet they're going to shove it into everything. If it's usable or not.
People have said that about me. After cycling through a RX 480, a vega 64, and now a 6800xt, apparently I only want a 5090 competitor from AMD so "I can buy Nvidia".
This is just silly hostility that gets us nowhere.
No, AMD lost via mindshare and marketing. Nah, misinformation is one of the things killing them.
Example, mention bad drivers and watch the responses, yet intel has garbage drivers for their gpus and nobody will believe you if you said that.
You might want and would buy a 5090 equivalent from AMD but dont be silly in assuming that everyone will.
Hell, the 7900 xtx is often cheaper than the 4080, but in many situations is faster and if you were going to believe people’s comments, those 7900 xtx are absolutely trash.
There are people (those people) that will only buy Ngreedia regardless .
Just look at the Steam survey ( which i dont think its entirely correct, but its the closest we have) and all you see is Ngreedia even though AMD does provide options and in cases, better options.
The 7900xtx isnt bad, but it WAS a disappointment, after AMD came out swinging with the 6900xt, going toe to toe with nvidia, their next high end was a tier below nvidia's top, with power consumption issues and FAR slower RT. Despite that, I remember reviewers singing its praises and the card being out of stock for months after release.
As for being faster.....well I looked into it here on TPU
Stalker 2: nah
BO6: yup (activision CoD engine has loved AMD for awhile)
silent hill 2: nah
Silent hill 2 RT: OH HELL NAH
GoW: ragnarok: Nah (but gets close at 4k)
FF XVI: about equal
Space marine 2: Yeah
Star wars outlaws: nah
Black myth wukong: nah
Black myth pathtracing: LMFAOROFL no
The first descendent: about equal
Ghost of tushima : yeah
Homeworld 3: nah
Horizon: yeah
Avatar: frontier: nah
Alan wake 2: equal
Alan wake 2 RT: LOL no
Alan wake 2 path trace: LOL no
AC mirage: nah
Lords of the fallen: nah
Lords of the fallen RT: also nah, but not that bad
So that's 9 nah, 4 yeah, and 3 equal on performance, out of the last 16 games. Over 50% of the time, the 7900xtx is slower then the 4080.
With RT: that changes to 10 nah, 4 yeah, and 2 equal. Also, with RT enabled, not only is the 7900xtx slower than a 4080, it frequently ties with or is slightly slower than the 4070ti, a $800 card. which would explain why the 7900xt had dropped to near $800 more than once, which is actually a good price for it.
It wasn't a bad card, but it DOES feel second rate to nvidia with RT, and the price reflects that. For people like me, I had to way overpay to get my 6800xt, so I'm not really about to jump on another near $1000 card just 2 years later. And it's clearly not just me, nvidia's sales were way down too, until AI hit.
You want bugs with Pedigree, Nvidia has had a DSP latency bug for over a decade now and it's only been semi-fixed on the RTX 3000 series (no idea why that gen only). Still an issue for 4000 series and on gens prior to 3000 series. RTX 3000 series feeds back noise into the 12V Sense pin which causes certain PSUs like the Seasonic Prime series OCP to trip. Now that's a hardware level bug that Nvidia never bothered to fix and will never fix.
If I'm listing off AMD bugs vs Nvidia bugs over the past few years, the Nvidia one's are way worse. The fact that cards could cook themselves in New World alone was extremely severe. Let alone the hardware level screwups they've had with the 3000 series and connectors on the 4000 series. Just like AMD they have software bugs as well. Flickering In COD, stuttering in VR, discord lowering clocks, ect. All of the above were officially confirmed in driver patch notes and eventually fixed (the VR issue took 6 months to fix).
I agree that companies should fix their existing and long standing bugs but I find the one-sidedness of arguments around this topic rather telling. People need to hold Nvidia to account as well, they have been getting away with extremely bad bugs and hardware issues because not enough people call them out. If AMD had a driver that allowed their cards to cook themselves like Nvidia did, we'd not be hearing the end of it for at least a decade and 1/2 (and that'd be kind of fair given the severity). Nvidia? Doesn't register apparently.