Tuesday, December 17th 2024

NVIDIA Blackwell RTX and AI Features Leaked by Inno3D

NVIDIA's RTX 5000 series GPU hardware has been leaked repeatedly in the weeks and months leading up to CES 2025, with previous leaks tipping significant updates for the RTX 5070 Ti in the VRAM department. Now, Inno3D is apparently hinting that the RTX 5000 series will also introduce updated machine learning and AI tools to NVIDIA's GPU line-up. An official CES 2025 teaser published by Inno3D, titled "Inno3D At CES 2025, See You In Las Vegas!" makes mention of potential updates to NVIDIA's AI acceleration suite for both gaming and productivity.

The Inno3D teaser specifically points out "Advanced DLSS Technology," "Enhanced Ray Tracing" with new RT cores, "better integration of AI in gaming and content creation," "AI-Enhanced Power Efficiency," AI-powered upscaling tech for content creators, and optimizations for generative AI tasks. All of this sounds like it builds off of previous NVIDIA technology, like RTX Video Super Resolution, although the mention of content creation suggests that it will be more capable than previous efforts, which were seemingly mostly consumer-focussed. Of course, improved RT cores in the new RTX 5000 GPUs is also expected, although it will seemingly be the first time NVIDIA will use AI to enhance power draw, suggesting that the CES announcement will come with new features for the NVIDIA App. The real standout feature, though, are called "Neural Rendering" and "Advanced DLSS," both of which are new nomenclatures. Of course, Advanced DLSS may simply be Inno3D marketing copy, but Neural Rendering suggests that NVIDIA will "Revolutionize how graphics are processed and displayed," which is about as vague as one could be.
Just based on the information Inno3D has revealed, we can speculate that there will be a new DLSS technology, perhaps DLSS 4. As for Neural Rendering, NVIDIA has a page detailing research it has done relating to new methods of AI-generated textures, shading, and lighting, although it's unclear which of these new methods—which seem like they will also need to be added to games on the developer side—it will implement. Whatever it is, though, NVIDIA will likely divulge the details when it reveals its new 5000 series GPUs.
Sources: HardwareLuxx, NVIDIA
Add your own comment

90 Comments on NVIDIA Blackwell RTX and AI Features Leaked by Inno3D

#26
sepheronx
Guwapo77People like me have been using AMD GPUs since 9700PRO you need to stop with the nonsense.


I've been gaming at 1440p for damn near a decade, I upgraded my monitor when I felt 4K gaming with a respectable frame rate can be achieved. Again, your 2K vs my 2K...


I've been playing at 1440p for nearly a decade and now I am upgrading to 4K! You enjoy that 1440p its a nice place to be at...I know. Now its time for me to move on. QD-OLED and 4K is glorious. At 47, I can pretty much do wtf I want to.


All it would have taken in a hot second to look at my current system specs. I've only bought AMD for the last 2 decades...


Not going to put in effort into this reply, look at the others. No competition and they got the best product, yet its my fault. Go play in traffic.
You mean ATI? Yeah, me too and earlier. What's your point? That isn't even an argument.

Enjoy being ripped off. As someone else said, if AMD or intel came out with a better gpu with similar performance, you wouldn't buy it.
Posted on Reply
#28
igormp
sepheronxOf course.

When the market is screwing you, you try to protest the prices. Most countries do. I find everyone here too lazy and complacent. I actually do complain about people who buy certain cars for exact same reason.
Given how they can keep jacking up prices and still sell without issues, I believe the market does not feel it's being screwed up at all.

If products got out of your price range, that's a "you" problem. Many folks are also getting those GPUs for things other than games (specially with the AI hype), so those cards end up as an investment with an eventual return.

I'd say that a 5090 even at $3k is still an amazing price for the level of compute it's supposed to bring.
Posted on Reply
#29
Guwapo77
sepheronxYou mean ATI? Yeah, me too and earlier. What's your point? That isn't even an argument.

Enjoy being ripped off. As someone else said, if AMD or intel came out with a better gpu with similar performance, you wouldn't buy it.
You seem like an awfully immature child
freeagentWho doesn't love ATi lol..
ATi got so much of my lunch money over the years and I couldn't have been happier. I just need more horsepower, if they comeback with a vengeance, I'll be Team Red yet again.
Posted on Reply
#30
sepheronx
im probably older than you, but sure. whatever.
igormpGiven how they can keep jacking up prices and still sell without issues, I believe the market does not feel it's being screwed up at all.

If products got out of your price range, that's a "you" problem. Many folks are also getting those GPUs for things other than games (specially with the AI hype), so those cards end up as an investment with an eventual return.

I'd say that a 5090 even at $3k is still an amazing price for the level of compute it's supposed to bring.
Found another one.

No, it isn't a me problem. There is a general term of debt levels of average person. Notice its going up all over? But then again, people sure do spend far too much for so little. Far more than what they themselves make.

Edit: Also, where do you get the idea that even at $3K its a good price? What arbitrary number did you determine that? what exactly do you know about it that would wager that a price in the thousands is a good idea for a GPU? I imagine you felt same about the Titan back in the day too?
Posted on Reply
#31
nguyen
Man having played 4 path tracing games this year (CP2077 Phantom Liberty, Alan Wake 2, Black Myth Wukong and now Indiana Jones) I can't wait for better GPU with more advanced RT capability.

2025 will be PT lit for sure :D
Posted on Reply
#32
Hecate91
igormpGiven how they can keep jacking up prices and still sell without issues, I believe the market does not feel it's being screwed up at all.

If products got out of your price range, that's a "you" problem. Many folks are also getting those GPUs for things other than games (specially with the AI hype), so those cards end up as an investment with an eventual return.

I'd say that a 5090 even at $3k is still an amazing price for the level of compute it's supposed to bring.
It isn't a problem for anyone wanting a decent GPU without getting screwed, when even enthusiasts that usually spend thousands just to have latest shiny thing are starting to realize the prices and the market is screwed up. The gaming market consumers can't tell when they're being screwed especially when they keep buying the latest AAA garbage at full price instead of waiting for sales.
Those buying a GPU for other things aren't complaining about the price, or shouldn't be complaining, though these cards being good at compute is also a problem, AI is like the next crypto, its a scam for a majority of consumers yet Nvidia is still convincing gamers it cares about the gaming market.
Guwapo77You seem like an awfully immature child
I wouldn't consider getting overly defensive telling someone to "go play in traffic" to be mature, but you do you.
People finding excuses to justify a hobby isn't anything but new but if you complain and buy it anyway it's not going to help prices go down because you're supporting the problem.
Posted on Reply
#33
sepheronx
Hecate91It isn't a problem when even enthusiasts that usually spend thousands just to have latest shiny thing are starting to realize the prices and the market is screwed up. The gaming market consumers can't tell when they're being screwed especially when they keep buying the latest AAA garbage at full price instead of waiting for sales.
Those buying a GPU for other things aren't complaining about the price, or shouldn't be complaining, though these cards being good at compute is also a problem, AI is like the next crypto, its a scam for a majority of consumers yet Nvidia is still convincing gamers it cares about the gaming market.

I wouldn't consider getting overly defensive telling someone to "go play in traffic" to be mature, but you do you.
People finding excuses to justify a hobby isn't anything but new but if you complain and buy it anyway it's not going to help prices go down because you're supporting the problem.
hes butthurt cause he knows he is wrong but whatever. Some people just have trouble accepting truth more so than anything.

There is good set of videos from Threat Interactive that looks at the issue of gaming and the engines used (he really hits at UE titles). But essentially in the end, games are shit optimized and thus you need to overspend on a GPU to brute force the issue. The situation never gets better because instead of just saying no, people will end up spending on the hardware for it. I myself am partially at fault for it cause I too ended up spending way more for a GPU than I should have in the past.
Posted on Reply
#34
igormp
sepheronxFound another one.
I don't even play games, nor said I was going to buy a 5090.
sepheronxNo, it isn't a me problem. There is a general term of debt levels of average person. Notice its going up all over? But then again, people sure do spend far too much for so little. Far more than what they themselves make.
People should be aware of their own finances. If they don't, that's their problem.
On the other hand, there are plenty of people that can afford such product. And as I had said, that kind of product has an actual return on investment for many uses that buy it for more than playing games.
sepheronxEdit: Also, where do you get the idea that even at $3K its a good price? What arbitrary number did you determine that? what exactly do you know about it that would wager that a price in the thousands is a good idea for a GPU? I imagine you felt same about the Titan back in the day too?
A RTX 5000 Ada with its 32GB sells for $4k, a 5090 with the same 32GB, but way faster memory and core configs while being cheaper than that is great value without a doubt.

Once again, if you can't afford it, then it's your problem. And as long as the market keeps accepting those prices the trend will continue, like it or not. Complaining about what other people do with their money just sounds like jealousy.
Hecate91It isn't a problem for anyone wanting a decent GPU without getting screwed, when even enthusiasts that usually spend thousands just to have latest shiny thing are starting to realize the prices and the market is screwed up. The gaming market consumers can't tell when they're being screwed especially when they keep buying the latest AAA garbage at full price instead of waiting for sales.
Those buying a GPU for other things aren't complaining about the price, or shouldn't be complaining, though these cards being good at compute is also a problem, AI is like the next crypto, its a scam for a majority of consumers yet Nvidia is still convincing gamers it cares about the gaming market.
It has been quite some time since GPUs have been shown to be great at many things other than games, which means gamers now need to share market with people that actually profit out of those products.
If one side is willing to spend more since they'll get a return on that money spent, people who only buy those for entertainment should either not buy it anymore (and pray this make enough of a dent to lower prices), or suck it up and pay the price.
I've said this many times before, I do believe the dGPU market for gamers will keep decreasing (while the actual revenue to companies keeps rising due to other buyers), and SoCs like strix halo will eventually become the standard for an entertainment platform.
Posted on Reply
#35
john_
AI skyrocketed pricing.
Posted on Reply
#36
Bwaze
Of course, "Everything is AI" will inevitably mean that nothing will work on RTX 4090 and other old cards - because they lack "inteligence"!

So if you bought a card for $2000, all of a sudden it won't do the new DLSS, it won't do the new ray-tracing etc... We have seen it with Ada Lovelace in limited amount, where only new cards could do Frame Generation. Even the RTX 3090 Ti suddenly wasn't enough.

Now they're going all in.

And I won't be surprised if we see a bunch of new "PhysX" equivalent Nvidia specific AI technologies inserted into games that only work on Nvidia, and only on RTX 50x0 cards - "The Way it's Meant To Be Played!". Nvidia couldn't pull it off with PhysX, but now they really are in a position to shove their tech into every gaming studio, they basically are the sole makers of gaming cards, AMD just dabbles a bit.
Posted on Reply
#37
95Viper
Stick to the topic.
Stop berating, insulting, trolling, bickering... discuss the topic and stop the BS.
Also, If there is a problem... report it, the moderation team will deal with it.

If you cannot discuss civilly then don't post.
Posted on Reply
#38
Neo_Morpheus
BwazeAnd I won't be surprised if we see a bunch of new "PhysX" equivalent Nvidia specific AI technologies inserted into games that only work on Nvidia, and only on RTX 50x0 cards - "The Way it's Meant To Be Played!". Nvidia couldn't pull it off with PhysX, but now they really are in a position to shove their tech into every gaming studio
I have said exactly that for over a decade and its exactly why i stopped buying Ngreedia.

I hate how nobody has a problem in ignoring the current infestation of dlss on games. They tried over and over until finally found the perfect storm of influencers and weak minded followers/consumers.

Hell, W1zzard really dropped the ball when he said it was a “Con” on intel part because the gpu didn’t support dlss.

We are in some truly dark times.
Posted on Reply
#39
Legacy-ZA
Neo_MorpheusI have said exactly that for over a decade and its exactly why i stopped buying Ngreedia.

I hate how nobody has a problem in ignoring the current infestation of dlss on games. They tried over and over until finally found the perfect storm of influencers and weak minded followers/consumers.

Hell, W1zzard really dropped the ball when he said it was a “Con” on intel part because the gpu didn’t support dlss.

We are in some truly dark times.
I have a love/hate relationship when it comes to DLSS. The thing I like the most is, the fact that it removes the shimmering/distortions in background settings, with fences, poles, bridges, I never could stand that and never will, I would pick it up immediately and it will break my immersion.

As for the slight increase in blur, it sucks, but you can counter it by increasing the sharpness, so it has less of an impact.
Over all, I say it's an improvement and it will become better, just look how far we have come with DLSS 1 to where we are now?

I much more prefer DLAA though but the performance cost and the VRAM issues, you probably need a xx90 to appreciate it fully.
Posted on Reply
#40
Vayra86
Looks like a perfect prediction once more: Nvidia will sell you DLSS now. Not a better GPU. Just DLSS, with a topping of AI to seal the deal.

Perf/$ improvement? Sure, turn on DLSS4!

As I've done before, Nvidia will be judged exclusively on its raw non-upscaled performance. I hope reviewers keep doing the same, and most certainly stop propping up the lack of it as a disadvantage. @W1zzard (this is aimed at you). It is a situational advantage at best, that comes with a vendor lock in, just like Gsync. We don't tell readers of a Windows laptop review either that not having Apple's OSX is a disadvantage, do we.
Neo_MorpheusI hate how nobody has a problem in ignoring the current infestation of dlss on games
Upscaling. DLSS is just the best implementation of it and it is hard to deny. I think gamers have accepted that upscale is everywhere now, but they do always complain about shitty implementations of it. Nvidia executes well on that front and even Intel seems to surpass AMD's efforts. If anything, it is AMD that should do better here. They gambled on an approach and so far, they lost.

Its a bit like complaining about traffic and then buying a new car. Its a given, it comes with the territory, but you're really better off taking the bicycle whenever you can. Upscale is a bit like that. If performance allows, you don't really need or want it, but you cán still get more frames out of it (go faster!) which allows you, for example, to cap framerate and run at lower TDP. The bigger issue in gaming now isn't the upscale tech per say, but the development in game engines themselves that pseudo-require TAA and upscaling to avoid graphical artifacts.

And that ain't on Nvidia, its a developer problem supported by an industry push.
Posted on Reply
#41
Ahhzz
There's a lot of personal attacks in here. They need to stop.
Posted on Reply
#42
sepheronx
The blurring that comes from upscale and TAA is abysmal. Stalker 2 looks like utter shit and if someone says otherwise, needs their eyes checked. The smearing and ghosting is beyond ridiculous and the game still runs like crap.

Silent hill 2 remake isn't nearly as bad but it still isn't excusable. Fog now affects performance when in the past it was to help hide limitations yet created a great atmosphere.
Posted on Reply
#43
LittleBro
Distortions and fake frames FTW!

So, Nvidia gonna release new DLSS and make up some-unknown-but-very-important-reason for this technology to be compatible only with RTX 5000 series?
Please, don't consider this an attack, this is just a question based on empirical knowdledge. Well, as you know, this already happenned, if you recall.
sepheronxThe blurring that comes from upscale and TAA is abysmal.
I needed quite an amount of time to get used to TAA due to blurring.
Posted on Reply
#44
Bwaze
Frame generation limiting only to Ada cards was never really explained, other than "screw you, buy a new card".

So they don't really need to explain themselves.
Posted on Reply
#45
DemonicRyzen666
sepheronxThe blurring that comes from upscale and TAA is abysmal. Stalker 2 looks like utter shit and if someone says otherwise, needs their eyes checked. The smearing and ghosting is beyond ridiculous and the game still runs like crap.

Silent hill 2 remake isn't nearly as bad but it still isn't excusable. Fog now affects performance when in the past it was to help hide limitations yet created a great atmosphere.
Isn't that same with depth of field now too?
It's supposed to blur things far away to make it less work to render.
yet everyone enables it & complains they can't read texts far away.
Then enable D.L.S.S to make it readable again, instead of just turning off "depth of field"]
I feel like we're going backwards in graphics.
Posted on Reply
#46
LittleBro
DOF aims to blur background while foreground remains unblurred. It kind of tries to replicate what DLSR lens with low aperture value naturally do. TAA and DLSS affects everything on the screen.
Posted on Reply
#47
SIGSEGV
For me, I got 4090 (although it's second-hand and like a new item) for 1K, and I still felt like getting ripped off. On the other hand, I really need to get this stuff to support my research project.
I am done spending more on a GPU for gaming and put the spending on a console instead (PS4).
I plan to get a PS4 Pro in the near future (for me, it's way more logical than to make a donation for Nvidia, lol).
Posted on Reply
#48
Legacy-ZA
DemonicRyzen666Isn't that same with depth of field now too?
It's supposed to blur things far away to make it less work to render.
yet everyone enables it & complains they can't read texts far away.
Then enable D.L.S.S to make it readable again, instead of just turning off "depth of field"]
I feel like we're going backwards in graphics.
Motion Blur and DoF are always the first settings I disable in any game. :D
Posted on Reply
#49
Neo_Morpheus
Vayra86Upscaling. DLSS is just the best implementation of it and it is hard to deny.
I personally dont care for upscaling, but if I had no choice but to use it and we have the current options, I will always use the one that works for everyone, instead of the one that takes away my options, even if such option is not the absolute best.
Vayra86Nvidia executes well on that front and even Intel seems to surpass AMD's efforts.
See above.
Vayra86If anything, it is AMD that should do better here. They gambled on an approach and so far, they lost.
I dont use FSR and obviously, cant use dlss.

That said, I have been perplexed by the claims that FSR is absolute trash and DLSS is bigger than the second coming, so I have read and watched many videos where some unbiased reviewers (very few these days sadly) got to a point where they say FSR is good enough and depending on the game and dev, the same flaws observed in one, show in the other.

So when I read comments like that (FSR is trash, AMD lost, ETC) confuses me and make me believe that is someone simply repeating the other non AMD customers baseless attacks to defend Ngreedia.

Same group that still claims that all AMD drivers are trash.
SIGSEGVI really need to get this stuff to support my research project.
If they are the only one providing the tool that you need, I can understand and support the purchase of a 4090.
Posted on Reply
#50
Lycanwolfen
Legacy-ZAMotion Blur and DoF are always the first settings I disable in any game. :D
Yep, I always want to see what the card can do without software gimics. You know people complained about PS3 graphics back in the day but when it was programmed correctly it was much better than some PS4 games. Like FF13 on the PS3 I could see in the distance clear as a bell anything. Then I played FF15 on the PS4 Pro and everything in the distance was blurry. When I saw that I knew right away they had to cut corners to make the game playable. Even most of all the newer games on PS4 PS5 blur the distance objects. Sad really sad.
Posted on Reply
Add your own comment
Mar 31st, 2025 15:46 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts