Tuesday, December 17th 2024

NVIDIA Blackwell RTX and AI Features Leaked by Inno3D

NVIDIA's RTX 5000 series GPU hardware has been leaked repeatedly in the weeks and months leading up to CES 2025, with previous leaks tipping significant updates for the RTX 5070 Ti in the VRAM department. Now, Inno3D is apparently hinting that the RTX 5000 series will also introduce updated machine learning and AI tools to NVIDIA's GPU line-up. An official CES 2025 teaser published by Inno3D, titled "Inno3D At CES 2025, See You In Las Vegas!" makes mention of potential updates to NVIDIA's AI acceleration suite for both gaming and productivity.

The Inno3D teaser specifically points out "Advanced DLSS Technology," "Enhanced Ray Tracing" with new RT cores, "better integration of AI in gaming and content creation," "AI-Enhanced Power Efficiency," AI-powered upscaling tech for content creators, and optimizations for generative AI tasks. All of this sounds like it builds off of previous NVIDIA technology, like RTX Video Super Resolution, although the mention of content creation suggests that it will be more capable than previous efforts, which were seemingly mostly consumer-focussed. Of course, improved RT cores in the new RTX 5000 GPUs is also expected, although it will seemingly be the first time NVIDIA will use AI to enhance power draw, suggesting that the CES announcement will come with new features for the NVIDIA App. The real standout feature, though, are called "Neural Rendering" and "Advanced DLSS," both of which are new nomenclatures. Of course, Advanced DLSS may simply be Inno3D marketing copy, but Neural Rendering suggests that NVIDIA will "Revolutionize how graphics are processed and displayed," which is about as vague as one could be.
Just based on the information Inno3D has revealed, we can speculate that there will be a new DLSS technology, perhaps DLSS 4. As for Neural Rendering, NVIDIA has a page detailing research it has done relating to new methods of AI-generated textures, shading, and lighting, although it's unclear which of these new methods—which seem like they will also need to be added to games on the developer side—it will implement. Whatever it is, though, NVIDIA will likely divulge the details when it reveals its new 5000 series GPUs.
Sources: HardwareLuxx, NVIDIA
Add your own comment

87 Comments on NVIDIA Blackwell RTX and AI Features Leaked by Inno3D

#1
esserpain
Is NVIDIA trying to give me buyer's remorse for having just bought an RTX 4070 SUPER? :laugh:

Regardless, this article is right about Inno3D's taglines being able to mean anything. Marketing is marketing for a reason, after all. Improved AI features also aren't surprising, though most of those probably won't benefit the average consumer much and will be targeted towards the enterprise sectors that would've bought an RTX 5090 anyways.
Posted on Reply
#3
Guwapo77
This is a practice I absolutely cannot stand with Nvidia. I really wish AMD had a true competitor at the 5090 level, but I'm stuck with buying a $2000 GPU next year (I hope it doesn't cost more than that).
Posted on Reply
#4
RogueSix
Awesome. This is relentless innovation. I was a little worried that nVidia would completely forget about gaming as long as 90%+ of their revenue is coming from AI/DC but looks like that worry was unfounded. They still keep trucking on the feature set front.

I can't wait to see what they have cooked up for the RTX 5000 series and I will grab a RTX 5090 as soon as the initial onslaught has died down. I made the (minor) mistake of being a too early adopter of the RTX 4090 but no one really knew if we'd see a repeat of the CoVid and crypto scalping craze when those cards were released. I bought mine for ~€2400 and three months later the cards were consistently sold for under €2000.

This time I will wait two or three months until prices and availability have normalized. Then that sweet RTX 5090 ass is mine! :D
Posted on Reply
#5
motov8
AI = script
Card with label AI = 2000 $
Card with script = 500$
Posted on Reply
#6
theouto
Absolutely nothing new, the only new thing here are the names.

"AI-Enhanced power efficiency", what are you on about nvidia? You would rather sell smoke than actually push for a lower TDP?
Posted on Reply
#7
sepheronx
Guwapo77You need to insult for what reason huh? 4K gaming @ 240Hz with maxed settings - options?
nah, just the simple fact that there is no such thing as 4k gaming @ 240Hz. I seem to recall 4090 was said to be just that and guess what? it isn't. Funny thing is, it needs make belief frames in order to even operate at playable levels for a ton of games so it doesn't even run at 4K settings.

You are chasing a ghost. Well, you and everyone like you. And its funny because jacket man can look at people like you as an easy cash grab because you will fall for such marketing.

Or you mean 4K at 240hz playing terraria or stardew valley.
Posted on Reply
#8
evernessince
Neural Rendering is interesting. I have to wonder if Path Traced lighting could be cheated in at a lower resource cost by having an AI generate just the lighting instead of actually simulating the light rays. It would require a very slim and focused generative AI model as current high quality models are limited to 1024x1024 and use 24GB of VRAM but those are general purpose models. Still, the model size would have to be vastly smaller in order to run as just the lighting step.

I also have to wonder how additional AI will impact VRAM usage. The AI is going to have an overhead but it's also feasible that if the GPU doesn't need to store as much lighting data in the VRAM it could reduce VRAM usage. Then again the AI might need that data and simply be additive to VRAM consumption.

Even though I enjoy tinkering with AI a lot, I feel like it should be pointed out that AI has downsides. In the case of generating assets that's typically a reduction in quality , heavy limits to resolution, added visual artifacts, concept bleed, and noticeable patterns to output content. In regards to the latter, you can often start to notice patterns in content generated by AI even when the input is different. It gets worse when you start feeding AI generated content into AI as those downsides tend to stack.
RogueSixAwesome. This is relentless innovation. I was a little worried that nVidia would completely forget about gaming as long as 90%+ of their revenue is coming from AI/DC but looks like that worry was unfounded. They still keep trucking on the feature set front.
Nvidia occupies about 13% of Nvidia's mind, which is about the proportion of sales it represents to them.

Make no mistake, it's pushing AI improvements for it's enterprise customers and not gamers.
Posted on Reply
#9
Vya Domus
"AI-Enhanced Power Efficiency" lol, what does that even mean.
Posted on Reply
#10
LabRat 891
More meaningless marketing wank.
woohoo :rolleyes:
Posted on Reply
#11
wolf
Better Than Native
Curious to see the new features detailed beyond these uhh.. buzzwords.
Posted on Reply
#12
Scircura
sepheronxEasy, people like him and possibly you are the reason for high prices for gpus. Cause "you have no choice"
No no, he's buying the card for $500 while giving Nvidia a $1500 R&D grant to develop better economies of scale. It's for the good of all.

Anyway, it's not a healthy mindset to point one's finger at whoever's closest by, and blame them for the things that aren't right in the world. Let's enjoy each other's presence here! :peace:
Posted on Reply
#13
RogueSix
evernessinceMake no mistake, it's pushing AI improvements for it's enterprise customers and not gamers.
* its

... and I'm not making that mistake. It still takes "translation" to the consumer market and a lot of work on the software and feature set side to adapt it for the consumer market. That remains impressive, no matter how much people want to downplay it.

I mean, nVidia could be a complacent and lazy company like AMD :p and only release cards that are 10% faster every generation but they actually choose to keep pushing the envelope, in spite of the fact that less than 10% of their business is gaming these days.

I was fully expecting them to take a break with regard to gaming because, given the extreme boost in revenue, it would have only made sense to shift all of their engineers to work on AI/DC stuff but here we are with all new features on the consumer RTX 5000 front. That is commendable and I'm really looking forward to that RTX 5090 masterpiece. Bring it on, Jen-Hsun! Let's do this shit! :D
Posted on Reply
#14
sephiroth117
I currently have an RTX 2060, for 4K I wanted to upgrade finally to a 5080/5090 depending on price.


But I have so many remarks:
  • 24GB was more than plenty, those 32GB are here for AI first and foremost...whilst the 5080 doesn't have enough with 16GB, that's really is an absurd product segmentation.
    • I though Quadro was there for workstations...gaming segment should still be focused on gaming...
  • Why still 5nm TSMC nodes ? (4NP is 5nm not 4nm), if those 5090 are really rumoured to be 2,500 EUR, they should have picked the more efficient nodes.
  • AI like DLSS is great but I do not want my GPU to "guess" the texture and physics (besides for upscaling them), I still want actual developers and creative people to design the games and its assets, so that "neural rendering" I hope it's some marketing gimmicks
  • AI power efficiency ? lol, come on, if they cared so much about efficiency they'd use 2 or 3 nm nodes alongside software optimisations.
  • AI thermal management ? unless it means smart undervolting it's also pure gimmick imho
Posted on Reply
#15
RogueSix
sephiroth117
  • 24GB was more than plenty, those 32GB are here for AI first and foremost
Depends. If you play something like Microsoft Flight Simulator then it is pretty easy to push it past 24GB VRAM usage if you install a nice, juicy 8K texture pack for e.g. the FBW Airbus A380. Been there, done that.
sephiroth117
  • Why still 5nm TSMC nodes ? (4NP is 5nm not 4nm), if those 5090 are really rumoured to be 2,500 EUR, they should have picked the more efficient nodes.
Same reason AMD still went with 5nm ("4nm") for Zen 5. They both need all their 3nm capacities for AI/DC.
sephiroth117
  • AI power efficiency ? lol, come on, if they cared so much about efficiency they'd use 2 or 3 nm nodes alongside software optimisations.
2nm is nowhere near ready yet. 3nm is ready but it is extremely expensive and both, nVidia and AMD, are wise to only use it for AI/DC for now. A 3nm RTX 5090 would likely really carry a price tag of $2,999 (or more).
Posted on Reply
#16
Neo_Morpheus
Guwapo77I really wish AMD had a true competitor at the 5090 level
Just so Ngreedia didn’t charge you as much as they do.

People like you will never buy an AMD gpu.

Well, maybe if they released a gpu that its faster than the 5090 and cost 500, mayyybe some of you will consider it.

Anyways, sounds to me that they will pull the same shenanigans they pulled with dlss and the 40 series.
Posted on Reply
#17
sepheronx
Visible NoiseDo you complain about people buying eggs too? How about cars, are people that like expensive cars stupid also?

Judging someone on how they choose to use their disposable income is one of the absolute stupidest things I see on the internet.

Do I get to judge you for living in Detroit’s armpit?
Of course.

When the market is screwing you, you try to protest the prices. Most countries do. I find everyone here too lazy and complacent. I actually do complain about people who buy certain cars for exact same reason.
Posted on Reply
#18
evernessince
Guwapo77This is a practice I absolutely cannot stand with Nvidia. I really wish AMD had a true competitor at the 5090 level, but I'm stuck with buying a $2000 GPU next year (I hope it doesn't cost more than that).
No one is forcing you to shell out that kind of money for a GPU. Your purchase is an endorsement of their pricing. If I'm Nvidia looking at this post I'd wager you seem to be rather cavalier about $2,000 so might as well make it $2,300. You are going to buy anyways.
Posted on Reply
#19
freeagent
I can honestly think of way better things to spend 2K on, I am priced out of that market for sure. I don't care who's name is on the box, not worth 2K.

The worst part is, my 2K is not your 2K lol, its even lower.

But our government just imploded, so there is that.
Posted on Reply
#20
sepheronx
freeagentI can honestly think of way better things to spend 2K on, I am priced out of that market for sure. I don't care who's name is on the box, not worth 2K.

The worst part is, my 2K is not your 2K lol, its even lower.

But our government just imploded, so there is that.
Yes, us Canadians are pooched for most part as the prices of these goods may be 2K in US but around 3K here. With wages also not even meeting inflation for most part, I would say it is a rich man's game (70 series is over 1000 dollars here).

Nvidia being the worst but AMD not much better. Hence why I'm gonna sit till I see what the B7 series from Intel deliver.
Posted on Reply
#21
freeagent
sepheronx(70 series is over 1000 dollars here).
Got a good deal on mine, summer of 2023..




I remember when I bought my used 980 Classified, watching J2C talk about his and how he paid 700USD for his.. like wow.. that's crazy talk bro.
Posted on Reply
#22
sepheronx
freeagentGot a good deal on mine, summer of 2023..




I remember when I bought my used 980 Classified, watching J2C talk about his and how he paid 700USD for his.. like wow.. that's crazy talk bro.
That's still a ripoff
Posted on Reply
#23
Dr. Dro
sepheronxnah, just the simple fact that there is no such thing as 4k gaming @ 240Hz. I seem to recall 4090 was said to be just that and guess what? it isn't. Funny thing is, it needs make belief frames in order to even operate at playable levels for a ton of games so it doesn't even run at 4K settings.

You are chasing a ghost. Well, you and everyone like you. And its funny because jacket man can look at people like you as an easy cash grab because you will fall for such marketing.

Or you mean 4K at 240hz playing terraria or stardew valley.
I'm gonna have to disagree. Extrapolating a 100% performance uplift vs. my RTX 4080, with some DLSS the RTX 5090 should achieve 240 fps at 4K at practically every game that my 4080 manages to pull 120. And those are a lot, unless they're 2024 AAAs.
Posted on Reply
#24
sepheronx
Dr. DroI'm gonna have to disagree. Extrapolating a 100% performance uplift vs. my RTX 4080, with some DLSS the RTX 5090 should achieve 240 fps at 4K at practically every game that my 4080 manages to pull 120. And those are a lot, unless they're 2024 AAAs.
Try even earlier titles but yeah, sure. Games been unoptimized mess. Some dlss? Any kind of dlss is a joke like the rest.

I've been consistent with this and dlss makes everything a blurry smear of a mess.

It's far better to play at 1440p without dlss than higher with dlss. But paying over 2K to achieve that is a joke in itself.
Posted on Reply
#25
Guwapo77
sepheronxSounds to me you have more money than sense.
sepheronxEasy, people like him and possibly you are the reason for high prices for gpus. Cause "you have no choice"
People like me have been using AMD GPUs since 9700PRO you need to stop with the nonsense.
freeagentI can honestly think of way better things to spend 2K on, I am priced out of that market for sure. I don't care who's name is on the box, not worth 2K.

The worst part is, my 2K is not your 2K lol, its even lower.

But our government just imploded, so there is that.
I've been gaming at 1440p for damn near a decade, I upgraded my monitor when I felt 4K gaming with a respectable frame rate can be achieved. Again, your 2K vs my 2K...
sepheronxTry even earlier titles but yeah, sure. Games been unoptimized mess. Some dlss? Any kind of dlss is a joke like the rest.

I've been consistent with this and dlss makes everything a blurry smear of a mess.

It's far better to play at 1440p without dlss than higher with dlss. But paying over 2K to achieve that is a joke in itself.
I've been playing at 1440p for nearly a decade and now I am upgrading to 4K! You enjoy that 1440p its a nice place to be at...I know. Now its time for me to move on. QD-OLED and 4K is glorious. At 47, I can pretty much do wtf I want to.
Neo_MorpheusJust so Ngreedia didn’t charge you as much as they do.

People like you will never buy an AMD gpu.

Well, maybe if they released a gpu that its faster than the 5090 and cost 500, mayyybe some of you will consider it.

Anyways, sounds to me that they will pull the same shenanigans they pulled with dlss and the 40 series.
All it would have taken in a hot second to look at my current system specs. I've only bought AMD for the last 2 decades...
evernessinceNo one is forcing you to shell out that kind of money for a GPU. Your purchase is an endorsement of their pricing. If I'm Nvidia looking at this post I'd wager you seem to be rather cavalier about $2,000 so might as well make it $2,300. You are going to buy anyways.
Not going to put in effort into this reply, look at the others. No competition and they got the best product, yet its my fault. Go play in traffic.
Posted on Reply
Add your own comment
Dec 21st, 2024 22:28 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts