• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

NVIDIA 2025 International CES Keynote: Liveblog

At least with FG you know they're running the same game
By your own previous argument, it's running in a different way and should not be comparable.
I don't think it's even about fairness it's just a shitty way of presenting those performance metrics for the tiny percentage of people who would even care, wouldn't you want to know that this now supports a smaller data type and that you can now run smaller models ?
The tiny percentage of people that care about it should be aware that those performance increases come from the extra bandwidth.
Be honest when you saw that did you assume it's the same data type or did you magically understand there must be more to it before squinting your eyes in the footnotes (if you ever did before someone else pointed it out for you), I for one admit I missed it before I saw someone else talk about it.
Being 100% honest, I hadn't even seen that Flux was in that graph at first lol
I just saw 1 game, then another one with DLSS stuff on top, and another one like that, and just stopped looking because I don't even care about games, and the comparison between DLSS stuff is kinda moot as everyone already agreed to in here.
I only noticed it when someone else brought the FP4 vs FP8 stuff earlier, I got confused what they were talking about until I noticed Flux was in there (and only for the 4090 vs 5090 comparison, not the others), so I was already aware of it by the time I saw it.

Does the AI industry even buy 5070/5080 level cards? I mean, home users getting their feet wet in AI, sure, but the wealthiest AI corps need a lot more oomph, don't they? That's who uber expensive professional cards are for. To them, everything you say about the 5070/5080 is meaningless.
5070s I don't think so, but 5080s would still be useful. If your model fits within 16GB, you can use these as cheap-yet-powerful inference stations.
Not a big company doing collocation in a big DC, but plenty of startups do something like that. Just take a look at vast.ai and you'll notice systems like so:
Screenshot 2025-01-08 at 16.32.49.png

8x 4070S Ti on an Epyc node with 258GB of RAM.

No, they are buyin B200 which also used FP4 claims (vs FP8 for hopper) in their marketing slides.


Look, the thing is, there was another company at CES that compared their 120w CPU vs the competitions 17w chip. With no small letters btw. No one is talking about it being misleading, but we have 50 different threads 20 pages long complaining about nvidia. Makes you wonder
B200s would be for the big techs, most others either buy smaller models (or even consumer GPUs), or just subscribe to some cloud provider.

Fair enough. Still wrong, imo, but as long as buyers are fine with it, who am I to argue.
They are more than fine, it means lots of vram savings and more perf.
Really? That's poor as well. I guess no one was really interested in that CPU. I don't even know which one you're talking about, it completely missed the spot with me (although I admit, I only looked for GPUs this time around).
That was AMD doing comparisons of strix halo vs lunar lake (and it was the shitty 30W lunar lake model, instead of the regular 17W).

RTX 4000 series changed the pattern about next gen. near top SKU beating the previous gen. top SKU. This no longer works, dudes.

Nvidia widened gap between SKUs. RTX 4080 has only 60% of 4090's compute units.
With RTX 5000 series, gap is even more widening. RTX 5080 will have only 50% of RTX 5090's compute units and just 65% of RTX 4090's.
RTX 5080 is basically RTX 4080S with 5% more compute units and a bit higher clocks.
No room for any significant performance boost this time (no significant node change).

Seriously, do your math people. I say it once more - RTX 5080 won't beat RTX 4090 in native (no DLSS and FG) because it lacks hardware resources.
We should come back to this discussion after reviews of 5080 are up. I don't have problem to admit that I was wrong WHEN I was wrong.

As for RX 9070 XT, no one is really expecting that it will go toe to toe with RX 7900 XTX, with RX 7900 XT maybe. (I'm not taking RT into account here.)
AMD clearly stated that they want to focus on making mainstream card for masses with vastly improved RT performance over previous generation.
I personally estimate for RX 9070 XT to be 5% below RX 7900XT but beating RX 7900XT in RT. Unfortunately, I don't give a f* about RT now.
RT may become a reasonable things when it will become less burdening on hardware, meaning ramping up RT will degrade performance as much as 15-20%.
Anything above that is just too much. My personal expectation is that AMD will move with RDNA4 from -60% perf. degradation to 30-35% degradation.
To be honest, even though the 4090 had almost 70% more cores, this doesn't mean that it had 70% more performance in games, in the same way the 5090 won't have 100% higher perf than the 5080 in this scenario.
The 4090 was really bottlenecked by memory bandwidth for games, and the 5080 has a bandwidth pretty similar to it, so the gap between those two may not be as big as the difference in SMs.
Will it be faster or equal in games? I don't know, reviews should reveal that once they're available, but I wouldn't be surprised if it does (in the same sense I wouldn't be in case it doesn't). Game perf is not really linear with either memory bandwidth nor compute units, so it's hard to estimate anything.
 
Low quality post by notoperable
... The 4090 was really bottlenecked by memory bandwidth for games, and the 5080 has a bandwidth pretty similar to it, so the gap between those two may not be as big as the difference in SMs...
Who f**n cares? it has a higher number!
 
Whatever someone thinks about nvidias presentation, as bad as you think the cards they showed us are, they were good enough to make their competitor sound the retreat. But sure, let's once again be angry because nvidia is lying to us about the 5070 being a 4090, while completely ignoring the fact that the 5070 is good enough to make amd change whatever they had planned. Instead they wasted their time comparing a 17w chip to a 120w chip, because hey, that in no way is misleading.

On top of all this, the CEO couldn’t be bothered to show up to the single largest event of the year.
 
Actually I find the statement in Nvidia slide that 5070 has 4090 performance plausible (except from misleading)
Before the slide Jensen was talking about raytracing and AI generated frames saying that for every 33 million pixels calculated with MFG, only 2 million is calculated through traditional rendering.
So the comparison that Nvidia seemingly wants to do for 5070 vs 4090 is FHD native res with raytracing applied and then upscaled to 4K with DLSS (performance) with MFG in 5070's case and FG in 4090's case.
Apply DLSS in the below results (4090 is 145 and 4070S is 92 for example) then multiple with MFG for 5070 and just FG for 4090, it seems perfectly doable and also the experience will not be far off for many games that aren't fast paced since 5070 will have as an example 30fps base with up to 120fps with MFG and 4090 will have 60 fps base with up to 120fps with FG.
relative-performance-rt-1920-1080.png
 
Actually I find the statement in Nvidia slide that 5070 has 4090 performance plausible (except from misleading)
Before the slide Jensen was talking about raytracing and AI generated frames saying that for every 33 million pixels calculated with MFG, only 2 million is calculated through traditional rendering.
So the comparison that Nvidia seemingly wants to do for 5070 vs 4090 is FHD native res with raytracing applied and then upscaled to 4K with DLSS (performance) with MFG in 5070's case and FG in 4090's case.
Apply DLSS in the below results (4090 is 145 and 4070S is 92 for example) then multiple with MFG for 5070 and just FG for 4090, it seems perfectly doable and also the experience will not be far off for many games that aren't fast paced since 5070 will have as an example 30fps base with up to 120fps with MFG and 4090 will have 60 fps base with up to 120fps with FG.
relative-performance-rt-1920-1080.png
If on my 7900XTX I will apply Lossless Scaling 4X frame gen or even better FSR FG with AFMF2 which will give me 2 interpolated frames is my GPU suddenly faster than 4090? And can it be put into benchmarks? If not then 5070 is not equal to 4090.
Don't get me wrong I might jump on 5080 myself, just for the RT perf and tbh just because, but bullshit marketing is bullshit marketing.
 
Even if a gpu can create a million fake frames in a second, still we should never accept this as performance metric.
Yes it helps/it's necessary for some PT games but it's not a deal breaking feature.
DLSS is the most important asset nVidia has. Not the FG.
 
Actually I find the statement in Nvidia slide that 5070 has 4090 performance plausible (except from misleading)
Before the slide Jensen was talking about raytracing and AI generated frames saying that for every 33 million pixels calculated with MFG, only 2 million is calculated through traditional rendering.
So the comparison that Nvidia seemingly wants to do for 5070 vs 4090 is FHD native res with raytracing applied and then upscaled to 4K with DLSS (performance) with MFG in 5070's case and FG in 4090's case.
Apply DLSS in the below results (4090 is 145 and 4070S is 92 for example) then multiple with MFG for 5070 and just FG for 4090, it seems perfectly doable and also the experience will not be far off for many games that aren't fast paced since 5070 will have as an example 30fps base with up to 120fps with MFG and 4090 will have 60 fps base with up to 120fps with FG.
relative-performance-rt-1920-1080.png
That's exactly what's happening. The thing is that without knowing how MFG affects your gameplay experience, this isn't valid information. It's almost like saying that the 5070 runs games faster at 720p low than the 4090 does at 4K ultra. Of course it does, duh. ;)

First we had fake resolutions, then fake frames, now we have multiple fake frames, all introducing different sorts of graphical and latency issues, and people are pissing their pants in joy because it gives them MOAR POWAH!!! What happened to just enjoying games? :(
 
Outputs from FP4 and FP8 models are not equivalent, quit thinking as an AI tourist. People supposedly using these for work would know this is a false comparison.

Your beloved is adding FP4.
  • The first product in the AMD Instinct MI350 Series, the AMD Instinct MI350X accelerator, is based on the AMD CDNA 4 architecture and is expected to be available in 2025. It will use the same industry standard Universal Baseboard server design as other MI300 Series accelerators and will be built using advanced 3nm process technology, support the FP4 and FP6 AI datatypes and have up to 288 GB of HBM3E memory.
 
Your beloved is adding FP4.
  • The first product in the AMD Instinct MI350 Series, the AMD Instinct MI350X accelerator, is based on the AMD CDNA 4 architecture and is expected to be available in 2025. It will use the same industry standard Universal Baseboard server design as other MI300 Series accelerators and will be built using advanced 3nm process technology, support the FP4 and FP6 AI datatypes and have up to 288 GB of HBM3E memory.
You're missing the point. Vya wasn't discussing FP4 support, but the fact that it was compared to FP8 performance in the presentation as if they were equal.
 
You're missing the point. Vya wasn't discussing FP4 support, but the fact that it was compared to FP8 performance in the presentation as if they were equal.
They did that to showcase a new feature. Anyone that's actually in the field would find this a nice thing.
Anyone in the field would also know that they could have been using 1-bit weights and the graph would still be pretty similar, the data type for that model is almost negligible since it's mostly memory-bound, and that's where the perf uplift came from.
 
they could have been using 1-bit weights and the graph would still be pretty similar, the data type for that model is almost negligible since
100% false, I don't know why you insist the data type is irrelevant and totally interchangeable but it's total nonsense.
 
Last edited:
FP4 has a place, mostly because 80B parameter models have better performance (even on highly compressed / quantized FP4) than 11B parameter models on FP16.

So running an 80B parameter model at 4-bit is better than running 11B parameter model at 16-bit. But everyone would prefer to run the 80B parameter at 16-bit if at all possible. Alas, the 80B parameter model needs too much RAM.

Comparing an FP8 benchmark on the old card and an FP4 benchmark on the new cards is 100% shenanigans. Its just false marketing, dare I say. In fact, because the FP4 model uses 1/2 the RAM I bet that running FP4 on old-hardware would still be a dramatic speedup (cuts down on memory bandwidths by 50% !!!). Even without any FP4 tensor units, you can just do the FP8 or FP16 multiply-and-accumulate, then downsample to FP4 storage.
 
Actually I find the statement in Nvidia slide that 5070 has 4090 performance plausible (except from misleading)
Before the slide Jensen was talking about raytracing and AI generated frames saying that for every 33 million pixels calculated with MFG, only 2 million is calculated through traditional rendering.
So the comparison that Nvidia seemingly wants to do for 5070 vs 4090 is FHD native res with raytracing applied and then upscaled to 4K with DLSS (performance) with MFG in 5070's case and FG in 4090's case.
Apply DLSS in the below results (4090 is 145 and 4070S is 92 for example) then multiple with MFG for 5070 and just FG for 4090, it seems perfectly doable and also the experience will not be far off for many games that aren't fast paced since 5070 will have as an example 30fps base with up to 120fps with MFG and 4090 will have 60 fps base with up to 120fps with FG.
relative-performance-rt-1920-1080.png
Of course it's plausible when you use MFG, but that's the point, if the 5070 with MFG matches the 4090 with FG, that literally makes the 4090 twice as fast in raw horsepower.

Something im puzzled with, both with reading the comments here and on other platforms - and using the data from TPU's latest GPU testing (https://www.techpowerup.com/review/gpu-test-system-update-for-2025/2.html), looks like Nvidia will have the 5 (maybe 6, we have to see where the 5070ti lands) fastest cards for pure raster. Again, just raster, not even touching RT. Looking at RT, it has the top 9-11 fastest cards depending on resolution (assuming the 5070ti will be faster than the XTX, which is likely the case) And yet we are complaining that they are ignoring raw performance for AI....? The have cards from 2020 (LOL) that are faster in pure RT performance than amd's latest and greatest..

So, do they have to have the 50 top fastest cards in both RT and raster to stop complaining, or what am I missing?
 
Last edited:
To be honest, even though the 4090 had almost 70% more cores, this doesn't mean that it had 70% more performance in games, in the same way the 5090 won't have 100% higher perf than the 5080 in this scenario.
The 4090 was really bottlenecked by memory bandwidth for games, and the 5080 has a bandwidth pretty similar to it, so the gap between those two may not be as big as the difference in SMs.
Will it be faster or equal in games? I don't know, reviews should reveal that once they're available, but I wouldn't be surprised if it does (in the same sense I wouldn't be in case it doesn't). Game perf is not really linear with either memory bandwidth nor compute units, so it's hard to estimate anything.
5080 has 75% memory bandwidth of 4090. I wouldn't call it "pretty similar". [EDIT: 5080 has 95% of 4090's memory bandwidth. My bad.]
Even though you made a valid point, this is (IMHO) still not enough for 5080 to beat 4090 in native (raster. perf.).

That's exactly what's happening. The thing is that without knowing how MFG affects your gameplay experience, this isn't valid information. It's almost like saying that the 5070 runs games faster at 720p low than the 4090 does at 4K ultra. Of course it does, duh. ;)

First we had fake resolutions, then fake frames, now we have multiple fake frames, all introducing different sorts of graphical and latency issues, and people are pissing their pants in joy because it gives them MOAR POWAH!!! What happened to just enjoying games? :(
Next is fake games! I've already mentioned that before if you recall (AI generated graphics, sounds, even script).
Gaming industry will face tremendous difficulties once gamers get their hands on tools to create AI generated games for free.
I have no problem playing older games as long as there is multi-player/co-op support (servers still running).
I doubt that quality of games such as L4D2, Diablo, Borderlands, Red Alert, Starcraft 2, Battlefield BC2, Jagged Alliance 2, etc. will ever get beaten in future. I lost thousands of my life hours with these games.
I'm not trying to be pessimistic here, but realistic. If you look at what is quality of games today ...

Of course it's plausible when you use MFG, but that's the point, if the 5070 with MFG matches the 4090 with FG, that literally makes the 4090 twice as fast in raw horsepower.
Exactly.
 
Last edited:
once gamers get their hands on tools to create AI generated games for free.
That stuff is still ages away from being close to usable, if it will ever be. The concern is misguided, it's these corporations who will try to fill games with AI generated trash the moment it becomes feasible.
 
Next is fake games! I've already mentioned that before if you recall (AI generated graphics, sounds, even script).
Gaming industry will face tremendous difficulties once gamers get their hands on tools to create AI generated games for free.
I have no problem playing older games as long as there is multi-player/co-op support (servers still running).
I doubt that quality of games such as L4D2, Diablo, Borderlands, Red Alert, Starcraft 2, Battlefield BC2, Jagged Alliance 2, etc. will ever get beaten in future. I lost thousands of my life hours with these games.
I'm not trying to be pessimistic here, but realistic. If you look at what is quality of games today ...
Oh there are lots of amazing games out there, believe me! Just look further than the overhyped, mass produced usual EA / Ubisoft AAA crap. The future is in indie and lesser known titles, I've been saying this for years. Stray and Hellblade: Senua's Sacrifice both made me cry. Abzu is also a great one to recommend. They run great on a Steam Deck, you don't even need high-end hardware for them. I have a few more on my list that I've yet to play (Lost Ember, Star Trucker, Bramble: The Mountain King just to name a few).
 
Oh there are lots of amazing games out there, believe me! Just look further than the overhyped, mass produced usual EA / Ubisoft AAA crap. The future is in indie and lesser known titles, I've been saying this for years. Stray and Hellblade: Senua's Sacrifice both made me cry. Abzu is also a great one to recommend. They run great on a Steam Deck, you don't even need high-end hardware for them. I have a few more on my list that I've yet to play (Lost Ember, Star Trucker, Bramble: The Mountain King just to name a few).
KENA bridge of spirits is pretty damn good too.
 
KENA bridge of spirits is pretty damn good too.
How did I forget, that's on my list of games to play, too! I just installed it on my Deck a few days ago. :)
 
Actually I find the statement in Nvidia slide that 5070 has 4090 performance plausible (except from misleading)
Before the slide Jensen was talking about raytracing and AI generated frames saying that for every 33 million pixels calculated with MFG, only 2 million is calculated through traditional rendering.
So the comparison that Nvidia seemingly wants to do for 5070 vs 4090 is FHD native res with raytracing applied and then upscaled to 4K with DLSS (performance) with MFG in 5070's case and FG in 4090's case.
Apply DLSS in the below results (4090 is 145 and 4070S is 92 for example) then multiple with MFG for 5070 and just FG for 4090, it seems perfectly doable and also the experience will not be far off for many games that aren't fast paced since 5070 will have as an example 30fps base with up to 120fps with MFG and 4090 will have 60 fps base with up to 120fps with FG.
[..]
The problem is that pathtracing/raytracing requires a lot of VRAM and the 5070 only has 12 GB of it. The 4070 with its 12GB runs out of VRAM when enabling even the Medium Path Tracing (Full Ray Tracing) setting in Indiana Jones and the Great Circle.
5080 has 75% memory bandwidth of 4090.
5080 has 960 GB/s and 4090 has 1008 GB/s, that's 95%. But the only 16GB VRAM of the 5080 is a problem. Feels like planned obsolescence especially because pathtracing/raytracing requires so much VRAM and I would have to look up how much 16GB are (going to be) enough for pure raster.
 
5080 has 960 GB/s and 4090 has 1008 GB/s, that's 95%. But the only 16GB VRAM of the 5080 is a problem. Feels like planned obsolescence especially because pathtracing/raytracing requires so much VRAM and I would have to look up how much 16GB are (going to be) enough for pure raster.
Thanks for pointing out. My bad, I must have been looking at other specs tab, most probably on the tab of 4080S.
 
Next is fake games! I've already mentioned that before if you recall (AI generated graphics, sounds, even script).
Gaming industry will face tremendous difficulties once gamers get their hands on tools to create AI generated games for free.
I have no problem playing older games as long as there is multi-player/co-op support (servers still running).
I doubt that quality of games such as L4D2, Diablo, Borderlands, Red Alert, Starcraft 2, Battlefield BC2, Jagged Alliance 2, etc. will ever get beaten in future. I lost thousands of my life hours with these games.
I'm not trying to be pessimistic here, but realistic. If you look at what is quality of games today ...
On that one I'm totally not worried at all. Gaming will survive. Compare it to young kids that you give a piece of paper and a pencil. Something nice will come out of it, sooner or later. Imagination never ends. Its also because of that PC gaming has never, and will never die. If its not happening on Windows, it happens on Linux, and if GPUs are too expensive, we'll code for IGPs. Look at the Deck and the numerous indies: its that reality happening as we speak.

You see this even today, between the oceans of salted plastic AAA soup you can find sweetwater rivers of indie games that show the world what gaming was all about to begin with: plain fun, immersion in a set of mechanics and systems and worlds, and taking you deep into it. Escapism also happens at that point, and not when you're playing the umpteenth triple A with the same mechanics ad infinitum. That's just braindead entertainment, like watching TV. Also fine, but not what gaming is really about - just watch TV then, so you can actually do nothing. Gaming is, after all, engagement, being active, not being passive. And that is also the biggest issue AI-driven gaming is going to face: how much is generated, and how much is left to player agency? How real and how far can it go?

Its paradoxical in a way, the same way AI-driven opponents are: the AI will always have a responsive and data-advantage over the player, because the player responds to end-output and AI has all the steps before it. So how do you fix this inbalance of power? By coding the AI, giving it limitations, making it slower... effectively negating or notably reducing its advantages. Is the end result going to be better than a scripted NPC? Doubtful - at best, it is going to be different and perhaps more dynamic. But not too dynamic, because how then, as a player, can you ever get good, or better than the AI? Skill caps are not infinite. A good example of this problem is already out in the wild for a long time: automated matchmaking in games, based on skill rankings. If you want a dynamic AI, you will want to use a similar ranking system to categorize player skill and appropriate AI skill. But its not fun, and never surprising, unless you as a player are actively trying to get better. How much effort are you willing to put into that? Weren't you just trying to have fun?
 
Last edited:
If on my 7900XTX I will apply Lossless Scaling 4X frame gen or even better FSR FG with AFMF2 which will give me 2 interpolated frames is my GPU suddenly faster than 4090? And can it be put into benchmarks? If not then 5070 is not equal to 4090.
Don't get me wrong I might jump on 5080 myself, just for the RT perf and tbh just because, but bullshit marketing is bullshit marketing.
I agree, that's why I mentioned that it's misleading, with my post I just wanted to clarify in what terms (probably) Nvidia makes the comparison...

Of course it's plausible when you use MFG, but that's the point, if the 5070 with MFG matches the 4090 with FG, that literally makes the 4090 twice as fast in raw horsepower.

Something im puzzled with, both with reading the comments here and on other platforms - and using the data from TPU's latest GPU testing (https://www.techpowerup.com/review/gpu-test-system-update-for-2025/2.html), looks like Nvidia will have the 5 (maybe 6, we have to see where the 5070ti lands) fastest cards for pure raster. Again, just raster, not even touching RT. Looking at RT, it has the top 9-11 fastest cards depending on resolution (assuming the 5070ti will be faster than the XTX, which is likely the case) And yet we are complaining that they are ignoring raw performance for AI....? The have cards from 2020 (LOL) that are faster in pure RT performance than amd's latest and greatest..

So, do they have to have the 50 top fastest cards in both RT and raster to stop complaining, or what am I missing?
I agree, that's why I mentioned that it's misleading, with my post I just wanted to clarify in what terms (probably) Nvidia makes the comparison.
Regarding raster only 5090/4090/5080 is clearly faster than RX 7900XTX in raster, 4080S/4080/5070Ti (probably) is similar and in reality a little bit worse than RX 7900XTX in 4K raster, don't look the latest wizard data, because I'm not convinced that the latest game selection is representative exactly, the difference is minor of course with what I consider but enough for RX7900XT not to lose vs 4080S in 4K which is the intending resolution for these VGAs, and I don't consider unreal lumen pure raster (on the other hand with the success and adoption of unreal 5 maybe it's fair in general)

The problem is that pathtracing/raytracing requires a lot of VRAM and the 5070 only has 12 GB of it. The 4070 with its 12GB runs out of VRAM when enabling even the Medium Path Tracing (Full Ray Tracing) setting in Indiana Jones and the Great Circle.
In agree that 12GB is a problem, I haven't seen usage in Indiana with path tracing at FHD base resolution to check it and also 5070 will support neural texture compression enabling similar quality textures with smaller memory footprint (or higher quality textures in a given memory budget) but what you said even if we suppose that it isn't true today (FHD base...) certainly will be in the near future.
 
Last edited:
The problem is that pathtracing/raytracing requires a lot of VRAM and the 5070 only has 12 GB of it. The 4070 with its 12GB runs out of VRAM when enabling even the Medium Path Tracing (Full Ray Tracing) setting in Indiana Jones and the Great Circle.

5080 has 960 GB/s and 4090 has 1008 GB/s, that's 95%. But the only 16GB VRAM of the 5080 is a problem. Feels like planned obsolescence especially because pathtracing/raytracing requires so much VRAM and I would have to look up how much 16GB are (going to be) enough for pure raster.
Maybe it use less Vram.
12Gb is still ok in that price, not maybe good deal but can play couple years just fine or until Super series is coming.
There is still lot oof ppls who use 8/10GB cards no issues.
 
Happy I took the bite on the 4080 super, 5000 series seems 'meh' as I gambled it would be, the rumoured VRAM boost not really surfaced and most of any gains are down to better AI (multi frame gen).

Will still get the better DLSS/DLAA so am happy.

For what its worth my prediction is we going to see another delayed pricing drop, I dont think these will sell that well compared to older gens. So I wouldnt buy now and hold off if your plan is to get a 5000 series card.
 
Back
Top