• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA 2025 International CES Keynote: Liveblog

Fair enough. Still wrong, imo, but as long as buyers are fine with it, who am I to argue.


Really? That's poor as well. I guess no one was really interested in that CPU. I don't even know which one you're talking about, it completely missed the spot with me (although I admit, I only looked for GPUs this time around).
Just an example from nvidias computex presentation regarding B200


blackwell computed.JPG


The CPU in question was strix point (390Ai). But you know, it's amd, so it's not trying to mislead us :D
 
Just an example from nvidias computex presentation regarding B200


Oh, but that clearly states the precision level right below the number. You don't need to see the small print for that.

The CPU in question was strix point (390Ai). But you know, it's amd, so it's not trying to mislead us :D
Ah OK, fair point then. I'm happy to call out bullshit on any side (although I personally skipped that part entirely, as I only cared for GPUs this time around).
 
Oh, but that clearly states the precision level right below the number. You don't need to see the small print for that.


Ah OK, fair point then. I'm happy to call out bullshit on any side.
Ok, there are 67 pages - I can find you one without the number. That's not the point, the point is - according to nvidia and the AI industry that's following them, the FP4 support is a good thing, not a bad one, so the argument here is that nvidia is trying to hide something good they have. Which you understand, doesn't make sense.

Again, to clarify, I know nothing about AI, I just know the trend is to move to lower precision calculations for whatever reason.
 
Ok, there are 67 pages - I can find you one without the number. That's not the point, the point is - according to nvidia and the AI industry that's following them, the FP4 support is a good thing, not a bad one, so the argument here is that nvidia is trying to hide something good they have. Which you understand, doesn't make sense.
No, the argument is that they're trying to hide something they don't have, which is a meaningful improvement in raw performance.

You can find the AMD slide if you want, but I don't think you need to. I already agreed with you that it's a bullshit comparison just the same (and I'm not in search for a Ryzen Pro Plus Uber Ultra Super AI Bla Bla Bla 390 XXXTXT AIAI Whateveritscalled anyway).

I don't side with any company. I side with fairness and honesty. :)
 
No, the argument is that they're trying to hide something they don't have, which is a meaningful improvement in raw performance.

You can find the AMD slide if you want, but I don't think you need to. I already agreed with you that it's a bullshit comparison just the same (and I'm not in search for a Ryzen Pro Plus Uber Ultra Super AI Bla Bla Bla 390 XXXTXT AIAI Whateveritscalled anyway).

I don't side with any company. I side with fairness and honesty. :)
Ok, saw this is on a youtube video, let's all agree that this would be fairer than what jensen showed us, AMEN
the more you buy.JPG
 
How the hell is that even possible? :eek:

I'm on a 6750 XT and only now starting to find it slightly lacking at maximum detail. I know my expectations aren't the greatest in the gaming world, but yours must be through the roof.

I’m also running a 4090 at 1440p. DLSDR is wonderful.
 
You failed at interpretation of this bar chart.

The left most bars indeed don't say DLSS. But they do say RT.
Raster performance might be at a complete standstill, just RT ON is improved, going by this chart. It does not say a thing about raster perf.
I didn't mention RT at all, not sure why you brought it up? Frame-generation is a DLSS feature that works with or without raytracing.

I don't think I interpreted that chart incorrectly. The three games that show double the performance are comparing 4x framegen on the 5080 to 2x framegen on the 4080. That's not an apples to apples comparison, IMO.

The two games on the left are apples-to-apples. FC6 is running with no DLSS at all, so there's no fake frames. Plague Tale is running on the old DLSS3, putting the 4080 and 5080 on equal ground when it comes to number of faked frames using framegen.
 
Are we seriously suggesting that the whole ai industry bought into b200 cause they were misled - they didn't understand what FP4 is?

It appears that someone should become very very rich by showing the entire AI industry that they are doing it wrong.
 
so the argument here is that nvidia is trying to hide something good they have. Which you understand, doesn't make sense.
Everything here makes sense you just pretend not to understand what this is about, wherever that picture you posted it's from it clearly states the precision and TFLOPS at every data point, that bar chart on their website about the 50 series doesn't, it's pretty clear cut why one of these is far more disingenuous than the other.
 
I didn't mention RT at all, not sure why you brought it up? Frame-generation is a DLSS feature that works with or without raytracing.

I don't think I interpreted that chart incorrectly. The three games that show double the performance are comparing 4x framegen on the 5080 to 2x framegen on the 4080. That's not an apples to apples comparison, IMO.

The two games on the left are apples-to-apples. FC6 is running with no DLSS at all, so there's no fake frames. Plague Tale is running on the old DLSS3, putting the 4080 and 5080 on equal ground when it comes to number of faked frames using framegen.
OK, I'll spell it out.

The reason I think you misinterpreted this chart is because you might think there is actual raster / raw performance on tap here based on the left-most bar that only says RT... but this actual performance increase could also just be coming from improved RT handling on the game(s) in question. It does not necessarily speak of raster performance, which is the basis for all performance anyway - DLSS included.
 
Everything here makes sense you just pretend not to understand what this is about, wherever that picture you posted it's from it clearly states the precision and TFLOPS at every data point, that bar chart on their website about the 50 series doesn't, it's pretty clear cut why one of these is far more disingenuous than the other.
Yea, outrageous. I've never seen you being so mad about misleading charts since those 5900xt charts...oh nevermind.
 
"Uhm, whatever. What about AMD though ?
 
"Uhm, whatever. What about AMD though ?
It was kind of a question. Sorry, ill ask plainly. Are you really against whatever you conceive as misleading marketing, or just against whatever you conceive as misleading marketing from a specific company? Cause it really, really, really really looks like the latter.
 
Following this logic, there would never be a generational uplift over the previous halo part. This has been the case for the past few generations. It is possible, but personally I'm optimistic on at least a match. We'll have to wait and see. After all, it's pretty much what AMD is proposing with the 9070 XT. A leaner and meaner chip that will go toe to toe with their previous generation flagship with less raw hardware resources.
U will be so suprised and silent afther u see 5080 is faster than 4090


That list have to be u dream from last night?

5080Ti slower than 4090
Realy? u joking?
RTX 4000 series changed the pattern about next gen. near top SKU beating the previous gen. top SKU. This no longer works, dudes.

Nvidia widened gap between SKUs. RTX 4080 has only 60% of 4090's compute units.
With RTX 5000 series, gap is even more widening. RTX 5080 will have only 50% of RTX 5090's compute units and just 65% of RTX 4090's.
RTX 5080 is basically RTX 4080S with 5% more compute units and a bit higher clocks.
No room for any significant performance boost this time (no significant node change).

Seriously, do your math people. I say it once more - RTX 5080 won't beat RTX 4090 in native (no DLSS and FG) because it lacks hardware resources.
We should come back to this discussion after reviews of 5080 are up. I don't have problem to admit that I was wrong WHEN I was wrong.

As for RX 9070 XT, no one is really expecting that it will go toe to toe with RX 7900 XTX, with RX 7900 XT maybe. (I'm not taking RT into account here.)
AMD clearly stated that they want to focus on making mainstream card for masses with vastly improved RT performance over previous generation.
I personally estimate for RX 9070 XT to be 5% below RX 7900XT but beating RX 7900XT in RT. Unfortunately, I don't give a f* about RT now.
RT may become a reasonable thing when it will become less burdening on hardware, meaning ramping up RT will degrade performance by as much as 15-20%.
Anything above that is just too much of a penalty. My personal expectation is that AMD will move with RDNA4 from -60% RT perf. degradation to "just" RT 30-35% degradation.

EDIT: Some fragments about RX 9070XT performance here and here. If this comes true, then ... holy shit ... I guess.
 
Last edited:
Are you really against whatever you conceive as misleading marketing, or just against whatever you conceive as misleading marketing from a specific company? Cause it really, really, really really looks like the latter.
In case you haven't noticed this thread is about Nvidia and what they announced. But say I only care when Nvidia lies and not when AMD lies, sure, whatever. Logically speaking wouldn't you be more concerned about the marketing lies the company that you and over 90% of all consumers buy from, according to the stats ?
 
Whatever someone thinks about nvidias presentation, as bad as you think the cards they showed us are, they were good enough to make their competitor sound the retreat. But sure, let's once again be angry because nvidia is lying to us about the 5070 being a 4090, while completely ignoring the fact that the 5070 is good enough to make amd change whatever they had planned. Instead they wasted their time comparing a 17w chip to a 120w chip, because hey, that in no way is misleading.
 
Far Cry 6 and Plague Tale Requiem are examples of the raw performance improvement because they clearly don't support DLSS4 MFG fakery.

View attachment 379002

That 30% improvement there is likely what we can really expect in the overwhelming majority of games. The 5080 has 15% more compute (cores*clocks) and sucks down more power despite being a newer, more efficient node, so the other 15% likely comes from the 4080 being sandbagged by power limits.
W1z says 4080S not particularly power limited, 320W->355W only improved performance +1%
 
Low quality post by Vya Domus
Whatever someone thinks about nvidias presentation, as bad as you think the cards they showed us are, they were good enough to make their competitor sound the retreat. But sure, let's once again be angry because nvidia is lying to us about the 5070 being a 4090, while completely ignoring the fact that the 5070 is good enough to make amd change whatever they had planned. Instead they wasted their time comparing a 17w chip to a 120w chip, because hey, that in no way is misleading.
Even more "but what about AMD" lol
 
All I am saying is, this is comical:


RTX3070Ti RTX5070
View attachment 378975 View attachment 378974

But yeah, of course the clock speeds/cache and so on will make the RTX5070 faster, but by just how much is what I am wondering? The 5070 will obliterate it in A.I workloads, that is for sure.

When I owned my 3070Ti, I used "Lossless Scaling" which you can buy from Steam, you could also get impressive FPS numbers if you enabled frame generation but the 8GB VRAM never allowed for it to be used for long periods. Anyways, erm, I am glad I got rid of my RTX3070Ti and hell no, I would not buy a RTX5070, especially with 12GB VRAM. Why? Because if you start using those A.I features, you will run into that VRAM limit very fast and you will wish you still had an GTX1070, because performance will drop way below that when you hit that VRAM ceiling (on 1440p and above)

I don't think I would one free, it's would be just too much of a pain in the ass in many scenarios to manage instead of just enjoy playing games.
If u see a car in a dealer.. are u starting to rage and yelling " i wont buy that car!!"
Nope u wont, u just look other car u like more.

U can buy 5070 but u dont have to.
 
If u see a car in a dealer.. are u starting to rage and yelling " i wont buy that car!!"
Nope u wont, u just look other car u like more.

U can buy 5070 but u dont have to.
I want nvidia to destroy itself and only make raster products and I want them to sell it at a loss and I am owed this because mUh G4m1ng PC!!!!! also make all women hot and naked in games pls MASTER RACE!!!!!!!!!!!!!!
 
You failed at interpretation of this bar chart.

The left most bars indeed don't say DLSS. But they do say RT.
Raster performance might be at a complete standstill, just RT ON is improved, going by this chart. It does not say a thing about raster perf.
4080 FC6 4K: 105.5 fps FC6 +RT 4K: 91.8 fps. Even if 50 series made RT literally free it would also have to improve on raster.
 
OK, I'll spell it out.

The reason I think you misinterpreted this chart is because you might think there is actual raster / raw performance on tap here based on the left-most bar that only says RT... but this actual performance increase could also just be coming from improved RT handling on the game(s) in question. It does not necessarily speak of raster performance, which is the basis for all performance anyway - DLSS included.
Ah, no, that's not what I'm thinking.
Nvidia has been trying to sweep raster performance under the rug for two whole years already, hiding raw performance stagnation behind software tricks. Remember, the only numbers we have right now are Nvidia's so raster performance numbers are intentionally absent.

The 4060 is barely more performant than the 3060 in raw raster performance if you disable all of the RT/DLSS-focused features, and the same is true of the 4060Ti vs 3060Ti. Actual raw raster performance gains this generation are likely to come from simply having more cores and higher clocks at higher power consumption, and that's only obviously true for the 5090. The 5080 gets 15% more cores than the 4080, and more power to clock them higher, so if there's any raw raster performance gain once you take away all the RT/DLSS improvements then it's only going to be a direct result of these extra cores and clocks, IMO.
 
Honestly, the 5090 is an unmatched and uncontested halo product, the 5080 is really what they want high-end gamers to buy. I feel Blackwell just about arrived at the performance level, memory capacity and overall engineering quality that I feel like it is a GPU I can own for many years. Ada was not quite there yet.

Yep, the 5090 stands on an entirely different plain, deliberately positioned with a VASTLY significant hardware advantage over the 5080. In comparison, the RTX 5080 feels more akin to an upgraded 4070-class card rather than a true peer to the 5090. Its a tough one to stomach at $1000. The real story will unfold through benchmarks... I suspect it will show only a modest performance improvement, likely positioning it somewhere between the 4080S and 4090. I suspect a 5080 TI will eventually be triggered to match/outperform the 4090. It'll be undoubtedly impressive in terms of hardware and performance but price? Nah! Who knows, a 5080 TI for $1200 and perhaps a 5080 TI SUPER for $1500... so much 80-territory to play with. I thought last Gen's performance segmentation was naughty with Lovelace forgetting to give 80-series the love it deserved, but it seems the Blackwell successor has dug the well even deeper. Outside of artificial embellishments, i hope i'm proven wrong as I don't like the idea of settling with the 5070 TI.
 
Back
Top