Monday, January 6th 2025

NVIDIA 2025 International CES Keynote: Liveblog

NVIDIA kicks off the 2025 International CES with a bang. The company is expected to debut its new GeForce "Blackwell" RTX 5000 generation of gaming graphics cards. It is also expected to launch new technology, such as neural rendering, and DLSS 4. The company is also expected to highlight a new piece of silicon for Windows on Arm laptops, showcase the next in its Drive PX FSD hardware, and probably even talk about its next-generation "Blackwell Ultra" AI GPU, and if we're lucky, even namedrop "Rubin." Join us, as we liveblog CEO Jensen Huang's keynote address.

02:22 UTC: The show is finally underway!
02:35 UTC: CTA president Gary Shaprio kicks off the show, introduces Jensen Huang.
02:46 UTC: "Tokens are the building blocks of AI"

02:46 UTC: "Do you like my jacket?"
02:47 UTC: NVIDIA recounts progress all the way till NV1 and UDA.
02:48 UTC: "CUDA was difficult to explain, it took 6 years to get the industry to like it"
02:50 UTC: "AI is coming home to GeForce". NVIDIA teases neural material and neural rendering. Rendered on "Blackwell"
02:55 UTC: Every single pixel is ray traced, thanks to AI rendering.
02:55 UTC: Here it is, the GeForce RTX 5090.
03:20 UTC: At least someone is pushing the limits for GPUs.
03:22 UTC: Incredible board design.
03:22 UTC: RTX 5070 matches RTX 4090 at $550.
03:24 UTC: Here's the lineup, available from January.
03:24 UTC: RTX 5070 Laptop starts at $1299.
03:24 UTC: "The future of computer graphics is neural rendering"
03:25 UTC: Laptops powered by RTX Blackwell: staring prices:
03:26 UTC: AI has come back to power GeForce.
03:28 UTC: Supposedly the Grace Blackwell NVLink72.
03:28 UTC: 1.4 ExaFLOPS.
03:32 UTC: NVIDIA very sneakily teased a Windows AI PC chip.

03:35 UTC: NVIDIA is teaching generative AI basic physics. NVIDIA Cosmos, a world foundation model.
03:41 UTC: NVIDIA Cosmos is trained on 20 million hours of video.

03:43 UTC: Cosmos is open-licensed on GitHub.

03:52 UTC: NVIDIA onboards Toyota for its next generation EV for full-self driving.

03:53 UTC: NVIDIA unveils Thor Blackwell robotics processor.
03:53 UTC: Thor is 20x the processing capability of Orin.

03:54 UTC: CUDA is now a functional safe computer thanks to its automobile certifications.
04:01 UTC: NVIDIA brought a dozen humanoid robots to the stage.

04:07 UTC: Project DIGITS, is a shrunk down AI supercomputer.
04:08 UTC: NVIDIA GB110 "Grace-Blackwell" chip powers DIGITS.
Add your own comment

446 Comments on NVIDIA 2025 International CES Keynote: Liveblog

#401
JustBenching
AusWolfOh, but that clearly states the precision level right below the number. You don't need to see the small print for that.


Ah OK, fair point then. I'm happy to call out bullshit on any side.
Ok, there are 67 pages - I can find you one without the number. That's not the point, the point is - according to nvidia and the AI industry that's following them, the FP4 support is a good thing, not a bad one, so the argument here is that nvidia is trying to hide something good they have. Which you understand, doesn't make sense.

Again, to clarify, I know nothing about AI, I just know the trend is to move to lower precision calculations for whatever reason.
Posted on Reply
#402
AusWolf
JustBenchingOk, there are 67 pages - I can find you one without the number. That's not the point, the point is - according to nvidia and the AI industry that's following them, the FP4 support is a good thing, not a bad one, so the argument here is that nvidia is trying to hide something good they have. Which you understand, doesn't make sense.
No, the argument is that they're trying to hide something they don't have, which is a meaningful improvement in raw performance.

You can find the AMD slide if you want, but I don't think you need to. I already agreed with you that it's a bullshit comparison just the same (and I'm not in search for a Ryzen Pro Plus Uber Ultra Super AI Bla Bla Bla 390 XXXTXT AIAI Whateveritscalled anyway).

I don't side with any company. I side with fairness and honesty. :)
Posted on Reply
#403
JustBenching
AusWolfNo, the argument is that they're trying to hide something they don't have, which is a meaningful improvement in raw performance.

You can find the AMD slide if you want, but I don't think you need to. I already agreed with you that it's a bullshit comparison just the same (and I'm not in search for a Ryzen Pro Plus Uber Ultra Super AI Bla Bla Bla 390 XXXTXT AIAI Whateveritscalled anyway).

I don't side with any company. I side with fairness and honesty. :)
Ok, saw this is on a youtube video, let's all agree that this would be fairer than what jensen showed us, AMEN
Posted on Reply
#404
AusWolf
JustBenchingOk, saw this is on a youtube video, let's all agree that this would be fairer than what jensen showed us, AMEN
Man, this will be a classic! :D
Posted on Reply
#405
Visible Noise
AusWolfHow the hell is that even possible? :eek:

I'm on a 6750 XT and only now starting to find it slightly lacking at maximum detail. I know my expectations aren't the greatest in the gaming world, but yours must be through the roof.
I’m also running a 4090 at 1440p. DLSDR is wonderful.
Posted on Reply
#406
Chrispy_
Vayra86You failed at interpretation of this bar chart.

The left most bars indeed don't say DLSS. But they do say RT.
Raster performance might be at a complete standstill, just RT ON is improved, going by this chart. It does not say a thing about raster perf.
I didn't mention RT at all, not sure why you brought it up? Frame-generation is a DLSS feature that works with or without raytracing.

I don't think I interpreted that chart incorrectly. The three games that show double the performance are comparing 4x framegen on the 5080 to 2x framegen on the 4080. That's not an apples to apples comparison, IMO.

The two games on the left are apples-to-apples. FC6 is running with no DLSS at all, so there's no fake frames. Plague Tale is running on the old DLSS3, putting the 4080 and 5080 on equal ground when it comes to number of faked frames using framegen.
Posted on Reply
#407
Visible Noise
JustBenchingAre we seriously suggesting that the whole ai industry bought into b200 cause they were misled - they didn't understand what FP4 is?
It appears that someone should become very very rich by showing the entire AI industry that they are doing it wrong.
Posted on Reply
#408
Vya Domus
JustBenchingso the argument here is that nvidia is trying to hide something good they have. Which you understand, doesn't make sense.
Everything here makes sense you just pretend not to understand what this is about, wherever that picture you posted it's from it clearly states the precision and TFLOPS at every data point, that bar chart on their website about the 50 series doesn't, it's pretty clear cut why one of these is far more disingenuous than the other.
Posted on Reply
#409
Vayra86
Chrispy_I didn't mention RT at all, not sure why you brought it up? Frame-generation is a DLSS feature that works with or without raytracing.

I don't think I interpreted that chart incorrectly. The three games that show double the performance are comparing 4x framegen on the 5080 to 2x framegen on the 4080. That's not an apples to apples comparison, IMO.

The two games on the left are apples-to-apples. FC6 is running with no DLSS at all, so there's no fake frames. Plague Tale is running on the old DLSS3, putting the 4080 and 5080 on equal ground when it comes to number of faked frames using framegen.
OK, I'll spell it out.

The reason I think you misinterpreted this chart is because you might think there is actual raster / raw performance on tap here based on the left-most bar that only says RT... but this actual performance increase could also just be coming from improved RT handling on the game(s) in question. It does not necessarily speak of raster performance, which is the basis for all performance anyway - DLSS included.
Posted on Reply
#410
JustBenching
Vya DomusEverything here makes sense you just pretend not to understand what this is about, wherever that picture you posted it's from it clearly states the precision and TFLOPS at every data point, that bar chart on their website about the 50 series doesn't, it's pretty clear cut why one of these is far more disingenuous than the other.
Yea, outrageous. I've never seen you being so mad about misleading charts since those 5900xt charts...oh nevermind.
Posted on Reply
#411
Vya Domus
"Uhm, whatever. What about AMD though ?
Posted on Reply
#412
JustBenching
Vya Domus"Uhm, whatever. What about AMD though ?
It was kind of a question. Sorry, ill ask plainly. Are you really against whatever you conceive as misleading marketing, or just against whatever you conceive as misleading marketing from a specific company? Cause it really, really, really really looks like the latter.
Posted on Reply
#413
LittleBro
Dr. DroFollowing this logic, there would never be a generational uplift over the previous halo part. This has been the case for the past few generations. It is possible, but personally I'm optimistic on at least a match. We'll have to wait and see. After all, it's pretty much what AMD is proposing with the 9070 XT. A leaner and meaner chip that will go toe to toe with their previous generation flagship with less raw hardware resources.
DaworaU will be so suprised and silent afther u see 5080 is faster than 4090


That list have to be u dream from last night?

5080Ti slower than 4090
Realy? u joking?
RTX 4000 series changed the pattern about next gen. near top SKU beating the previous gen. top SKU. This no longer works, dudes.

Nvidia widened gap between SKUs. RTX 4080 has only 60% of 4090's compute units.
With RTX 5000 series, gap is even more widening. RTX 5080 will have only 50% of RTX 5090's compute units and just 65% of RTX 4090's.
www.techpowerup.com/gpu-specs/geforce-rtx-4080-super.c4182
www.techpowerup.com/gpu-specs/geforce-rtx-5080.c4217
RTX 5080 is basically RTX 4080S with 5% more compute units and a bit higher clocks.
No room for any significant performance boost this time (no significant node change).

Seriously, do your math people. I say it once more - RTX 5080 won't beat RTX 4090 in native (no DLSS and FG) because it lacks hardware resources.
We should come back to this discussion after reviews of 5080 are up. I don't have problem to admit that I was wrong WHEN I was wrong.

As for RX 9070 XT, no one is really expecting that it will go toe to toe with RX 7900 XTX, with RX 7900 XT maybe. (I'm not taking RT into account here.)
AMD clearly stated that they want to focus on making mainstream card for masses with vastly improved RT performance over previous generation.
I personally estimate for RX 9070 XT to be 5% below RX 7900XT but beating RX 7900XT in RT. Unfortunately, I don't give a f* about RT now.
RT may become a reasonable thing when it will become less burdening on hardware, meaning ramping up RT will degrade performance by as much as 15-20%.
Anything above that is just too much of a penalty. My personal expectation is that AMD will move with RDNA4 from -60% RT perf. degradation to "just" RT 30-35% degradation.

EDIT: Some fragments about RX 9070XT performance here and here. If this comes true, then ... holy shit ... I guess.
Posted on Reply
#414
Vya Domus
JustBenchingAre you really against whatever you conceive as misleading marketing, or just against whatever you conceive as misleading marketing from a specific company? Cause it really, really, really really looks like the latter.
In case you haven't noticed this thread is about Nvidia and what they announced. But say I only care when Nvidia lies and not when AMD lies, sure, whatever. Logically speaking wouldn't you be more concerned about the marketing lies the company that you and over 90% of all consumers buy from, according to the stats ?
Posted on Reply
#415
JustBenching
Whatever someone thinks about nvidias presentation, as bad as you think the cards they showed us are, they were good enough to make their competitor sound the retreat. But sure, let's once again be angry because nvidia is lying to us about the 5070 being a 4090, while completely ignoring the fact that the 5070 is good enough to make amd change whatever they had planned. Instead they wasted their time comparing a 17w chip to a 120w chip, because hey, that in no way is misleading.
Posted on Reply
#416
Scircura
Chrispy_Far Cry 6 and Plague Tale Requiem are examples of the raw performance improvement because they clearly don't support DLSS4 MFG fakery.



That 30% improvement there is likely what we can really expect in the overwhelming majority of games. The 5080 has 15% more compute (cores*clocks) and sucks down more power despite being a newer, more efficient node, so the other 15% likely comes from the 4080 being sandbagged by power limits.
W1z says 4080S not particularly power limited, 320W->355W only improved performance +1%
Posted on Reply
#417
Dawora
Legacy-ZAAll I am saying is, this is comical:


RTX3070Ti RTX5070


But yeah, of course the clock speeds/cache and so on will make the RTX5070 faster, but by just how much is what I am wondering? The 5070 will obliterate it in A.I workloads, that is for sure.

When I owned my 3070Ti, I used "Lossless Scaling" which you can buy from Steam, you could also get impressive FPS numbers if you enabled frame generation but the 8GB VRAM never allowed for it to be used for long periods. Anyways, erm, I am glad I got rid of my RTX3070Ti and hell no, I would not buy a RTX5070, especially with 12GB VRAM. Why? Because if you start using those A.I features, you will run into that VRAM limit very fast and you will wish you still had an GTX1070, because performance will drop way below that when you hit that VRAM ceiling (on 1440p and above)

I don't think I would one free, it's would be just too much of a pain in the ass in many scenarios to manage instead of just enjoy playing games.
If u see a car in a dealer.. are u starting to rage and yelling " i wont buy that car!!"
Nope u wont, u just look other car u like more.

U can buy 5070 but u dont have to.
Posted on Reply
#418
SOAREVERSOR
DaworaIf u see a car in a dealer.. are u starting to rage and yelling " i wont buy that car!!"
Nope u wont, u just look other car u like more.

U can buy 5070 but u dont have to.
I want nvidia to destroy itself and only make raster products and I want them to sell it at a loss and I am owed this because mUh G4m1ng PC!!!!! also make all women hot and naked in games pls MASTER RACE!!!!!!!!!!!!!!
Posted on Reply
#419
Scircura
Vayra86You failed at interpretation of this bar chart.

The left most bars indeed don't say DLSS. But they do say RT.
Raster performance might be at a complete standstill, just RT ON is improved, going by this chart. It does not say a thing about raster perf.
4080 FC6 4K: 105.5 fps FC6 +RT 4K: 91.8 fps. Even if 50 series made RT literally free it would also have to improve on raster.
Posted on Reply
#420
SOAREVERSOR
Scircura4080 FC6 4K: 105.5 fps FC6 +RT 4K: 91.8 fps. Even if 50 series made RT literally free it would also have to improve on raster.
Raster is a dodo now. The entire industry is going the AI way of rendering. That's why AMD is fucked until they pick up in AI.
Posted on Reply
#421
Chrispy_
Vayra86OK, I'll spell it out.

The reason I think you misinterpreted this chart is because you might think there is actual raster / raw performance on tap here based on the left-most bar that only says RT... but this actual performance increase could also just be coming from improved RT handling on the game(s) in question. It does not necessarily speak of raster performance, which is the basis for all performance anyway - DLSS included.
Ah, no, that's not what I'm thinking.
Nvidia has been trying to sweep raster performance under the rug for two whole years already, hiding raw performance stagnation behind software tricks. Remember, the only numbers we have right now are Nvidia's so raster performance numbers are intentionally absent.

The 4060 is barely more performant than the 3060 in raw raster performance if you disable all of the RT/DLSS-focused features, and the same is true of the 4060Ti vs 3060Ti. Actual raw raster performance gains this generation are likely to come from simply having more cores and higher clocks at higher power consumption, and that's only obviously true for the 5090. The 5080 gets 15% more cores than the 4080, and more power to clock them higher, so if there's any raw raster performance gain once you take away all the RT/DLSS improvements then it's only going to be a direct result of these extra cores and clocks, IMO.
Posted on Reply
#422
wheresmycar
Dr. DroHonestly, the 5090 is an unmatched and uncontested halo product, the 5080 is really what they want high-end gamers to buy. I feel Blackwell just about arrived at the performance level, memory capacity and overall engineering quality that I feel like it is a GPU I can own for many years. Ada was not quite there yet.
Yep, the 5090 stands on an entirely different plain, deliberately positioned with a VASTLY significant hardware advantage over the 5080. In comparison, the RTX 5080 feels more akin to an upgraded 4070-class card rather than a true peer to the 5090. Its a tough one to stomach at $1000. The real story will unfold through benchmarks... I suspect it will show only a modest performance improvement, likely positioning it somewhere between the 4080S and 4090. I suspect a 5080 TI will eventually be triggered to match/outperform the 4090. It'll be undoubtedly impressive in terms of hardware and performance but price? Nah! Who knows, a 5080 TI for $1200 and perhaps a 5080 TI SUPER for $1500... so much 80-territory to play with. I thought last Gen's performance segmentation was naughty with Lovelace forgetting to give 80-series the love it deserved, but it seems the Blackwell successor has dug the well even deeper. Outside of artificial embellishments, i hope i'm proven wrong as I don't like the idea of settling with the 5070 TI.
Posted on Reply
#423
notoperable
Nvidia introduces: NVIDIA INDENTURE, resell value cracking and zeroing your credit card NOW!
igormpDid you folks like his jacket?
Snake, you remember what Pauli said about Snakes in Sopranos? :]
Posted on Reply
#424
igormp
Vya DomusAt least with FG you know they're running the same game
By your own previous argument, it's running in a different way and should not be comparable.
Vya DomusI don't think it's even about fairness it's just a shitty way of presenting those performance metrics for the tiny percentage of people who would even care, wouldn't you want to know that this now supports a smaller data type and that you can now run smaller models ?
The tiny percentage of people that care about it should be aware that those performance increases come from the extra bandwidth.
Vya DomusBe honest when you saw that did you assume it's the same data type or did you magically understand there must be more to it before squinting your eyes in the footnotes (if you ever did before someone else pointed it out for you), I for one admit I missed it before I saw someone else talk about it.
Being 100% honest, I hadn't even seen that Flux was in that graph at first lol
I just saw 1 game, then another one with DLSS stuff on top, and another one like that, and just stopped looking because I don't even care about games, and the comparison between DLSS stuff is kinda moot as everyone already agreed to in here.
I only noticed it when someone else brought the FP4 vs FP8 stuff earlier, I got confused what they were talking about until I noticed Flux was in there (and only for the 4090 vs 5090 comparison, not the others), so I was already aware of it by the time I saw it.
AusWolfDoes the AI industry even buy 5070/5080 level cards? I mean, home users getting their feet wet in AI, sure, but the wealthiest AI corps need a lot more oomph, don't they? That's who uber expensive professional cards are for. To them, everything you say about the 5070/5080 is meaningless.
5070s I don't think so, but 5080s would still be useful. If your model fits within 16GB, you can use these as cheap-yet-powerful inference stations.
Not a big company doing collocation in a big DC, but plenty of startups do something like that. Just take a look at vast.ai and you'll notice systems like so:

8x 4070S Ti on an Epyc node with 258GB of RAM.
JustBenchingNo, they are buyin B200 which also used FP4 claims (vs FP8 for hopper) in their marketing slides.


Look, the thing is, there was another company at CES that compared their 120w CPU vs the competitions 17w chip. With no small letters btw. No one is talking about it being misleading, but we have 50 different threads 20 pages long complaining about nvidia. Makes you wonder
B200s would be for the big techs, most others either buy smaller models (or even consumer GPUs), or just subscribe to some cloud provider.
AusWolfFair enough. Still wrong, imo, but as long as buyers are fine with it, who am I to argue.
They are more than fine, it means lots of vram savings and more perf.
AusWolfReally? That's poor as well. I guess no one was really interested in that CPU. I don't even know which one you're talking about, it completely missed the spot with me (although I admit, I only looked for GPUs this time around).
That was AMD doing comparisons of strix halo vs lunar lake (and it was the shitty 30W lunar lake model, instead of the regular 17W).
LittleBroRTX 4000 series changed the pattern about next gen. near top SKU beating the previous gen. top SKU. This no longer works, dudes.

Nvidia widened gap between SKUs. RTX 4080 has only 60% of 4090's compute units.
With RTX 5000 series, gap is even more widening. RTX 5080 will have only 50% of RTX 5090's compute units and just 65% of RTX 4090's.
www.techpowerup.com/gpu-specs/geforce-rtx-4080-super.c4182
www.techpowerup.com/gpu-specs/geforce-rtx-5080.c4217
RTX 5080 is basically RTX 4080S with 5% more compute units and a bit higher clocks.
No room for any significant performance boost this time (no significant node change).

Seriously, do your math people. I say it once more - RTX 5080 won't beat RTX 4090 in native (no DLSS and FG) because it lacks hardware resources.
We should come back to this discussion after reviews of 5080 are up. I don't have problem to admit that I was wrong WHEN I was wrong.

As for RX 9070 XT, no one is really expecting that it will go toe to toe with RX 7900 XTX, with RX 7900 XT maybe. (I'm not taking RT into account here.)
AMD clearly stated that they want to focus on making mainstream card for masses with vastly improved RT performance over previous generation.
I personally estimate for RX 9070 XT to be 5% below RX 7900XT but beating RX 7900XT in RT. Unfortunately, I don't give a f* about RT now.
RT may become a reasonable things when it will become less burdening on hardware, meaning ramping up RT will degrade performance as much as 15-20%.
Anything above that is just too much. My personal expectation is that AMD will move with RDNA4 from -60% perf. degradation to 30-35% degradation.
To be honest, even though the 4090 had almost 70% more cores, this doesn't mean that it had 70% more performance in games, in the same way the 5090 won't have 100% higher perf than the 5080 in this scenario.
The 4090 was really bottlenecked by memory bandwidth for games, and the 5080 has a bandwidth pretty similar to it, so the gap between those two may not be as big as the difference in SMs.
Will it be faster or equal in games? I don't know, reviews should reveal that once they're available, but I wouldn't be surprised if it does (in the same sense I wouldn't be in case it doesn't). Game perf is not really linear with either memory bandwidth nor compute units, so it's hard to estimate anything.
Posted on Reply
#425
Visible Noise
JustBenchingWhatever someone thinks about nvidias presentation, as bad as you think the cards they showed us are, they were good enough to make their competitor sound the retreat. But sure, let's once again be angry because nvidia is lying to us about the 5070 being a 4090, while completely ignoring the fact that the 5070 is good enough to make amd change whatever they had planned. Instead they wasted their time comparing a 17w chip to a 120w chip, because hey, that in no way is misleading.
On top of all this, the CEO couldn’t be bothered to show up to the single largest event of the year.
Posted on Reply
Add your own comment
Jan 9th, 2025 11:02 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts