Monday, January 6th 2025

NVIDIA 2025 International CES Keynote: Liveblog

NVIDIA kicks off the 2025 International CES with a bang. The company is expected to debut its new GeForce "Blackwell" RTX 5000 generation of gaming graphics cards. It is also expected to launch new technology, such as neural rendering, and DLSS 4. The company is also expected to highlight a new piece of silicon for Windows on Arm laptops, showcase the next in its Drive PX FSD hardware, and probably even talk about its next-generation "Blackwell Ultra" AI GPU, and if we're lucky, even namedrop "Rubin." Join us, as we liveblog CEO Jensen Huang's keynote address.

02:22 UTC: The show is finally underway!
02:35 UTC: CTA president Gary Shaprio kicks off the show, introduces Jensen Huang.
02:46 UTC: "Tokens are the building blocks of AI"

02:46 UTC: "Do you like my jacket?"
02:47 UTC: NVIDIA recounts progress all the way till NV1 and UDA.
02:48 UTC: "CUDA was difficult to explain, it took 6 years to get the industry to like it"
02:50 UTC: "AI is coming home to GeForce". NVIDIA teases neural material and neural rendering. Rendered on "Blackwell"
02:55 UTC: Every single pixel is ray traced, thanks to AI rendering.
02:55 UTC: Here it is, the GeForce RTX 5090.
03:20 UTC: At least someone is pushing the limits for GPUs.
03:22 UTC: Incredible board design.
03:22 UTC: RTX 5070 matches RTX 4090 at $550.
03:24 UTC: Here's the lineup, available from January.
03:24 UTC: RTX 5070 Laptop starts at $1299.
03:24 UTC: "The future of computer graphics is neural rendering"
03:25 UTC: Laptops powered by RTX Blackwell: staring prices:
03:26 UTC: AI has come back to power GeForce.
03:28 UTC: Supposedly the Grace Blackwell NVLink72.
03:28 UTC: 1.4 ExaFLOPS.
03:32 UTC: NVIDIA very sneakily teased a Windows AI PC chip.

03:35 UTC: NVIDIA is teaching generative AI basic physics. NVIDIA Cosmos, a world foundation model.
03:41 UTC: NVIDIA Cosmos is trained on 20 million hours of video.

03:43 UTC: Cosmos is open-licensed on GitHub.

03:52 UTC: NVIDIA onboards Toyota for its next generation EV for full-self driving.

03:53 UTC: NVIDIA unveils Thor Blackwell robotics processor.
03:53 UTC: Thor is 20x the processing capability of Orin.

03:54 UTC: CUDA is now a functional safe computer thanks to its automobile certifications.
04:01 UTC: NVIDIA brought a dozen humanoid robots to the stage.

04:07 UTC: Project DIGITS, is a shrunk down AI supercomputer.
04:08 UTC: NVIDIA GB110 "Grace-Blackwell" chip powers DIGITS.
Add your own comment

470 Comments on NVIDIA 2025 International CES Keynote: Liveblog

#151
Visible Noise
Cheeseball

The RTX 5070 and Ti comparison charts are unfair. They should be comparing it to the SUPER successors, especially since for both they were significant performance jumps from the non-SUPERs.
Are you really crying “unfair” about marketing slides?

Lol, don’t be a sucker for AMD. They aren’t your freind.
Posted on Reply
#152
oxrufiioxo
AusWolfIt's like comparing with different graphics settings. It's false data, there are no two words about it. I'm disgusted to the core.
I agree it does not belong on a benchmark graph... They are improving the 3 core technologies for all RTX owners though so that was probably my favorite slide lol. DLSS SR, DLAA, DLSS RR.


Still I want to see it in action and if it's better than OG Frame generation I just don't think I will care if I am getting 140fps with frame generation at 35ms or 240fps with Multi frame generation at 35ms lol it's going to feel the same and just look slightly smoother.
AusWolfWhy do you have to bring up AMD here?
I mean they both do this so it shouldn't be a surprise amd tries to fake it slightly less but the 5900XT/5800XT was especially bad trying to pass them off as equal to rocketlake in gaming. I am going to take Geralts stance on this one... Evil is evil. Lesser, greater, middling, it's all the same
Posted on Reply
#153
Dr. Dro
ScircuraI used the SVG markup on Nvidia's website to calculate their advertised gen-on-gen uplift.

ComparisonFar Cry 6 improvementA Plague Tale: Requiem improvement
5090 vs. 409027.5%43.2%
5080 vs. 408033.2%35.1%
5070 Ti vs. 4070 Ti33.2%35.1%
5070 vs. 407031.3%40.1%


Bar graphs for 5080 and 5070 Ti have exact same sizes for these two games. And who knows if any of these bars are just an "artist's rendition of performance improvements". But Nvidia won't put real numbers in their presentation so this is the best we've got...
W1zzard will have reviews ready in 3 weeks, don't beat yourself up over it
Posted on Reply
#154
evernessince
yfn_ratchetI was really only here for GeForce and all we get is a price and TOPS numbers? And a promise from the man who disappointed us last gen about how the 5070 "matches the 4090" and the 5090 is "double the performance of a 4090"? In what? In F#&*%^G WHAT? Room heating?! Avant garde sous vide recipes?! Generating an 8K image of a cat in a spacesuit?! I wanted specifics and all I got was vague ideas. I don't know how you can somehow become worse at presenting performance metrics than AMD or god forbid Intel. Christ.
There is zero chance you are generating an 8K cat image on a 5090 with only 32GB of VRAM unless you are using tiled rendering. FLUX Dev takes 24GB to do 1024x1024.

The focus this gen is firmly on AI. The updates to DLSS and FG are purely switches to newer models that Nvidia could easily do as a side project to it's AI R&D.
Visible NoiseRTX Mega Geometry is supposed to be the solution to that.
A 4090 is cpu limited a lot of the time. Nvidia knew this and are making changes to make sure it’s not a large issue for the 50 series. They really do think ahead.
The problem is that the 4090 is CPU limited a lot regardless of whether RT is enabled or not. Mega Geometry isn't supposed to solve the overhead problem in general, just tame some of it in regards to RT. Mega Geometry itself is just batching BVH updates on the GPU.
Posted on Reply
#155
Visible Noise
AusWolfWhy do you have to bring up AMD here?
Comparing features is relevant.
Posted on Reply
#156
oxrufiioxo
Dr. DroW1zzard will have reviews ready in 3 weeks, don't beat yourself up over it
eh got nothing better to do but it is interesting if the 3 lower tiers come close to matching farcry in every game pretty damn good but the 5090 would be mildly disappointing. Which would be interesting because going in the 3 lower cards looked meh with only the 5090 looking awesome...

My guess is the 3 lower cards will be slightly lower than that overall with the 5090 being slightly higher...
Visible NoiseComparing features is relevant.
Not when We want to see actual performance improvements and not how much frame generation X3 improves stuff if all they were announcing was MF Frame generation and showing us how it is different I agree but these are actual gpu's and this is just one thing they do and to focus on that one thing would make anyone thing they are hiding actual RT/Raster gains gen on gen.
Posted on Reply
#157
wolf
Better Than Native
AusWolfI don't think it is.
It's off topic so I'll leave it at agree to disagree, because I absolutely can't and won't concede that said marketing slides weren't absolutely heinous and worse than what I'm seeing here.

The common ground between both? Lets see what reputable reviews get with their 'here and now' benchmark suites of expected titles.
Posted on Reply
#158
theglaze
Pricing is make believe, like performance:

Posted on Reply
#159
AusWolf
oxrufiioxoI agree it does not belong on a benchmark graph... They are improving the 3 core technologies for all RTX owners though so that was probably my favorite slide lol. DLSS SR, DLAA, DLSS RR.


Still I want to see it in action and if it's better than OG Frame generation I just don't think I will care if I am getting 140fps with frame generation at 35ms or 240fps with Multi frame generation at 35ms lol it's going to feel the same and just look slightly smoother.
Exactly my point. 140 or 240 FPS doesn't matter. 20 or 35 would matter, but that's not a possibility.
oxrufiioxoI mean they both do this so it shouldn't be a surprise amd tries to fake it slightly less but the 5900XT/5800XT was especially bad trying to pass them off as equal to rocketlake in gaming. I am going to take Geralts stance on this one... Evil is evil. Lesser, greater, middling, it's all the same
Well, I could use a 1050 Ti with my 35-Watt Haswell i7, or with my 7800X3D, it wouldn't matter, would it? Low-end GPUs don't care what CPUs you're using. It's just common knowledge. Trying to sell it as useful info is idiotic, but laughable at best. Trying to sell a product based on false data, though, is malicious and unethical towards the buyer base.
Posted on Reply
#160
igormp
evernessinceThere is zero chance you are generating an 8K cat image on a 5090 with only 32GB of VRAM unless you are using tiled rendering. FLUX Dev takes 24GB to do 1024x1024.
That's at FP16, at Q4 you need less than 8GB of VRAM.
A 5090 with its 32GB and FP4 support could do 4096x4096 blazing fast. If you don't give a damn about quality, Q2 should be able to do 8k.
Posted on Reply
#161
Wasteland
oxrufiioxoI'm guessing 15% up and down the stack give or take 5% and 35-45% for the 5090... Besides the 5090 all of them are cheaper though the 5080 is almost 300 usd cheaper when accounting for inflation than the 4080 was...
I have mixed feelings. On the one hand Blackwell pricing seems surprisingly sane. AMD's 9070s will have a very hard time competing.

On the other hand you're still paying a minimum of $750 for a 16 GB frame buffer, and it looks like the perf uplift on non-90 cards will be largely inconsequential. Framegen 4x is firmly, at least to me, in the "cool but not useful" category. It does make one wonder what Nvidia will do about their first-party benchmarks going forward, though--are they just going to keep increasing the framegen multiplier to juice the charts? Framegen 8x in two years? Framegen 16x in four years? At some point, the rollercoaster has to end, doesn't it? They can't keep finding ways to spin 100+% gen-on-gen gains out of nothing ... right? Right?

Anyway, it's been a fun thread. I think it's time for me to go enjoy my super-top-secret futuristic zero-input 24x Framegen tech, though. (It's a movie.)
Posted on Reply
#162
AusWolf
Visible NoiseComparing features is relevant.
So you don't care about Nvidia's products. You only care that they're better than AMD. Ok.
Posted on Reply
#163
oxrufiioxo
wolfIt's off topic so I'll leave it at agree to disagree, because I absolutely can't and won't concede that said marketing slides weren't absolutely heinous and worse than what I'm seeing here.

The common ground between both? Lets see what reputable reviews get with their 'here and now' benchmark suites of expected titles.
The only difference in my book is that if you Turn on DLSS4 you will get the % they claim but as I said above showing one thing a gpu does that a previous generation doesn't do makes it look like they are hiding real performance gains.
AusWolfWell, I could use a 1050 Ti with my 35-Watt Haswell i7, or with my 7800X3D, it wouldn't matter, would it? Low-end GPUs don't care what CPUs you're using. It's just common knowledge. Trying to sell it as useful info is idiotic, but laughable at best. Trying to sell a product based on false data, though, is malicious and unethical towards the buyer base.
It's technically not false if you turn it on and it does what it shows though. False would be them showing 1000 fps but when reviewers get the product it maxes at 200 fps.....
Posted on Reply
#164
Visible Noise
AusWolfSo you don't care about Nvidia's products. You only care that they're better than AMD. Ok.
How did you get here from what I wrote?
Posted on Reply
#165
Cheeseball
Not a Potato
Visible NoiseAre you really crying “unfair” about marketing slides?

Lol, don’t be a sucker for AMD. They aren’t your freind.
Brother look at NVIDIA's 50-series comparison chart and the varying settings between the 40 and 50 series in the footnote.

I don't know why you keep mentioning AMD here. NVIDIA is not a friend too, especially with the misleading marketing.
igormp40 series does not support FP4, it's a new feature from blackwell (it was already known given the GB200 is already in production)
Is FP8 not doable on the RTX 50 due to architectural changes? Why can't they just run the RX 50 series in FP8 quantization for a fair comparison instead?
Posted on Reply
#166
AusWolf
oxrufiioxoIt's technically not false if you turn it on and it does what it shows though. False would be them showing 1000 fps but when reviewers get the product it maxes at 200 fps.....
Nobody reviews with FG on because it makes the data incomparable. But that's not a problem here, apparently. I wonder why they don't turn all graphics settings to low on the 50 series while they're at it. It would make them look even better.
Visible NoiseHow did you get here from what I wrote?
We're trying to discuss the Nvidia keynote here, and just go "AMD this, AMD that, Nvidia is so much better". There's no need for that kind of penis measuring contest.
Posted on Reply
#167
wolf
Better Than Native
AusWolfI wonder why they don't turn all graphics settings to low on the 50 series while they're at it. It would make them look even better.
I think we get it man, how much can you just beat the same drum over and over? you've made your point several times now in this thread to the point where frankly seeing you reply it over and over is becoming boring.
Posted on Reply
#168
oxrufiioxo
AusWolfNobody reviews with FG on because it makes the data incomparable. But that's not a problem here, apparently. I wonder why they don't turn all graphics settings to low on the 50 series while they're at it. It would make them look even better.
I think we are in agreement that they should not use frame generation in comparison benchmarks. Especially a 2x vs 4x because in that scenario they are not showing how good the cards are but how much they've improved frame generation.

I'm not a fan. That doesn't make the data false but it's only showing potentially how much better frame generation is not how good these cards are.
wolfI think we get it man, how much can you just beat the same drum over and over? you've made your point several times now in this thread to the point where frankly seeing you reply it over and over is becoming boring.
I agree with him for the most part though this shouldn't be a thing unless there only goal was to show us how FG improved generation over generation..... If this is the best thing the new cards do that kinda sucks after 2 years. I don't think most people are going to be great we got better frame generation now with 4x the frames vs 2x the frames.... You have to remember the average consumer is going to think 240fps native is the same as 240fps framgen and Nvidia knows that.
Posted on Reply
#169
AusWolf
wolfI think we get it man, how much can you just beat the same drum over and over? you've made your point several times now in this thread to the point where frankly seeing you reply it over and over is becoming boring.
Ah so you arguing your point to death is fine, but me replying is boring. Sure, then, have it your way, everything Nvidia does is for the consumer, frame generation is amazing and lying is not a bad thing if you make money on the hype generated by it. Better?
Posted on Reply
#170
nguyen
So DLSS4 means that there will be new DLLs for Super Res, Ray Reconstruction, Reflex 2.0 and Frame Generation to improve visuals and performance for previous RTX GPUs.

Meanwhile RTX 5000 takes it up a notch with Multi Frame Generation

Still can't find info about games that will feature Neural rendering though
Posted on Reply
#171
Visible Noise
AusWolfNobody reviews with FG on because it makes the data incomparable. But that's not a problem here, apparently. I wonder why they don't turn all graphics settings to low on the 50 series while they're at it. It would make them look even better.


We're trying to discuss the Nvidia keynote here, and just go "AMD this, AMD that, Nvidia is so much better". There's no need for that kind of penis measuring contest.
Again, I’m comparing features that were announced in the Nvidia keynote. Just because you don’t like that AMD doesn’t have them doesn’t mean the differences shouldn’t be pointed out.

Of course people review with FG. I hate to tell you but it’s a fundamental gaming graphics technology now. UE5 is literally built around it and image reconstruction. Sorry if your team is behind, but those are the facts.
Posted on Reply
#172
igormp
mrnagantWhat is Nvidia basing the AI TOPS value off of? FP8?
oxrufiioxoFP4 I think but I could be wrong.... Sad we only have Farcry to give us some idea of performance lmao fing Nvidia...
It's INT4. The standard for TOPS in most places is INT4.
CheeseballIs FP8 not doable on the RTX 50 due to architectural changes? Why can't they just run the RX 50 series in FP8 quantization for a fair comparison instead?
IMO it's pointless to compare at FP8. When tensor cores came out everyone started doing mixed-precision because it was so much faster and used half the memory.
Same goes for smaller data types.

Anyhow, Blackwell's FP4 rate is double the value of its FP8 perf, and 4x of the FP16 rate, so you can just figure those out by yourself.
Posted on Reply
#173
oxrufiioxo
AusWolfAh so you arguing your point to death is fine, but me replying is boring. Sure, then, have it your way, everything Nvidia does is for the consumer, frame generation is amazing and lying is not a bad thing if you make money on the hype generated by it. Better?
Bottom line we have 1 game without frame generation to go by and it looks promising 30% up and down the stack although that would only make the 5070 like 10-15% faster than the 4070 super.... so meh but other than that not bad.

We all know they improved frame generation and for it's fans good for them.
nguyenSo DLSS4 means that there will be new DLLs for Super Res, Ray Reconstruction, Reflex 2.0 and Frame Generation to improve visuals and performance for previous RTX GPUs.

Meanwhile RTX 5000 takes it up a notch with Multi Frame Generation

Still can't find info about games that will feature Neural rendering though
It's going to take time unless Developers can do it the old way and new way without any extra work most games will still be developed on PS5 first afterall and then PS6.... I found it to be the most interesting thing.
Posted on Reply
#174
wolf
Better Than Native
AusWolfAh so you arguing your point to death is fine, but me replying is boring. Sure, then, have it your way, everything Nvidia does is for the consumer, frame generation is amazing and lying is not a bad thing if you make money on the hype generated by it. Better?
Don't put words in my mouth, I know you're better than that.
Posted on Reply
#175
AusWolf
Visible NoiseAgain, I’m comparing features that were announced in the Nvidia keynote. Just because you don’t like that AMD doesn’t have them doesn’t mean the differences shouldn’t be pointed out.
No, I don't like the features, and I don't like that they're being used to compare performance to last gen that doesn't have them.
Visible NoiseOf course people review with FG. I hate to tell you but it’s a fundamental gaming graphics technology now. UE5 is literally built around it and image reconstruction. Sorry if your team is behind, but those are the facts.
Nobody reviews with FG on. Period. And I don't have a team - I have had lots of AMD and Nvidia (and even ATi) cards through the years and loved them all. I still have way more Nvidia and Intel hardware than AMD. This is not a battlefield.
Posted on Reply
Add your own comment
Jan 23rd, 2025 15:10 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts