Monday, January 6th 2025

NVIDIA 2025 International CES Keynote: Liveblog

NVIDIA kicks off the 2025 International CES with a bang. The company is expected to debut its new GeForce "Blackwell" RTX 5000 generation of gaming graphics cards. It is also expected to launch new technology, such as neural rendering, and DLSS 4. The company is also expected to highlight a new piece of silicon for Windows on Arm laptops, showcase the next in its Drive PX FSD hardware, and probably even talk about its next-generation "Blackwell Ultra" AI GPU, and if we're lucky, even namedrop "Rubin." Join us, as we liveblog CEO Jensen Huang's keynote address.

02:22 UTC: The show is finally underway!
02:35 UTC: CTA president Gary Shaprio kicks off the show, introduces Jensen Huang.
02:46 UTC: "Tokens are the building blocks of AI"

02:46 UTC: "Do you like my jacket?"
02:47 UTC: NVIDIA recounts progress all the way till NV1 and UDA.
02:48 UTC: "CUDA was difficult to explain, it took 6 years to get the industry to like it"
02:50 UTC: "AI is coming home to GeForce". NVIDIA teases neural material and neural rendering. Rendered on "Blackwell"
02:55 UTC: Every single pixel is ray traced, thanks to AI rendering.
02:55 UTC: Here it is, the GeForce RTX 5090.
03:20 UTC: At least someone is pushing the limits for GPUs.
03:22 UTC: Incredible board design.
03:22 UTC: RTX 5070 matches RTX 4090 at $550.
03:24 UTC: Here's the lineup, available from January.
03:24 UTC: RTX 5070 Laptop starts at $1299.
03:24 UTC: "The future of computer graphics is neural rendering"
03:25 UTC: Laptops powered by RTX Blackwell: staring prices:
03:26 UTC: AI has come back to power GeForce.
03:28 UTC: Supposedly the Grace Blackwell NVLink72.
03:28 UTC: 1.4 ExaFLOPS.
03:32 UTC: NVIDIA very sneakily teased a Windows AI PC chip.

03:35 UTC: NVIDIA is teaching generative AI basic physics. NVIDIA Cosmos, a world foundation model.
03:41 UTC: NVIDIA Cosmos is trained on 20 million hours of video.

03:43 UTC: Cosmos is open-licensed on GitHub.

03:52 UTC: NVIDIA onboards Toyota for its next generation EV for full-self driving.

03:53 UTC: NVIDIA unveils Thor Blackwell robotics processor.
03:53 UTC: Thor is 20x the processing capability of Orin.

03:54 UTC: CUDA is now a functional safe computer thanks to its automobile certifications.
04:01 UTC: NVIDIA brought a dozen humanoid robots to the stage.

04:07 UTC: Project DIGITS, is a shrunk down AI supercomputer.
04:08 UTC: NVIDIA GB110 "Grace-Blackwell" chip powers DIGITS.
Add your own comment

435 Comments on NVIDIA 2025 International CES Keynote: Liveblog

#176
Visible Noise
nguyenSo DLSS4 means that there will be new DLLs for Super Res, Ray Reconstruction, Reflex 2.0 and Frame Generation to improve visuals and performance for previous RTX GPUs.

Meanwhile RTX 5000 takes it up a notch with Multi Frame Generation

Still can't find info about games that will feature Neural rendering though
Neural render I think is a long ways out. Here’s Microsoft’s post about it.

devblogs.microsoft.com/directx/enabling-neural-rendering-in-directx-cooperative-vector-support-coming-soon/
Posted on Reply
#177
AusWolf
wolfDon't put words in my mouth, I know you're better than that.
I'm not doing that. I'm just saying "your argument is getting boring" is not a very elegant way to close a discussion.
Posted on Reply
#178
oxrufiioxo
Visible NoiseAgain, I’m comparing features that were announced in the Nvidia keynote. Just because you don’t like that AMD doesn’t have them doesn’t mean the differences shouldn’t be pointed out.

Of course people review with FG. I hate to tell you but it’s a fundamental gaming graphics technology now. UE5 is literally built around it and image reconstruction. Sorry if your team is behind, but those are the facts.
Regardless of how anyone feels about frame generation that is all Nvidia showed for the most part throwing in a couple of apples to apples to see what we are actually getting generation to generation with actual natively rendered frames wouldn't have been the end of the world...

I hate using frame generation it's flaws are way too obvious on a monitor for me but if it's massively improved awesome but until native 240fps is the same latency as 240 frame generation with no artifacts benchmarks should only show natively rendered frames.
Posted on Reply
#179
Visible Noise
AusWolfNo, I don't like the features, and I don't like that they're being used to compare performance to last gen that doesn't have them.


Nobody reviews with FG on. Period. And I don't have a team - I have had lots of AMD and Nvidia (and even ATi) cards through the years and loved them all. I still have way more Nvidia and Intel hardware than AMD. This is not a battlefield.
Digital Foundry for one.

There goes your nobody argument.
oxrufiioxoRegardless of how anyone feels about frame generation that is all Nvidia showed for the most part throwing in a couple of apples to apples to see what we are actually getting generation to generation with actual natively rendered frames wouldn't have been the end of the world...

I hate using frame generation it's flaws are way too obvious on a monitor for me but if it's massively improved awesome but until native 240fps is the same latency as 240 frame generation with no artifacts benchmarks should only show natively rendered frames.
I’ve rarely seen a flaw with FG. I guess I’m too busy playing games to try to peep a flickering pixel somewhere.
Posted on Reply
#180
AusWolf
Visible NoiseDigital Foundry for one.

There goes your nobody argument.
Ah one. Perfect!

I'd still argue that data is irrelevant because it's not valid for comparison.
Posted on Reply
#181
oxrufiioxo
Visible NoiseDigital Foundry for one.

There goes your nobody argument.



I’ve rarely seen a flaw with FG. I guess I’m too busy playing games to try to peep a flickering pixel somewhere.
They weren't too bad on my LG G2 but on my Samsung G8 ultrawide even with high ish base frame rate they are easy to see in every game.... Maybe as I get older and my eyesight worsens they will be less noticeable....
Posted on Reply
#182
AusWolf
Visible NoiseI’ve rarely seen a flaw with FG. I guess I’m too busy playing games to try to peep a flickering pixel somewhere.
Have you tried using it on a game that ran at 20 FPS without it?
Posted on Reply
#183
oxrufiioxo
AusWolfHave you tried using it on a game that ran at 20 FPS without it?
To be fair both AMD and Nvidia Recommend 60fps+ and the CP results they showed were at 71 with DLSS SR So right around what you want to be at a min.
Posted on Reply
#184
Visible Noise
AusWolfAh one. Perfect!

I'd still argue that data is irrelevant because it's not valid for comparison.
Lol, I got whiplash from trying to watch how fast those goal posts moved.

Anyway, meetings in the morning. Good night.
Posted on Reply
#185
AusWolf
oxrufiioxoTo be fair both AMD and Nvidia Recommend 60fps+ and the CP results they showed were at 71 with DLSS SR So right around what you want to be at a min.
That's why I'm saying it's pointless. 60 FPS is pretty smooth in my books, I don't need to make 100 out of it.
Visible NoiseLol, I got whiplash from trying to watch how fast those goal posts moved.

Anyway, meetings in the morning. Good night.
My goal post never moved. Frame generation is useless, and frame generation enabled data is not valid for comparison. That's what I've been saying all along.
Posted on Reply
#186
Visible Noise
oxrufiioxoThey weren't too bad on my LG G2 but on my Samsung G8 ultrawide even with high ish base frame rate they are easy to see in every game.... Maybe as I get older and my eyesight worsens they will be less noticeable....
I game on LG Ultragear. 2019 and 2022 vintage I think.
Posted on Reply
#187
oxrufiioxo
AusWolfThat's why I'm saying it's pointless. 60 FPS is pretty smooth in my books, I don't need to make 100 out of it.
I hear where your coming from and we agree Nvidia needs to show both the FG with the native data but it's a feature people seem to like and if Nvidia sells a ton because of it, it is what it is....
Visible NoiseI game on LG Ultragear. 2019 and 2022 vintage I think.
Don't get me wrong man I like the idea of frame generation and want to see it get better and better but I just would have loved to see CP path traced DLSS quality vs Dlss quality at 4k or Alan Wake or even indiana jones with the FG on the side showing it's improvements.
Posted on Reply
#188
Visible Noise
AusWolfNobody reviews with FG on. Period.
AusWolfMy goal post never moved. Frame generation is useless,
Are you trying to convince me or yourself?
Posted on Reply
#189
Cheeseball
Not a Potato
AusWolfThat's why I'm saying it's pointless. 60 FPS is pretty smooth in my books, I don't need to make 100 out of it.


My goal post never moved. Frame generation is useless, and frame generation enabled data is not valid for comparison. That's what I've been saying all along.
Hmm I mean thats what makes FG appealing is because you can squeeze more performance out of it at the expense of image quality (and latency). Even though I am a strict native rendering-only user, I can see why these image enhancing and upscaling (and downscaling) techniques are appealing to gamers.
Posted on Reply
#190
AusWolf
oxrufiioxoI hear where your coming from and we agree Nvidia needs to show both the FG with the native data but it's a feature people seem to like and if Nvidia sells a ton because of it, it is what it is....
I don't mind if it sells because people love it. I just wanted to see apples-to-apples comparison with the 40 series to make it fair.
Visible NoiseAre you trying to convince me or yourself?
I don't need to convince myself. I've seen it in work and it was either pointless or crap.
Posted on Reply
#191
oxrufiioxo
AusWolfI don't mind if it sells because people love it. I just wanted to see apples-to-apples comparison with the 40 series to make it fair.
Same
CheeseballHmm I mean thats what makes FG appealing is because you can squeeze more performance out of it at the expense of image quality. Even though I am strict native rendering-only user, I can see why these image enhancing and upscaling (and downscaling) techniques are appealing to gamers.
Back when I was gaming at 4k on a 65 inch G2 from about 10 feet away with a controller in single play games I thought it was pretty great honestly and if I still gamed like that I would use it but since swapping to an Samsung G8 UW oled I only want to use DLAA.... It kinda depends on how you game.
Posted on Reply
#192
igormp
igormpForget the 5090, I really want that Digits thingie now



If it goes for $2k or less I'll insta buy one (even though I doubt it goes for that cheap)
nvidianews.nvidia.com/news/nvidia-puts-grace-blackwell-on-every-desk-and-at-every-ai-developers-fingertips

$3k, and there goes away my hopes and dreams.
Honestly it's not a bad price given that it's cheaper than an equivalent Apple product, but it's still too much for what would basically be a toy for me.
Posted on Reply
#193
wolf
Better Than Native

if I'm interpreting that correctly, a new drop in DLL will enable the new "Transformer" model giving considerable IQ improvements. Neat they also lowered the memory footprint of FG and made it give a bigger uplift too.
Posted on Reply
#194
Pumper
5070 equal to 4090, but the DLSS4 chart is showing the 5090 native 4K in Alan Wake to be around 30FPS, which is exactly what the 4090 is doing. So I guess 5070 is equal to 5090? lol
Posted on Reply
#195
DemonicRyzen666
This feels exactly like the 3,000 series release after the lack luster 2,000 series.
Making up for last generations (2,000) terrble price points.
Splashing another d.l.s.s & some more A.I bullcrap.
Its alsmo exqctly like that one guy said

2000 bad prices
3000 good prices
4000 bad price
5000 good prices

The game for nvidia is faking the supply limiation to drive prices to insanity
Posted on Reply
#196
N/A
this is a small tweak to the 40 series with GDDR7 and lower accuracy DLSS for 2x output. 3080 was at least 50% faster. Not accounting for FG 5080 is only 20%.
another 25% missing. 5090 needs to shrink to N3 and become the 5080 for this to hold true.
Posted on Reply
#197
oxrufiioxo
DemonicRyzen666This feels exactly like the 3,000 series release after the lack luster 2,000 series.
Making up for last generations (2,000) terrble price points.
Splashing another d.l.s.s & some more A.I bullcrap.
Its alsmo exqctly like that one guy said

2000 bad prices
3000 good prices
4000 bad price
5000 good prices

The game for nvidia is faking the supply limiation to drive prices to insanity
Until we actually see the gen on gen performance gains in a wide variety of games it's hard to know how good the prices are though it's looking like 20-30% across the board but we only have one benchmark to actually compare.
Posted on Reply
#198
remekra
It's more DLSS4 announcement rather than RTX50 announcement. Looking at the official specs of 5080 for example, only major upgrade is GDDR7 and the bandwidth that it brings.
So we are stagnant at the shader performance (minor upgrades) and only working on RT cores and more importantly on Tensor cores.
At least they didn't price it at 1500$. So the pricing strategy works, ask for 1200$ for 4080, then lower 5080 to 999$ and suddenly you get praise that the prices are good this time around.
We will see if the prices will really be 999$, FE's probably will go out of stock in 5 minutes and then it's a rodeo for AIB and sellers to price it at whatever they want.
Posted on Reply
#199
sbacc
These prices aren't as bad as I thought, still 2000usd for a 5090 is very meh and for 1000usd the 5080 is still a 256bit bus card...

The perf improvement, that Nvidia is showing in their graph, is... meh as well. I am not impressed, especially since they compare it only against the awfull (except the 4090) first batch of the 4000 serie cards and not the much better (but still just OK tbh) SUPER refresh...

I miss the Nvidia of the past, before that whole RTX "revolution". One more year, one more generation that is meh as F.
Posted on Reply
#200
remekra
I also saw in a video that the MFG of DLSS4 is a driver toogle. So I can get that basically now on AMD if I enable FSR3.1 in game FG and add AFMF in the driver :roll:
Don't get me wrong, it will look shitty but to be honest only thing AMD needs to compete with this is their improved FSR4 image scaler and then just combine FSR FG with AFMF.
Then they can claim that 9070XT is 2X faster than 7900XTX.
I'm dissapointed in both presentations, but hey at least Nvidia actually presented some GPUs.
Posted on Reply
Add your own comment
Jan 8th, 2025 22:17 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts