Monday, January 6th 2025

NVIDIA 2025 International CES Keynote: Liveblog

NVIDIA kicks off the 2025 International CES with a bang. The company is expected to debut its new GeForce "Blackwell" RTX 5000 generation of gaming graphics cards. It is also expected to launch new technology, such as neural rendering, and DLSS 4. The company is also expected to highlight a new piece of silicon for Windows on Arm laptops, showcase the next in its Drive PX FSD hardware, and probably even talk about its next-generation "Blackwell Ultra" AI GPU, and if we're lucky, even namedrop "Rubin." Join us, as we liveblog CEO Jensen Huang's keynote address.

02:22 UTC: The show is finally underway!
02:35 UTC: CTA president Gary Shaprio kicks off the show, introduces Jensen Huang.
02:46 UTC: "Tokens are the building blocks of AI"

02:46 UTC: "Do you like my jacket?"
02:47 UTC: NVIDIA recounts progress all the way till NV1 and UDA.
02:48 UTC: "CUDA was difficult to explain, it took 6 years to get the industry to like it"
02:50 UTC: "AI is coming home to GeForce". NVIDIA teases neural material and neural rendering. Rendered on "Blackwell"
02:55 UTC: Every single pixel is ray traced, thanks to AI rendering.
02:55 UTC: Here it is, the GeForce RTX 5090.
03:20 UTC: At least someone is pushing the limits for GPUs.
03:22 UTC: Incredible board design.
03:22 UTC: RTX 5070 matches RTX 4090 at $550.
03:24 UTC: Here's the lineup, available from January.
03:24 UTC: RTX 5070 Laptop starts at $1299.
03:24 UTC: "The future of computer graphics is neural rendering"
03:25 UTC: Laptops powered by RTX Blackwell: staring prices:
03:26 UTC: AI has come back to power GeForce.
03:28 UTC: Supposedly the Grace Blackwell NVLink72.
03:28 UTC: 1.4 ExaFLOPS.
03:32 UTC: NVIDIA very sneakily teased a Windows AI PC chip.

03:35 UTC: NVIDIA is teaching generative AI basic physics. NVIDIA Cosmos, a world foundation model.
03:41 UTC: NVIDIA Cosmos is trained on 20 million hours of video.

03:43 UTC: Cosmos is open-licensed on GitHub.

03:52 UTC: NVIDIA onboards Toyota for its next generation EV for full-self driving.

03:53 UTC: NVIDIA unveils Thor Blackwell robotics processor.
03:53 UTC: Thor is 20x the processing capability of Orin.

03:54 UTC: CUDA is now a functional safe computer thanks to its automobile certifications.
04:01 UTC: NVIDIA brought a dozen humanoid robots to the stage.

04:07 UTC: Project DIGITS, is a shrunk down AI supercomputer.
04:08 UTC: NVIDIA GB110 "Grace-Blackwell" chip powers DIGITS.
Add your own comment

446 Comments on NVIDIA 2025 International CES Keynote: Liveblog

#276
Vya Domus
JustBenchingFG doesn't necessarily double your framerate, therefore I assume the new FG isn't going to quadruple them either, so whatever comparison you are trying to make from that screenshot are wrong.
Yes it does double the framerate, that's the whole point, it inserts a frame between every 2 rendered ones thus doubling it. The scaling isn't 100% because it has a cost per interpolated frame.

I think we're entering the realm of some serious coping right here, 2X FG had ~90-95% scaling and clearly 4X has similar scaling as well from 2X perhaps ever so slightly worse, 85-90%. 5080 with no FG is barely any faster than a 4080 Super in this game, the writing is on the wall.
Posted on Reply
#277
JustBenching
Vya DomusYes it does double the framerate, that's the whole point, it inserts a frame between every 2 rendered ones thus doubling it. The scaling isn't 100% because it has a cost per interpolated frame.

I think we're entering the realm of some serious coping right here, 2X FG had ~90-95% scaling and clearly 4X has similar scaling as well from 2X perhaps ever so slightly worse, 85-90%. 5080 with no FG is barely any faster than a 4080 Super in this game, the writing is on the wall.
Clearly you haven't used it. It doesn't always translate to a 90-95% scaling. Especially at 4k im usually seeing 40-70% scaling. EG Ghost of tsushima, which was the most recent game I played with FG.

What would I be coping about? Im just explaining to you how the thing works. Whatever
Posted on Reply
#278
chstamos
Here's a more general question. Is there a point of GPU price inflation at which console makers start reexamining going outside , or developing their own custom graphics? They don't have to account for backwards compatibility after all, and that's a big barrier to entry in discrete graphics that they wouldn't have to deal with. Or are the designs simply too complex for such a project to have any hope of succeeding?

I know it sounds like craziness, but Apple isn't doing all that bad with their imagination technologies derived graphics silicon, after all.
Posted on Reply
#279
Macro Device
chstamosHere's a more general question. Is there a point of GPU price inflation at which console makers start reexamining going outside , or developing their own custom graphics? They don't have to account for backwards compatibility after all, and that's a big barrier to entry in discrete graphics that they wouldn't have to deal with. Or are the designs simply too complex for such a project to have any hope of succeeding?

I know it sounds like craziness, but Apple isn't doing all that bad with their imagination technologies derived graphics silicon, after all.
It's us average Joes who go to local Micro Centers, BestBuys, Amazons, eBays, AliExpresses and the sorts and pay $20 for a GPU, $30 for the cooling, $600 for NV tax, $50 for ASUS tax and $100 in other taxes so that a die that cost $20 for NVIDIA to print ends up an 8-hunnit retail price GPU. Console developers bulk on chips on much merrier terms. There is no way they start inventing their own GPUs because first off, reaching RTX 2000 series in terms of performance is already extremely problematic if not impossible for a complete dGPU market newbie, and cakes are coming in cheap anyway.

Apple are way more experienced in this regard than any other "non-GPU" player out here.

On topic: I expected crystal clear vast nothingness from this Blackwell generation but it slightly proved me wrong as prices aren't THAT insane. I assume 5070 Ti will make short work of 4080 in virtually every scenario. 15% IPC gains will be enough for this GPU to become my likely purchase as it'll crawl dangerously close to 8 times the RT performance I have now. And since my pure raster needs are basically covered by whatever GPU beefier than 3080 it's totally a hmmmmmmmm.
Posted on Reply
#280
JustBenching
Macro DeviceIt's us average Joes who go to local Micro Centers, BestBuys, Amazons, eBays, AliExpresses and the sorts and pay $20 for a GPU, $30 for the cooling, $600 for NV tax, $50 for ASUS tax and $100 in other taxes so that a die that cost $20 for NVIDIA to print ends up an 8-hunnit retail price GPU. Console developers bulk on chips on much merrier terms. There is no way they start inventing their own GPUs because first off, reaching RTX 2000 series in terms of performance is already extremely problematic if not impossible for a complete dGPU market newbie, and cakes are coming in cheap anyway.

Apple are way more experienced in this regard than any other "non-GPU" player out here.

On topic: I expected crystal clear vast nothingness from this Blackwell generation but it slightly proved me wrong as prices aren't THAT insane. I assume 5070 Ti will make short work of 4080 in virtually every scenario. 15% IPC gains will be enough for this GPU to become my likely purchase as it'll crawl dangerously close to 8 times the RT performance I have now. And since my pure raster needs are basically covered by whatever GPU beefier than 3080 it's totally a hmmmmmmmm.
Short work? I wouldn't bet on that, I think the 70ti will be close to the 4080
Posted on Reply
#281
Macro Device
JustBenchingShort work? I wouldn't bet on that, I think the 70ti will be close to the 4080
70 VS 76 SM (slight disadvantage) but higher clocks, possibly higher IPC and significantly higher VRAM bandwidth at a lower price and lower TGP will mean it's an overall better GPU. Of course it's very subtle and sometimes one'll need a microscope to see a performance difference but all in all, it's more interesting. Especially considering overbuilt coolers, my 1 kW PSU and an absolute crap ton of cold days per year. If possible to OC beyond 3200 MHz on air then it's awesome. Expensive but awesome.
Posted on Reply
#283
Vya Domus
JustBenchingClearly you haven't used it. It doesn't always translate to a 90-95% scaling. Especially at 4k im usually seeing 40-70% scaling. EG Ghost of tsushima, which was the most recent game I played with FG.

What would I be coping about? Im just explaining to you how the thing works. Whatever
I think the problem is most of you just don't know math, that's why everybody is so mystified. In reality we have all the information we need.



This is from the same video, 580% to 1000% is a ~72% increase, that's the scaling from 2X to 4X, from the previous screenshot I posted in order to reach a final percentage of 185% on 4X the starting value must be around ~108%.

This puts the 5080 at a meager ~10-8% faster with 2X FG vs 4080 Super.
Posted on Reply
#284
JustBenching
Vya DomusI think the problem is most of you just don't know math, that's why everybody is so mystified. In reality we have all the information we need.



This is from the same video, 580% to 1000% is a ~72% increase, that's the scaling from 2X to 4X, from the previous screenshot I posted in order to reach a final percentage of 185% on 4X the starting value must be around ~108%.

This puts the 5080 at a meager ~10-8% faster with 2X FG vs 4080 Super.
Ok bud
Posted on Reply
#285
remekra
Vya DomusI think the problem is most of you just don't know math, that's why everybody is so mystified. In reality we have all the information we need.



This is from the same video, 580% to 1000% is a ~72% increase, that's the scaling from 2X to 4X, from the previous screenshot I posted in order to reach a final percentage of 185% on 4X the starting value must be around ~108%.

This puts the 5080 at a meager ~10-8% faster with 2X FG vs 4080 Super.
And that's in a game that heavily utilizes RT Cores. What will happen in games that do not have RT or have light implementation.
Posted on Reply
#286
Chrispy_
Vya DomusIn the meantime DF released a video essentially confirming basically all of that performance comes from the 4X FG.

it's slower. Twice as many fake frames, not twice as many frames per second

Net result, input lag gets even worse, and input lag is the main reason people don't like fake frames in the first place. Not the only reason, but definitely the main one.

More seriously, of the thousand or so demanding titles from the last half decade, only a tiny tiny handful (under 50) actually even support Nvidia's frame-gen.
Posted on Reply
#287
wNotyarD
Chrispy_More seriously, of the thousand or so demanding titles from the last half decade, only a tiny tiny handful (under 50) actually even support Nvidia's frame-gen.
DLSS4 MFG will be a driver-level toggle, won't it? Will it only apply to whatever game already has DLSS3 FG enabled, or will it work over anything like AFMF does?
Posted on Reply
#288
3x0
wNotyarDDLSS4 MFG will be a driver-level toggle, won't it? Will it only apply to whatever game already has DLSS3 FG enabled, or will it work over anything like AFMF does?
It's based on DLSS3 frame gen, it's not universal.
Posted on Reply
#289
LittleBro
Dr. DroArchitecture-level improvements. And 40 series are not being deprived of any feature, they just won't support frame generation at factors above 2x...
That's the thing - architectural level inprovements are non existent. Blackwell is shrinked Ada on steroids. Brute force. Nvidia added as much new compute units as was possible and tried to balance it power-wise.

You can tell from 5090's specs that efficiency is also a problem now. 4090 has 5000 shaders less but also much lower TGP than 5090. Were there any significant architectural changes, it would not end like that. Nvidia brute forced everything towards so called AI features (DLSS, FG). Jensen already stated before that this is the only way for new stage of gaming. I have my doubts, though.

As for new DLSS, please, don't say that 4000 series will be deprived of nothing and basically they just won't support something here, something there, there and also there and god knows where epse as well. RTX 4090 is surely capable (hardware-wise) for new DLSS tech when slower 5080 is capable (and anything below 5080 as well). Or change my mind, give me one real reason why 4090 would not be capable.

RTX 5080 will not beat RTX 4090 in native. Because:
- not enough computing power
- that would negatively affect 4090 sales which is Jensen's golden goose, they can't just release something more powerful and price it 20-30% less, or else they would cripple their own sales
- there will be RTX 5080 Ti with around 14k shaders and this one maybe will be on par with 4090

Performance-wise, from best:
RTX 5090
RTX 4090
RTX 5080 Ti (Super) with around 400W TGP
RTX 5080
RTX 5070 Ti (Super)
RTX 5070
Posted on Reply
#290
oxrufiioxo
JustBenchingFG doesn't necessarily double your framerate, therefore I assume the new FG isn't going to quadruple them either, so whatever comparison you are trying to make from that screenshot are wrong.
I agree though it is looking like apples to apples the 5080 probably isn't much faster than the 4080..
Posted on Reply
#291
TheinsanegamerN
chstamosHere's a more general question. Is there a point of GPU price inflation at which console makers start reexamining going outside , or developing their own custom graphics? They don't have to account for backwards compatibility after all, and that's a big barrier to entry in discrete graphics that they wouldn't have to deal with. Or are the designs simply too complex for such a project to have any hope of succeeding?

I know it sounds like craziness, but Apple isn't doing all that bad with their imagination technologies derived graphics silicon, after all.
The console makers are not paying GPU inflation. Console chips are, famously, very low margin designs for chip makers, part of why nvidia is happy to let AMD have it.

It's also monstrously expensive, if Sony/MS had to design their own, neither one would be making any profit from consoles, even with software sales. Apple gets away with it because they sell more iphones in 6 months then xbox series x/s and ps5/pro have sold combined the ENTIRE generation, and that tech is also used on all their ipads and macs.
Posted on Reply
#292
AusWolf
DaworaU take this like personal offence? Why?
its Amd not You! Dont hurt u feelings if someone say bad about tech company.
I'm not taking it personally. I'd just like to stay on topic. I am equally disappointed in the AMD keynote, but there is a place to discuss that, which isn't here. I have expectations on certain products, but I do not have feelings for either company. It rather looks like you have feelings for Nvidia which you're trying to justify by convincing me. Believe me, it's pointless.
Dawora60 FPS is not smooth at all when playing years +100fps.
OFC u cant see difference if u are using 60Hz monitors
I'm on 144 Hz. Our perceptions are different. What's smooth for you might not be for me and vice versa. I want stability in my performance, and I want low input lag with no graphical glitches. Whether it's at 60 or 100 or 200 FPS, I don't care. But unfortunately, when I only have 30 FPS, I can't make 60 out of it with FG without introducing other problems, that's why I don't like the tech.
DaworaBut can u just cool off and wait for reviews? we got u point allredy. Ok?
You asked for it.
DaworaBecause Nvidia have best gpus also best features..
no need to use FG, but its still there when needed.
Ai is future
Believe that if it makes you feel better. Personally, I see the same games running on Nvidia and AMD cards (yes, I have both). The colour of the box doesn't matter. Price and performance do.
TheinsanegamerNThe console makers are not paying GPU inflation. Console chips are, famously, very low margin designs for chip makers, part of why nvidia is happy to let AMD have it.
More because Nvidia doesn't do APUs (if you don't count low-performance designs with ARM cores like in the Nintendo Switch).
Posted on Reply
#293
oxrufiioxo
Chrispy_it's slower. Twice as many fake frames, not twice as many frames per second

Net result, input lag gets even worse, and input lag is the main reason people don't like fake frames in the first place. Not the only reason, but definitely the main one.

More seriously, of the thousand or so demanding titles from the last half decade, only a tiny tiny handful (under 50) actually even support Nvidia's frame-gen.
I'm annoyed with how Nvidia portrayed 50 series but honestly the best announcement is DLSS RR, DLSS SR, and DLAA are getting meaningful improvements and that's coming to all RTX owners.
Posted on Reply
#294
Baba
Nvidia didn't show any raster gains and AMD didn't even show their GPUs. I was telling people that AMD exited the GPU business. They'll argue that UDNA is coming. They killed off RDNA. They don't want to pour any money into graphics. You'll be gaming on their compute units. It migth work, it might not but they don't care about graphics and they made it very clear.

I was expecting the 5080 on down to be similar to the Super update. Single digit improvements. They focused on everything other than raster. Can't even buy any old stuff. Shelves are clear at Microcenter. 5080 here I come. I can't believe how hard it is to replace my 6950XT. 7900XTX is only 50% gain. 4080 close to the same. I prefer to at least double my fps when I upgrade. This will be the saddest upgrade ever for me. I previously went from 1060 to 6950XT. That's 4-5x fps improvement. Are we reaching diminishing returns? Good news will be that we don't have to upgrade as often with such measly gains.
Posted on Reply
#295
Dr. Dro
Vya DomusYou're not being serious when you're saying that you also believe a 5080 is going to be 30% faster than a 4090 ?
We'll have to wait and see but I personally don't think it's impossible
Posted on Reply
#296
AusWolf
JustBenchingIve read comments similar to yours a hundred times this past week. You are not doing your side any favors. Honestly, on the list of why im not buying amd GPUs "obnoxious comments by the company's fans" is at the top.
If I limited my choices by a few idiots and blind fans on an online forum, then I wouldn't have a PC at all (let alone three). There's plenty of them in every camp.
Posted on Reply
#297
Dr. Dro
LittleBroThat's the thing - architectural level inprovements are non existent. Blackwell is shrinked Ada on steroids. Brute force. Nvidia added as much new compute units as was possible and tried to balance it power-wise.

You can tell from 5090's specs that efficiency is also a problem now. 4090 has 5000 shaders less but also much lower TGP than 5090. Were there any significant architectural changes, it would not end like that. Nvidia brute forced everything towards so called AI features (DLSS, FG). Jensen already stated before that this is the only way for new stage of gaming. I have my doubts, though.

As for new DLSS, please, don't say that 4000 series will be deprived of nothing and basically they just won't support something here, something there, there and also there and god knows where epse as well. RTX 4090 is surely capable (hardware-wise) for new DLSS tech when slower 5080 is capable (and anything below 5080 as well). Or change my mind, give me one real reason why 4090 would not be capable.

RTX 5080 will not beat RTX 4090 in native. Because:
- not enough computing power
- that would negatively affect 4090 sales which is Jensen's golden goose, they can't just release something more powerful and price it 20-30% less, or else they would cripple their own sales
- there will be RTX 5080 Ti with around 14k shaders and this one maybe will be on par with 4090

Performance-wise, from best:
RTX 5090
RTX 4090
RTX 5080 Ti (Super) with around 400W TGP
RTX 5080
RTX 5070 Ti (Super)
RTX 5070
Following this logic, there would never be a generational uplift over the previous halo part. This has been the case for the past few generations. It is possible, but personally I'm optimistic on at least a match. We'll have to wait and see. After all, it's pretty much what AMD is proposing with the 9070 XT. A leaner and meaner chip that will go toe to toe with their previous generation flagship with less raw hardware resources.
Posted on Reply
#298
oxrufiioxo
JustBenchingAs much as I don't get all the hating about DLSS 4 etc., fact of the matter is they are not actual frames (they do not affect the game engine) and therefore they just shouldn't be on a framerate graph / slide from nvidias marketing or from reviewers.

On the other hand, it's really hard to demonstrate what FG actually does so what other way do you actually have besides putting them on a graph?
I have no issues with them showing how they've improve frame generation personally I think it's awesome that they are I'm just not a fan of them omitting actual apples to apples performance difference especially when turning frame generation on increases latency at each step from 2x-3x-4x I will say I'm impressed it isn't much higher after the first step but until there are no noticeable artifacts and latency goes down at each step it shouldn't be sold as extra performance.

I am happy that the core DLSS technologies are improving for all RTX owners probably the best announcement period.
Posted on Reply
#299
JustBenching
oxrufiioxoI have no issues with them showing how they've improve frame generation personally I think it's awesome that they are I'm just not a fan of them omitting actual apples to apples performance difference especially when turning frame generation on increases latency at each step from 2x-3x-4x I will say I'm impressed it isn't much higher after the first step but until there are no noticeable artifacts and latency goes down at each step it shouldn't be sold as extra performance.

I am happy that the core DLSS technologies are improving for all RTX owners probably the best announcement period.
There shouldn't be any artifacts unless you start pausing and going frame by frame. The sheer amount of frames would make it impossible to detect anything out of the ordinary. The latency on FG is still a bit problematic, especially in games that have a problematic / laggy game engine.

The thing is, there is literally no other way to max out a 4k 240hz monitor without going with MFG. There isn't enough gpu or cpu power to do that on AAA titles, so MFG is very welcome. Wasn't really interested in upgrading but MFG has me thinking
Posted on Reply
#300
oxrufiioxo
JustBenchingThere shouldn't be any artifacts unless you start pausing and going frame by frame. The sheer amount of frames would make it impossible to detect anything out of the ordinary. The latency on FG is still a bit problematic, especially in games that have a problematic / laggy game engine.

The thing is, there is literally no other way to max out a 4k 240hz monitor without going with MFG. There isn't enough gpu or cpu power to do that on AAA titles, so MFG is very welcome. Wasn't really interested in upgrading but MFG has me thinking
Linus was picking out issues running on a 5090 easily in his video. Now are those issue the same or worse than 2x we will have to wait for reviews.
Posted on Reply
Add your own comment
Jan 9th, 2025 11:07 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts