Monday, January 6th 2025

NVIDIA 2025 International CES Keynote: Liveblog

NVIDIA kicks off the 2025 International CES with a bang. The company is expected to debut its new GeForce "Blackwell" RTX 5000 generation of gaming graphics cards. It is also expected to launch new technology, such as neural rendering, and DLSS 4. The company is also expected to highlight a new piece of silicon for Windows on Arm laptops, showcase the next in its Drive PX FSD hardware, and probably even talk about its next-generation "Blackwell Ultra" AI GPU, and if we're lucky, even namedrop "Rubin." Join us, as we liveblog CEO Jensen Huang's keynote address.

02:22 UTC: The show is finally underway!
02:35 UTC: CTA president Gary Shaprio kicks off the show, introduces Jensen Huang.
02:46 UTC: "Tokens are the building blocks of AI"

02:46 UTC: "Do you like my jacket?"
02:47 UTC: NVIDIA recounts progress all the way till NV1 and UDA.
02:48 UTC: "CUDA was difficult to explain, it took 6 years to get the industry to like it"
02:50 UTC: "AI is coming home to GeForce". NVIDIA teases neural material and neural rendering. Rendered on "Blackwell"
02:55 UTC: Every single pixel is ray traced, thanks to AI rendering.
02:55 UTC: Here it is, the GeForce RTX 5090.
03:20 UTC: At least someone is pushing the limits for GPUs.
03:22 UTC: Incredible board design.
03:22 UTC: RTX 5070 matches RTX 4090 at $550.
03:24 UTC: Here's the lineup, available from January.
03:24 UTC: RTX 5070 Laptop starts at $1299.
03:24 UTC: "The future of computer graphics is neural rendering"
03:25 UTC: Laptops powered by RTX Blackwell: staring prices:
03:26 UTC: AI has come back to power GeForce.
03:28 UTC: Supposedly the Grace Blackwell NVLink72.
03:28 UTC: 1.4 ExaFLOPS.
03:32 UTC: NVIDIA very sneakily teased a Windows AI PC chip.

03:35 UTC: NVIDIA is teaching generative AI basic physics. NVIDIA Cosmos, a world foundation model.
03:41 UTC: NVIDIA Cosmos is trained on 20 million hours of video.

03:43 UTC: Cosmos is open-licensed on GitHub.

03:52 UTC: NVIDIA onboards Toyota for its next generation EV for full-self driving.

03:53 UTC: NVIDIA unveils Thor Blackwell robotics processor.
03:53 UTC: Thor is 20x the processing capability of Orin.

03:54 UTC: CUDA is now a functional safe computer thanks to its automobile certifications.
04:01 UTC: NVIDIA brought a dozen humanoid robots to the stage.

04:07 UTC: Project DIGITS, is a shrunk down AI supercomputer.
04:08 UTC: NVIDIA GB110 "Grace-Blackwell" chip powers DIGITS.
Add your own comment

446 Comments on NVIDIA 2025 International CES Keynote: Liveblog

#226
Dristun
Are there any actual "no DLSS+FG shenanigans" performance estimations available? This is the only one I could find, from a 5090D vs 4090D slide. Everything else compares different versions of DLSS and FG. So this looks like what, 30-35% gen on gen?
Posted on Reply
#227
oxrufiioxo
Vya DomusNvidia has more of the same at almost no discount. Is that better ?
For me probably not but for 20 series and maybe 30 series owners sure I mean we don't know what the 9070 will perform like but it's sitting next to a 4070ti/7900XT on AMD's chart so who knows what that means.....

I agree though people are oddly super happy about the Nvidia announcement that only has one benchmark showing a 30% uplift but not against the super cards only the older more meh options so the 5070ti will probably be ok and maybe the 5080 due to 0 competition at 999 but the 5070 is probably not going to be that great but when the fastest RDNA4 card is just going to be ball park in that performance class that is what we get I guess.
Posted on Reply
#228
AusWolf
wolfAh so my point is irrelevant. Sure, then, have it your way, everything AMD does is fantastic, the 9070XT is amazing and pulling out of the presentation is not a bad thing because AMD is pulling a 4D chess move. Better?
Have you noticed that this is an Nvidia thread? You can have my honest opinion on the non-existent 9070 XT and AMD's non-existent presentation on it if you like, just pop over to the other thread.

Spoiler alert: I'm not as happy about AMD's lack of info as you think, far from it. But I don't see the world in red and green, and I know this isn't the place to talk about AMD, and vice versa.

So no, AMD isn't relevant here. There's no need to get defensive about it.
oxrufiioxoIt's just the buzz word right now everyone want's AI people with Iphones and galaxy think a random program telling you a joke is AI. Bottom line though Nvidia makes an insane amount of money and they bet on it and it's paid off and now everyone else is playing catch up.... We can hate all we want but they are killing it so much so that there are a ton of people super happy with the 50 series announcement and they didn't show shit.... Just a singular apples to apples benchmark..... ALL because the 3 cheapest cards are affordable and that likely means the 5060ti and 5060 will be priced ok the cards probably 80-90% of people care about especially those 20 series people or the 10 series people still hanging on to their cards....
I get it for data centres and such, but I don't know a single common person who gives a lick about AI. If Nvidia wants to ride the AI bandwagon hard with their data centre stuff, that's fair, but with a consumer card like the 5070, I don't see the point.
Vya DomusNvidia has more of the same at almost no discount. Is that better ?
Let's wait for reviews, imo. They might be giving you more of the same, they might be giving you a lot. Based on some useless FG performance data, there's no way to know. (although I have my suspicions)
Posted on Reply
#229
oxrufiioxo
AusWolfI get it for data centres and such, but I don't know a single common person who gives a lick about AI. If Nvidia wants to ride the AI bandwagon hard with their data centre stuff, that's fair, but with a consumer card like the 5070, I don't see the point.
It's super weird but regardless these cards will be faster with a 50-200 usd lower starting MSRP although I don't think they will look as favorable vs the super cards.... When Nvidia number one competitor is abandoning anything above the 600 usd mark it's not like they have to be all that good honestly but AMD really needs to hit it with the 9070XT
AusWolfLet's wait for reviews, imo. They might be giving you more of the same, they might be giving you a lot. Based on some useless FG performance data, there's no way to know. (although I have my suspicions)
Yeah 30% more performance at each tier at a discount would actually be pretty good but I don't think it will end up that good hopefully I am wrong.
DristunAre there any actual "no DLSS+FG shenanigans" performance estimations available? This is the only one I could find, from a 5090D vs 4090D slide. Everything else compares different versions of DLSS and FG. So this looks like what, 30-35% gen on gen?
That is it, Technically the Plague tale one is somewhat usable because both cards are using the same version of DLSS3 in that one but as far as apples to apples all cards performed about 30% better than what they replaced in FC6

Can we actually trust those numbers is anyone's guess.
Posted on Reply
#230
DemonicRyzen666
DristunAre there any actual "no DLSS+FG shenanigans" performance estimations available? This is the only one I could find, from a 5090D vs 4090D slide. Everything else compares different versions of DLSS and FG. So this looks like what, 30-35% gen on gen?
If that's from any RTX 5090 vs RTX 4090 that increase is coming for pure rasterization increase from the increase R.O.P's the RTX 5090 has 224 R.O.P's vs 176 on the RTX 4090.
In fact, most card designs are limited not by shader count or RT cores but by pure rasterization limitations that all the shader's, textures units, & RT/Tenosrs cores are linked together to.
Posted on Reply
#231
wolf
Better Than Native
AusWolfSo no, AMD isn't relevant here
Suffice to say I disagree, but in the interest of a smoother flowing thread from here on out, I won't get into why. Too many threads already turned to crap today after all.
Posted on Reply
#232
Vya Domus
AusWolfBased on some useless FG performance data, there's no way to know.
If they had something better they would have showed it, even if it was with FG. I already pointed out in a previous comment that the CP2077 chart shows the 5080 being 2X faster, meaning that without the extra FG it's basically within 5-10% of the 4080 in that particular game and remember this is the best material they have.
Posted on Reply
#233
oxrufiioxo
Vya DomusIf they had something better they would have showed it, even if it was with FG. I already pointed out in a previous comment that the CP2077 chart shows the 5080 being 2X faster, meaning that without the extra FG it's basically within 5-10% of the 4080 in that particular game and remember this is the best material they have.
The first two benchmarks Farcry and A plague tale are the only apples to apples ones and they show 30-40%. The rest we have no idea how much DLSS4 is adding on top... My guess is they are likely not that much faster than what they replace but we are all guessing till w1z get's the cards.
Posted on Reply
#234
Vya Domus
DristunAre there any actual "no DLSS+FG shenanigans"
It's pretty funny because Far Cry 6 used to be an AMD biased game if you asked many Nvidia fans, wonder why they chose this rather old and not particularly demanding game as the only non DLSS example.
oxrufiioxoThe rest we have no idea how much DLSS4 is adding on top...
The math isn't complicated, for each extra frame you get ~90% extra FPS, it can't be 3X FG because the performance would be way lower, so it's 4X like in every other example.
Posted on Reply
#235
oxrufiioxo
Vya DomusThe math isn't complicated, for each extra frame you get ~90% extra FPS, it can't be 3X FG because the performance would be way lower, so it's 4X like in every other example.
Yes but we don't know how much of a hit turning it on does it's likely not a free even vanilla DLSS SR has a cost to turn it on. Also every game scales different according to their graph some are 6x others are 4x so again only Farcry is actually showing apples to apples no dlss performance.
Posted on Reply
#236
Chomiq
igormpDid you folks like his jacket?
I didn't. Looks like it should come with purse to match.
Posted on Reply
#237
Vya Domus
oxrufiioxoYes but we don't know how much of a hit turning it on does it's likely not a free
Realistically it can't possibly be more expensive per frame than 2X FG is but what you don't realize is that the larger the performance cost the smaller the difference between it and the 4080 will be.
oxrufiioxoevery game scales different according to their graph some are 6x others are 4x so again only Farcry is actually showing apples to apples no dlss performance
Not when it comes to FG, they scale differently because they have varying degrees of RT, the FG has a similar cost every time because its computation cost does not depend on the game.
Posted on Reply
#238
oxrufiioxo
Vya DomusRealistically it can't possibly be more expensive per frame than 2X FG is but what you don't realize is that the large the performance cost it has the smaller the difference between it and the 4080 will be.


Not when it comes to FG, they scale differently because they have varying degrees of RT, the FG has a similar cost every time because its computation cost does not depend on the game.
A plauge tale has frame generation 3.5 on and is a heavy ish game and the 5080 is 30% faster....

So MF FG looks heavier but we won't know till Wiz get the hardware in 3 weeks to go...
Posted on Reply
#239
LittleBro
Nvidia is definitely so called "AI" company now.
Instead of vastly improved rasterizing performance we get more fake/guessed frames.
RTX 5070 will reach RTX 4090's performance only with help of DLSS.

RTX 5090 might be a real progress but at cost of 25% more TGP. As for other 5000s, I barely see progress.

Do you like my new jacket? What a narcisstic arrogant person, basically laughs in face of gamers who helped his company to raise.

Now, will USA ban everything from RTX 5070 and up because it can be (mis)used by China just like RTX 4090 or better?
Posted on Reply
#240
Chomiq
LittleBroDo you like my new jacket? What a narcisstic arrogant person, basically laughs in face of gamers who helped his company to raise.
I wouldn't call him that, he was basically making fun of himself as he is the "leather jacket man". Also, CES isn't a gaming show. Gamers also didn't earn Nvidia billions of dollars in past 4 years, it's all about AI.
Posted on Reply
#241
wolf
Better Than Native
AusWolfPosts like yours are the exact reason why that is happening.
Again, from my perspective, the reasons threads have turned to crap is for different reasons. I'll admit being reactionary to those I consider toxic trolls isn't exactly a moral high ground to be proud lf, it's just... Stooping to their level.
LittleBroDo you like my new jacket?
It's literally him jovially taking part in the meme. It's surprising to me anyone saw that and was offended by it.
Posted on Reply
#242
L'Eliminateur
igormpDid you folks like his jacket?
no... he looks like he's ready to go to epstein island for a diddy party....

the older jacket was a classic, this one screams nouveau rich pimp
Posted on Reply
#243
scooze
Nvidia wisely estimated that 2500 without taxes would turn into madness on the shelves, so ONLY 2000, which will still transform into something more, but not that much. As I expected, the 5080 costs 1000, because there is no way it can be much better than the 4080. The 5070Ti is even cheaper than the 4070Ti, because Nvidia knows it is as much crap as the 4070TiSuper compared to *80s. All cards except the 5090 have inflated power consumption, especially the little 5070 - 250W, oh my god, because it can't handle or even match the 4070Super otherwise. The 5080 looks the best, and the yet to be announced 5060 will be even better because it will be a real *60, unlike the 4060(4050), and despite the 8GB, should be on par with the 3070Ti or even better and will conquer steam.
Posted on Reply
#244
b1k3rdude
igormpDid you folks like his jacket?
He is wearing the jacket like a bloody 12yr old, the cuff is supposed to stop at the wrist. And if that is real aligator skin, then he is evern more of a cnut. Another perfect example of a rich corpo that is out of touch with reality.
Posted on Reply
#245
Dawora
AusWolfThe only use case I can imagine is if your game stutters. But then, it's useless. I'm not interested in making 200 FPS out of 100, thank you, and I can't imagine why anyone is. Especially with the same input lag.
Do u need to tell it 100times?

U dont like it so its fine, u go and buy Amd again, just be happy.
Lets others do what they want and let others be happy also.

i like free FPS when gaming 4K, so thats fine for me if it looks good.
3x0So 5090 is barely 30% faster than 4090 in non Frame Generation situations. Expected more of a jump, 4090 vs 3090Ti was ~40% uplift.
5080 is 20-30% faster than 4090
AusWolfWait for it. All data shown in the Nvidia keynote is FG-enabled fake shit. There's no raw performance comparison with the 40 series.


And above that it's pointless.
So over 60fps is pointless?
U have 60Hz monitor still and never tested how smooth +100fps 144hz is?
Posted on Reply
#246
AusWolf
DaworaDo u need to tell it 100times?
I've already moved on, but it looks like you want me to tell it a 101st time as well.
DaworaU dont like it so its fine, u go and buy Amd again, just be happy.
It's got nothing to do with AMD. I don't like the technology in general, because the only time it doesn't work is when I actually need it. Whether it's on AMD or Nvidia, it doesn't matter.
Daworai like free FPS when gaming 4K, so thats fine for me if it looks good.
Then by all means, get your "free FPS" (there's no such thing, but that's another matter). My different opinion surely doesn't spoil your fun, does it? ;)

There. Can we move on now? :)
DaworaSo over 60fps is pointless?
U have 60Hz monitor still and never tested how smooth +100fps 144hz is?
I have a 144 Hz monitor, but I don't feel much difference over 50-60 in 99% of games. Freesync pretty much smooths everything out for me.
Posted on Reply
#247
10tothemin9volts
5070 vs 4070: Same lame planned obsolescence 12 GB VRAM (it doesn't matter if it's GDDR6X or GDDR7, once u run out of VRAM, u run out of VRAM) and +25% increased TDP/TGP. According to NV's own graphs, the "RT" (not the other ones where "DLSS" is added) performance increased by like 20-30% (looking by eye, not counting the pixels). So the power efficiency performance increase looks very low indeed (not unexpected because almost the same manufacturing process is used).
5070 with only 12GB VRAM is a big disappointment. At least give us a 24GB clamshell design (like you did for the 4060 Ti 16GB (NV, did you get too much scared of your 8GB planned obsolescence? *wink*) for people who want to run LLMs.

5070 Ti vs 4070 Ti: Increased the VRAM from 12GB to 16GB (NV got too much scared of their planned obsolescence). Judging by this, a 6070 non-Ti is going to have 16GB. Of course, a switch to 3GB GDDR7 modules next year for their refresh cards would be nice. A 32GB VRAM clamshell version of the 5070 Ti would also be nice, 24GB are slowly becoming not enough (for LLM stuff).

5090: I give NV credit for offering a 512bit 32GB VRAM GeForce card, which many people may use instead of the more expensive RTX 5000 Ada workstation card (256bit GDDR6, clamshell, 576 GiB/s) (for tasks where workstation card features are not required), which is twice as expensive.

The "AI TOPS", of say the 5070, looks to be 988 INT4, instead of 4070's 466 INT8. 988 INT4 / 2 = 494 INT8, 494 INT8 [5070] / 466 INT8 [4070] = 1.06 -> 6% improvement, which reminds me of these 6%.

Having a 4070, so far it looks like I'm going to sit this one out and see if NV switches to 3GB GDDR7 modules for their GeForce 50 refresh cards next year (although clamshelling/doubling the VRAM this year for certain cards would be even nicer, but that would double the VRAM and it wouldn't be NV if they'd only switched to 3GB modules only next year and only 1.5x the VRAM). The 4070 with its 12GB runs out of VRAM when enabling even the Medium Path Tracing (Full Ray Tracing) setting in Indiana Jones and the Great Circle.
Posted on Reply
#248
Dawora
AusWolfWe're trying to discuss the Nvidia keynote here, and just go "AMD this, AMD that, Nvidia is so much better". There's no need for that kind of penis measuring contest.
U take this like personal offence? Why?
its Amd not You! Dont hurt u feelings if someone say bad about tech company.
AusWolfAh so you arguing your point to death is fine, but me replying is boring. Sure, then, have it your way, everything Nvidia does is for the consumer, frame generation is amazing and lying is not a bad thing if you make money on the hype generated by it. Better?
if FG working then its just good thing, extra FPS for free Thanks!
Also when image Quality is good then its Win
AusWolfThat's why I'm saying it's pointless. 60 FPS is pretty smooth in my books, I don't need to make 100 out of it.


My goal post never moved. Frame generation is useless, and frame generation enabled data is not valid for comparison. That's what I've been saying all along.
60 FPS is not smooth at all when playing years +100fps.
OFC u cant see difference if u are using 60Hz monitors
AusWolfI don't mind if it sells because people love it. I just wanted to see apples-to-apples comparison with the 40 series to make it fair.


I don't need to convince myself. I've seen it in work and it was either pointless or crap.
U want yes like everyone

But can u just cool off and wait for reviews? we got u point allredy. Ok?
AusWolfGamers buy Nvidia because o_O o_O o_O , not because it has AI.
Because Nvidia have best gpus also best features..
no need to use FG, but its still there when needed.
Ai is future
Posted on Reply
#249
igormp
10tothemin9voltsThe "AI TOPS", of say the 5070, looks to be 988 INT4, instead of 4070's 466 INT8. 988 INT4 / 2 = 494 INT8, 494 INT8 [5070] / 466 INT8 [4070] = 1.06 -> 6% improvement, which reminds me of these 6%.
No, TOPS is INT4 for both products.
Posted on Reply
#250
LittleBro
Daworai like free FPS when gaming 4K, so thats fine for me if it looks good.
So it's perfectly fine for you when 75% of these FPS show something inaccurate, unreal, approximated?
You are okay that you paid so much for your GPU and this is what you get?
Dawora5080 is 20-30% faster than 4090
Is this a joke? How on Earth would 5080 with 60% of 4090's processing units could be 20-30% faster in native?
Only with FG. Now imagine what would happen if RTX 4090 supported newest generation of FG.

www.techpowerup.com/gpu-specs/geforce-rtx-4090.c3889
www.techpowerup.com/gpu-specs/geforce-rtx-5080.c4217
Posted on Reply
Add your own comment
Jan 9th, 2025 10:54 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts