• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA 2025 International CES Keynote: Liveblog

Oh I personally don't care how they price it but if it's just more of the same at a slight discount we all know how that story ends....
Nvidia has more of the same at almost no discount. Is that better ?
 
Are there any actual "no DLSS+FG shenanigans" performance estimations available? This is the only one I could find, from a 5090D vs 4090D slide. Everything else compares different versions of DLSS and FG. So this looks like what, 30-35% gen on gen?
1736241516367.png
 
Nvidia has more of the same at almost no discount. Is that better ?

For me probably not but for 20 series and maybe 30 series owners sure I mean we don't know what the 9070 will perform like but it's sitting next to a 4070ti/7900XT on AMD's chart so who knows what that means.....

I agree though people are oddly super happy about the Nvidia announcement that only has one benchmark showing a 30% uplift but not against the super cards only the older more meh options so the 5070ti will probably be ok and maybe the 5080 due to 0 competition at 999 but the 5070 is probably not going to be that great but when the fastest RDNA4 card is just going to be ball park in that performance class that is what we get I guess.
 
Ah so my point is irrelevant. Sure, then, have it your way, everything AMD does is fantastic, the 9070XT is amazing and pulling out of the presentation is not a bad thing because AMD is pulling a 4D chess move. Better?
Have you noticed that this is an Nvidia thread? You can have my honest opinion on the non-existent 9070 XT and AMD's non-existent presentation on it if you like, just pop over to the other thread.

Spoiler alert: I'm not as happy about AMD's lack of info as you think, far from it. But I don't see the world in red and green, and I know this isn't the place to talk about AMD, and vice versa.

So no, AMD isn't relevant here. There's no need to get defensive about it.

It's just the buzz word right now everyone want's AI people with Iphones and galaxy think a random program telling you a joke is AI. Bottom line though Nvidia makes an insane amount of money and they bet on it and it's paid off and now everyone else is playing catch up.... We can hate all we want but they are killing it so much so that there are a ton of people super happy with the 50 series announcement and they didn't show shit.... Just a singular apples to apples benchmark..... ALL because the 3 cheapest cards are affordable and that likely means the 5060ti and 5060 will be priced ok the cards probably 80-90% of people care about especially those 20 series people or the 10 series people still hanging on to their cards....
I get it for data centres and such, but I don't know a single common person who gives a lick about AI. If Nvidia wants to ride the AI bandwagon hard with their data centre stuff, that's fair, but with a consumer card like the 5070, I don't see the point.

Nvidia has more of the same at almost no discount. Is that better ?
Let's wait for reviews, imo. They might be giving you more of the same, they might be giving you a lot. Based on some useless FG performance data, there's no way to know. (although I have my suspicions)
 
I get it for data centres and such, but I don't know a single common person who gives a lick about AI. If Nvidia wants to ride the AI bandwagon hard with their data centre stuff, that's fair, but with a consumer card like the 5070, I don't see the point.

It's super weird but regardless these cards will be faster with a 50-200 usd lower starting MSRP although I don't think they will look as favorable vs the super cards.... When Nvidia number one competitor is abandoning anything above the 600 usd mark it's not like they have to be all that good honestly but AMD really needs to hit it with the 9070XT

Let's wait for reviews, imo. They might be giving you more of the same, they might be giving you a lot. Based on some useless FG performance data, there's no way to know. (although I have my suspicions)

Yeah 30% more performance at each tier at a discount would actually be pretty good but I don't think it will end up that good hopefully I am wrong.

Are there any actual "no DLSS+FG shenanigans" performance estimations available? This is the only one I could find, from a 5090D vs 4090D slide. Everything else compares different versions of DLSS and FG. So this looks like what, 30-35% gen on gen?
View attachment 378763

That is it, Technically the Plague tale one is somewhat usable because both cards are using the same version of DLSS3 in that one but as far as apples to apples all cards performed about 30% better than what they replaced in FC6

Can we actually trust those numbers is anyone's guess.
 
Last edited:
Are there any actual "no DLSS+FG shenanigans" performance estimations available? This is the only one I could find, from a 5090D vs 4090D slide. Everything else compares different versions of DLSS and FG. So this looks like what, 30-35% gen on gen?
View attachment 378763

If that's from any RTX 5090 vs RTX 4090 that increase is coming for pure rasterization increase from the increase R.O.P's the RTX 5090 has 224 R.O.P's vs 176 on the RTX 4090.
In fact, most card designs are limited not by shader count or RT cores but by pure rasterization limitations that all the shader's, textures units, & RT/Tenosrs cores are linked together to.
 
So no, AMD isn't relevant here
Suffice to say I disagree, but in the interest of a smoother flowing thread from here on out, I won't get into why. Too many threads already turned to crap today after all.
 
Based on some useless FG performance data, there's no way to know.
If they had something better they would have showed it, even if it was with FG. I already pointed out in a previous comment that the CP2077 chart shows the 5080 being 2X faster, meaning that without the extra FG it's basically within 5-10% of the 4080 in that particular game and remember this is the best material they have.
 
If they had something better they would have showed it, even if it was with FG. I already pointed out in a previous comment that the CP2077 chart shows the 5080 being 2X faster, meaning that without the extra FG it's basically within 5-10% of the 4080 in that particular game and remember this is the best material they have.

The first two benchmarks Farcry and A plague tale are the only apples to apples ones and they show 30-40%. The rest we have no idea how much DLSS4 is adding on top... My guess is they are likely not that much faster than what they replace but we are all guessing till w1z get's the cards.
 
Are there any actual "no DLSS+FG shenanigans"
View attachment 378763
It's pretty funny because Far Cry 6 used to be an AMD biased game if you asked many Nvidia fans, wonder why they chose this rather old and not particularly demanding game as the only non DLSS example.

The rest we have no idea how much DLSS4 is adding on top...
The math isn't complicated, for each extra frame you get ~90% extra FPS, it can't be 3X FG because the performance would be way lower, so it's 4X like in every other example.
 
The math isn't complicated, for each extra frame you get ~90% extra FPS, it can't be 3X FG because the performance would be way lower, so it's 4X like in every other example.

Yes but we don't know how much of a hit turning it on does it's likely not a free even vanilla DLSS SR has a cost to turn it on. Also every game scales different according to their graph some are 6x others are 4x so again only Farcry is actually showing apples to apples no dlss performance.
 
Yes but we don't know how much of a hit turning it on does it's likely not a free
Realistically it can't possibly be more expensive per frame than 2X FG is but what you don't realize is that the larger the performance cost the smaller the difference between it and the 4080 will be.

every game scales different according to their graph some are 6x others are 4x so again only Farcry is actually showing apples to apples no dlss performance
Not when it comes to FG, they scale differently because they have varying degrees of RT, the FG has a similar cost every time because its computation cost does not depend on the game.
 
Realistically it can't possibly be more expensive per frame than 2X FG is but what you don't realize is that the large the performance cost it has the smaller the difference between it and the 4080 will be.


Not when it comes to FG, they scale differently because they have varying degrees of RT, the FG has a similar cost every time because its computation cost does not depend on the game.

A plauge tale has frame generation 3.5 on and is a heavy ish game and the 5080 is 30% faster....

So MF FG looks heavier but we won't know till Wiz get the hardware in 3 weeks to go...
 
Nvidia is definitely so called "AI" company now.
Instead of vastly improved rasterizing performance we get more fake/guessed frames.
RTX 5070 will reach RTX 4090's performance only with help of DLSS.

RTX 5090 might be a real progress but at cost of 25% more TGP. As for other 5000s, I barely see progress.

Do you like my new jacket? What a narcisstic arrogant person, basically laughs in face of gamers who helped his company to raise.

Now, will USA ban everything from RTX 5070 and up because it can be (mis)used by China just like RTX 4090 or better?
 
Do you like my new jacket? What a narcisstic arrogant person, basically laughs in face of gamers who helped his company to raise.
I wouldn't call him that, he was basically making fun of himself as he is the "leather jacket man". Also, CES isn't a gaming show. Gamers also didn't earn Nvidia billions of dollars in past 4 years, it's all about AI.
 
Posts like yours are the exact reason why that is happening.
Again, from my perspective, the reasons threads have turned to crap is for different reasons. I'll admit being reactionary to those I consider toxic trolls isn't exactly a moral high ground to be proud lf, it's just... Stooping to their level.
Do you like my new jacket?
It's literally him jovially taking part in the meme. It's surprising to me anyone saw that and was offended by it.
 
Nvidia wisely estimated that 2500 without taxes would turn into madness on the shelves, so ONLY 2000, which will still transform into something more, but not that much. As I expected, the 5080 costs 1000, because there is no way it can be much better than the 4080. The 5070Ti is even cheaper than the 4070Ti, because Nvidia knows it is as much crap as the 4070TiSuper compared to *80s. All cards except the 5090 have inflated power consumption, especially the little 5070 - 250W, oh my god, because it can't handle or even match the 4070Super otherwise. The 5080 looks the best, and the yet to be announced 5060 will be even better because it will be a real *60, unlike the 4060(4050), and despite the 8GB, should be on par with the 3070Ti or even better and will conquer steam.
 
Last edited:
Did you folks like his jacket?
He is wearing the jacket like a bloody 12yr old, the cuff is supposed to stop at the wrist. And if that is real aligator skin, then he is evern more of a cnut. Another perfect example of a rich corpo that is out of touch with reality.
 
The only use case I can imagine is if your game stutters. But then, it's useless. I'm not interested in making 200 FPS out of 100, thank you, and I can't imagine why anyone is. Especially with the same input lag.
Do u need to tell it 100times?

U dont like it so its fine, u go and buy Amd again, just be happy.
Lets others do what they want and let others be happy also.

i like free FPS when gaming 4K, so thats fine for me if it looks good.

So 5090 is barely 30% faster than 4090 in non Frame Generation situations. Expected more of a jump, 4090 vs 3090Ti was ~40% uplift.
5080 is 20-30% faster than 4090

Wait for it. All data shown in the Nvidia keynote is FG-enabled fake shit. There's no raw performance comparison with the 40 series.


And above that it's pointless.
So over 60fps is pointless?
U have 60Hz monitor still and never tested how smooth +100fps 144hz is?
 
Do u need to tell it 100times?
I've already moved on, but it looks like you want me to tell it a 101st time as well.

U dont like it so its fine, u go and buy Amd again, just be happy.
It's got nothing to do with AMD. I don't like the technology in general, because the only time it doesn't work is when I actually need it. Whether it's on AMD or Nvidia, it doesn't matter.

i like free FPS when gaming 4K, so thats fine for me if it looks good.
Then by all means, get your "free FPS" (there's no such thing, but that's another matter). My different opinion surely doesn't spoil your fun, does it? ;)

There. Can we move on now? :)

So over 60fps is pointless?
U have 60Hz monitor still and never tested how smooth +100fps 144hz is?
I have a 144 Hz monitor, but I don't feel much difference over 50-60 in 99% of games. Freesync pretty much smooths everything out for me.
 
5070 vs 4070: Same lame planned obsolescence 12 GB VRAM (it doesn't matter if it's GDDR6X or GDDR7, once u run out of VRAM, u run out of VRAM) and +25% increased TDP/TGP. According to NV's own graphs, the "RT" (not the other ones where "DLSS" is added) performance increased by like 20-30% (looking by eye, not counting the pixels). So the power efficiency performance increase looks very low indeed (not unexpected because almost the same manufacturing process is used).
5070 with only 12GB VRAM is a big disappointment. At least give us a 24GB clamshell design (like you did for the 4060 Ti 16GB (NV, did you get too much scared of your 8GB planned obsolescence? *wink*) for people who want to run LLMs.

5070 Ti vs 4070 Ti: Increased the VRAM from 12GB to 16GB (NV got too much scared of their planned obsolescence). Judging by this, a 6070 non-Ti is going to have 16GB. Of course, a switch to 3GB GDDR7 modules next year for their refresh cards would be nice. A 32GB VRAM clamshell version of the 5070 Ti would also be nice, 24GB are slowly becoming not enough (for LLM stuff).

5090: I give NV credit for offering a 512bit 32GB VRAM GeForce card, which many people may use instead of the more expensive RTX 5000 Ada workstation card (256bit GDDR6, clamshell, 576 GiB/s) (for tasks where workstation card features are not required), which is twice as expensive.

The "AI TOPS", of say the 5070, looks to be 988 INT4, instead of 4070's 466 INT8. 988 INT4 / 2 = 494 INT8, 494 INT8 [5070] / 466 INT8 [4070] = 1.06 -> 6% improvement, which reminds me of these 6%.

Having a 4070, so far it looks like I'm going to sit this one out and see if NV switches to 3GB GDDR7 modules for their GeForce 50 refresh cards next year (although clamshelling/doubling the VRAM this year for certain cards would be even nicer, but that would double the VRAM and it wouldn't be NV if they'd only switched to 3GB modules only next year and only 1.5x the VRAM). The 4070 with its 12GB runs out of VRAM when enabling even the Medium Path Tracing (Full Ray Tracing) setting in Indiana Jones and the Great Circle.
 
We're trying to discuss the Nvidia keynote here, and just go "AMD this, AMD that, Nvidia is so much better". There's no need for that kind of penis measuring contest.
U take this like personal offence? Why?
its Amd not You! Dont hurt u feelings if someone say bad about tech company.

Ah so you arguing your point to death is fine, but me replying is boring. Sure, then, have it your way, everything Nvidia does is for the consumer, frame generation is amazing and lying is not a bad thing if you make money on the hype generated by it. Better?
if FG working then its just good thing, extra FPS for free Thanks!
Also when image Quality is good then its Win

That's why I'm saying it's pointless. 60 FPS is pretty smooth in my books, I don't need to make 100 out of it.


My goal post never moved. Frame generation is useless, and frame generation enabled data is not valid for comparison. That's what I've been saying all along.
60 FPS is not smooth at all when playing years +100fps.
OFC u cant see difference if u are using 60Hz monitors

I don't mind if it sells because people love it. I just wanted to see apples-to-apples comparison with the 40 series to make it fair.


I don't need to convince myself. I've seen it in work and it was either pointless or crap.
U want yes like everyone

But can u just cool off and wait for reviews? we got u point allredy. Ok?

Gamers buy Nvidia because o_O o_O o_O , not because it has AI.
Because Nvidia have best gpus also best features..
no need to use FG, but its still there when needed.
Ai is future
 
Back
Top