• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA 2025 International CES Keynote: Liveblog

I mean people who haven't learned they need to wait for reviews are a lost cause regardless... w1z isn't going to go for the MFFG BS


These cards can go anywhere from great all the way down to meh depending on actual performance gains gen on gen.
I'm cautiously optimistic of the 5070. It caught my attention with its price. But yeah, reviews are key, we learned that with Ampere, Ada and RDNA 3.
 
Aye; Jenson is a master at obfuscation, as I said before, I hate the guy. Lying by omission, is still lying.

The 5070 is a turd, it's only better at A.I workloads, but for rasterization workloads? lol, It's basically a 3070Ti.:roll:

We have no idea he did the same thing at the 40 series launch showing every card being 2x-4x they all ended up with decent gains just with price hikes.


It definitely will beat a 3070ti it just might not beat a 4070 by much...
 
I'm cautiously optimistic of the 5070. It caught my attention with its price. But yeah, reviews are key, we learned that with Ampere, Ada and RDNA 3.
550,- for a 12GB GPU with lots of core power in 2025? I've got a fire you can burn that money on, too. Both approaches will keep you warm for a short while.

I can't say I share the optimism here in any way shape or form. I'm actually more optimistic enough idiots throw their Ada cards on 2nd hand market because they fell for the empty marketing just like you, because we are in complete stagnation territory between Ada and Blackwell, its crystal clear, even Huang confirmed it by talking about DLSS4 exclusively. I'll happily pick up a 4080 or a 4070ti Super at 650~700 sooner than I would even consider a poorly balanced Blackwell at 550,-.

That's the real upgrade path here. 2nd hand last gen as all the n00bs upgrade to the latest greatest that didn't gain them anything.
 
550,- for a 12GB GPU with lots of core power in 2025? I've got a fire you can burn that money on, too. Both approaches will keep you warm for a short while.
You got a point there, I somehow missed this "small" detail. :ohwell:

Why did I assume that every midrange card anno 2025 has 16 GB? :confused: I guess I'm tired after work.
 
550,- for a 12GB GPU with lots of core power in 2025? I've got a fire you can burn that money on, too. Both approaches will keep you warm for a short while.

I mean the 12GB card from Nvidia is still beating the 20GB one from AMD at 4k with RT and while I don't necessarily disagree with you if the 9070 sucks at RT and FSR4 is a bust that 5070 is gonna start looking pretty good.

You got a point there, I somehow missed this "small" detail. :ohwell:

Why did I assume that every midrange card anno 2025 has 16 GB? :confused: I guess I'm tired after work.

Yeah the 12GB is a huge bummer I'd wait and see if they refresh it with an 18GB model....
 
I mean the 12GB card from Nvidia is still beating the 20GB one from AMD at 4k with RT and while I don't necessarily disagree with you if the 9070 sucks at RT and FSR4 is a bust that 5070 is gonna start looking pretty good.
Both cards will get stuck at RT anyway; but a few years down the line, the 20GB card will run ultra textures and max detail, but the 5070 will not, at similar FPS. The 'today performance' is pretty irrelevant, all cards are fast enough for virtually anything even at this perf segment.
 
We have no idea he did the same thing at the 40 series launch showing every card being 2x-4x they all ended up with decent gains just with price hikes.


It definitely will beat a 3070ti it just might not beat a 4070 by much...

Of course it will, but not by much, that is, if it's not A.I work related, meaning MFG, DLSS and so on. Watch and see. :)
 
Both cards will get stuck at RT anyway; but a few years down the line, the 20GB card will run ultra textures and max detail, but the 5070 will not, at similar FPS. The 'today performance' is pretty irrelevant, all cards are fast enough.

They both are bad options for different reasons I'd wait for the 3GB Gddr7 to release first.

Of course it will, but not by much, that is, if it's not A.I work related, meaning MFG, DLSS and so on. Watch and see. :)

It would be pretty funny if it lost on raster I just don't see it happening lol.
 
They both are bad options for different reasons I'd wait for the 3GB Gddr7 to release first.



It would be pretty funny if it lost on raster I just don't see it happening lol.

As I said, it won't, but it's going to be closer than people expect. ^_^
 
As I said, it won't, but it's going to be closer than people expect. ^_^

I think there is a real chance the 5070/5070ti are not very impressive vs the super cards but lets be real they could have just kept selling ADA which is likely cheaper to make at a slight price cut if that was the case and just release the 5090 for content creators.

They are also both 30% faster in farcry so even is they are half that in a large selection of games they'll be fine considering the competition.
 
I think there is a real chance the 5070/5070ti are not very impressive vs the super cards but lets be real they could have just kept selling ADA which is likely cheaper to make at a slight price cut if that was the case and just release the 5090 for content creators.

They are also both 30% faster in farcry so even is they are half that in a large selection of games they'll be fine considering the competition.

All I am saying is, this is comical:


RTX3070Ti RTX5070
1736330880519.png
1736330852875.png


But yeah, of course the clock speeds/cache and so on will make the RTX5070 faster, but by just how much is what I am wondering? The 5070 will obliterate it in A.I workloads, that is for sure.

When I owned my 3070Ti, I used "Lossless Scaling" which you can buy from Steam, you could also get impressive FPS numbers if you enabled frame generation but the 8GB VRAM never allowed for it to be used for long periods. Anyways, erm, I am glad I got rid of my RTX3070Ti and hell no, I would not buy a RTX5070, especially with 12GB VRAM. Why? Because if you start using those A.I features, you will run into that VRAM limit very fast and you will wish you still had an GTX1070, because performance will drop way below that when you hit that VRAM ceiling (on 1440p and above)

I don't think I would one free, it's would be just too much of a pain in the ass in many scenarios to manage instead of just enjoy playing games.
 

Attachments

  • 1736330781999.png
    1736330781999.png
    42.4 KB · Views: 30
  • 1736330813836.png
    1736330813836.png
    17.5 KB · Views: 23
All I am saying is, this is comical:


RTX3070Ti RTX5070
View attachment 378975 View attachment 378974

But yeah, of course the clock speeds/cache and so on will make the RTX5070 faster, but by just how much is what I am wondering? The 5070 will obliterate it in A.I workloads, that is for sure.

When I owned my 3070Ti, I used "Lossless Scaling" which you can buy from Steam, you could also get impressive FPS numbers if you enabled frame generation but the 8GB VRAM never allowed for it to be used for long periods. Anyways, erm, I am glad I got rid of my RTX3070Ti and hell no, I would not buy a RTX5070, especially with 12GB VRAM. Why? Because if you start using those A.I features, you will run into that VRAM limit very fast and you will wish you still had an GTX1070, because performance will drop way below that when you hit that VRAM ceiling (on 1440p and above)

I don't think I would one free, it's would be just too much of a pain in the ass in many scenarios to manage instead of just enjoy playing games.

I view the 5-600 usd cards as 1080p cards as it is so 12GB is fine but I agree people should be buying 16GB cards in 2025 regardless of how fast this 12GB card is or isn't.
 
I view the 5-600 usd cards as 1080p cards as it is
I find that a very strange statement considering that the market still calls this layer "performance segment". 1080p isn't where performance is at / is needed these days.
 
I view the 5-600 usd cards as 1080p cards as it is so 12GB is fine but I agree people should be buying 16GB cards in 2025 regardless of how fast this 12GB card is or isn't.
No, 1440p + cards are xx70+, the xx60- is for 1080p and below, I will not accept it any other way, no matter the marketing or brainwashing spewed by that snake Jenson.
 
No, 1440p + cards are xx70+, the xx60- is for 1080p and below, I will not accept it any other way, no matter the marketing or brainwashing spewed by that snake Jenson.

I find that a very strange statement considering that the market still calls this layer "performance segment". 1080p isn't where performance is at / is needed these days.

I use a 4090 for 1440p that's my minimum ok level of performance just to give you context. A 4070 would be half that perfomance.

4080s would probably barely scrape by.

Everyone games differently though.
 
I use a 4090 for 1440p that's my minimum ok level of performance just to give you context.
How the hell is that even possible? :eek:

I'm on a 6750 XT and only now starting to find it slightly lacking at maximum detail. I know my expectations aren't the greatest in the gaming world, but yours must be through the roof.
 
How the hell is that even possible? :eek:

I'm on a 6750 XT and only now starting to find it slightly lacking at maximum detail. I know my expectations aren't the greatest in the gaming world, but yours must be through the roof.

There is no right or wrong way to game if you're happy with the performance it's all that matters.... I might have to drop down a tier for my next upgrade and just be happy with it lol.
 
Maybe it's high time for another intel-AMD cross licensing agreement, this time in graphics. Intel could bring XESS and ray tracing technologies to the table, seeing how useless AMD is at those, and Radeon group could license some raster technologies back to intel :)
 
Maybe it's high time for another intel-AMD cross licensing agreement, this time in graphics. Intel could bring XESS and ray tracing technologies to the table, seeing how useless AMD is at those, and Radeon group could license some raster technologies back to intel :)
That's actually not a bad idea! :) (but psst... you're in the wrong thread) :ohwell:
 
Isn't the 9060 rumored to be an 8gb card? That will show them (nvidia ain't any better in this department, but at least they have a reason being stingy with vram)
Depends on pricing but if it's over $300 it will likely use double-density GDDR6 to get capacity up to 16GB.

AMD is well-aware that 8GB GPUs have been criticised by reviewers and developers alike for the last 2 years now. One of their only selling points during RDNA2/3 generations was that they weren't as stingy with VRAM as Nvidia, so the cards would be viable for longer despite a defecit in features, power efficiency, and RT performance.

I personally got rid of a 2060 6GB, 3060 laptop (6GB), and 3070 8GB long before they were too weak to run games, simply because they ran out of VRAM and I had to severely compromise on graphics settings at the relatively modest 1440p resolution. I'm not buying midrange $350+ gpus to run at low settings. A PS5 was running those same games at pseudo-4K at better settings, and that whole console costs less than some of those GPUs alone.
 
Last edited:
No, TOPS is INT4 for both products.
You mean NV is comparing apples to apples in this one?, this would be nice (I guess since there is no fineprint, it might be so). Then the AI TOPS would indeed be massively improved (+70% when adjusting for the power increase of +25% for the 5070 vs 4070: 988/(466*1.25)).
According to "nvidia-ada-gpu-architecture.pdf", the 4090 is:
Peak INT8 Tensor TOPS1 660.6/1321.22
Peak INT4 Tensor TOPS1 1321.2/2642.42
2. Effective TOPS / TFLOPS using the new Sparsity Feature
So it's either 1321 INT8 sparse or 1321 INT4 dense? Anyway, what matters more, is that it's an apples to apples comparison.
 
That's actually not a bad idea! :) (but psst... you're in the wrong thread) :ohwell:
Didn't AMD suggest that the 8800XT 9070XT offered RT performance of a 4080 at some point? I'm not sure what the source was for that rumour, and like all info from AMD, you don't trust it until it has been independently verified.
 
Didn't AMD suggest that the 8800XT 9070XT offered RT performance of a 4080 at some point? I'm not sure what the source was for that rumour, and like all info from AMD, you don't trust it until it has been independently verified.
I can't remember if it was RT or raster, but something like that. They did say that RDNA 4 has a totally new RT engine, though.
 
I can't remember if it was RT or raster, but something like that. They did say that RDNA 4 has a totally new RT engine, though.

Yes, it's what I heard too, perhaps their version of 2de generation RT cores? FSR 4.0 will also utilize A.I similar to DLSS according to the rumours, which will finally give great uplifts to image quality.

I won't forget that AMD gave us FSR without A.I, something I think, was extremely awesome of them to do, sure it doesn't look as good in some scenarios but it is a great option for non RTX users. :love:
 
We can safely asssume that it's going to be similar to what PS5 Pro has so double the intersection perf and better BVH acceleration, also better handling of divergent rays so increased performance in harder RT workloads like GI and PT. Plus any other tech that Sony didn't disclose in detail.

1736337717605.png
 
Back
Top