• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA Tunes GeForce RTX 5080 GDDR7 Memory to 32 Gbps, RTX 5070 Launches at CES

No, the joke is that the 5070 has 6,400 Shaders, my 3070Ti has 6144.

This thing will only have what, maybe 15% more performance than mine with the new memory?
4070 has 5888 and it's already 15% faster than 3070ti w/o gddr7 and clock speed increase.
btw, that 3070ti is already dead at anything higher than 1080p, and 12gb will soon go the same way.
If 5070 has 12gb, I'm not touching it. If these leaks are true, it smells of another mid-cycle refresh, similar to 20/40 super series.

won't the cores themselves be more powerful though, Jensen did say in-house AI helped to develop Blackwell.
wasn't that mentioned in the context of utilizing die space to its fullest ? from what I understood, this method was just a way of coping with diminishing returns of node shrinks.
 
Last edited:
won't the cores themselves be more powerful though, Jensen did say in-house AI helped to develop Blackwell. I think 5000 series cards across the line are probably going to impress people honestly, we will see early next year anyway, am I buying one? no, I am content with my backlog of games and 7900 xt. it will be interesting to see all this play out over next couple of years though as AI helps the designs
How is this related to the massive gap in shaders? And doesn't that even more strongly destroy the argument that there is more than enough development space in GPUs?
 
16 GB VRAM for 5080 and 12 GB VRAM for 5070?
12 GB VRAM for $800-900 USD GPU (5070)?

Should have been:
24GB + 384bit for 5080
16GB + 256 bit for 5070

I bet my ass Ngreedia will release 24GB/384bit as 5080 Ti/Super later, and 20GB/320bit as 5070 Ti/Super. So, with regular 5080 and 5070 you will be paying a lot for a cards that are unable to run modern UE5 games at 4K at 60+ FPS on native. But hey, you'll have fake frames and distortions and that will get you some nice FPS boost! Does not matter that half of those frames will be fake ...

This is crazy. Vote with your wallets, dudes.
 
How is this related to the massive gap in shaders? And doesn't that even more strongly destroy the argument that there is more than enough development space in GPUs?

Yes, it's criminal in my book, that is not a 5080, and definitely not a 5070. These cards could have been something amazing, pushing new boundaries, now all they do is leave an extremely bad taste in ones mouth. It seems as if we just can't move forward, permanently stuck with extremely poor performance/price tiers. This is deliberate sabotage on progress.
 
All jokes aside, I think we should wait for formal announcement and reviews before drawing definitive judgement, but acting like it’s a BIGHUGE surprise that a company that effectively holds a monopoly on the discrete GPU market isn’t rushing to provide top tier value for customers is silly. But it is what it is. At least it has never been easier to just go “naaah, I’ll save my shekels” and forgo an upgrade than it is nowadays… provided you’re not one for modern AAA slop, so YMMV.
 
If 5070 has 12gb, I'm not touching it. If these leaks are true, it smells of another mid-cycle refresh, similar to 20/40 super series.
For sure, the 4070ti 16GB surprised me the most. Its possibly the only well balanced/positioned product in the Ada stack, albeit still somewhat too expensive. That's what x80 should have been from the get go.
 
I’m very curious to see if Nvidia skips the mid range below the 5070.

As for performance,

5070 = 4080
5080 = 4090
5090 = 30% higher than 4090
I wouldn't bet on the 5070 beating the 4080.

The 3080 beat out the 4070 in a lot of games. Then there is the whole 12gb in a 70s class card in 2025.

Bring on Battlemage.
 
somewhat too expensive
To put it extremely mildly. That's a true xx60 Ti for the price of xx80 Ti. With the silliest name possible.

This will still sell great because what else is there to buy.

Should have been:
24GB + 384bit for 5080
16GB + 256 bit for 5070
In the ideal world, more like 20 GB/320-bit 5070 and 28 GB/448-bit 5080. And 32 GB/512-bit 5090. But alas.
 
To put it extremely mildly. That's a true xx60 Ti for the price of xx80 Ti. With the silliest name possible.

This will still sell great because what else is there to buy.
I wouldn't go that far, people have been playing that game to some real extremes imho. What I find more interesting is how the products within a stack work concurrently; and both Ampere and Ada were a fcking mess in that sense - Blackwell seems to be making it worse; 15k shaders between a single tier of products just screams 'ha ha, you're not getting any real good stuff unless you pay maximum price'. It deliberately shines badly on the entire stack below it, basically making everyone feel like a peasant unless they go for the real thing. An oppressive lack of VRAM is a similar middle finger. Its basically accelerating E waste by making subpar products; and there's another slew of 12GB midrange/high end cards inc.
 
It's a shame about 5070 12GB, it's fine now, but in 1-2 years this will end up feeling like the 3070 8GB all over again. Not interested in the pricing of the more expensive cards. As a happy 4070 owner I'll wait for 6070 and try again. By then PS6 will be coming out and there will be a good reason to see spec bumps in games.
 
people have been playing that game to some real extremes imho.
What game?
Historically, the products of 45±4% full die were named xx60 Ti and were sold for 300+ USD.
2018: 400+ USD.
2022: doesn't exist: 4070 Ti is a true xx60 non-Ti; $800+.
2024: appears to be 1200 USD for a "5080."

The Pandora box had been opened and it ain't gonna close. Enjoy.
 
if gap be low i go for 4070 ti
i wonder if they lock dlss 4 behind 5xxx
 
What game?
Historically, the products of 45±4% full die were named xx60 Ti and were sold for 300+ USD.
2018: 400+ USD.
2022: doesn't exist: 4070 Ti is a true xx60 non-Ti; $800+.
2024: appears to be 1200 USD for a "5080."

The Pandora box had been opened and it ain't gonna close. Enjoy.
Sure thing, but we need to appreciate the fact that the performance gap between those dies in a single stack was also a LOT smaller. People seem to forget that a lot. You didn't play anything different on a 690 vs a 660. Or a 780ti. Not a higher resolution either, it was all 1080p, maybe some medium settings 1440p affair. Perhaps 720p - again, a gap between resolutions no more than 30%, whereas right now the singular GPU stack caters to resolutions from 1080p to 4K.
 
All of this suggests they likely cut down on-die caches to cram more CUs as the new node doesn't increase transistor counts in a significant manner. Pretty stupid if true but I guess there is not much else they can do.

It's pretty funny that they're literally doing the 12GB 4080 thing again with a massively underpowered xx80 card but this time around someone at Nvidia had more than two working braincells and they decided to not release two GPUs with the same name so there's not gonna be an outrage. But the strategy remains the same, sell less of an uplift in performance with a higher price tag.
 
Last edited:
For sure, the 4070ti 16GB surprised me the most. Its possibly the only well balanced/positioned product in the Ada stack, albeit still somewhat too expensive. That's what x80 should have been from the get go.

I would have upgraded if it was priced correctly.
 
Jensen is probably going to leave it to Ai to design a GPu.

They are going to feed a load of blueprints into an off grid custom built chatGPT then tell Alexa to get to work.
It will probably be the joke of the century. Hallucinating pixels, I can totally see it
 
I’m very curious to see if Nvidia skips the mid range below the 5070.

As for performance,

5070 = 4080
5080 = 4090
5090 = 30% higher than 4090

I very much doubt a 5080 is as fast as a 4090 if there’s effectively no increase in cuda cores from 4080>4080s>5080, wider bandwidth isn’t gonna do much of anything if there isn’t a massive IPC gain on the CUDA core side.

I fully expect a 5080 to be slower than the 4090 without the use of gimmick marketing like they used with ADA.

I’d like to be wrong for sure, but the consumer market needs serious price correction. The fact that you get 4060, a steaming turd in every respect, at a $300 price point is actually insane. I’ve been actively not recommending friends and family to invest in gaming pc’s the last year and will continue to do so going forward.
 
Jensen is probably going to leave it to Ai to design a GPu.

They are going to feed a load of blueprints into an off grid custom built chatGPT then tell Alexa to get to work.
It will probably be the joke of the century. Hallucinating pixels, I can totally see it

Retail AI upscaling: selling you a 1080p card at 4k-chip prices
 
12 gigs on 5070???:roll:LMAO. I bet 5070 won't even beat my poor man's 4K GPU, 4070TI Super. Ngreedia at it's best:
animation domination lol GIF by gifnews
 
Last edited:
The gap between the 4070 and 4080 is much greater than in previous generations. I would be very surprised if the 5070 actually manages to match the 4080.
I agree.

Nvidia doesn't compete with itself anymore. New cards are slotting into different price points, rather than competing at same price pint of last gen.

Now gamers have to go to the 'used' GeForce market to find the price point they want.

For example, "entry-level" RTX 4060 Ti 8GB are still +$400 USD. If you want to pay less, you can find vanilla RTX 4060 8GB at $300. Which is outperformed by RTX 2080 8GB, goes for $200 USD on eBay.
 
I wouldn't bet on the 5070 beating the 4080.

The 3080 beat out the 4070 in a lot of games. Then there is the whole 12gb in a 70s class card in 2025.

Bring on Battlemage.
5070 might be more like the 4070 Ti Super

I very much doubt a 5080 is as fast as a 4090 if there’s effectively no increase in cuda cores from 4080>4080s>5080, wider bandwidth isn’t gonna do much of anything if there isn’t a massive IPC gain on the CUDA core side.

I fully expect a 5080 to be slower than the 4090 without the use of gimmick marketing like they used with ADA.

I’d like to be wrong for sure, but the consumer market needs serious price correction. The fact that you get 4060, a steaming turd in every respect, at a $300 price point is actually insane. I’ve been actively not recommending friends and family to invest in gaming pc’s the last year and will continue to do so going forward.
The problem is always the performance gap. The difference between the 4080 and 4090 is around 30%. If the 5080 cannot match the 4090 in performance then it will be less than 30% faster than the model it is replacing. That is too small of a difference and acting more like CPU generational differences.
 
5070 might be more like the 4070 Ti Super


The problem is always the performance gap. The difference between the 4080 and 4090 is 20-25%. If the 5080 cannot match the 4090 in performance then it will be less than 20% faster than the model it is replacing. That is too small of a difference and acting more like CPU generational differences.
Which may actually give both AMD & Intel the opportunity to gain marketshare.

I am not going back to 12gb cards.
 
22+GB 5080 would have been perfect from gaming imho.

Those are very high end GPU and sadly some games can use more than 16GB in 4K now, what will the future holds, with more UE5 nanite etc ?

4090 or 5090 for me next year then..
 
I was hoping to buy the 5090 this Q4 2024. Guess that might of happened if AMD stayed in the high end GPU market.
 
Back
Top