Wednesday, October 9th 2024

NVIDIA Tunes GeForce RTX 5080 GDDR7 Memory to 32 Gbps, RTX 5070 Launches at CES

NVIDIA is gearing up for an exciting showcase at CES 2025, where its CEO, Jensen Huang, will take the stage and talk about, hopefully, future "Blackwell" products. According to Wccftech's sources, the anticipated GeForce RTX 5090, RTX 5080, and RTX 5070 graphics cards should arrive at CES 2025 in January. The flagship RTX 5090 is rumored to come equipped with 32 GB of GDDR7 memory running at 28 Gbps. Meanwhile, the RTX 5080 looks very interesting with reports of its impressive 16 GB of GDDR7 memory running at 32 Gbps. This advancement comes after we previously believed that the RTX 5080 model is going to feature 28 Gbps GDDR7 memory. However, the newest rumors suggest that we are in for a surprise, as the massive gap between RTX 5090 and RTX 5080 compute cores will be filled... with a faster memory.

The more budget-friendly RTX 5070 is also set for a CES debut, featuring 12 GB of memory. This card aims to deliver solid performance for gamers who want high-quality graphics without breaking the bank, targeting the mid-range segment. We are very curious about pricing of these models and how they would fit in the current market. As anticipation builds for CES 2025, we are eager to see how these innovations will impact gaming experiences and creative workflows in the coming year. Stay tuned for more updates as the event approaches!
Sources: Wccftech, via VideoCardz
Add your own comment

112 Comments on NVIDIA Tunes GeForce RTX 5080 GDDR7 Memory to 32 Gbps, RTX 5070 Launches at CES

#26
Vayra86
Space Lynxwon't the cores themselves be more powerful though, Jensen did say in-house AI helped to develop Blackwell. I think 5000 series cards across the line are probably going to impress people honestly, we will see early next year anyway, am I buying one? no, I am content with my backlog of games and 7900 xt. it will be interesting to see all this play out over next couple of years though as AI helps the designs
How is this related to the massive gap in shaders? And doesn't that even more strongly destroy the argument that there is more than enough development space in GPUs?
Posted on Reply
#27
LittleBro
16 GB VRAM for 5080 and 12 GB VRAM for 5070?
12 GB VRAM for $800-900 USD GPU (5070)?

Should have been:
24GB + 384bit for 5080
16GB + 256 bit for 5070

I bet my ass Ngreedia will release 24GB/384bit as 5080 Ti/Super later, and 20GB/320bit as 5070 Ti/Super. So, with regular 5080 and 5070 you will be paying a lot for a cards that are unable to run modern UE5 games at 4K at 60+ FPS on native. But hey, you'll have fake frames and distortions and that will get you some nice FPS boost! Does not matter that half of those frames will be fake ...

This is crazy. Vote with your wallets, dudes.
Posted on Reply
#28
Daven
AnotherReaderThe gap between the 4070 and 4080 is much greater than in previous generations. I would be very surprised if the 5070 actually manages to match the 4080.
The 5070 might be around the 4070 Ti Super in that case.
Posted on Reply
#29
Legacy-ZA
Vayra86How is this related to the massive gap in shaders? And doesn't that even more strongly destroy the argument that there is more than enough development space in GPUs?
Yes, it's criminal in my book, that is not a 5080, and definitely not a 5070. These cards could have been something amazing, pushing new boundaries, now all they do is leave an extremely bad taste in ones mouth. It seems as if we just can't move forward, permanently stuck with extremely poor performance/price tiers. This is deliberate sabotage on progress.
Posted on Reply
#30
Onasi
All jokes aside, I think we should wait for formal announcement and reviews before drawing definitive judgement, but acting like it’s a BIGHUGE surprise that a company that effectively holds a monopoly on the discrete GPU market isn’t rushing to provide top tier value for customers is silly. But it is what it is. At least it has never been easier to just go “naaah, I’ll save my shekels” and forgo an upgrade than it is nowadays… provided you’re not one for modern AAA slop, so YMMV.
Posted on Reply
#31
Vayra86
ZazigalkaIf 5070 has 12gb, I'm not touching it. If these leaks are true, it smells of another mid-cycle refresh, similar to 20/40 super series.
For sure, the 4070ti 16GB surprised me the most. Its possibly the only well balanced/positioned product in the Ada stack, albeit still somewhat too expensive. That's what x80 should have been from the get go.
Posted on Reply
#32
SSGBryan
DavenI’m very curious to see if Nvidia skips the mid range below the 5070.

As for performance,

5070 = 4080
5080 = 4090
5090 = 30% higher than 4090
I wouldn't bet on the 5070 beating the 4080.

The 3080 beat out the 4070 in a lot of games. Then there is the whole 12gb in a 70s class card in 2025.

Bring on Battlemage.
Posted on Reply
#33
Macro Device
Vayra86somewhat too expensive
To put it extremely mildly. That's a true xx60 Ti for the price of xx80 Ti. With the silliest name possible.

This will still sell great because what else is there to buy.
LittleBroShould have been:
24GB + 384bit for 5080
16GB + 256 bit for 5070
In the ideal world, more like 20 GB/320-bit 5070 and 28 GB/448-bit 5080. And 32 GB/512-bit 5090. But alas.
Posted on Reply
#34
Vayra86
Beginner Macro DeviceTo put it extremely mildly. That's a true xx60 Ti for the price of xx80 Ti. With the silliest name possible.

This will still sell great because what else is there to buy.
I wouldn't go that far, people have been playing that game to some real extremes imho. What I find more interesting is how the products within a stack work concurrently; and both Ampere and Ada were a fcking mess in that sense - Blackwell seems to be making it worse; 15k shaders between a single tier of products just screams 'ha ha, you're not getting any real good stuff unless you pay maximum price'. It deliberately shines badly on the entire stack below it, basically making everyone feel like a peasant unless they go for the real thing. An oppressive lack of VRAM is a similar middle finger. Its basically accelerating E waste by making subpar products; and there's another slew of 12GB midrange/high end cards inc.
Posted on Reply
#35
phints
It's a shame about 5070 12GB, it's fine now, but in 1-2 years this will end up feeling like the 3070 8GB all over again. Not interested in the pricing of the more expensive cards. As a happy 4070 owner I'll wait for 6070 and try again. By then PS6 will be coming out and there will be a good reason to see spec bumps in games.
Posted on Reply
#36
Macro Device
Vayra86people have been playing that game to some real extremes imho.
What game?
Historically, the products of 45±4% full die were named xx60 Ti and were sold for 300+ USD.
2018: 400+ USD.
2022: doesn't exist: 4070 Ti is a true xx60 non-Ti; $800+.
2024: appears to be 1200 USD for a "5080."

The Pandora box had been opened and it ain't gonna close. Enjoy.
Posted on Reply
#37
shk021051
if gap be low i go for 4070 ti
i wonder if they lock dlss 4 behind 5xxx
Posted on Reply
#38
Vayra86
Beginner Macro DeviceWhat game?
Historically, the products of 45±4% full die were named xx60 Ti and were sold for 300+ USD.
2018: 400+ USD.
2022: doesn't exist: 4070 Ti is a true xx60 non-Ti; $800+.
2024: appears to be 1200 USD for a "5080."

The Pandora box had been opened and it ain't gonna close. Enjoy.
Sure thing, but we need to appreciate the fact that the performance gap between those dies in a single stack was also a LOT smaller. People seem to forget that a lot. You didn't play anything different on a 690 vs a 660. Or a 780ti. Not a higher resolution either, it was all 1080p, maybe some medium settings 1440p affair. Perhaps 720p - again, a gap between resolutions no more than 30%, whereas right now the singular GPU stack caters to resolutions from 1080p to 4K.
Posted on Reply
#39
Vya Domus
All of this suggests they likely cut down on-die caches to cram more CUs as the new node doesn't increase transistor counts in a significant manner. Pretty stupid if true but I guess there is not much else they can do.

It's pretty funny that they're literally doing the 12GB 4080 thing again with a massively underpowered xx80 card but this time around someone at Nvidia had more than two working braincells and they decided to not release two GPUs with the same name so there's not gonna be an outrage. But the strategy remains the same, sell less of an uplift in performance with a higher price tag.
Posted on Reply
#40
Legacy-ZA
Vayra86For sure, the 4070ti 16GB surprised me the most. Its possibly the only well balanced/positioned product in the Ada stack, albeit still somewhat too expensive. That's what x80 should have been from the get go.
I would have upgraded if it was priced correctly.
Posted on Reply
#41
Vayra86
FreedomEclipseJensen is probably going to leave it to Ai to design a GPu.

They are going to feed a load of blueprints into an off grid custom built chatGPT then tell Alexa to get to work.
It will probably be the joke of the century. Hallucinating pixels, I can totally see it
Posted on Reply
#42
rv8000
DavenI’m very curious to see if Nvidia skips the mid range below the 5070.

As for performance,

5070 = 4080
5080 = 4090
5090 = 30% higher than 4090
I very much doubt a 5080 is as fast as a 4090 if there’s effectively no increase in cuda cores from 4080>4080s>5080, wider bandwidth isn’t gonna do much of anything if there isn’t a massive IPC gain on the CUDA core side.

I fully expect a 5080 to be slower than the 4090 without the use of gimmick marketing like they used with ADA.

I’d like to be wrong for sure, but the consumer market needs serious price correction. The fact that you get 4060, a steaming turd in every respect, at a $300 price point is actually insane. I’ve been actively not recommending friends and family to invest in gaming pc’s the last year and will continue to do so going forward.
Posted on Reply
#43
Wasteland
FreedomEclipseJensen is probably going to leave it to Ai to design a GPu.

They are going to feed a load of blueprints into an off grid custom built chatGPT then tell Alexa to get to work.
Vayra86It will probably be the joke of the century. Hallucinating pixels, I can totally see it
Retail AI upscaling: selling you a 1080p card at 4k-chip prices
Posted on Reply
#44
RedelZaVedno
12 gigs on 5070???:roll:LMAO. I bet 5070 won't even beat my poor man's 4K GPU, 4070TI Super. Ngreedia at it's best:
Posted on Reply
#45
theglaze
AnotherReaderThe gap between the 4070 and 4080 is much greater than in previous generations. I would be very surprised if the 5070 actually manages to match the 4080.
I agree.

Nvidia doesn't compete with itself anymore. New cards are slotting into different price points, rather than competing at same price pint of last gen.

Now gamers have to go to the 'used' GeForce market to find the price point they want.

For example, "entry-level" RTX 4060 Ti 8GB are still +$400 USD. If you want to pay less, you can find vanilla RTX 4060 8GB at $300. Which is outperformed by RTX 2080 8GB, goes for $200 USD on eBay.
Posted on Reply
#46
Daven
SSGBryanI wouldn't bet on the 5070 beating the 4080.

The 3080 beat out the 4070 in a lot of games. Then there is the whole 12gb in a 70s class card in 2025.

Bring on Battlemage.
5070 might be more like the 4070 Ti Super
rv8000I very much doubt a 5080 is as fast as a 4090 if there’s effectively no increase in cuda cores from 4080>4080s>5080, wider bandwidth isn’t gonna do much of anything if there isn’t a massive IPC gain on the CUDA core side.

I fully expect a 5080 to be slower than the 4090 without the use of gimmick marketing like they used with ADA.

I’d like to be wrong for sure, but the consumer market needs serious price correction. The fact that you get 4060, a steaming turd in every respect, at a $300 price point is actually insane. I’ve been actively not recommending friends and family to invest in gaming pc’s the last year and will continue to do so going forward.
The problem is always the performance gap. The difference between the 4080 and 4090 is around 30%. If the 5080 cannot match the 4090 in performance then it will be less than 30% faster than the model it is replacing. That is too small of a difference and acting more like CPU generational differences.
Posted on Reply
#47
SSGBryan
Daven5070 might be more like the 4070 Ti Super


The problem is always the performance gap. The difference between the 4080 and 4090 is 20-25%. If the 5080 cannot match the 4090 in performance then it will be less than 20% faster than the model it is replacing. That is too small of a difference and acting more like CPU generational differences.
Which may actually give both AMD & Intel the opportunity to gain marketshare.

I am not going back to 12gb cards.
Posted on Reply
#48
sephiroth117
22+GB 5080 would have been perfect from gaming imho.

Those are very high end GPU and sadly some games can use more than 16GB in 4K now, what will the future holds, with more UE5 nanite etc ?

4090 or 5090 for me next year then..
Posted on Reply
#49
PixelTech
I was hoping to buy the 5090 this Q4 2024. Guess that might of happened if AMD stayed in the high end GPU market.
Posted on Reply
#50
rv8000
Daven5070 might be more like the 4070 Ti Super


The problem is always the performance gap. The difference between the 4080 and 4090 is around 30%. If the 5080 cannot match the 4090 in performance then it will be less than 30% faster than the model it is replacing. That is too small of a difference and acting more like CPU generational differences.
Half of the 4000 series was a neutral increase in price performance over 3000 series, there’s no reason for Nvidia to offer a better product given the way people throw away money. They’ve consistently shifted lower performance up to higher price brackets. Unless the rumored specs are horribly wrong it’s gonna be slower.

Based on the gap between the 4080 and 4090 in raw specs, and the performance difference only being ~30%, a 10% bump to CUDA cores over the 4080 and memory bandwidth increase are going to require significant IPC increases. IPC increases have been getting smaller and smaller in the last few years. It’s overly optimistic to think the 5080 is going to match a 4090 explicitly based on the rumored specs.
Posted on Reply
Add your own comment
Nov 21st, 2024 09:10 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts