Thursday, September 26th 2024

NVIDIA GeForce RTX 5090 and RTX 5080 Specifications Surface, Showing Larger SKU Segmentation

Thanks to the renowned NVIDIA hardware leaker kopite7Kimi on X, we are getting information about the final versions of NVIDIA's first upcoming wave of GeForce RTX 50 series "Blackwell" graphics cards. The two leaked GPUs are the GeForce RTX 5090 and RTX 5080, which now feature a more significant gap between xx80 and xx90 SKUs. For starters, we have the highest-end GeForce RTX 5090. NVIDIA has decided to use the GB202-300-A1 die and enabled 21,760 FP32 CUDA cores on this top-end model. Accompanying the massive 170 SM GPU configuration, the RTX 5090 has 32 GB of GDDR7 memory on a 512-bit bus, with each GDDR7 die running at 28 Gbps. This translates to 1,568 GB/s memory bandwidth. All of this is confined to a 600 W TGP.

When it comes to the GeForce RTX 5080, NVIDIA has decided to further separate its xx80 and xx90 SKUs. The RTX 5080 has 10,752 FP32 CUDA cores paired with 16 GB of GDDR7 memory on a 256-bit bus. With GDDR7 running at 28 Gbps, the memory bandwidth is also halved at 784 GB/s. This SKU uses a GB203-400-A1 die, which is designed to run within a 400 W TGP power envelope. For reference, the RTX 4090 has 68% more CUDA cores than the RTX 4080. The rumored RTX 5090 has around 102% more CUDA cores than the rumored RTX 5080, which means that NVIDIA is separating its top SKUs even more. We are curious to see at what price point NVIDIA places its upcoming GPUs so that we can compare generational updates and the difference between xx80 and xx90 models and their widened gaps.
Sources: kopite7kimi (RTX 5090), kopite7kimi (RTX 5080)
Add your own comment

185 Comments on NVIDIA GeForce RTX 5090 and RTX 5080 Specifications Surface, Showing Larger SKU Segmentation

#26
RedelZaVedno
I don't believe it but if this rumor turns out to be true, 5080 will be a complete shitshow. The RTX 4090 has 16384 shading units so good luck matching it's rasterization performance with only 10,752 FP32 CUDA cores paired with a 256-bit memory bus. This thing will be around 20%, maybe 30% faster than 4080(S) at best. Who would want to buy this shit if it's not priced well under a grand?
Posted on Reply
#27
pk67
RedelZaVednoI don't believe it but if this rumor turns out to be true, 5080 will be a complete shitshow. The RTX 4090 has 16384 shading units so good luck matching it's rasterization performance with only 10,752 FP32 CUDA cores paired with a 256-bit memory bus. This thing will be around 20%, maybe 30% faster than 4080(S) at best. Who would want to buy this shit if it's not priced well under a grand?
Dont judge too fast. If you care just pure rasterization performance just consider Radeon. I wouldnt said it will be shitshow but if its 4nm node it wouldnt be revolutionary performance breaker.
Posted on Reply
#28
RedelZaVedno
pk67Dont judge too fast. If you care just pure rasterization performance just consider Radeon. I wouldnt said it will be shitshow but if its 4nm node it wouldnt be revolutionary performance breaker.
What radeon? RDNA4 will compete with 4080(S) at best. As an owner of 4070TIS I have no upgrade path besides used 4090 or 5090 for the next 3 years :(
Posted on Reply
#29
64K
RedelZaVednoI don't believe it but if this rumor turns out to be true, 5080 will be a complete shitshow. The RTX 4090 has 16384 shading units so good luck matching it's rasterization performance with only 10,752 FP32 CUDA cores paired with a 256-bit memory bus. This thing will be around 20%, maybe 30% faster than 4080(S) at best. Who would want to buy this shit if it's not priced well under a grand?
Well, remember that the original 4080 almost was a shit show until Nvidia released the real specs over the rumored ones and anyway the $1,200 MSRP was the biggest joke of the entire Ada stack but there were other jokes as well. I'm hoping Nvidia leaves the BS behind with Blackwell but I suspect they won't.
Posted on Reply
#30
Krit
RedelZaVednoWhat radeon? RDNA4 will compete with 4080(S) at best. As an owner of 4070TIS I have no upgrade path besides used 4090 or 5090 for the next 3 years :(
3 years for a graphics card it's more than half of it's life :) Long time.... Why you need to upgrade so soon ?
Posted on Reply
#31
pk67
RedelZaVednoWhat radeon? RDNA4 will compete with 4080(S) at best. As an owner of 4070TIS I have no upgrade path besides used 4090 or 5090 for the next 3 years :(
Try to look at this thru pink glasses. This way you will have more time to collect more money for even more significant upgrade like 6080(Ti/S) :laugh:
Posted on Reply
#32
RedelZaVedno
Krit3 years for a graphics card it's more than half of it's life :) Long time.... Why you need to upgrade so soon ?
Because 4070TIS is hardly cutting it for flight sim VR usage and with MS FS2024 two months away I was really hoping that 5080 would at least trade blows with 4090 and be priced around 1200 bucks. I guess Ngreedia don't want us to buy it's dies anymore besides junk non usable for AI learning. It's a sad state of affairs, AMD giving up on high end, Intel's GPUs being a joke of a year(s) and Jensen not giving a F about us ordinary buyers with limited budgets.
Posted on Reply
#33
Krit
64KI'm hoping Nvidia leaves the BS behind with Blackwell but I suspect they won't.
That's not nVIDIAS fault that people are dumb and buying them like a crazy maniacs. Look at Next-Gen GPUs: What Matters Most to You? voting results. Something similar is going on there. :laugh:

Posted on Reply
#34
Marcus L
dgianstefaniDefinitely 5080 Ti with memory/cuda somewhere in between this gen lol.

Assuming these rumours are accurate. I'd be surprised to see an xx90 with full memory bus/die. 4090 was quite cut down. I don't think they're going to jump straight to 600 W from 450 W.

My guess 500 W.

20,000 cores.
Ti/Ti Super/Ultra/LE/Super Duper :laugh: there will likely be another couple of SKU's in between them with such a large gap between the 5090/80
Can see 5080 being priced the same as 4080 @$1200 and likely $2-2.5k for the "mightyyyyyyyyyy" 5090 with such specs, 1.5TB/s vRAM though o_O
AMD better pull there finger out and give me an worthy upgrade to my 6800 for $300 almost 4 years since it's release and there is nothing without spending $500+ for a meaningful upgrade, stagnant :mad:
Posted on Reply
#35
usiname
pk67I didn't said it can't work fine at 12V but we have 2025 just around the corner now and it would be more eco friendly at 24V level with the same power cables.
BTW
2 times No - You dont need different mobo or new PSU design - just minor changes in PSU - thats all.
Here we go again. I will suggest 48V for 5090, 24V for 5080 and 12V for the rest. Of course the 5090 and 5080 will support also 12V. Its so simple to add new power rail to the PSUs, its just new standard that will took just 10 years minimum to reach the market, just in time for the RTX 5000 launch so PK67 won't hear electricity noises from his fanless low end GPU :roll:
Posted on Reply
#36
pk67
RedelZaVednoBecause 4070TIS is hardly cutting it for flight sim VR usage and with MS FS2024 two months away I was really hoping that 5080 would at least trade blows with 5090 and be priced around 1200 bucks. I guess Ngreedia don't want us to buy it's dies anymore besides junk non usable for AI learning. It's a sad state of affairs, AMD giving up on high end, Intel's GPUs being a joke of a year(s) and Jensen not giving a F about us ordinary buyers with limited budgets.
If you are not satisfied with performance of your super duper GPU you can always trying RenPy games or just level up your Python skills.
Posted on Reply
#37
Macro Device
Vayra86And why is there such a disparity between the power gap relative to shader gap between these two generations?
Take AMD for example. RX 6700 XT: 40 CUs; RX 6800: 60 CUs. Wattage? Almost the same. Why? RX 6800 runs much lower clocks.

If this leak is correct then it's almost certain RTX 5080 is indeed an overclocked 4080. Which is foul. Especially if for >1000 USD we get <50% performance uplift (which is a given, thus it's foul).

I don't believe NVIDIA will stop shitposting but a GPU like 4070TS but for <500 USD and with >=16 GB VRAM would be much appreciated. Almost zero chance it happens before RTX 7000 series but wishing doesn't hurt.
pk67Dont judge too fast.
He still got a point. GPU prices are horribly high, and there's no way they become reasonable. Anti-monopoly departments and other organisations will have hard time proving NV guilty before the 2020s end. And even if they prove there's still no good answer. No good solution.
Posted on Reply
#38
Vya Domus
RedelZaVednoI don't believe it but if this rumor turns out to be true, 5080 will be a complete shitshow.
You better believe it, they literally tried the same strategy this generation, originally the 4070ti was supposed supposed to be a "4080", so why is this so hard to believe lol.
Posted on Reply
#39
pk67
Beginner Macro DeviceHe still got a point. GPU prices are horribly high, and there's no way they become reasonable. Anti-monopoly departments and other organisations will have hard time proving NV guilty before the 2020s end. And even if they prove there's still no good answer. No good solution.
Yes and No. It depend on if you are patient or not. What making prices horribly high is progress at ultra fast pace ( despite Jensen trying to fool us Moore Law is dead, so progress will must be dead too ). Short practical lifespan making these prices higher. After 4-5 years super duper GPU is obsolete brick even if totally new condition still.
But ultra fast progress is what we want - isnt it ? Just be more patient and you can find used GPU at affordable price I'm sure.
Posted on Reply
#40
chrcoluk
There isnt a need to release a 5000 series as they already the market leaders, this shows to me, they going to release it as they want to sell 4090 owners another 2000+ USD GPU.

If the rumour is true about the power, it also says to me whatever they got ready isnt ready for prime time if it requires another jacking up the power.
Posted on Reply
#41
AusWolf
Knowing Nvidia, that's a whole 17 product classes worth of space between the 5080 and 5090. It makes me wonder, why 5090? Why not just call it a Titan or something?
Posted on Reply
#42
pk67
AusWolfKnowing Nvidia, that's a whole 17 product classes worth of space between the 5080 and 5090. It makes me wonder, why 5090? Why not just call it a Titan or something?
If it is 4 nm node it would be Titan on jelly legs :laugh:
Posted on Reply
#43
natr0n
I wonder how many people will have so much disposable income to buy one of these.
Posted on Reply
#44
64K
chrcolukThere isnt a need to release a 5000 series as they already the market leaders, this shows to me, they going to release it as they want to sell 4090 owners another 2000+ USD GPU.
There is a big reason to release though besides just wanting to sell new GPUs. AMD will be releasing their next gen around the same time and Nvidia detests being outdone and they know that if they aren't first it means potential lost sales even if it's not a whole lot.
natr0n
natr0nI wonder how many people will have so much disposable income to buy one of these.
For the 5090 not a lot. Probably less than 1% of gamers as usual with the xx90 GPUs. I have no idea what the appeal will be with the prosumers though. The AI craze is still a thing.
Posted on Reply
#45
AusWolf
64KThere is a big reason to release though besides just wanting to sell new GPUs. AMD will be releasing their next gen around the same time and Nvidia detests being outdone and they know that if they aren't first it means potential lost sales even if it's not a whole lot.
AMD won't be competing with these, though.
Posted on Reply
#46
Macro Device
pk67Just be more patient and you can find used GPU at affordable price I'm sure.
Previously, it took two gens for a top tier GPU to fall so low in price it's hard to justify efforts to sell it (GTX 680, for example, cost about 100 dollars in 2018 when it turned 6 years). Today, same level old 2080 Tis still are sold for more than $250. Ampere cards are also still expensive. Ada Lovelace GPUs, even used, are horribly expensive. With upcoming GPUs being even more expensive, it'll require me to abstain from upgrades till my very deathbed for good upgrades to be available on acceptable terms.

Progress is currently excruciatingly slow: previously, $300 GPUs made fun of $600 GPUs of the last gen (GTX 1060 VS GTX 980; HD 7870 VS HD 6970). Today, $300 GPUs match or slightly defeat yesterday's $300 GPUs. You can account inflation and DLSS all you please, it never denies the fact we need a full-scale price war but we can't get one because there's literally not a single company to unload grass for NVIDIA to touch. Notice I'm not even saying, "enough grass."

5080 will be a very expensive piece of rubbish. 5090 will be even more out of reach. 5070 downwards will just be a smidge better than their predecessors and won't cost noticeably less. We'll probably see some good news on the lowest end where Intel and AMD still have their saying but something more than casual gaming will be a millionaires' thing for a long while.
Posted on Reply
#47
64K
AusWolfAMD won't be competing with these, though.
Not with the 5090 as they've already said but with the 5080 I think they will. Why wouldn't they? AMD had a competitor for the 1080, 2080, 3080, 4080. When AMD says high end I think they are referring to the xx90 GPUs which a lot of people call high end but some call enthusiast.
Posted on Reply
#48
FoulOnWhite
Nvidia can near enough do what they like and still sell the cards.
Posted on Reply
#49
Darmok N Jalad
Just look at the transistor counts we’re at these days. To gain significantly more performance, it costs a lot. It’s an exponential growth curve, where 15% more cores gets steeper with each generation. The only other way to gain is more clockspeed, but again, more clocks over more cores has the same exponential penalty. It’s why new products are slowing down. New nodes only fix this so much, and I get the feeling they can’t keep pace with the demands needed for “generational” performance gains.
Posted on Reply
#50
RandallFlagg
Beginner Macro DevicePreviously, it took two gens for a top tier GPU to fall so low in price it's hard to justify efforts to sell it (GTX 680, for example, cost about 100 dollars in 2018 when it turned 6 years). Today, same level old 2080 Tis still are sold for more than $250. Ampere cards are also still expensive. Ada Lovelace GPUs, even used, are horribly expensive. With upcoming GPUs being even more expensive, it'll require me to abstain from upgrades till my very deathbed for good upgrades to be available on acceptable terms.

Progress is currently excruciatingly slow: previously, $300 GPUs made fun of $600 GPUs of the last gen (GTX 1060 VS GTX 980; HD 7870 VS HD 6970). Today, $300 GPUs match or slightly defeat yesterday's $300 GPUs. You can account inflation and DLSS all you please, it never denies the fact we need a full-scale price war but we can't get one because there's literally not a single company to unload grass for NVIDIA to touch. Notice I'm not even saying, "enough grass."

5080 will be a very expensive piece of rubbish. 5090 will be even more out of reach. 5070 downwards will just be a smidge better than their predecessors and won't cost noticeably less. We'll probably see some good news on the lowest end where Intel and AMD still have their saying but something more than casual gaming will be a millionaires' thing for a long while.
Agree, the price ramp up of GPUs is ridiculous. I see it in everything though, but I believe it's reflective of larger social forces going on with concentration of wealth. In the US, around 8% of households are 'millionaires'. They have tremendous buying power, that's where the market is at so everything is gravitating to service that group. The midrange and lower end GPUs act in a fashion that reminds me of 'shrinkflation'. You pay the same because that's what that segment can afford, but relative to the top end what you get is less and less.
Posted on Reply
Add your own comment
Dec 11th, 2024 22:47 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts