Thursday, September 26th 2024

NVIDIA GeForce RTX 5090 and RTX 5080 Specifications Surface, Showing Larger SKU Segmentation

Thanks to the renowned NVIDIA hardware leaker kopite7Kimi on X, we are getting information about the final versions of NVIDIA's first upcoming wave of GeForce RTX 50 series "Blackwell" graphics cards. The two leaked GPUs are the GeForce RTX 5090 and RTX 5080, which now feature a more significant gap between xx80 and xx90 SKUs. For starters, we have the highest-end GeForce RTX 5090. NVIDIA has decided to use the GB202-300-A1 die and enabled 21,760 FP32 CUDA cores on this top-end model. Accompanying the massive 170 SM GPU configuration, the RTX 5090 has 32 GB of GDDR7 memory on a 512-bit bus, with each GDDR7 die running at 28 Gbps. This translates to 1,568 GB/s memory bandwidth. All of this is confined to a 600 W TGP.

When it comes to the GeForce RTX 5080, NVIDIA has decided to further separate its xx80 and xx90 SKUs. The RTX 5080 has 10,752 FP32 CUDA cores paired with 16 GB of GDDR7 memory on a 256-bit bus. With GDDR7 running at 28 Gbps, the memory bandwidth is also halved at 784 GB/s. This SKU uses a GB203-400-A1 die, which is designed to run within a 400 W TGP power envelope. For reference, the RTX 4090 has 68% more CUDA cores than the RTX 4080. The rumored RTX 5090 has around 102% more CUDA cores than the rumored RTX 5080, which means that NVIDIA is separating its top SKUs even more. We are curious to see at what price point NVIDIA places its upcoming GPUs so that we can compare generational updates and the difference between xx80 and xx90 models and their widened gaps.
Sources: kopite7kimi (RTX 5090), kopite7kimi (RTX 5080)
Add your own comment

111 Comments on NVIDIA GeForce RTX 5090 and RTX 5080 Specifications Surface, Showing Larger SKU Segmentation

#1
dgianstefani
TPU Proofreader
Definitely 5080 Ti with memory/cuda somewhere in between this gen lol.

Assuming these rumours are accurate. I'd be surprised to see an xx90 with full memory bus/die. 4090 was quite cut down. I don't think they're going to jump straight to 600 W from 450 W.

My guess 500 W.

20,000 cores.
Posted on Reply
#2
64K
dgianstefaniDefinitely 5080 Ti with memory/cuda somewhere in between this gen lol.
Agreed. There's inevitably going to be some salvage on the big chip that will have to be sold somehow.
Posted on Reply
#3
potsdaman
70 series again with 12Gb :mad::rolleyes:
Posted on Reply
#4
Ravenmaster
Take everything Kimi says with a massive grain of salt. He changes his predictions every couple of weeks.
Posted on Reply
#5
N3utro
600W and 400W, that's a 150 (33%) and 80W (25%) increase compared to 4090 and 4080. They'd better offer performance uplift significantly higher than these % or it will be a flop
Posted on Reply
#6
phints
Nvidia can't do a massive lithography jump like RTX 3000 to 4000 this time (Samsung 8nm to TSMC 4nm), so sadly we are looking at a TDP increase for most of the added performance.
Posted on Reply
#7
pk67
dgianstefaniDefinitely 5080 Ti with memory/cuda somewhere in between this gen lol.

Assuming these rumours are accurate. I'd be surprised to see an xx90 with full memory bus/die. 4090 was quite cut down. I don't think they're going to jump straight to 600 W from 450 W.

My guess 500 W.

20,000 cores.
My bet - closer to 13k cores for 5080 Ti or Super
Also I bet 21k cores of 5090 is for most defective dies available now. Later we will see higher grade dies with slightly more enabled cores.

BTW - they should also differentiate supply voltages between these two SKU imho. 24V would fit better for 600W thermal envelope.
Posted on Reply
#8
gffermari
1099£ for 5080, 1799£ for 5090.

So if you want a high end card, it will the same sxxt again.
The x90 variant will only be worth it.
Posted on Reply
#9
64K
gffermari1099£ for 5080, 1799£ for 5090.

So if you want a high end card, it will the same sxxt again.
The x90 variant will only be worth it.
Might even go as high as $3,000 for the 5090 if it doesn't get cut down or cut down by much from the possible specs and they call it a Titan. Nvidia has done a Titan for $2,500 and $3,000 in the past. :eek:
Posted on Reply
#10
pk67
gffermari1099£ for 5080, 1799£ for 5090.

So if you want a high end card, it will the same sxxt again.
The x90 variant will only be worth it.
If 5090 is done in 4nm node i would waiting for 3nm line. 4nm parts will be obsolete pretty soon imho.
Posted on Reply
#11
Scircura
RavenmasterTake everything Kimi says with a massive grain of salt. He changes his predictions every couple of weeks.
Yes, his past predictions could be significantly off. To his credit he doesn't delete past incorrect predictions so you can check his track record for yourself.
Posted on Reply
#12
Vayra86
N3utro600W and 400W, that's a 150 (33%) and 80W (25%) increase compared to 4090 and 4080. They'd better offer performance uplift significantly higher than these % or it will be a flop
400W x80 makes absolutely no sense to me, to be honest. There's no way this perf gap and this wattage makes any sense tbh. I can get into the 21k cores vs 10k cores, but not with these wattages.
pk67If 5090 is done in 4nm node i would waiting for 3nm line. 4nm parts will be obsolete pretty soon imho.
If its a big node advancement, it will be delayed and/or capacity will be scarce. Its best to just look at what's in front of you right now, or you'll find yourself waiting a long time. And whatever the node its built on, you can still deploy lots of shitty products on it nonetheless: the price and the specs make the product, in the end.
Posted on Reply
#13
pk67
Vayra86400W x80 makes absolutely no sense to me, to be honest. There's no way this perf gap and this wattage makes any sense tbh. I can get into the 21k cores vs 10k cores, but not with these wattages.
You can always undervolt and trim clocks down a bit
Posted on Reply
#14
TheinsanegamerN
pk67My bet - closer to 13k cores for 5080 Ti or Super
Also I bet 21k cores of 5090 is for most defective dies available now. Later we will see higher grade dies with slightly more enabled cores.

BTW - they should also differentiate supply voltages between these two SKU imho. 24V would fit better for 600W thermal envelope.
24V would require an entirely different motherboard and PSU design. Not happening. EDI: also 3090ti already did 600w on 12v. it worked fine.
Posted on Reply
#15
Vayra86
pk67You can always undervolt and trim clocks down a bit
That's not the point I was trying to make. Its the overall wattage jump compared to the core count I'm looking at - between these two GPUs, but also compared to Ada.

The 4080 already runs nearly 10k shaders, at 320W.
The 4090 runs 16k right now, at 450W.

And then there's also a 4080S with 10240 with the same 320W.

So what's Blackwell's 5080 then? A massively OC'd 4080S? And why is there such a disparity between the power gap relative to shader gap between these two generations? Are these some special shaders that want much more juice on a 5080 than they do on a 5090? We haven't seen a big frequency gap between same stack GPUs on Nvidia lately, so what's that power doing there, all VRAM... ? And if that's true, we come back to the earlier point, what's that 5080 doing with 400W on the same 16GB as its Ada sibling?
Posted on Reply
#16
Nater
So hoping for a CAD/Gaming flagship combo card. I sold my 5700XT and RTX 3080 right before Bitcoin crashed, been stuck on an A2000 since. I still do a lot of CAD stuff, but it barely cracks 60 fps in Warzone on low settings.
Posted on Reply
#17
dgianstefani
TPU Proofreader
pk67My bet - closer to 13k cores for 5080 Ti or Super
Also I bet 21k cores of 5090 is for most defective dies available now. Later we will see higher grade dies with slightly more enabled cores.

BTW - they should also differentiate supply voltages between these two SKU imho. 24V would fit better for 600W thermal envelope.
I wasn't clear, I mean I expect closer to 20K cores, not 22k for 5090, with 5080 Ti somewhere in between.
I don't think there's much chance of a full die/full bus 5090.

Maybe 5090 Ti/Titan for 22k full die/full bus.

Yes, 5080 Ti somewhere in 15k range.
Posted on Reply
#18
pk67
Vayra86That's not the point I was trying to make. Its the overall wattage jump compared to the core count I'm looking at - between these two GPUs, but also compared to Ada.

The 4080 already runs nearly 10k shaders, at 320W.
The 4090 runs 16k right now, at 450W.

And then there's also a 4080S with 10240 with the same 320W.

So what's Blackwell's 5080 then? A massively OC'd 4080S? And why is there such a disparity between the power gap relative to shader gap between these two generations? Are these some special shaders that want much more juice on a 5080 than they do on a 5090? We haven't seen a big frequency gap between same stack GPUs on Nvidia lately, so what's that power doing there, all VRAM?
Frankly speaking I'm not surprised they are trying to get as much juice as they can from these dies. As a result you get insane wattages.
Posted on Reply
#19
Vayra86
pk67Frankly speaking I'm not surprised they are trying to get as much juice as they can from these dies. As a result you get insane wattages.
Still very unconvinced we'll see a 400W x80 though. 21k at 600W I can get into, but 10k at 400W then makes no sense, especially not given the halved VRAM and bus. Its literally half the GPU. If there's so much to be gained from more power, the 5090 would've definitely been given more room to breathe.
Posted on Reply
#20
64K
Vayra86That's not the point I was trying to make. Its the overall wattage jump compared to the core count I'm looking at - between these two GPUs, but also compared to Ada.

The 4080 already runs nearly 10k shaders, at 320W.
The 4090 runs 16k right now, at 450W.

And then there's also a 4080S with 10240 with the same 320W.

So what's Blackwell's 5080 then? A massively OC'd 4080S? And why is there such a disparity between the power gap relative to shader gap between these two generations? Are these some special shaders that want much more juice on a 5080 than they do on a 5090? We haven't seen a big frequency gap between same stack GPUs on Nvidia lately, so what's that power doing there, all VRAM... ? And if that's true, we come back to the earlier point, what's that 5080 doing with 400W on the same 16GB as its Ada sibling?
There's still a lot we can only guess at right now even assuming the rumored specs are true. Number of Tensor and RT cores and how insane the clocks could be.
Posted on Reply
#21
Vayra86
dgianstefaniI wasn't clear, I mean I expect closer to 20K cores, not 22k for 5090, with 5080 Ti somewhere in between.
I don't think there's much chance of a full die/full bus 5090.

Maybe 5090 Ti/Titan for 22k full die/full bus.

Yes, 5080 Ti somewhere in 15k range.
I wouldn't dismiss the idea of them releasing x80ti with a mere 13k shaders either, to keep the halo product, halo, and they could still sell that with a price somewhere just under a 4090 with perf just over it.
Posted on Reply
#22
pk67
Vayra86Still very unconvinced we'll see a 400W x80 though. 21k at 600W I can get into, but 10k at 400W then makes no sense, especially not given the halved VRAM and bus. Its literally half the GPU. If there's so much to be gained from more power, the 5090 would've definitely been given more room to breathe.
Yes its half the GPU but similiar case surface for cooling. So with a bit higher clocks they can safely get a bit more juice under 400W thermal limit of safety. I see that this way.
Posted on Reply
#23
igormp
32GB? Wew, I wasn't expecting that at all from leatherjacketman.

Heavily considering a couple of those now...
Posted on Reply
#24
Raiju
Very much like Apple with their M1; Rtx 5080 would be M1 Max while 5090 would be M1 Ultra.
I know there's litterally zero chance of that happening but I wish they could leave the interconnection (NVLink) on the 5080 to bring back SLI
Posted on Reply
#25
pk67
TheinsanegamerN24V would require an entirely different motherboard and PSU design. Not happening. EDI: also 3090ti already did 600w on 12v. it worked fine.
I didn't said it can't work fine at 12V but we have 2025 just around the corner now and it would be more eco friendly at 24V level with the same power cables.
BTW
2 times No - You dont need different mobo or new PSU design - just minor changes in PSU - thats all.
Posted on Reply
Add your own comment
Sep 27th, 2024 13:18 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts