• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA GeForce RTX 5090 and RTX 5080 Specifications Surface, Showing Larger SKU Segmentation

Joined
May 10, 2023
Messages
304 (0.52/day)
Location
Brazil
Processor 5950x
Motherboard B550 ProArt
Cooling Fuma 2
Memory 4x32GB 3200MHz Corsair LPX
Video Card(s) 2x RTX 3090
Display(s) LG 42" C2 4k OLED
Power Supply XPG Core Reactor 850W
Software I use Arch btw
If it had 24V output rail 1200W would be enough. But with standard 12V rail i quess 1500-1600W is safe minimum and a pair of thick cables of course.
You should just accept that 24v won't become a thing anytime soon in PCs lol
 

Ruru

S.T.A.R.S.
Joined
Dec 16, 2012
Messages
12,897 (2.94/day)
Location
Jyväskylä, Finland
System Name 4K-gaming / media-PC
Processor AMD Ryzen 7 5800X / Intel Core i7-6700K
Motherboard Asus ROG Crosshair VII Hero / Asus Z170-K
Cooling Arctic Freezer 50 / Alphacool Eisbaer 240
Memory 32GB DDR4-3466 / 16GB DDR4-3000
Video Card(s) RTX 3080 10GB / RX 6700 XT
Storage 3.3TB of SSDs / several small SSDs
Display(s) Acer 27" 4K120 IPS + Lenovo 32" 4K60 IPS
Case Corsair 4000D AF White / DeepCool CC560 WH
Audio Device(s) Creative 2.1
Power Supply EVGA G2 750W / Fractal ION Gold 550W
Mouse Logitech MX518 / Logitech G400s
Keyboard Roccat Vulcan 121 AIMO / NOS C450 Mini Pro
VR HMD Oculus Rift CV1
Software Windows 11 Pro / Windows 11 Pro
Benchmark Scores They run Crysis
You should just accept that 24v won't become a thing anytime soon in PCs lol
Yeah, at least in consumer market. Servers may be a different thing (I have no idea that do they already use it?)
 
Joined
May 26, 2023
Messages
109 (0.19/day)
You should just accept that 24v won't become a thing anytime soon in PCs lol
Frankly speaking I dont care now - cos I'm not interested in high power GPU now - but if I were I would stay away until 24V become a standard if I had pay more than grand for GPU or GPU NPU high power combo.
So its not my problem now.

edit
The other solution is just modding of PSU + 24V/12V DC/DC added right to the power socket(s) of GPU
 
Last edited:
Joined
Sep 4, 2022
Messages
326 (0.39/day)
Does anyone know why the performance doesn't scale linearly with core counts? 4080 vs 4090 has about 60% core difference but the performance delta is only 25% at 4k.
 
Joined
Oct 19, 2022
Messages
107 (0.14/day)
Location
Los Angeles, CA
Processor AMD Ryzen 7 9800X3D (+PBO)
Motherboard MSI MPG X870E Carbon Wifi
Cooling ARCTIC Liquid Freezer II 280 A-RGB
Memory 2x32GB (64GB) G.Skill Trident Z Royal @ 6400MHz 1:1 (30-38-38-30)
Video Card(s) MSI GeForce RTX 4090 SUPRIM Liquid X
Storage Samsung 990 PRO 2TB w/ Heatsink SSD + Seagate FireCuda 530 SSD 2TB w/ Heatsink
Display(s) AORUS FO32U2P 4K QD-OLED 240Hz monitor (+ LG OLED C9 55" TV 4K@120Hz)
Case CoolerMaster H500M (Mesh)
Audio Device(s) AKG N90Q with AudioQuest DragonFly Red (USB DAC)
Power Supply Seasonic Prime TX-1600 Noctua Edition (1600W 80Plus Titanium) ATX 3.1 & PCIe 5.1
Mouse Logitech G PRO X SUPERLIGHT
Keyboard Razer BlackWidow V3 Pro
Software Windows 10 64-bit
The funniest thing was when it was going to be released as "RTX 4080 12GB" first. :D


x90 is the Titan and x90 Ti is the Titan Black. ;) Remember, the first Titan wasn't even with full die, hell, even the 780 Ti had a full die (but only 3GB VRAM).

Though they did the same milking with Titan X (Pascal) and Titan Xp.


How fortunate that Seasonic just released a new 2200W unit. :rolleyes:

The x90 and x90 Ti are not TITAN yet because the TITAN usually have 2x the amount of VRAM. If they made a TITAN Ada it would have had 48GB GDDR6X.

Regarding the Seasonic they also have a 1600W that is 80+ Titanium (the 2200W is surprisingly Platinum even though there's not much difference) but I think the 1600W is enough! I wish Corsair would release a new AX1600i with 2x 16-pin connectors! I have a AX1500i and love it!

Does anyone know why the performance doesn't scale linearly with core counts? 4080 vs 4090 has about 60% core difference but the performance delta is only 25% at 4k.

It is due to a Memory Bandwidth bottleneck.
FYI the 4090 has a bandwidth of 1,008GB/s whereas the 4080 has 717GB/s aka ~40% more Bandwidth when it has 68% more CUDA Cores...
Also the 4090 has only 72MB L2 Cache (out of 96MB of a full AD102 die) and the 4080 has 64MB, so only 12.5% more...
 
Joined
Sep 4, 2022
Messages
326 (0.39/day)
The x90 and x90 Ti are not TITAN yet because the TITAN usually have 2x the amount of VRAM. If they made a TITAN Ada it would have had 48GB

It is due to a Memory Bandwidth bottleneck.
FYI the 4090 has a bandwidth of 1,008GB/s whereas the 4080 has 717GB/s aka ~40% more Bandwidth when it has 68% more CUDA Cores...
Also the 4090 has only 72MB L2 Cache (out of 96MB of a full AD102 die) and the 4080 has 64MB, so only 12.5% more...
68% more cores at 40% more bandwidth but yields 25% delta gains.
Does the l2 cache really bottlenecking the 4090 and will this plautau affect the 5090 as well?



Update it's official anyone postulating Blackwells high prices for likes is a paid troll!
 
Joined
May 26, 2023
Messages
109 (0.19/day)
68% more cores at 40% more bandwidth but yields 25% delta gains.
Does the l2 cache really bottlenecking the 4090 and will this plautau affect the 5090 as well?

I bet power supply and heat dissipation are really bottlenecking.
 

Ruru

S.T.A.R.S.
Joined
Dec 16, 2012
Messages
12,897 (2.94/day)
Location
Jyväskylä, Finland
System Name 4K-gaming / media-PC
Processor AMD Ryzen 7 5800X / Intel Core i7-6700K
Motherboard Asus ROG Crosshair VII Hero / Asus Z170-K
Cooling Arctic Freezer 50 / Alphacool Eisbaer 240
Memory 32GB DDR4-3466 / 16GB DDR4-3000
Video Card(s) RTX 3080 10GB / RX 6700 XT
Storage 3.3TB of SSDs / several small SSDs
Display(s) Acer 27" 4K120 IPS + Lenovo 32" 4K60 IPS
Case Corsair 4000D AF White / DeepCool CC560 WH
Audio Device(s) Creative 2.1
Power Supply EVGA G2 750W / Fractal ION Gold 550W
Mouse Logitech MX518 / Logitech G400s
Keyboard Roccat Vulcan 121 AIMO / NOS C450 Mini Pro
VR HMD Oculus Rift CV1
Software Windows 11 Pro / Windows 11 Pro
Benchmark Scores They run Crysis
The x90 and x90 Ti are not TITAN yet because the TITAN usually have 2x the amount of VRAM. If they made a TITAN Ada it would have had 48GB GDDR6X.
What about 3080 12GB and 3080 Ti? :rolleyes:

And the x90 is a Titan replacement as Nvidia made that clear themselves with 3090's release back then.
 
Joined
Oct 19, 2022
Messages
107 (0.14/day)
Location
Los Angeles, CA
Processor AMD Ryzen 7 9800X3D (+PBO)
Motherboard MSI MPG X870E Carbon Wifi
Cooling ARCTIC Liquid Freezer II 280 A-RGB
Memory 2x32GB (64GB) G.Skill Trident Z Royal @ 6400MHz 1:1 (30-38-38-30)
Video Card(s) MSI GeForce RTX 4090 SUPRIM Liquid X
Storage Samsung 990 PRO 2TB w/ Heatsink SSD + Seagate FireCuda 530 SSD 2TB w/ Heatsink
Display(s) AORUS FO32U2P 4K QD-OLED 240Hz monitor (+ LG OLED C9 55" TV 4K@120Hz)
Case CoolerMaster H500M (Mesh)
Audio Device(s) AKG N90Q with AudioQuest DragonFly Red (USB DAC)
Power Supply Seasonic Prime TX-1600 Noctua Edition (1600W 80Plus Titanium) ATX 3.1 & PCIe 5.1
Mouse Logitech G PRO X SUPERLIGHT
Keyboard Razer BlackWidow V3 Pro
Software Windows 10 64-bit
Even if it's $4000 or more?

I hope I'm wrong, but we might be underestimating how much Nvidia doesn't need Gaming any more.
GeForce GPUs still bring them a lot of money, even if it's true that A.I. brings them a lot more money due to their insane margins... their H100 are selling for $30K to $40K per chip !!!
Nvidia are still a Gaming brand and they know that if the A.I. bubble was bursting tomorrow they would have to go back to Gaming as their main revenue...

What about 3080 12GB and 3080 Ti? :rolleyes:

And the x90 is a Titan replacement as Nvidia made that clear themselves with 3090's release back then.
There is a reason why the 3090 and 3090 Ti were not called TITAN and that's because they are not! TITAN also pack FP64 cores and usually have 2x more VRAM, 780/Ti had 3GB whereas the TITAN had 6GB, they 2080 Ti had 11GB whereas the RTX TITAN had 24GB.

68% more cores at 40% more bandwidth but yields 25% delta gains.
Does the l2 cache really bottlenecking the 4090 and will this plautau affect the 5090 as well?



Update it's official anyone postulating Blackwells high prices for likes is a paid troll!
Performance never scales linearly, and yes the L2 Cache plays a big role in Lovelace architecture, hence the "only" 28% more performance at 4K Ultra but sometimes closer to 40% in Ray Tracing/Path Tracing because it relies on RT Cores performance.

Ps: we don't know how much VRAM the 5090 will have but it could have 96MB this time... when the full GB202 has 128MB so it might still create a bottleneck somewhere even though the Memory Bandwidth should be much higher than the 4090 (almost 1.8TB/s vs 1TB/s)

I bet power supply and heat dissipation are really bottlenecking.
The power is not a limiting factor because even with the 600W BIOS you don't get a lot more performance!
GDDR6X memory Overclocking without raising the power limit can sometimes bring you a lot more fps than Core Overclocking!
God of War: Ragnarök for example is very Memory Bandwidth bound! I have OC'd my GDDR6X to 25Gbps on my 4090 and it gave me 7% more performance without any Core OC for example.
 
Last edited:
Joined
Apr 14, 2022
Messages
749 (0.77/day)
Location
London, UK
Processor AMD Ryzen 7 5800X3D
Motherboard ASUS B550M-Plus WiFi II
Cooling Noctua U12A chromax.black
Memory Corsair Vengeance 32GB 3600Mhz
Video Card(s) Palit RTX 4080 GameRock OC
Storage Samsung 970 Evo Plus 1TB + 980 Pro 2TB
Display(s) Acer Nitro XV271UM3B IPS 180Hz
Case Asus Prime AP201
Audio Device(s) Creative Gigaworks - Razer Blackshark V2 Pro
Power Supply Corsair SF750
Mouse Razer Viper
Keyboard Asus ROG Falchion
Software Windows 11 64bit
There are benchmarks where the RTX Titan is faster than the 3090. nVidia explained back then that the Titan class gets some of the quadro features while the GeForce lineup does not.
So no, the x90s are not Titans.
 
Joined
Apr 13, 2023
Messages
43 (0.07/day)
Nvidia keeps repeating 3090 as 4080,5080. granted 4nm node L2$, double the clock speed but thats a given every other gen. 400W is strictly water cooling territory. Too bad it's not 3nm.

The memory bandwidth increase alone can give 2 digit performance increase over previous gen (15-20%) even if the rest of specs are similar. High power consumption might mean either it has crazy high GPU clocks or it's packed with TensorCores and RTX cores, since CUDA cores and SSM numbers are mostly the same.

The performance gap between 4080 and 4090 is enormous, and missing 4080Ti design is obvious there. So the 5080 will fill that gap pretty nicely. The MSRP price might match or be slightly lower than 4090 though. And retailers will for sure match price for a new gen basing upon raster performance and not on MSRPs.
 
  • Like
Reactions: N/A
Joined
May 26, 2023
Messages
109 (0.19/day)
The power is not a limiting factor because even with the 600W BIOS you don't get a lot more performance!
GDDR6X memory Overclocking without raising the power limit can sometimes bring you a lot more fps than Core Overclocking!
God of War: Ragnarök for example is very Memory Bandwidth bound! I have OC'd my GDDR6X to 25Gbps on my 4090 and it gave me 7% more performance without any Core OC for example.
Yes you are right - my statement is misleading i see. Let me explain what i had on my mind.

If they put more resources on the silicon they would have a lot more problems to supply power to them properly and to dissipate heat as well.

So power delivery and heat envelope were limiting factors at design state - I'm betting and it is what I should write in previous sentence
 
Joined
Oct 19, 2022
Messages
107 (0.14/day)
Location
Los Angeles, CA
Processor AMD Ryzen 7 9800X3D (+PBO)
Motherboard MSI MPG X870E Carbon Wifi
Cooling ARCTIC Liquid Freezer II 280 A-RGB
Memory 2x32GB (64GB) G.Skill Trident Z Royal @ 6400MHz 1:1 (30-38-38-30)
Video Card(s) MSI GeForce RTX 4090 SUPRIM Liquid X
Storage Samsung 990 PRO 2TB w/ Heatsink SSD + Seagate FireCuda 530 SSD 2TB w/ Heatsink
Display(s) AORUS FO32U2P 4K QD-OLED 240Hz monitor (+ LG OLED C9 55" TV 4K@120Hz)
Case CoolerMaster H500M (Mesh)
Audio Device(s) AKG N90Q with AudioQuest DragonFly Red (USB DAC)
Power Supply Seasonic Prime TX-1600 Noctua Edition (1600W 80Plus Titanium) ATX 3.1 & PCIe 5.1
Mouse Logitech G PRO X SUPERLIGHT
Keyboard Razer BlackWidow V3 Pro
Software Windows 10 64-bit
Yes you are right - my statement is misleading i see. Let me explain what i had on my mind.

If they put more resources on the silicon they would have a lot more problems to supply power to them properly and to dissipate heat as well.

So power delivery and heat envelope were limiting factors at design state - I'm betting and it is what I should write in previous sentence
Well the 4090/4080 cooler was already made to sustain 500W easily and up to 600W too! The 4090 almost never reaches 450W that a 4090 so it's almost overkill already.
The 4090 Ti was supposed to be a 600W GPU with a 4-slot cooler... but even the 4090 with a 600W BIOS and fully overclocked doesn't even get very high temperatures so I'm not worried about the 5090. Blackwell is supposed to be a brand new architecture whereas Lovelace was more an Ampere+ architecture. The biggest change was the process node, going from Samsung 8nm (10nm enhnaced) to TSMC 4N (5nm enhanced) was a big jump!
 
Joined
Dec 31, 2020
Messages
989 (0.69/day)
Processor E5-4627 v4
Motherboard VEINEDA X99
Memory 32 GB
Video Card(s) 2080 Ti
Storage NE-512
Display(s) G27Q
Case DAOTECH X9
Power Supply SF450
Clearly some 5090s can go up to 800W and the dual 12V-2x6 connectors are a must as they double the 5080 in every possible way. or gets stuck at 2.4GHz while the 5080 can average up to 3GHz.
 
Joined
May 26, 2023
Messages
109 (0.19/day)
Well the 4090/4080 cooler was already made to sustain 500W easily and up to 600W too! The 4090 almost never reaches 450W that a 4090 so it's almost overkill already.
The 4090 Ti was supposed to be a 600W GPU with a 4-slot cooler... but even the 4090 with a 600W BIOS and fully overclocked doesn't even get very high temperatures so I'm not worried about the 5090. Blackwell is supposed to be a brand new architecture whereas Lovelace was more an Ampere+ architecture. The biggest change was the process node, going from Samsung 8nm (10nm enhnaced) to TSMC 4N (5nm enhanced) was a big jump!
I'm not worried too. Just trying point out why you cant get significant extra performance without risking shortening substantially lifespan of the chip imho.
 
Joined
Sep 4, 2022
Messages
326 (0.39/day)
I bet power supply and heat dissipation are really bottlenecking.
In an ideal linear scale the performance would be close to delta cores/ bandwidth or somewhere in the middle at the similar clocks speeds. Although my 4090 suprim liquid at 3ghz and 100 mhz + on vram get about 10 to 15% gains in rt titles over factory settings.
Reminds of the sli scaling bs where 2 gpus didn't scale 100% haha the monolithic being superior than 2 gpus was half true.
One would hope that scaling would be linear or close to it especially at almost 100% premium outside a few outliers just like sli. Hopefully that 512 bit bus improves scaling for Blackwell.


update but then again if power was the issue the 4080 at 3ghz and memory oc also has significant performance delta so you have to look at it at factory settings. Tweaking is no part of the equation because both sides improve.
 
Last edited:
Joined
Dec 26, 2006
Messages
3,858 (0.59/day)
Location
Northern Ontario Canada
Processor Ryzen 5700x
Motherboard Gigabyte X570S Aero G R1.1 BiosF5g
Cooling Noctua NH-C12P SE14 w/ NF-A15 HS-PWM Fan 1500rpm
Memory Micron DDR4-3200 2x32GB D.S. D.R. (CT2K32G4DFD832A)
Video Card(s) AMD RX 6800 - Asus Tuf
Storage Kingston KC3000 1TB & 2TB & 4TB Corsair MP600 Pro LPX
Display(s) LG 27UL550-W (27" 4k)
Case Be Quiet Pure Base 600 (no window)
Audio Device(s) Realtek ALC1220-VB
Power Supply SuperFlower Leadex V Gold Pro 850W ATX Ver2.52
Mouse Mionix Naos Pro
Keyboard Corsair Strafe with browns
Software W10 22H2 Pro x64
Pretty much the same, maybe $350. As much as these cards cost, I can buy 2 decent gaming laptops.
Ya $350 probably acceptable. I'd pay more if there were bios support like motherboards have, and if it was possible to have a sodimm slot so it would be easy to upgrade ram capacity. Reality is, all games I have could run on an RX6600, the other part less friends gaming and less time and desire to game.
 
Joined
Jul 13, 2016
Messages
3,307 (1.08/day)
Processor Ryzen 7800X3D
Motherboard ASRock X670E Taichi
Cooling Noctua NH-D15 Chromax
Memory 32GB DDR5 6000 CL30
Video Card(s) MSI RTX 4090 Trio
Storage Too much
Display(s) Acer Predator XB3 27" 240 Hz
Case Thermaltake Core X9
Audio Device(s) Topping DX5, DCA Aeon II
Power Supply Seasonic Prime Titanium 850w
Mouse G305
Keyboard Wooting HE60
VR HMD Valve Index
Software Win 10
68% more cores at 40% more bandwidth but yields 25% delta gains.
Does the l2 cache really bottlenecking the 4090 and will this plautau affect the 5090 as well?



Update it's official anyone postulating Blackwells high prices for likes is a paid troll!

The performance scaling is very poor because of the changes to the SM. 3000 series and later, there is one FP32 data path and one FP32 / INT data path that Nvidia counts as 2 cores per SM that share resources. 2000 series though had one FP32 core and one Int core per SM but only technically counts as 1 core. In both versions of the SM, operations can be run simultaneously. What this means is that for operations with a 50/50 mix of INT and FP32, both SMs will be equally occupied (assuming no bottlenecks in other parts of the pipeline).

That said games do not run 50/50 INT / FP32. They run 23 / 77, which essentially perfectly lines up with the expected performance uplift of adding FP32 capability to your INT datapath (assuming no other bottlenecks). 27% of your INT cores that would have otherwise remained idle can now handle FP32, which increases your performance in gaming workloads.

Nvidia has a whitepaper on the 3000 series here: https://www.nvidia.com/content/PDF/nvidia-ampere-ga-102-gpu-architecture-whitepaper-v2.pdf
 
Joined
May 10, 2023
Messages
304 (0.52/day)
Location
Brazil
Processor 5950x
Motherboard B550 ProArt
Cooling Fuma 2
Memory 4x32GB 3200MHz Corsair LPX
Video Card(s) 2x RTX 3090
Display(s) LG 42" C2 4k OLED
Power Supply XPG Core Reactor 850W
Software I use Arch btw
Yeah, at least in consumer market. Servers may be a different thing (I have no idea that do they already use it?)
Some stuff like Nvidia's SXM and the OAM counterpart use 48v to power those 700W+ accelerators.

This is way easier to pull on a platform where you don't need to care that much about standards and can make your own (such as SXM itself). SXM3 even hinted to manufacturers that they could use a 12v to 48v booster in their designs to update legacy projects.

TITAN also pack FP64 cores
FP64 on consumer GPUs haven't been a thing since Kepler. FP64 cores are only a thing on x100 chips now.
The Titan V had it since it used the V100 chip, but the latter Titan RTX did not.
they 2080 Ti had 11GB whereas the RTX TITAN had 24GB.
The 3080ti had 12gb whereas the 3090 had 24GB.
 

Ruru

S.T.A.R.S.
Joined
Dec 16, 2012
Messages
12,897 (2.94/day)
Location
Jyväskylä, Finland
System Name 4K-gaming / media-PC
Processor AMD Ryzen 7 5800X / Intel Core i7-6700K
Motherboard Asus ROG Crosshair VII Hero / Asus Z170-K
Cooling Arctic Freezer 50 / Alphacool Eisbaer 240
Memory 32GB DDR4-3466 / 16GB DDR4-3000
Video Card(s) RTX 3080 10GB / RX 6700 XT
Storage 3.3TB of SSDs / several small SSDs
Display(s) Acer 27" 4K120 IPS + Lenovo 32" 4K60 IPS
Case Corsair 4000D AF White / DeepCool CC560 WH
Audio Device(s) Creative 2.1
Power Supply EVGA G2 750W / Fractal ION Gold 550W
Mouse Logitech MX518 / Logitech G400s
Keyboard Roccat Vulcan 121 AIMO / NOS C450 Mini Pro
VR HMD Oculus Rift CV1
Software Windows 11 Pro / Windows 11 Pro
Benchmark Scores They run Crysis
There is a reason why the 3090 and 3090 Ti were not called TITAN and that's because they are not! TITAN also pack FP64 cores and usually have 2x more VRAM, 780/Ti had 3GB whereas the TITAN had 6GB, they 2080 Ti had 11GB whereas the RTX TITAN had 24GB.
Did any Titan after the GK110 based ones have any special FP64 performance? Nope. ;)

They've been just glamorized halo-tier cards with (almost) full die and with full memory bandwith tand with larger VRAM amount. That's why x90 is the Titan these days, just branded for gamers.

FP64 on consumer GPUs haven't been a thing since Kepler. FP64 cores are only a thing on x100 chips now.
You were faster, looks that we said the same things.
 
Joined
Sep 17, 2014
Messages
22,566 (6.04/day)
Location
The Washing Machine
System Name Tiny the White Yeti
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling CPU: Thermalright Peerless Assassin / Case: Phanteks T30-120 x3
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
VR HMD HD 420 - Green Edition ;)
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
The x90 and x90 Ti are not TITAN yet because the TITAN usually have 2x the amount of VRAM. If they made a TITAN Ada it would have had 48GB GDDR6X.

Regarding the Seasonic they also have a 1600W that is 80+ Titanium (the 2200W is surprisingly Platinum even though there's not much difference) but I think the 1600W is enough! I wish Corsair would release a new AX1600i with 2x 16-pin connectors! I have a AX1500i and love it!



It is due to a Memory Bandwidth bottleneck.
FYI the 4090 has a bandwidth of 1,008GB/s whereas the 4080 has 717GB/s aka ~40% more Bandwidth when it has 68% more CUDA Cores...
Also the 4090 has only 72MB L2 Cache (out of 96MB of a full AD102 die) and the 4080 has 64MB, so only 12.5% more...
I wouldn't dare to try and find some sort of reason or logic within the Nvidia naming schemes.

The first, foremost and dare I say only aspect that determines what Nvidia calls A, B or C is marketing strategy. Every single Titan was created with that express purpose: marketing. GTX and RTX were created for marketing purposes, too. They call it whatever it is they want to sell you. Its not necessarily a different product. Its just whatever's deemed popular.
 
Joined
Sep 4, 2022
Messages
326 (0.39/day)
Intel: here is a CPU that needs 300W
Nvidia: here is a GPU that needs 600W
Intel: challenge accepted
Me at the sidelines with 4090 suprim liquid with power limit and 7800X3D pbo offset -25 @ 95% performance and half the power. :cool:
 
Joined
Oct 19, 2022
Messages
107 (0.14/day)
Location
Los Angeles, CA
Processor AMD Ryzen 7 9800X3D (+PBO)
Motherboard MSI MPG X870E Carbon Wifi
Cooling ARCTIC Liquid Freezer II 280 A-RGB
Memory 2x32GB (64GB) G.Skill Trident Z Royal @ 6400MHz 1:1 (30-38-38-30)
Video Card(s) MSI GeForce RTX 4090 SUPRIM Liquid X
Storage Samsung 990 PRO 2TB w/ Heatsink SSD + Seagate FireCuda 530 SSD 2TB w/ Heatsink
Display(s) AORUS FO32U2P 4K QD-OLED 240Hz monitor (+ LG OLED C9 55" TV 4K@120Hz)
Case CoolerMaster H500M (Mesh)
Audio Device(s) AKG N90Q with AudioQuest DragonFly Red (USB DAC)
Power Supply Seasonic Prime TX-1600 Noctua Edition (1600W 80Plus Titanium) ATX 3.1 & PCIe 5.1
Mouse Logitech G PRO X SUPERLIGHT
Keyboard Razer BlackWidow V3 Pro
Software Windows 10 64-bit
Clearly some 5090s can go up to 800W and the dual 12V-2x6 connectors are a must as they double the 5080 in every possible way. or gets stuck at 2.4GHz while the 5080 can average up to 3GHz.
If the leak from kopite7kimi is true the 5090 is a Dual-slot GPU therefore it's probably Liquid-cooled like the MSI 4090 SUMPRIM LIQUID X ! If so then AIBs are going to struggle even more to make buyers want to buy theirs. I guess people who want Air-Cooling will go for AIBs but Water-Cooling is definitely going to become a Standard sooner or later if GPUs start pulling 600W+

I wouldn't dare to try and find some sort of reason or logic within the Nvidia naming schemes.

The first, foremost and dare I say only aspect that determines what Nvidia calls A, B or C is marketing strategy. Every single Titan was created with that express purpose: marketing. GTX and RTX were created for marketing purposes, too. They call it whatever it is they want to sell you. Its not necessarily a different product. Its just whatever's deemed popular.
As much as Naming doesn't mean anything the TITAN line is definitely aimed at Professionals. They have some FP64 cores that Consumer GPUs do not have, and they usually have 2x the amount of VRAM for Professional workloads too.
 
Top