• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA GeForce RTX 5090 Founders Edition

Joined
Jan 17, 2022
Messages
78 (0.07/day)
I do wonder if there's any credence to the rumors that some AIBs expressed surprise at NVIDIA's final pricing structure revealed during CES.
Makes me think that the initial price was supposed to be 2499 USD instead of 1999 USD. It'd be the only way these insane AIB markups make any sense.
My guess is the AIBs are surprised at the pricing because it gives them little room for profit over the price Nvidia charges them for each chip.

Nvidia has pretty obviously been trying to close off the market to AIBs and only sell direct (cut out a middleman). It's a primary reason EVGA quit the market during the 4000 series and I'm sure the 5000 series puts the squeeze on even harder than the 4000 series.
 
Joined
Jun 26, 2023
Messages
46 (0.08/day)
Processor 7800X3D @ Curve Optimizer: All Core: -25
Motherboard TUF Gaming B650-Plus
Memory 2xKSM48E40BD8KM-32HM ECC RAM
Video Card(s) 4070 @ 110W
Display(s) SAMSUNG S95B 55" QD-OLED TV
Power Supply RM850x
I am really impressed by cooling efficiency. Incredible results. Good job Nvidia. Wonder how much does liquid metal contribute to this. Hopefully someone will test it, eventually.

Now please, other manufacturers, kindly get inspired and don't ever come again with bigger than 3-slot GPUs in the future, okay?

As for performance, I used results from this review to calculate all resolution average:

View attachment 381175

RTX 4090 has around 18% less transistors count and 25% less compute units. I used all games average fps to calculate efficiency (not juct Cyberpunk) and RTX 5090 is in fact less efficient than RTX 4090, since it consumes roughly 20% more power per frame. Maybe the price increase (+$500) is let's say justified, but performance-wise and efficiency-wise this is far from being special. In other words, now all performance increases is about scaling - the more compute units, the more performance. The more you buy, the more you have, but definitely not save. Were the RTX 5090 priced at $1700, then you'd also get more for less. We'll see when they jump to 3nm or 2nm node.

What indeed is special, as I already mentioned, is cooling efficiency of RTX 5090 cooler. Really damn impressive. My doubts were unjustified, I must say.
Do you see a reason why the 5090 FE has a 2-slot cooler design and got over 40 dB(A) fan noise, which unfortunately passes the concentration threshold and has higher temperatures too? It's not like it was a slim cooler design contest. Workstation PCs are big, consumer PCs also have no problem using a 2.5 to 3-slot cooler design.

Supposedly next-gen GeForce "Rubin" is going to use TSMC 3N (not N2 so far unfortunately, but maybe), which would increase performance by 10-15% at same power:
https://en.wikipedia.org/wiki/3_nm_process said:
TSMC has stated that its 3 nm FinFET chips will reduce power consumption by 25–30% at the same speed, increase speed by 10–15% at the same amount of power and increase transistor density by about 33% compared to its previous 5 nm FinFET chips.
Not great. Using N2 would give the average expected 30% power efficiency improvement gen-over-gen, but NV would have to jump 2 full nodes from N4 (which basically is a N5) to N3 to N2 and I'm not sure N2 will be ready by then.
 
Joined
Oct 19, 2022
Messages
235 (0.28/day)
Location
Los Angeles, CA
Processor AMD Ryzen 7 9800X3D
Motherboard MSI MPG X870E Carbon Wifi
Cooling ARCTIC Liquid Freezer II 280 A-RGB
Memory 2x32GB (64GB) G.Skill Trident Z Royal @ 6400MHz 1:1 (30-38-38-30)
Video Card(s) MSI GeForce RTX 4090 SUPRIM Liquid X
Storage Crucial T705 4TB (PCIe 5.0) w/ Heatsink + Samsung 990 PRO 2TB (PCIe 4.0) w/ Heatsink
Display(s) AORUS FO32U2P 4K QD-OLED 240Hz (DisplayPort 2.1)
Case CoolerMaster H500M (Mesh)
Audio Device(s) AKG N90Q with AudioQuest DragonFly Red (USB DAC)
Power Supply Seasonic Prime TX-1600 Noctua Edition (1600W 80Plus Titanium) ATX 3.1 & PCIe 5.1
Mouse Logitech G PRO X SUPERLIGHT
Keyboard Razer BlackWidow V3 Pro
Software Windows 10 64-bit
And reviewers, although noting the high price, don’t bother to expose this “price creep” although they have much better insight into all this than regular customers.

I wonder if the base models (the ones that should have been MSRP, but will also be more expensive) will be rarer than hen’s teeth, so people will eventually cave in and buy off all these overpriced models?



I doubt it. AIBs have really detailed price negotiations a year in advance, and even small changes require re-negotiations. Maybe they just know how little volume of this “Gaming” line there is, so instead of generating revenue from large number of cards sold they have to generate it from a smaller number - basically pre-scalping the buyers.
I kinda knew the 5090 would sell between $1800 and $2000, $2500 was not an option unless they went for a TSMC N3P or N3X similar node, with a full GB 202 and 2TB/s of bandwidth.
But $2000 is a lot still, and adding $400, $500 and even $800 for the ASUS is insane! I wonder how much they're going to sell...

Worst part of this is 5090 looks to be the best offering in terms of gen to gen perfromance in 5000 series.
It was the same with Lovelace too. The 4090 was definitely the best value... it was $1600 but was a lot more powerful, had a 384-bit and 24GB VRAM.

Do you see a reason why the 5090 FE has a 2-slot cooler design and got over 40 dB(A) fan noise, which unfortunately passes the concentration threshold and has higher temperatures too? It's not like it was a slim cooler design contest. Workstation PCs are big, consumer PCs also have no problem using a 2.5 to 3-slot cooler design.

Supposedly next-gen GeForce "Rubin" is going to use TSMC 3N (not N2 so far unfortunately, but maybe), which would increase performance by 10-15% at same power:

Not great. Using N2 would give the average expected 30% power efficiency improvement gen-over-gen, but NV would have to jump 2 full nodes from N4 (which basically is a N5) to N3 to N2 and I'm not sure N2 will be ready by then.
Because 2-slots coolers sell better! And the 5090 FE is pretty amazing for being able to cool a GPU of 600W ! Even a few years ago nobody thought it was even possible...

TSMC 2nm is probably too expensive as of now, not even Apple are using it right now! They're still on an Advanced 3nm node. I think their M5 chips will be 2nm for sure. The M3 and M4 haven't been that much better than M2 without raising core count on the M4. M5 should be much better for sure.
 
Joined
Dec 26, 2013
Messages
198 (0.05/day)
Processor Ryzen 7 5800x3D
Motherboard Gigabyte B550 Gaming X v2
Cooling Thermalright Phantom Spirit 120 SE
Memory Corsair Vengeance LPX 2x32GB 3600Mhz C18
Video Card(s) XFX RX 6800 XT Merc 319
Storage Kingston KC2500 2TB NVMe + Crucial MX100 256GB + Samsung 860QVO 1TB + Samsung Spinpoint F3 500GB HDD
Display(s) Samsung CJG5 27" 144 Hz QHD
Case Phanteks Eclipse P360A DRGB Black + 3x Thermalright TL-C12C-S ARGB
Audio Device(s) Logitech X530 5.1 + Logitech G35 7.1 Headset
Power Supply Cougar GEX850 80+ Gold
Mouse Razer Viper 8K
Keyboard Logitech G105
It was the same with Lovelace too. The 4090 was definitely the best value... it was $1600 but was a lot more powerful, had a 384-bit and 24GB VRAM.
At least they were good against higher class GPUs from 3000 series. 4080 comfortably beat both 3090 and 3090ti, heck even 4070TI matched 3090TI. 4070 was weaker of the bunch yet it matched 3080. Neither of the 5000 cards seem to be able to accomplish similar feats. 5080 and 5070 obviously wont be able match 4090 and 4080. 5070 TI has potential to match 4080S but that's not really the same thing as we never got 4080TI and 4080S was only a minor improvement over 4080 anyway.
And this is a deliberate move from Nvidia. They used to scale their 70 class cards as half of the 90 class cards in terms of cores (3070-3090), then they changed it to 4070TI - 4090 for 4000 series and now it's 5080 and 5090. So they finally managed to call a 70 class card a 80 name and charge accordingly... which automatically applies to lower class cards as well...
Well, holding onto the promise of "the more you buy the more you save" is easier when you force it...

Anyway, as long as they have no competition, I cant really blame them. Anyone in their shoes would do the same.
 
Joined
Oct 19, 2022
Messages
235 (0.28/day)
Location
Los Angeles, CA
Processor AMD Ryzen 7 9800X3D
Motherboard MSI MPG X870E Carbon Wifi
Cooling ARCTIC Liquid Freezer II 280 A-RGB
Memory 2x32GB (64GB) G.Skill Trident Z Royal @ 6400MHz 1:1 (30-38-38-30)
Video Card(s) MSI GeForce RTX 4090 SUPRIM Liquid X
Storage Crucial T705 4TB (PCIe 5.0) w/ Heatsink + Samsung 990 PRO 2TB (PCIe 4.0) w/ Heatsink
Display(s) AORUS FO32U2P 4K QD-OLED 240Hz (DisplayPort 2.1)
Case CoolerMaster H500M (Mesh)
Audio Device(s) AKG N90Q with AudioQuest DragonFly Red (USB DAC)
Power Supply Seasonic Prime TX-1600 Noctua Edition (1600W 80Plus Titanium) ATX 3.1 & PCIe 5.1
Mouse Logitech G PRO X SUPERLIGHT
Keyboard Razer BlackWidow V3 Pro
Software Windows 10 64-bit
At least they were good against higher class GPUs from 3000 series. 4080 comfortably beat both 3090 and 3090ti, heck even 4070TI matched 3090TI. 4070 was weaker of the bunch yet it matched 3080. Neither of the 5000 cards seem to be able to accomplish similar feats. 5080 and 5070 obviously wont be able match 4090 and 4080. 5070 TI has potential to match 4080S but that's not really the same thing as we never got 4080TI and 4080S was only a minor improvement over 4080 anyway.
And this is a deliberate move from Nvidia. They used to scale their 70 class cards as half of the 90 class cards in terms of cores (3070-3090), then they changed it to 4070TI - 4090 for 4000 series and now it's 5080 and 5090. So they finally managed to call a 70 class card a 80 name and charge accordingly... which automatically applies to lower class cards as well...
Well, holding onto the promise of "the more you buy the more you save" is easier when you force it...

Anyway, as long as they have no competition, I cant really blame them. Anyone in their shoes would do the same.
The 3090 was a terrible value compared to the 3080 but had a lot more VRAM, but the 4090 was definitely the best upgrade and even the 5090 is a better upgrade than the other RTX 50s counterparts! 33% more CUDA Cores, 32GB VRAM and a lot more Bandwidth, which is great for Pros at least.
 
Joined
Dec 26, 2013
Messages
198 (0.05/day)
Processor Ryzen 7 5800x3D
Motherboard Gigabyte B550 Gaming X v2
Cooling Thermalright Phantom Spirit 120 SE
Memory Corsair Vengeance LPX 2x32GB 3600Mhz C18
Video Card(s) XFX RX 6800 XT Merc 319
Storage Kingston KC2500 2TB NVMe + Crucial MX100 256GB + Samsung 860QVO 1TB + Samsung Spinpoint F3 500GB HDD
Display(s) Samsung CJG5 27" 144 Hz QHD
Case Phanteks Eclipse P360A DRGB Black + 3x Thermalright TL-C12C-S ARGB
Audio Device(s) Logitech X530 5.1 + Logitech G35 7.1 Headset
Power Supply Cougar GEX850 80+ Gold
Mouse Razer Viper 8K
Keyboard Logitech G105
The 3090 was a terrible value compared to the 3080 but had a lot more VRAM, but the 4090 was definitely the best upgrade and even the 5090 is a better upgrade than the other RTX 50s counterparts! 33% more CUDA Cores, 32GB VRAM and a lot more Bandwidth, which is great for Pros at least.
Value of 3000 series was quite irrelevant because of the cryptoboom really. Those were bad times. But they at least brought performance to table. 3070 matched the 2080 Ti. 33% more CUDA core is also nothing really, 3080 trippled 2080s and 4090 was 60% more over 3090. Either way what matters is performance.
VRAM after 24GB (16 even) is just for AI. As a gamer, I couldnt care less. But someone tested CP2077 at 16K resolution and apparently it requires 32GB VRAM so 5090 can run that. With horrible glitches and at 20FPS at least...
Anyway I dont really care for this generation and just waiting for the next actual leap (hopefully with some hardware breakthrough like 3d vchace on CPUs and not software one)
 
Joined
Oct 19, 2022
Messages
235 (0.28/day)
Location
Los Angeles, CA
Processor AMD Ryzen 7 9800X3D
Motherboard MSI MPG X870E Carbon Wifi
Cooling ARCTIC Liquid Freezer II 280 A-RGB
Memory 2x32GB (64GB) G.Skill Trident Z Royal @ 6400MHz 1:1 (30-38-38-30)
Video Card(s) MSI GeForce RTX 4090 SUPRIM Liquid X
Storage Crucial T705 4TB (PCIe 5.0) w/ Heatsink + Samsung 990 PRO 2TB (PCIe 4.0) w/ Heatsink
Display(s) AORUS FO32U2P 4K QD-OLED 240Hz (DisplayPort 2.1)
Case CoolerMaster H500M (Mesh)
Audio Device(s) AKG N90Q with AudioQuest DragonFly Red (USB DAC)
Power Supply Seasonic Prime TX-1600 Noctua Edition (1600W 80Plus Titanium) ATX 3.1 & PCIe 5.1
Mouse Logitech G PRO X SUPERLIGHT
Keyboard Razer BlackWidow V3 Pro
Software Windows 10 64-bit
Value of 3000 series was quite irrelevant because of the cryptoboom really. Those were bad times. But they at least brought performance to table. 3070 matched the 2080 Ti. 33% more CUDA core is also nothing really, 3080 trippled 2080s and 4090 was 60% more over 3090. Either way what matters is performance.
VRAM after 24GB (16 even) is just for AI. As a gamer, I couldnt care less. But someone tested CP2077 at 16K resolution and apparently it requires 32GB VRAM so 5090 can run that. With horrible glitches and at 20FPS at least...
Anyway I dont really care for this generation and just waiting for the next actual leap (hopefully with some hardware breakthrough like 3d vchace on CPUs and not software one)
RTX 30s had a lot more CUDA Cores due to the Hybrid architecture, but among all those cores a game needs ~35% INT32 Cores to run, so that's a lot of FP32 left on the table! FYI Turing CUDA cores were all FP32 (the INT32 were independent).
 

FreedomEclipse

~Technological Technocrat~
Joined
Apr 20, 2007
Messages
24,360 (3.75/day)
Location
London,UK
System Name WorkInProgress
Processor AMD 7800X3D
Motherboard MSI X670E GAMING PLUS
Cooling Thermalright AM5 Contact Frame + Phantom Spirit 120SE
Memory 2x32GB G.Skill Trident Z5 NEO DDR5 6000 CL32
Video Card(s) Asus Dual Radeon™ RX 6700 XT OC Edition
Storage WD SN770 1TB (Boot)|1x WD SN850X 8TB (Gaming)| 2x2TB WD SN770| 2x2TB+2x4TB Crucial BX500
Display(s) LG GP850-B
Case Corsair 760T (White) {1xCorsair ML120 Pro|5xML140 Pro}
Audio Device(s) Yamaha RX-V573|Speakers: JBL Control One|Auna 300-CN|Wharfedale Diamond SW150
Power Supply Seasonic Focus GX-850 80+ GOLD
Mouse Logitech G502 X
Keyboard Duckyshine Dead LED(s) III
Software Windows 11 Home
Benchmark Scores ლ(ಠ益ಠ)ლ
GPU Database needs a slight modification...

May I present:

1737953951063.png
 

jnv11

New Member
Joined
Jan 25, 2025
Messages
12 (4.00/day)
Location
Morrisville, NC, USA
I have a suggestion for an additional test due to people complaining about melted 12VHPWR/12V-2x6 connectors and sockets: place a thermocouple or thermometer on the 12V-2x6 socket.

I got inspired by this when looking at Guru3D's reviews which include FLIR infrared camera heat maps. While FLIR infrared camera heat maps are sometimes said to be problematic, seeing the heat maps at https://www.guru3d.com/review/review-nvidia-geforce-rtx-5090-reference-edition/page-7/ , https://www.guru3d.com/review/review-palit-geforce-rtx-5090-gamerock/page-7/ , and https://www.guru3d.com/review/review-asus-rog-geforce-rtx-5090-astral-oc-gaming/page-7/ for the Founders Edition, Palit GameRock, and Asus ROG Astral versions of the RTX 5090 respectively makes me wonder how much of the video card's heat is heating up the 12V-2x6 socket or plug which could contribute to damaging or melting the socket or plug. I suspect that the heat could be a combination of the heat from having so many amps being pulled through the 12V-2x6 connection and from the card itself.
 
Last edited:
Top