• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA GeForce RTX 5090 Founders Edition

Joined
Mar 13, 2021
Messages
489 (0.35/day)
Processor AMD 7600x
Motherboard Asrock x670e Steel Legend
Cooling Silver Arrow Extreme IBe Rev B with 2x 120 Gentle Typhoons
Memory 4x16Gb Patriot Viper Non RGB @ 6000 30-36-36-36-40
Video Card(s) XFX 6950XT MERC 319
Storage 2x Crucial P5 Plus 1Tb NVME
Display(s) 3x Dell Ultrasharp U2414h
Case Coolermaster Stacker 832
Power Supply Thermaltake Toughpower PF3 850 watt
Mouse Logitech G502 (OG)
Keyboard Logitech G512
Mid to end of gen generally is best time to buy yeah.
Oh I know it is, I got a 6950XT near the end of its release for just over £500 which is ~50% of its RRP.

The fact people are going around going "Hey I got this GPU for a steal near the end of its life at 10% below RRP" just makes me sad that this is what we have to deal with now. I just had a look at the Asus ROG STRIX GAMING OC details.

1737689324573.png


1737689368732.png


The LOWEST it ever went to was maybe 30% lower than its INSANE RRP at most and well above even the RRP of a Founder Edition at its lowest point, 18 months after release and this seems to be fairly standard acorss everything. If what we have to expect is basically the RRP of a card will be its RRP for its life then I can see why the high end gaming area is going to be limited to a select few because I cannot see many people willing to shell out £2-3k on a GPU every 4-6 years on top of everything else in a PC.
 
Joined
Oct 19, 2022
Messages
201 (0.24/day)
Location
Los Angeles, CA
Processor AMD Ryzen 7 9800X3D (+PBO)
Motherboard MSI MPG X870E Carbon Wifi
Cooling ARCTIC Liquid Freezer II 280 A-RGB
Memory 2x32GB (64GB) G.Skill Trident Z Royal @ 6400MHz 1:1 (30-38-38-30)
Video Card(s) MSI GeForce RTX 4090 SUPRIM Liquid X
Storage Crucial T705 4TB (PCIe 5.0) w/ Heatsink + Samsung 990 PRO 2TB (PCIe 4.0) w/ Heatsink
Display(s) AORUS FO32U2P 4K QD-OLED 240Hz (DisplayPort 2.1)
Case CoolerMaster H500M (Mesh)
Audio Device(s) AKG N90Q with AudioQuest DragonFly Red (USB DAC)
Power Supply Seasonic Prime TX-1600 Noctua Edition (1600W 80Plus Titanium) ATX 3.1 & PCIe 5.1
Mouse Logitech G PRO X SUPERLIGHT
Keyboard Razer BlackWidow V3 Pro
Software Windows 10 64-bit
The 5090 is a great GPU, just (too) expensive.
The 4090 was just $100 more (~6.6% more expensive) but ~70% faster than the 3090
The 5090 is $400 more (25% more expensive) but around 30-40% faster than the 4090

If the 5090 was $1600 nobody would be complaining. But Nvidia are marketing DLSS 4 (MFG using AI) to set new "metrics" and blur the lines of Performance.
 
Joined
Aug 12, 2019
Messages
2,271 (1.14/day)
Location
LV-426
System Name Custom
Processor i9 9900k
Motherboard Gigabyte Z390 arous master
Cooling corsair h150i
Memory 4x8 3200mhz corsair
Video Card(s) Galax RTX 3090 EX Gamer White OC
Storage 500gb Samsung 970 Evo PLus
Display(s) MSi MAG341CQ
Case Lian Li Pc-011 Dynamic
Audio Device(s) Arctis Pro Wireless
Power Supply 850w Seasonic Focus Platinum
Mouse Logitech G403
Keyboard Logitech G110
What's crazy is remembering the 3090ti launch around January of '22 with the expected release of the 40 series later that year. The 3090ti went for 2k USD and it was less than 10% greater than the 3090. I believe the consensus was that it was just binning to get the ti version. What a terrible deal that was
i thought the 3090ti got released because of RX6950xt
 
Joined
May 11, 2018
Messages
1,332 (0.54/day)
i thought the 3090ti got released because of RX6950xt

Nvidia RTX 3090 Ti got released in March 2022 for a MSRP of $ 2000 - but it was barely faster than RTX 3090 from September 2020, released for MSRP of $1500. Crypto fell by that time from the high in December 2021, but it kind of bounced around - but not for long, soon after release it crashed completely, ruining all ROI calculations of poor miners.

1000004903.png


But it enabled Nvidia to release RTX 4080 for $ 1200 and RTX 4090 for $1600 - even here on TechPowerUP they wrote how RTX 4080 was good value compared to slower RTX 3090 Ti at $ 2000, or the scalped prices RTX 3080 was going for at the height of crypto...
 

Attachments

  • 1000004902.png
    1000004902.png
    367.7 KB · Views: 9
Joined
Jul 13, 2016
Messages
3,435 (1.10/day)
Processor Ryzen 7800X3D
Motherboard ASRock X670E Taichi
Cooling Noctua NH-D15 Chromax
Memory 32GB DDR5 6000 CL30
Video Card(s) MSI RTX 4090 Trio
Storage P5800X 1.6TB 4x 15.36TB Micron 9300 Pro 4x WD Black 8TB M.2
Display(s) Acer Predator XB3 27" 240 Hz
Case Thermaltake Core X9
Audio Device(s) JDS Element IV, DCA Aeon II
Power Supply Seasonic Prime Titanium 850w
Mouse PMM P-305
Keyboard Wooting HE60
VR HMD Valve Index
Software Win 10
Nvidia RTX 3090 Ti got released in March 2022 for a MSRP of $ 2000 - but it was barely faster than RTX 3090 from September 2020, released for MSRP of $1500. Crypto fell by that time from the high in December 2021, but it kind of bounced around - but not for long, soon after release it crashed completely, ruining all ROI calculations of poor miners.

View attachment 381390

But it enabled Nvidia to release RTX 4080 for $ 1200 and RTX 4090 for $1600 - even here on TechPowerUP they wrote how RTX 4080 was good value compared to slower RTX 3090 Ti at $ 2000, or the scalped prices RTX 3080 was going for at the height of crypto...

Nvidia has gotten very good at controlling the perception of it's products. They stopped 4000 series production prior to the 5000 series and drained the channel of stock precisely to make the 5000 series appear to be better value.
 
Joined
Sep 17, 2014
Messages
22,998 (6.08/day)
Location
The Washing Machine
System Name Tiny the White Yeti
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling CPU: Thermalright Peerless Assassin / Case: Phanteks T30-120 x3
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
VR HMD HD 420 - Green Edition ;)
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
Exactly. 'Muh fantastic FE'
Lmao. Gonna be nice for the lifetime of this product.

I keep harping on about the wall that silicon lithography has hit and people keep ignoring me, and then do surprised Pikachu faces when there is no efficiency gain generation-on-generation. Because efficiency, by and large, comes from the node size and that isn't getting appreciably smaller. If y'all are crying this hard about lack of generational performance uplift now, you're gonna be drowning in your tears for a long time, because there ain't any good solutions in sight in the next half-decade at best. Physics is a harsh mistress.
Zen X3D has convincingly shown you could not be more wrong about that.

And we know current GPUs doing RT have literally created an extra consumption hog to add to the power needed to reach X FPS. Similarly there will likely still be a gap between RDNA4, Intel Xe and Geforce even if they are baked on the same node.

You are being ignored on that point because its only 'right' in the moment where the monopolist stops truly innovating its hardware. Moore's Law is similarly dead until someone proves Huang wrong.
 
Joined
Apr 30, 2020
Messages
1,033 (0.60/day)
System Name S.L.I + RTX research rig
Processor Ryzen 7 5800X 3D.
Motherboard MSI MEG ACE X570
Cooling Corsair H150i Cappellx
Memory Corsair Vengeance pro RGB 3200mhz 32Gbs
Video Card(s) 2x Dell RTX 2080 Ti in S.L.I
Storage Western digital Sata 6.0 SDD 500gb + fanxiang S660 4TB PCIe 4.0 NVMe M.2
Display(s) HP X24i
Case Corsair 7000D Airflow
Power Supply EVGA G+1600watts
Mouse Corsair Scimitar
Keyboard Cosair K55 Pro RGB
Just noticed this.
View attachment 381322
That’s… some interesting behavior from the Swarm Engine. Doesn’t show up on higher resolutions.
This seems to be too small of data to put enough load on the gpu problem.
that 4090 was probably the limit of the data amount the game engine puts out.
More data isn't available & the 5090 wants more data but gets nothing. doesn't seem to be a cpu problem it's common on some other games with the 4090 too below 1080p.
This would be like when people run 720p on the 4090 & it ends up below things like the 4080 or 4070.
 
Joined
Sep 17, 2014
Messages
22,998 (6.08/day)
Location
The Washing Machine
System Name Tiny the White Yeti
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling CPU: Thermalright Peerless Assassin / Case: Phanteks T30-120 x3
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
VR HMD HD 420 - Green Edition ;)
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
Oh I know it is, I got a 6950XT near the end of its release for just over £500 which is ~50% of its RRP.

The fact people are going around going "Hey I got this GPU for a steal near the end of its life at 10% below RRP" just makes me sad that this is what we have to deal with now. I just had a look at the Asus ROG STRIX GAMING OC details.

View attachment 381366

View attachment 381372

The LOWEST it ever went to was maybe 30% lower than its INSANE RRP at most and well above even the RRP of a Founder Edition at its lowest point, 18 months after release and this seems to be fairly standard acorss everything. If what we have to expect is basically the RRP of a card will be its RRP for its life then I can see why the high end gaming area is going to be limited to a select few because I cannot see many people willing to shell out £2-3k on a GPU every 4-6 years on top of everything else in a PC.
'The cost of RT'

Next chapter:

'The lacking RT adoption rate'
 
Joined
May 31, 2016
Messages
4,473 (1.42/day)
Location
Currently Norway
System Name Bro2
Processor Ryzen 5800X
Motherboard Gigabyte X570 Aorus Elite
Cooling Corsair h115i pro rgb
Memory 32GB G.Skill Flare X 3200 CL14 @3800Mhz CL16
Video Card(s) Powercolor 6900 XT Red Devil 1.1v@2400Mhz
Storage M.2 Samsung 970 Evo Plus 500MB/ Samsung 860 Evo 1TB
Display(s) LG 27UD69 UHD / LG 27GN950
Case Fractal Design G
Audio Device(s) Realtec 5.1
Power Supply Seasonic 750W GOLD
Mouse Logitech G402
Keyboard Logitech slim
Software Windows 10 64 bit
HWUB Steve was calling it 4090Ti on purpose. I mean, i see where he was coming from. 25% price bump for 26% average performance? Different sites different % performance increase but still not a lot of an increase and the price bump? The power draw is way higher as well. I must say, this does not look so good to me. Very average on the lower side of average and I'm being generous here.
I think this one is pretty bad and considering 5090's performance, I doubt we can expect huge differences gen to gen from the NV 5000 series lower tier cards. I'm kind of disappointed.
AMD you are up. lets see what you got. Odds are nothing groundbreaking really but lets just wait and see.
 
Joined
May 11, 2018
Messages
1,332 (0.54/day)
But it's perfectly in line with Jensen's Law:

"Moore's Law is dead … It's completely over, and so the idea that a chip is going to go down in cost over time, unfortunately, is a story of the past."
 
Joined
Nov 13, 2024
Messages
162 (2.22/day)
System Name le fish au chocolat
Processor AMD Ryzen 7 5950X
Motherboard ASRock B550 Phantom Gaming 4
Cooling Peerless Assassin 120 SE
Memory 2x 16GB (32 GB) G.Skill RipJaws V DDR4-3600 DIMM CL16-19-19-39
Video Card(s) NVIDIA GeForce RTX 3080, 10 GB GDDR6X (ASUS TUF)
Storage 2 x 1 TB NVME & 2 x 4 TB SATA SSD in Raid 0
Display(s) MSI Optix MAG274QRF-QD
Power Supply 750 Watt EVGA SuperNOVA G5
I feel this is a fallacy, it should totally be compared to other products and depending on the usage, a choice must be made. Purely for gaming, you must have money to burn. AI, game development, video editing, modeling and rendering, and gaming, here's your best option.
Then why are we comparing a b580 (msrp 250 $) to an 5090 (msrp 2000 $).

It's the best $ to performance (if you can get it for that price) card against the strongest card currently on the market.

Who thinks:

Hmmm... I can buy this card from intel for 250 $ (with perfromance between a 4060 ti 8GB & 16GB Version) or the 5090 for 2000 $.

The comparison is just nonsensical.
 
Joined
May 24, 2023
Messages
1,001 (1.64/day)
Power consumption video playback? Multi monitors? Idle?! ITS HORRIBLE!
People literally hated AMD and XTX because of this, but i guess...everyone and their mom will love Nvidia now. For the record, XTX was high too. I watch a lot of movies, can you imagine draining 50-60W for no reason at all, while my 10W CPU can nicely play movies on my laptop without any issue, and i can even put it to my 4k TV. What's going on with Nvidia now? They are becoming AMD?
For productive use as research computations or making money with this card using it for rendering, etc, this idle/low use power draw is not very relevant.

Rich people buying this card only for occasional gaming will not care either.

5080 and below for normal people will thanks to its lower ram capacity and silicon area hopefully have much lower power draw. But I believe that even my tiny chip 4070 with 12 GB of RAM draws at least 13W while doing very little.

I just checked TPU numbers for video playback, 4070 draws 15W and 5090 54W, and these cards are massively different in silicon area, ram capacity and power circuitry. I do not expect these cards to draw exactly the same power in those three low usage scenarious.
 
Joined
Sep 17, 2014
Messages
22,998 (6.08/day)
Location
The Washing Machine
System Name Tiny the White Yeti
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling CPU: Thermalright Peerless Assassin / Case: Phanteks T30-120 x3
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
VR HMD HD 420 - Green Edition ;)
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
Then why are we comparing a b580 (msrp 250 $) to an 5090 (msrp 2000 $).

It's the best $ to performance (if you can get it for that price) card against the strongest card currently on the market.

Who thinks:

Hmmm... I can buy this card from intel for 250 $ (with perfromance between a 4060 ti 8GB & 16GB Version) or the 5090 for 2000 $.

The comparison is just nonsensical.
Of course it isn't. The 5090 will eventually be a similar class of paperweight, it just takes more time. The comparisons are great to make, they provide you with much needed perspective on how silly it is to spend 2K on a GPU. Or how useful, given your use case. Fact is, B580 also offers 16GB, so if its just VRAM you need...
 
Joined
Nov 13, 2024
Messages
162 (2.22/day)
System Name le fish au chocolat
Processor AMD Ryzen 7 5950X
Motherboard ASRock B550 Phantom Gaming 4
Cooling Peerless Assassin 120 SE
Memory 2x 16GB (32 GB) G.Skill RipJaws V DDR4-3600 DIMM CL16-19-19-39
Video Card(s) NVIDIA GeForce RTX 3080, 10 GB GDDR6X (ASUS TUF)
Storage 2 x 1 TB NVME & 2 x 4 TB SATA SSD in Raid 0
Display(s) MSI Optix MAG274QRF-QD
Power Supply 750 Watt EVGA SuperNOVA G5
Of course it isn't. The 5090 will eventually be a similar class of paperweight, it just takes more time. The comparisons are great to make, they provide you with much needed perspective on how silly it is to spend 2K on a GPU. Or how useful, given your use case.
Yes I also think the price is stupid for the 50 Series, also the availability for the b580 at msrp is quite bad. (But the same goes for the 50 Series)
Fact is, B580 also offers 16GB, so if its just VRAM you need...
there are 16 GB models of the b580? I thought they all were 12GB
 
Joined
Sep 17, 2014
Messages
22,998 (6.08/day)
Location
The Washing Machine
System Name Tiny the White Yeti
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling CPU: Thermalright Peerless Assassin / Case: Phanteks T30-120 x3
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
VR HMD HD 420 - Green Edition ;)
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
Yes I also think the price is stupid for the 50 Series, also the availability for the b580 at msrp is quite bad. (But the same goes for the 50 Series)

there are 16 GB models of the b580? I thought they all were 12GB
You're right! My bad, that was A770
 
Joined
Oct 28, 2012
Messages
1,246 (0.28/day)
Processor AMD Ryzen 3700x
Motherboard asus ROG Strix B-350I Gaming
Cooling Deepcool LS520 SE
Memory crucial ballistix 32Gb DDR4
Video Card(s) RTX 3070 FE
Storage WD sn550 1To/WD ssd sata 1To /WD black sn750 1To/Seagate 2To/WD book 4 To back-up
Display(s) LG GL850
Case Dan A4 H2O
Audio Device(s) sennheiser HD58X
Power Supply Corsair SF600
Mouse MX master 3
Keyboard Master Key Mx
Software win 11 pro
'The cost of RT'

Next chapter:

'The lacking RT adoption rate'
or "The Rise of the PS6 for upscaled path traced AAA gaming." The PS5 Pro has been selling pretty well despite being expensive. Console players are not that fussy; low framerate and (bad) upscaling have been their standards.

And seriously, PC gaming playtime (and a big chunk of the revenue) is being carried by competitive/online games that could run on a potato. I have a hunch that the "PC GAMING MASTERRACE premium experience" becoming increasingly expensive isn't going to make that much of a dent overall.
 
Joined
May 11, 2018
Messages
1,332 (0.54/day)
I just checked TPU numbers for video playback, 4070 draws 15W and 5090 54W, and these cards are massively different in silicon area, ram capacity and power circuitry. I do not expect these cards to draw exactly the same power in those three low usage scenarious.

A bit more apples to apples:

4090 idle: 22W
5090 idle: 30W, +36%

4090 multi monitor: 27W
5090 multi monitor:39W, +44%

4090 video playback: 26W
5090 video playback: 54W, +108%

It's quite horrible. AMD "We'll fix it in drivers (but doesn't)" horrible.

But making excuses for Nvidia that this card isn't meant for gamers, home users is silly. Nvidia spent quite a big chunk of their presentation of RTX 5090 on how good it is in gaming - since it's apparently the only card that will have any significant performance uplift compared to Lovelace equivalent without using "frame quadrupling". Delegate this card to "Quadro" lineup, or "home and small business AI accelerator" lineup, what are you left with? Cards within 10- 15 % of their predecessors? That's within overclocking margin, as measly as it is now.
 
Joined
Jun 14, 2020
Messages
4,130 (2.45/day)
System Name Mean machine
Processor AMD 6900HS
Memory 2x16 GB 4800C40
Video Card(s) AMD Radeon 6700S
A bit more apples to apples:

4090 idle: 22W
5090 idle: 30W, +36%

4090 multi monitor: 27W
5090 multi monitor:39W, +44%

4090 video playback: 26W
5090 video playback: 54W, +108%

It's quite horrible. AMD "We'll fix it in drivers (but doesn't)" horrible.

But making excuses for Nvidia that this card isn't meant for gamers, home users is silly. Nvidia spent quite a big chunk of their presentation of RTX 5090 on how good it is in gaming - since it's apparently the only card that will have any significant performance uplift compared to Lovelace equivalent without using "frame quadrupling". Delegate this card to "Quaddro" lineup, or "home and small bussines AI accelerator" lineup, what are you left with? Cards within 10- 15 % of their predecessors? That's within overclocking margin, as measly as it is now.
High idle power is generally horrible, but... it's a trade off for having the fastest. I didn't particularly like my 4090s idle draw either (was hitting 27 watts) but again I was willing to pay that price for the fastest. Comparing it to amd gpus ain't fair because you are not really making a trade off, you aren't getting the fastest card, you are just getting a card with high power draw
 
Joined
May 24, 2023
Messages
1,001 (1.64/day)
A bit more apples to apples:

4090 idle: 22W
5090 idle: 30W, +36%

4090 multi monitor: 27W
5090 multi monitor:39W, +44%

4090 video playback: 26W
5090 video playback: 54W, +108%

It's quite horrible. AMD "We'll fix it in drivers (but doesn't)" horrible.

But making excuses for Nvidia that this card isn't meant for gamers, home users is silly. Nvidia spent quite a big chunk of their presentation of RTX 5090 on how good it is in gaming - since it's apparently the only card that will have any significant performance uplift compared to Lovelace equivalent without using "frame quadrupling". Delegate this card to "Quadro" lineup, or "home and small business AI accelerator" lineup, what are you left with? Cards within 10- 15 % of their predecessors? That's within overclocking margin, as measly as it is now.
I do not think that this low usage power draw is good, but 5090 draws almost 600W in 4K. That is a real concern. I do not think that this card should be consireded a common consumer product. This power draw for a "normal home pc" is INSANE.
 
Joined
Nov 13, 2024
Messages
162 (2.22/day)
System Name le fish au chocolat
Processor AMD Ryzen 7 5950X
Motherboard ASRock B550 Phantom Gaming 4
Cooling Peerless Assassin 120 SE
Memory 2x 16GB (32 GB) G.Skill RipJaws V DDR4-3600 DIMM CL16-19-19-39
Video Card(s) NVIDIA GeForce RTX 3080, 10 GB GDDR6X (ASUS TUF)
Storage 2 x 1 TB NVME & 2 x 4 TB SATA SSD in Raid 0
Display(s) MSI Optix MAG274QRF-QD
Power Supply 750 Watt EVGA SuperNOVA G5
I do not think that this low usage power draw is good, but 5090 draws almost 600W in 4K. That is a real concern. I do not think that this card should be consireded a common consumer product. This power draw for a "normal home pc" is INSANE.
don't worry they won't be common, there will be 20 - 30 of them at launch. And they will heat your room in the mean time (mostly kidding)
 
Joined
Dec 12, 2012
Messages
794 (0.18/day)
Location
Poland
System Name THU
Processor Intel Core i5-13600KF
Motherboard ASUS PRIME Z790-P D4
Cooling SilentiumPC Fortis 3 v2 + Arctic Cooling MX-2
Memory Crucial Ballistix 2x16 GB DDR4-3600 CL16 (dual rank)
Video Card(s) MSI GeForce RTX 4070 Ventus 3X OC 12 GB GDDR6X (2610/21000 @ 0.91 V)
Storage Lexar NM790 2 TB + Corsair MP510 960 GB + PNY XLR8 CS3030 500 GB + Toshiba E300 3 TB
Display(s) LG OLED C8 55" + ASUS VP229Q
Case Fractal Design Define R6
Audio Device(s) Yamaha RX-V381 + Monitor Audio Bronze 6 + Bronze FX | FiiO E10K-TC + Sony MDR-7506
Power Supply Corsair RM650
Mouse Logitech M705 Marathon
Keyboard Corsair K55 RGB PRO
Software Windows 10 Home
Benchmark Scores Benchmarks in 2024?
If it's 1% better than the most efficient card on earth, then it is, by definition, the new most efficient card on earth.

Pulling 600w on its own does not make something inefficient.

To clarify, the 4090 was not the most efficient card on Earth. The 4080 was at the time, and now it's the 4080 Super, at least according to TPU charts.

The issue is that if a new generation product has the same efficiency as the previous gen, it's not a good thing. Obviously it's mainly because of the 4N node, but it shows that the architecture doesn't actually improve anything, at least for existing titles. There are some features that have to be implemented in games (neural rendering, mega geometry), but current games don't really benefit from any of the architectural changes.
It's always worth bringing up Maxwell. The was a crazy new architecture which improved efficiency by 50% on the same node. It was unbelievable. It can be done.

Raising power targets like this is a problem for the following generation. What's going to happen with the 60 series? A new node will improve efficiency, but to get a significant performance improvement they'll need to keep the same power levels. If they lower the power to 40 series levels, there will be a small performance improvement, or none at all. Do you really want 70-class cards at 250-300 W? I certainly don't, at least not in the summer.
 
Joined
Nov 13, 2024
Messages
162 (2.22/day)
System Name le fish au chocolat
Processor AMD Ryzen 7 5950X
Motherboard ASRock B550 Phantom Gaming 4
Cooling Peerless Assassin 120 SE
Memory 2x 16GB (32 GB) G.Skill RipJaws V DDR4-3600 DIMM CL16-19-19-39
Video Card(s) NVIDIA GeForce RTX 3080, 10 GB GDDR6X (ASUS TUF)
Storage 2 x 1 TB NVME & 2 x 4 TB SATA SSD in Raid 0
Display(s) MSI Optix MAG274QRF-QD
Power Supply 750 Watt EVGA SuperNOVA G5
Raising power targets like this is a problem for the following generation. What's going to happen with the 60 series? A new node will improve efficiency, but to get a significant performance improvement they'll need to keep the same power levels. If they lower the power to 40 series levels, there will be a small performance improvement, or none at all. Do you really want 70-class cards at 250-300 W? I certainly don't, at least not in the summer.
Well they could lower the prices a bit (a series 60 card for 1000$ with almost 4090 performance wouldn't be bad) maybe something similar to 20series -> 30series.

Or they just lock really cool software features behind the new cards...
 
Joined
Jan 19, 2023
Messages
320 (0.43/day)
To clarify, the 4090 was not the most efficient card on Earth. The 4080 was at the time, and now it's the 4080 Super, at least according to TPU charts.

The issue is that if a new generation product has the same efficiency as the previous gen, it's not a good thing. Obviously it's mainly because of the 4N node, but it shows that the architecture doesn't actually improve anything, at least for existing titles. There are some features that have to be implemented in games (neural rendering, mega geometry), but current games don't really benefit from any of the architectural changes.
It's always worth bringing up Maxwell. The was a crazy new architecture which improved efficiency by 50% on the same node. It was unbelievable. It can be done.

Raising power targets like this is a problem for the following generation. What's going to happen with the 60 series? A new node will improve efficiency, but to get a significant performance improvement they'll need to keep the same power levels. If they lower the power to 40 series levels, there will be a small performance improvement, or none at all. Do you really want 70-class cards at 250-300 W? I certainly don't, at least not in the summer.
Of course it can be done, but that would mean they need to spend R&D team on improving shader perfomance, rather than their Tensor cores/AI. And they won't do that as it will not benefit them in the enterprise AI craze. From RT benchmarks it seems like they also stopped caring that much for it as well. It's AI all the way.
Ada also didn't improve shaders at all practically, the perfomance gains were simply from the better node and ability to run at 2,7-2,8GHz, rather than under 2GHz in Ampere.
If there will be a new development in that market or bubble will burst then they will be in a bad place, but that won't happen soon probably.
Now all you get in Geforce are leftovers from their enterprise arch. Radeon will go the same way with UDNA.
 
Joined
Jun 26, 2023
Messages
44 (0.08/day)
Now i want the same perf for 75W :)
In hope for a better power efficiency improvement for Blackwell, I did a calculation and got a 33% power efficiency improvement on average over 4 generations. Let's assume this trend continues (I think TSMC's own data for 3N and 2N suggests that it does not?), it would mean in 7 generations we'd be there (7.14 = ln(575/75)/ln(1.33)). If the 5090 had a 75W TDP, I'd consider getting it.
 
Top