• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA GeForce RTX 5090 Founders Edition

Joined
Nov 27, 2023
Messages
2,693 (6.34/day)
System Name The Workhorse
Processor AMD Ryzen R9 5900X
Motherboard Gigabyte Aorus B550 Pro
Cooling CPU - Noctua NH-D15S Case - 3 Noctua NF-A14 PWM at the bottom, 2 Fractal Design 180mm at the front
Memory GSkill Trident Z 3200CL14
Video Card(s) NVidia GTX 1070 MSI QuickSilver
Storage Adata SX8200Pro
Display(s) LG 32GK850G
Case Fractal Design Torrent (Solid)
Audio Device(s) FiiO E-10K DAC/Amp, Samson Meteorite USB Microphone
Power Supply Corsair RMx850 (2018)
Mouse Razer Viper (Original) on a X-Raypad Equate Plus V2
Keyboard Cooler Master QuickFire Rapid TKL keyboard (Cherry MX Black)
Software Windows 11 Pro (24H2)
Just noticed this.
1737676612488.png

That’s… some interesting behavior from the Swarm Engine. Doesn’t show up on higher resolutions.
 
Joined
Mar 7, 2010
Messages
998 (0.18/day)
Location
Michigan
System Name Daves
Processor AMD Ryzen 3900x
Motherboard AsRock X570 Taichi
Cooling Enermax LIQMAX III 360
Memory 32 GiG Team Group B Die 3600
Video Card(s) Powercolor 5700 xt Red Devil
Storage Crucial MX 500 SSD and Intel P660 NVME 2TB for games
Display(s) Acer 144htz 27in. 2560x1440
Case Phanteks P600S
Audio Device(s) N/A
Power Supply Corsair RM 750
Mouse EVGA
Keyboard Corsair Strafe
Software Windows 10 Pro
MOMMIES CC look out, the little ones will be snatching it from your purse to buy this non sense.
 
Joined
Dec 26, 2006
Messages
3,907 (0.59/day)
Location
Northern Ontario Canada
Processor Ryzen 5700x
Motherboard Gigabyte X570S Aero G R1.1 BiosF5g
Cooling Noctua NH-C12P SE14 w/ NF-A15 HS-PWM Fan 1500rpm
Memory Micron DDR4-3200 2x32GB D.S. D.R. (CT2K32G4DFD832A)
Video Card(s) AMD RX 6800 - Asus Tuf
Storage Kingston KC3000 1TB & 2TB & 4TB Corsair MP600 Pro LPX
Display(s) LG 27UL550-W (27" 4k)
Case Be Quiet Pure Base 600 (no window)
Audio Device(s) Realtek ALC1220-VB
Power Supply SuperFlower Leadex V Gold Pro 850W ATX Ver2.52
Mouse Mionix Naos Pro
Keyboard Corsair Strafe with browns
Software W10 22H2 Pro x64
@W1zzard

I see the avg fps and fps/$, just curious if an fps vs transistor or fps/1k transistors chart could be considered?
 
Joined
Mar 5, 2024
Messages
128 (0.39/day)
Power consumption video playback? Multi monitors? Idle?! ITS HORRIBLE!
People literally hated AMD and XTX because of this, but i guess...everyone and their mom will love Nvidia now. For the record, XTX was high too. I watch a lot of movies, can you imagine draining 50-60W for no reason at all, while my 10W CPU can nicely play movies on my laptop without any issue, and i can even put it to my 4k TV. What's going on with Nvidia now? They are becoming AMD?

Also, multi monitor is super high, i got 3 monitors+TV, ouch. This is not a card for me, cus i keep my stuff on 24/7 during work, gaming and i even fall asleep sometimes. Useless power drain and higher energy bills. Yes, someone that spend 2000 bucks like me still cares about that. Trust me, when you got a ton of lights, heating (water+air) fridges and many other stuff, every bit helps. Not that long ago, people hated the 50-80W bulbs for a reason. It's not peanuts. Anyways, this card will most likely be 3k in Europe, so i probably won't get it. If it was closer to 2k, i might consider it.. but probably not. I don't buy defective wastes of energy. I did skip XTX after all, even tho it was better than RTX 4080, price wise and even performance wise (minus RT)
 
Joined
Jun 19, 2024
Messages
309 (1.40/day)
System Name XPS, Lenovo and HP Laptops, HP Xeon Mobile Workstation, HP Servers, Dell Desktops
Processor Everything from Turion to 13900kf
Motherboard MSI - they own the OEM market
Cooling Air on laptops, lots of air on servers, AIO on desktops
Memory I think one of the laptops is 2GB, to 64GB on gamer, to 128GB on ZFS Filer
Video Card(s) A pile up to my knee, with a RTX 4090 teetering on top
Storage Rust in the closet, solid state everywhere else
Display(s) Laptop crap, LG UltraGear of various vintages
Case OEM and a 42U rack
Audio Device(s) Headphones
Power Supply Whole home UPS w/Generac Standby Generator
Software ZFS, UniFi Network Application, Entra, AWS IoT Core, Splunk
Benchmark Scores 1.21 GigaBungholioMarks
Welcome to the days of being CPU limited at 4K.

Still think I’ll get one though. Seems like fun.
 
Joined
Jun 12, 2023
Messages
86 (0.15/day)
Anyone who got a MSRP 4090 wont he proverbial lottery.

Turns out the 4090 WAS the next 1080ti.....
I still gloat about how I got a 4090, 13900KF, 32GB, 1TB SSD, 4TB HDD prebuilt from CLX through best buy off of OfferUp cuz the guy selling it "got an extra one" when he ordered his same setup for $2250 back in April of '23. I've been riding high

I sw that graph, but I seriously disagree with it.

Gaming power use, the 4090 sits at 411 watt, and the 5090 at 587. That's a 42% increase.

then on the performance charts, we see a 26% average increase at 4k.

so something doesnt add up.

Context is a difficult thing to grasp.

For a 2 slot cooler to be managing a card pulling damn close to 600w is nothing short of a miracle.
Agreed it's quite the amazing feat, it still feels wrong to have any component of a video card running over 70C. But it may be that I've just been seeing that kind of thing over the last 6 years

How is efficiency calculated? Which Watt and FPS data are used?




edit: I made a mistake in the original post, there are things that are not clear to me with the efficiency calculation.
Its mighty big of you to admit a mistake. Well done mate

Here's some fun with numbers. The 5090 is four times faster than the B580 for eight times the MSRP. I feel like Nvidia customers are paying half of the price for hardware and half of the price for software.
Yes, Nvidia is putting the cost to train the DLSS models into the video cards. Tom's Hardware has an article about Nvidia having a practically super computer working consistently for the past 6 years to develop and improve DLSS.
 
Joined
Aug 20, 2007
Messages
21,639 (3.40/day)
Location
Olympia, WA
System Name Pioneer
Processor Ryzen 9 9950X
Motherboard GIGABYTE Aorus Elite X670 AX
Cooling Noctua NH-D15 + A whole lotta Sunon, Phanteks and Corsair Maglev blower fans...
Memory 64GB (2x 32GB) G.Skill Flare X5 @ DDR5-6000 CL30
Video Card(s) XFX RX 7900 XTX Speedster Merc 310
Storage Intel 5800X Optane 800GB boot, +2x Crucial P5 Plus 2TB PCIe 4.0 NVMe SSDs
Display(s) 55" LG 55" B9 OLED 4K Display
Case Thermaltake Core X31
Audio Device(s) TOSLINK->Schiit Modi MB->Asgard 2 DAC Amp->AKG Pro K712 Headphones or HDMI->B9 OLED
Power Supply FSP Hydro Ti Pro 850W
Mouse Logitech G305 Lightspeed Wireless
Keyboard WASD Code v3 with Cherry Green keyswitches + PBT DS keycaps
Software Gentoo Linux x64 / Windows 11 Enterprise IoT 2024
Joined
Jun 12, 2023
Messages
86 (0.15/day)
it's a halo product, you should not compare it (PerfromanceTo$) to other cards...

But 5080 and under might be pretty ugly ^^"
I feel this is a fallacy, it should totally be compared to other products and depending on the usage, a choice must be made. Purely for gaming, you must have money to burn. AI, game development, video editing, modeling and rendering, and gaming, here's your best option.

Also I wouldn't be me if I hadn't done that.
View attachment 381162
View attachment 381164

One will need to buy approximately 170 thousand RTX 5090s, or invest at least $340 million USD + VAT in order to get one football field worth of RTX 5090s.

Now that RTX 5090 has been measured in football fields we can go back to actual gaming.
I honestly thought only people from the USA do these kinds of random measurements. Like "it weighs as much as 8 elephants!"

Yes this is just the 4090 Ti especially if Nvidia brings DLSS4 to the 4000 series.
They most likely won't though. It's a benefit to them to keep this software locked to their newest generation. There hasn't even been a rumor to have DLSS 3 to 30 series cards. And it's kinda been nagging me that there a mismatch in DLSS version and the thousands number in the series. Meaning DLSS 4 5000 series, DLSS 3 4000 series

They should have kept the cooler a 3 slot. Not liking the temps considering it's already using liquid metal. This was never meant for me, the power consumption is insane
I like that there's an extreme offering towards the smaller range. I figure none of the AIB artners would do it since I haven't seen any 4090 or 4080s like that

Ah, missed the latencies, thank you, I still want to see some image comparisons and a short video with MFG enabled.
TPU will definitely have an article evaluating DLSS 4 and all its permutations that have been updated
 
Joined
Mar 11, 2008
Messages
1,031 (0.17/day)
Location
Hungary / Budapest
System Name Kincsem
Processor AMD Ryzen 9 9950X
Motherboard ASUS ProArt X870E-CREATOR WIFI
Cooling Be Quiet Dark Rock Pro 5
Memory Kingston Fury KF560C32RSK2-96 (2×48GB 6GHz)
Video Card(s) Sapphire AMD RX 7900 XT Pulse
Storage Samsung 970PRO 500GB + Samsung 980PRO 2TB + FURY Renegade 2TB+ Adata 2TB + WD Ultrastar HC550 16TB
Display(s) Acer QHD 27"@144Hz 1ms + UHD 27"@60Hz
Case Cooler Master CM 690 III
Power Supply Seasonic 1300W 80+ Gold Prime
Mouse Logitech G502 Hero
Keyboard HyperX Alloy Elite RGB
Software Windows 10-64
Benchmark Scores https://valid.x86.fr/9qw7iq https://valid.x86.fr/4d8n02 X570 https://www.techpowerup.com/gpuz/g46uc
Damn AMD really is awful at ML isn't it?
Well, I have no comment on that since never been able to measure up.
This is why I asked @W1zzard to use token/s instead that arbitrary time option.
That way you can compare the things they tested, but not good if you wish to compare your own rig's performance.
Generally I like what I have - no issue, and the 20GB VRAM is still a lot, only the 4090/7900XTX now the 5090 and the industrial cards have more
But when it comes to complex LLMs there is no enough :roll:

Here is the new DeepSeek Llama 8B F16 44,12 token/s is quite good!
1737683323988.png

And Microsoft's Phi4 14B Q8 with 42.83 token/s
1737683363691.png

Yeah, it is equation :slap::D
So it is not bad, but would LOVE if TPU had a section for this in the GPU reviews and maybe even in the GPU database
 
Joined
Jun 12, 2023
Messages
86 (0.15/day)
I mean yeah it’s an impressive gpu but the price unimpressive …. I will tank it with my 3090 for another 2 years for 60 series or whatever amd offering they have then

but folks on 10 series should upgrade and 20 series can consider … 30 series folks I think can skip this 50 series
What's crazy is remembering the 3090ti launch around January of '22 with the expected release of the 40 series later that year. The 3090ti went for 2k USD and it was less than 10% greater than the 3090. I believe the consensus was that it was just binning to get the ti version. What a terrible deal that was
 
Joined
Jun 19, 2020
Messages
111 (0.07/day)
How can you write in conclusion, that 5090 is highly energy efficient, when its efficiency is just about 1-2% better than previous 4090?! And lower models of previous generation are more efficient too! That's no advancement at all!
 
Joined
Jun 19, 2024
Messages
309 (1.40/day)
System Name XPS, Lenovo and HP Laptops, HP Xeon Mobile Workstation, HP Servers, Dell Desktops
Processor Everything from Turion to 13900kf
Motherboard MSI - they own the OEM market
Cooling Air on laptops, lots of air on servers, AIO on desktops
Memory I think one of the laptops is 2GB, to 64GB on gamer, to 128GB on ZFS Filer
Video Card(s) A pile up to my knee, with a RTX 4090 teetering on top
Storage Rust in the closet, solid state everywhere else
Display(s) Laptop crap, LG UltraGear of various vintages
Case OEM and a 42U rack
Audio Device(s) Headphones
Power Supply Whole home UPS w/Generac Standby Generator
Software ZFS, UniFi Network Application, Entra, AWS IoT Core, Splunk
Benchmark Scores 1.21 GigaBungholioMarks
How can you write in conclusion, that 5090 is highly energy efficient, when its efficiency is just about 1-2% better than previous 4090?! And lower models of previous generation are more efficient too! That's no advancement at all!

It’s more efficient than the previously top efficiency card.

What would you call it, less efficient?
 
Joined
Jun 12, 2023
Messages
86 (0.15/day)
I keep harping on about the wall that silicon lithography has hit and people keep ignoring me, and then do surprised Pikachu faces when there is no efficiency gain generation-on-generation. Because efficiency, by and large, comes from the node size and that isn't getting appreciably smaller. If y'all are crying this hard about lack of generational performance uplift now, you're gonna be drowning in your tears for a long time, because there ain't any good solutions in sight in the next half-decade at best. Physics is a harsh mistress.
Totally agree, and it's why Nvidia has been developing tensor cores, ray tracing, DLSS, and Frame Gen. They knew that pure hardware improvements weren't going to continue at a steady pace and so they went laterally to make improvements
 
Joined
Feb 1, 2019
Messages
3,753 (1.72/day)
Location
UK, Midlands
System Name Main PC
Processor 13700k
Motherboard Asrock Z690 Steel Legend D4 - Bios 13.02
Cooling Noctua NH-D15S
Memory 32 Gig 3200CL14
Video Card(s) 4080 RTX SUPER FE 16G
Storage 1TB 980 PRO, 2TB SN850X, 2TB DC P4600, 1TB 860 EVO, 2x 3TB WD Red, 2x 4TB WD Red
Display(s) LG 27GL850
Case Fractal Define R4
Audio Device(s) Soundblaster AE-9
Power Supply Antec HCG 750 Gold
Software Windows 10 21H2 LTSC

Check @ min. 16:08 for a comparison with the fps locked.
A weird review, but I kept watching, so his presentation works.

So 4090 more efficient than 5090. Dude loves his frame rates, the review indicates he might be a bit of a shooter gamer, called 30fps unplayable, and visual artefacts as a worth it for higher frame rates. His capped rate testing was still really high frame rates. :) Yes I consider 144 really high.

I did similar testing when comparing my 4080 to 3080, 30 series more efficient than 40 at low end games, because the cards can run at lower voltage and clocks. Although I think in this case the issue has some similarities, but ultimately the extra cores hindering the 5090, and it cant make up for it by dropping clocks and voltage due to a high floor set in the bios/driver.

Something of note on the TPU data as well, look at how high the power draw is for this thing when playing videos.
 
Joined
Jun 12, 2023
Messages
86 (0.15/day)
How about if we put 5090 at higher than 4k resolution? I agree with you re the wall. 600w power consumption for 27% increase at 4k shows it.

My guess is at 8k 5090 with massive memory bandwidth will be much more efficient, even if unplayable.
On a mass scale, how much 8k adoption do you really think there is? I highly doubt it's even more than 5% of gamers. W1zz is tired and is hitting that 80% of coverage as best he can.
 

SRS

New Member
Joined
Oct 12, 2024
Messages
11 (0.10/day)
"at the SKU's baseline price of USD $1,999"

Ostensible price of $1999.

Anyone who paid attention to how the pricing for the 4090 went knows how this will go. A tiny number of FEs will be available, sporadically, and everyone else will have to buy 3rd-party cards with tiny overclocks for quite a bit higher price. The threads and trackers for "hope" over getting an FE 4090 is something gamers (and home AI enthusiasts) shouldn't forget nor should tolerate.
 
Joined
Jul 13, 2016
Messages
3,435 (1.10/day)
Processor Ryzen 7800X3D
Motherboard ASRock X670E Taichi
Cooling Noctua NH-D15 Chromax
Memory 32GB DDR5 6000 CL30
Video Card(s) MSI RTX 4090 Trio
Storage P5800X 1.6TB 4x 15.36TB Micron 9300 Pro 4x WD Black 8TB M.2
Display(s) Acer Predator XB3 27" 240 Hz
Case Thermaltake Core X9
Audio Device(s) JDS Element IV, DCA Aeon II
Power Supply Seasonic Prime Titanium 850w
Mouse PMM P-305
Keyboard Wooting HE60
VR HMD Valve Index
Software Win 10
It’s more efficient than the previously top efficiency card.

What would you call it, less efficient?

You'd call it a draw. If the result is within margin or error, it will change depending on run to run variance, test setup, software version, and other test considerations that may vary from review to review depending on methodology. Consider other large factors like test suite used will heavily influence the results as well.

Totally agree, and it's why Nvidia has been developing tensor cores, ray tracing, DLSS, and Frame Gen. They knew that pure hardware improvements weren't going to continue at a steady pace and so they went laterally to make improvements

Nvidia pushed those technologies to enable AI and Real-time ray tracing, not because we are hitting the limits of how much we can shrink chips.

This is a tock generation, performance uplifts were expected to be small. People need to stop running around like the sky is falling every time there's a tock generation only for performance gains to return to normal as they always do.

Scaling chips down is getting harder but the amount of investment has been exploding. This equilibrium has been the balancing force in chip manufacturing since it's inception.
 
Joined
Jun 12, 2023
Messages
86 (0.15/day)
You'd call it a draw. If the result is within margin or error, it will change depending on run to run variance, test setup, software version, and other test considerations that may vary from review to review depending on methodology. Consider other large factors like test suite used will heavily influence the results as well.



Nvidia pushed those technologies to enable AI and Real-time ray tracing, not because we are hitting the limits of how much we can shrink chips.

This is a tock generation, performance uplifts were expected to be small. People need to stop running around like the sky is falling every time there's a tock generation only for performance gains to return to normal as they always do.

Scaling chips down is getting harder but the amount of investment has been exploding. This equilibrium has been the balancing force in chip manufacturing since it's inception.
I'm not saying we've hit the limit yet, but that it is coming soon. I can't recall where I read that electrons don't physically perform the same and can jump around at the 1 nanometer transistor scale. I'm saying that Nvidia is prepping us with all these software and lateral hardware developments along side the traditional architecture and node improvements
 
Joined
Jun 19, 2024
Messages
309 (1.40/day)
System Name XPS, Lenovo and HP Laptops, HP Xeon Mobile Workstation, HP Servers, Dell Desktops
Processor Everything from Turion to 13900kf
Motherboard MSI - they own the OEM market
Cooling Air on laptops, lots of air on servers, AIO on desktops
Memory I think one of the laptops is 2GB, to 64GB on gamer, to 128GB on ZFS Filer
Video Card(s) A pile up to my knee, with a RTX 4090 teetering on top
Storage Rust in the closet, solid state everywhere else
Display(s) Laptop crap, LG UltraGear of various vintages
Case OEM and a 42U rack
Audio Device(s) Headphones
Power Supply Whole home UPS w/Generac Standby Generator
Software ZFS, UniFi Network Application, Entra, AWS IoT Core, Splunk
Benchmark Scores 1.21 GigaBungholioMarks
"at the SKU's baseline price of USD $1,999"

Ostensible price of $1999.

Anyone who paid attention to how the pricing for the 4090 went knows how this will go. A tiny number of FEs will be available, sporadically, and everyone else will have to buy 3rd-party cards with tiny overclocks for quite a bit higher price. The threads and trackers for "hope" over getting an FE 4090 is something gamers (and home AI enthusiasts) shouldn't forget nor should tolerate.

I bought a 4090 in December of 23 for below MSRP.

1737685618878.jpeg
 
Joined
Jun 12, 2023
Messages
86 (0.15/day)
How can you write in conclusion, that 5090 is highly energy efficient, when its efficiency is just about 1-2% better than previous 4090?! And lower models of previous generation are more efficient too! That's no advancement at all!
Look through pages like 6-9 to see this being discussed. It's pretty much summed up that it's increased power consumption but is also performing great to keep up with that
 

192kbps

New Member
Joined
Sep 22, 2024
Messages
14 (0.11/day)
For CS2, the most commonly used resolution is 960P. Are you willing to add this test specifically for CS2?
 
Joined
Mar 13, 2021
Messages
489 (0.35/day)
Processor AMD 7600x
Motherboard Asrock x670e Steel Legend
Cooling Silver Arrow Extreme IBe Rev B with 2x 120 Gentle Typhoons
Memory 4x16Gb Patriot Viper Non RGB @ 6000 30-36-36-36-40
Video Card(s) XFX 6950XT MERC 319
Storage 2x Crucial P5 Plus 1Tb NVME
Display(s) 3x Dell Ultrasharp U2414h
Case Coolermaster Stacker 832
Power Supply Thermaltake Toughpower PF3 850 watt
Mouse Logitech G502 (OG)
Keyboard Logitech G512
I bought a 4090 in December of 23 for below MSRP.

View attachment 381337
Congrats on getting a 4090 at basically MSRP at month 14 of its release......So everyone just hold your horses 12+ months and you MAY get it at MSRP following previous trends is what your saying?



Have people missed the fact that this is on the SAME manufacturing node as the 40xx series is? Why were people expecting to see MASSIVE efficency/power draw gains in this part?

Its like when intel went from 12th gen to 13th to 14th gen. You cant beat the laws of physics, if anything the fact they have increase die size and transistor count with such low drops in clock speed im pretty damn impressive, add to the fact they refined the design enough to eeek out a few percentage points of efficency in the heavily loaded areas is pretty good. Look at intel with the 14nm+++++++++++++ era or the 12/13/14th gen eras.
 
Joined
Feb 1, 2019
Messages
3,753 (1.72/day)
Location
UK, Midlands
System Name Main PC
Processor 13700k
Motherboard Asrock Z690 Steel Legend D4 - Bios 13.02
Cooling Noctua NH-D15S
Memory 32 Gig 3200CL14
Video Card(s) 4080 RTX SUPER FE 16G
Storage 1TB 980 PRO, 2TB SN850X, 2TB DC P4600, 1TB 860 EVO, 2x 3TB WD Red, 2x 4TB WD Red
Display(s) LG 27GL850
Case Fractal Define R4
Audio Device(s) Soundblaster AE-9
Power Supply Antec HCG 750 Gold
Software Windows 10 21H2 LTSC
Congrats on getting a 4090 at basically MSRP at month 14 of its release......So everyone just hold your horses 12+ months and you MAY get it at MSRP following previous trends is what your saying?



Have people missed the fact that this is on the SAME manufacturing node as the 40xx series is? Why were people expecting to see MASSIVE efficency/power draw gains in this part?

Its like when intel went from 12th gen to 13th to 14th gen. You cant beat the laws of physics, if anything the fact they have increase die size and transistor count with such low drops in clock speed im pretty damn impressive, add to the fact they refined the design enough to eeek out a few percentage points of efficency in the heavily loaded areas is pretty good. Look at intel with the 14nm+++++++++++++ era or the 12/13/14th gen eras.
Mid to end of gen generally is best time to buy yeah.
 
Joined
Apr 30, 2020
Messages
1,033 (0.60/day)
System Name S.L.I + RTX research rig
Processor Ryzen 7 5800X 3D.
Motherboard MSI MEG ACE X570
Cooling Corsair H150i Cappellx
Memory Corsair Vengeance pro RGB 3200mhz 32Gbs
Video Card(s) 2x Dell RTX 2080 Ti in S.L.I
Storage Western digital Sata 6.0 SDD 500gb + fanxiang S660 4TB PCIe 4.0 NVMe M.2
Display(s) HP X24i
Case Corsair 7000D Airflow
Power Supply EVGA G+1600watts
Mouse Corsair Scimitar
Keyboard Cosair K55 Pro RGB
If it's 1% better than the most efficient card on earth, then it is, by definition, the new most efficient card on earth.

Pulling 600w on its own does not make something inefficient.
No it does not when the power usage itself goes up by 27% it total counter acts any improvements made at all.
 
Joined
Dec 31, 2020
Messages
1,069 (0.72/day)
Processor E5-4627 v4
Motherboard VEINEDA X99
Memory 32 GB
Video Card(s) 2080 Ti
Storage NE-512
Display(s) G27Q
Case DAOTECH X9
Power Supply SF450
Mid to end of gen generally is best time to buy yeah.
Get a 3090 Ti at 1999 and watch it crash to 999 six months later. What a great idea, Same deal with the 5090 when 6080 comes out. You never know. Now with 32 GB that will be a little harder to beat. But if 5080 gets a 24 GB super refresh or even a Ti, that's encroaching in 4090 territory 90%.
 
Top