• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Best time to sell your used 4090s is now.

Joined
Sep 4, 2022
Messages
361 (0.42/day)

alfablac

New Member
Joined
Nov 16, 2023
Messages
7 (0.02/day)
The 5090 will be sold in seconds for the first 6 months. I mean good freaking luck if you can buy one.

Back one year when I upgraded to LGA1700, I’d say the 1-year mark from release date was perfect, at least for the GPU
I narrowly missed Intel's debacle by a few weeks, but my 14900KF is fine anyways, lol.

- Low low-stock issues
- Mid demand (scalpers already scalped, day-oners already day-oned)
- And, more importantly, the hardware already passed the consumer market test.
I feel companies are selling unfinished products to us and we end up being the beta testers.
So I ended up getting all the proper cables and PSU for my setup.
 
Joined
Sep 4, 2022
Messages
361 (0.42/day)
For reference prebuilt HP pc with i9 13900k and 4090 going for $2799 just in case some are eyeing the Acer prebuild placeholder with 5080
https://videocardz.com/newz/retaile...3499-rtx-5080-acer-gaming-pcs-ahead-of-launch
1000056274.jpg
 
Joined
Jan 9, 2017
Messages
208 (0.07/day)
The 4090 will be faster than the 5080. I would keep it and not sell if it I were you. You will have the second fastest card in the world even when the 5000 series drops. Do NOT SELL YOUR 4090!
 
Joined
Sep 5, 2023
Messages
463 (0.94/day)
Location
USA
System Name Dark Palimpsest
Processor Intel i9 13900k with Optimus Foundation Block
Motherboard EVGA z690 Classified
Cooling MO-RA3 420mm Custom Loop
Memory G.Skill 6000CL30, 64GB
Video Card(s) Nvidia 4090 FE with Heatkiller Block
Storage 3 NVMe SSDs, 2TB-each, plus a SATA SSD
Display(s) Gigabyte FO32U2P (32" QD-OLED) , Asus ProArt PA248QV (24")
Case Be quiet! Dark Base Pro 900
Audio Device(s) Logitech G Pro X
Power Supply Be quiet! Straight Power 12 1200W
Mouse Logitech G502 X
Keyboard GMMK Pro + Numpad
IF that's true, it won't be by much. However, that is very likely not true.
According to the rumors (yeah, I know lol but they do all seem to agree at this point), the 5080 has less bandwidth, considerably less cores and a lower power limit (and on the same tsmc node?)...none of that points to it even being as fast as the 4090. I'd be surprised if there's that large of an IPC gain that it can make up for all the gimping they did to it.
 
Joined
Jul 5, 2013
Messages
28,455 (6.77/day)
According to the rumors (yeah, I know lol but they do all seem to agree at this point), the 5080 has less bandwidth, considerably less cores and a lower power limit (and on the same tsmc node?)...none of that points to it even being as fast as the 4090. I'd be surprised if there's that large of an IPC gain that it can make up for all the gimping they did to it.
It's possible but unlikely. I think the 5070ti will be on par with the 4090, but that's just my guess based on historical generational jumps forward.
 
Joined
Sep 17, 2014
Messages
22,808 (6.06/day)
Location
The Washing Machine
System Name Tiny the White Yeti
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling CPU: Thermalright Peerless Assassin / Case: Phanteks T30-120 x3
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
VR HMD HD 420 - Green Edition ;)
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
It's possible but unlikely. I think the 5070ti will be on par with the 4090, but that's just my guess based on historical generational jumps forward.
This is no longer accurate anymore. Look at the Ampere vs Ada stack. Everything but the x90 is left in the dust, you could play games on a 4080 or a 4070 and not notice it much. The only real generational advancement there is the 4090, the rest just nudged up slightly, often barely even beating the previous-one-tier-up GPU, and this is especially true if you look at x60(ti) territory. Its sometimes even worse, a performance regression on the same tier.

If the shader counts are true, the 5080 will just be a slightly faster 4080 Super; and if the IPC has truly improved a lot... then why would it require such a bonkers TDP increase again? Doesn't make sense at all. And IF the IPC has truly improved a lot... the 5090 will be over twice that, resulting in a similar gap to what we're seeing today in the Ada stack on its own.

Without real competition you can safely leave the historical comparisons behind, the game has changed. Nvidia wants to stall the top end performance level as much as possible, so they can ride it longer. The 5080 looks for all intents and purposes a complete standstill, I bet if we match the power between it and the 4080 S, they'll be in spitting distance of one another.
 
Joined
Dec 31, 2020
Messages
1,024 (0.70/day)
Processor E5-4627 v4
Motherboard VEINEDA X99
Memory 32 GB
Video Card(s) 2080 Ti
Storage NE-512
Display(s) G27Q
Case DAOTECH X9
Power Supply SF450
Cuda Core +33%, +33%+128bit extra bandwidth resulting in 60-70% performance based on rumors, so the whole series is 20% faster than you expect. But if that's anything to go by, so does the 5080 and jumps to slightly faster than the 4090.
 
Joined
Oct 1, 2021
Messages
118 (0.10/day)
System Name Phenomenal1
Processor Ryzen 7 5800x3d
Motherboard MSI X570 Gaming Plus
Cooling Noctua NH-D15s with added NF-A12x25 fan on front
Memory Predator Vesta 32GB (2 x 16GB) DDR4 3600 BL.9BWWR.300 B-Die @3733 CL14
Video Card(s) PNY XLR8 RTX 4090 VERTO EPIC-X Triple Fan
Storage Boot SSD: SATA 500GB - 1tb + 4tb pcie3 nvme / Spinning Drives: 2tb + 6 tb
Display(s) Gigabyte M27Q 27" 1440P 170 Hz IPS BGR monitor
Case Fractal Torrent - Black
Audio Device(s) Realtek ALC1220P
Power Supply Super Flower Leadex VII XG 1000W
Mouse Razer Viper
Keyboard Razer Huntsman V2 Optical
Software Windows 11
Benchmark Scores Time Spy - 28594 https://www.3dmark.com/spy/51010148
Cuda Core +33%, +33%+128bit extra bandwidth resulting in 60-70% performance based on rumors, so the whole series is 20% faster than you expect. But if that's anything to go by, so does the 5080 and jumps to slightly faster than the 4090.
With 16gb ram.
 
Joined
Aug 9, 2024
Messages
168 (1.11/day)
Location
Michigan, United States
Processor Intel i7-13700K
Motherboard Asus ROG Strix Z690-E Gaming
Cooling NZXT Kraken Elite 360
Memory G.Skill Trident Z DDR5-6400
Video Card(s) MSI RTX 4090 Suprim X Liquid
Storage Western Digital SN850X 4Tb x 4
Case Corsair 5000D Airflow
Audio Device(s) Creative AE-5 Plus
Power Supply Corsair HX-1200
Software Windows 11 Pro 23H2
Joined
Mar 7, 2023
Messages
943 (1.40/day)
System Name BarnacleMan
Processor 14700KF
Motherboard Gigabyte B760 Aorus Elite Ax DDR5
Cooling ARCTIC Liquid Freezer II 240 + P12 Max Fans
Memory 32GB Kingston Fury Beast
Video Card(s) Asus Tuf 4090 24GB
Storage 4TB sn850x, 2TB sn850x, 2TB Netac Nv7000 + 2TB p5 plus, 4TB MX500 * 2 = 18TB. Plus dvd burner.
Display(s) Dell 23.5" 1440P IPS panel
Case Lian Li LANCOOL II MESH Performance Mid-Tower
Audio Device(s) Logitech Z623
Power Supply Gigabyte ud850gm pg5
exactly. my buddy irl also thinks he is going to get one on day 1, i keep telling him he won't even be able to get one in month 12.

the factories are focused on AI, this will be a paper launch with the supply/demand ratio.

I agree. The 32gb of vram was a dead give away. 24GB was already kind of unnecessarily high for video games. 32GB is clearly trying to attract other customers. Which I don't really understand since there's the enterprise lineup. Maybe they're trying to capture the middle market, like small businesses and whatnot that can't afford enterprise but could afford a ~$2000 consumer card. But whatever the reason, its not going to be good for gamers.
 
Last edited:
  • Like
Reactions: N/A
Joined
Jul 5, 2013
Messages
28,455 (6.77/day)
This is no longer accurate anymore. Look at the Ampere vs Ada stack.
That could be debated. I have a 2080 and upgraded to 3080. The increase was impressive and dramatic. I have seen the differences between the 3080 and 4080 and those numbers are good. Not as dramatic as the 2000 to 3000 jump, but still significant. We should reasonably expect that the 4000 to 5000 jump will provide a similar jump in performance, and that's what I'm expecting. My only reservation is the wattage cost. I'm really not interested in a space heater for a GPU. The 2080's TDP is 215W, the 3080 is 320w and the 4080 is the same 320W. If the 5080 is the same or higher it's going to be a tough sell for me personally.

EDIT; I did look up the general performance percentages on TPU's specs pages.
2080to3080.jpg 3080to4080.jpg
The jump from 2080 to 3080 is 63%. The jump from 3080 to 4080 is 49%. Not as pronounced, but still a big jump.
Without real competition you can safely leave the historical comparisons behind, the game has changed.
While I would normally agree with you, NVidia is actually completing against themselves, generation on generation. They need to make advances that move performance forward enough that customers are motivated to invest in the new product lineup.
 
Last edited:
Joined
Sep 17, 2014
Messages
22,808 (6.06/day)
Location
The Washing Machine
System Name Tiny the White Yeti
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling CPU: Thermalright Peerless Assassin / Case: Phanteks T30-120 x3
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
VR HMD HD 420 - Green Edition ;)
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
That could be debated. I have a 2080 and upgraded to 3080. The increase was impressive and dramatic. I have seen the differences between the 3080 and 4080 and those numbers are good. Not as dramatic as the 2000 to 3000 jump, but still significant. We should reasonably expect that the 4000 to 5000 jump will provide a similar jump in performance, and that's what I'm expecting. My only reservation is the wattage cost. I'm really not interested in a space heater for a GPU. The 2080's TDP is 215W, the 3080 is 320w and the 4080 is the same 320W. If the 5080 is the same or higher it's going to be a tough sell for me personally.

EDIT; I did look up the general performance percentages on TPU's specs pages.
View attachment 376826 View attachment 376828
The jump from 2080 to 3080 is 63%. The jump from 3080 to 4080 is 49%. Not as pronounced, but still a big jump.

While I would normally agree with you, NVidia is actually completing against themselves, generation on generation. They need to make advances that move performance forward enough that customers are motivated to invest in the new product lineup.
Depends how you look at it

1734941197593.png


1734941258585.png

Its the same as going from a 4060 to a 4080 over the course of three generations and three purchases. Ergo, a 4060, a 4070, and now 4080. Put differently, Nvidia just moved two tiers up between Turing and Ada. But the 3080 had an MSRP of 699,-, and the 4080 went for 1199,- US :)

Also, Nvidia is indeed competing against itself, that is why it is pricing each consecutive x80 higher. You're not getting more performance, you're (also) paying more for more performance. And you correctly noted the TDP increase, too, and it ain't gonna stop. If you want to keep getting more performance like we used to, you're paying with more power, heat, MSRP, etc and you'll still get the smallest possible piece of silicon for your money, suped up even further.

There's a lot of smoke and mirrors happening lately, its not really fair anymore to compare the x80 with past gen's x80, especially not between Turing and Ada. Every gen since we've seen the stack change, as well as the performance delta within the stack, to hide the stagnation in the midrange up to and including the x80. And if you get down to x60, its often even worse.
 
Last edited:
Joined
Jul 5, 2013
Messages
28,455 (6.77/day)
Depends how you look at it
I prefer to look at the gain from the perspective of the original card as that is where we're starting from. However, even if we look at it from the retrospective point of view, 39% and 59% is not a terrible shout.
For perspective, the jump from the 9800GTX to the GTX280;
9800to280.jpg
That jump forward was 40%. We could keep going this way, but the numbers don't go back much further. This is the perspective I draw conclusions from as they are the most relevant from a historical context. This metric is very similar for the Radeon side of things, more or less.
Of course Intel is blowing that curve out of the water with the gen on gen progress;
A380toB580.jpg
207% jump? Forget about it! Intel's coming up! NVidia and AMD have no choice but to compete and in one more generation they will have to compete on a whole new level.
Its the same as going from a 4060 to a 4080 over the course of three generations and three purchases. Ergo, a 4060, a 4070, and now 4080. Put differently, Nvidia just moved two tiers up between Turing and Ada. But the 3080 had an MSRP of 699,-, and the 4080 went for 1199,- US :)
I'm not discussing price at all, that's a different subject altogether. I'm also only comparing like-tier GPUs gen on gen as comparing to other model tiers would not be logical or a fair comparison.

All that said, the RTX5000 series is going to be an expected jump up again, one that will align with historical averages regardless or whether we look at the3070/4070/5070, 3080/4080/5080 or 3090/4090/5090.
 
Joined
Apr 14, 2016
Messages
111 (0.03/day)
Also, Nvidia is indeed competing against itself, that is why it is pricing each consecutive x80 higher. You're not getting more performance, you're (also) paying more for more performance.
But what about the 4080 Super, which dropped MSRP by $200.

If they weren't competing they would not have taken $200 off the price. 4080 Super has also surpassed 4080 in use on Steam Hardware Survey which suggests the $200 difference does matter, even at the $1000-tier.
 
Joined
Oct 5, 2024
Messages
147 (1.55/day)
Location
United States of America
Just in case anyone missed it, Microsoft is expecting to pay $80 billion (yikes!) just on AI data centers in 2025. A substantial portion of that will be on GPUs.

Yeah, GPU supply for consumers will be constrained all year if a single company is spending 3-4 fabs worth of money just for its data centers in a single year. :(
 
Joined
Sep 17, 2014
Messages
22,808 (6.06/day)
Location
The Washing Machine
System Name Tiny the White Yeti
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling CPU: Thermalright Peerless Assassin / Case: Phanteks T30-120 x3
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
VR HMD HD 420 - Green Edition ;)
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
But what about the 4080 Super, which dropped MSRP by $200.

If they weren't competing they would not have taken $200 off the price. 4080 Super has also surpassed 4080 in use on Steam Hardware Survey which suggests the $200 difference does matter, even at the $1000-tier.
It was clearly far too expensive at launch pricing, just like the 4080. It had the worst perf/$ of the stack. The vanilla 4080 was barely bought at its initial MSRP - what you've seen here is just Nvidia and partners not wanting to have stock laying around too long, keeping stock for too long costs money.
 
Joined
Sep 17, 2014
Messages
22,808 (6.06/day)
Location
The Washing Machine
System Name Tiny the White Yeti
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling CPU: Thermalright Peerless Assassin / Case: Phanteks T30-120 x3
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
VR HMD HD 420 - Green Edition ;)
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
If 4090 didn't exist the 4080 would have been seen differently.

4080 could have been the World's fastest GPU at $1200.

Nvidia's mistake was launching the 4090 first, and pricing 4090 at $1600, when it has regularly been sold at a higher price...
If the 4090 wouldn't exist AMD wouldn't be going midrange this time around, so yeah indeed differently. But not the way you think it would. It would have been fighting a 1k 7900XTX with the same performance, so not quite so halo and still 200 bucks too expensive.
 
Joined
Apr 14, 2016
Messages
111 (0.03/day)
If the 4090 wouldn't exist AMD wouldn't be going midrange this time around, so yeah indeed differently. But not the way you think it would.
If 4080 launched before 4090 it would have been received better I think.

It could have had the performance crown for a few weeks and be seen better, even at $1200.

The problem was 4090 was already out and only 33% more for a similar bump in performance.

If 4090 was instead $2000 or so, the 4080 also would have been seen better.

Many people paid $2000+ for a 4090...
 
Joined
Sep 17, 2014
Messages
22,808 (6.06/day)
Location
The Washing Machine
System Name Tiny the White Yeti
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling CPU: Thermalright Peerless Assassin / Case: Phanteks T30-120 x3
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
VR HMD HD 420 - Green Edition ;)
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
If 4080 launched before 4090 it would have been received better I think.

It could have had the performance crown for a few weeks and be seen better, even at $1200.

The problem was 4090 was already only 33% more for a similar bump in performance.

If 4090 was instead $2000 or so, the 4080 also would have been seen better.

Many people paid $2000+ for a 4090...
Oh sure, it could have, but its also questionable people would fall for that once again. It was clear there was going to be an x90, because the gap between Ada's x80 and the 3090(ti) is just too small. Without the x90, Ada would have barely moved the performance forward on the top end of the stack.

You can't sell last gen's top performance as the new halo card.

I think your fantasy is incorrect. We have also seen how Nvidia struggled with the x80, where the now x70ti was called an x80 12GB, and people took a giant shit over that. They never really positioned either the x70ti and the x80 right with Ada, and the reason is as described above: there was barely anything of note moved forward compared to Ampere. It just consumed less power and has DLSS3 / FG. Even Nvidia was stuck trying to figure out the best way to sell complete stagnation (and shit VRAM on the x70ti, which is why they figured hiding that under an x80 moniker was better, or something odd).
 
Joined
Jul 13, 2016
Messages
3,388 (1.09/day)
Processor Ryzen 7800X3D
Motherboard ASRock X670E Taichi
Cooling Noctua NH-D15 Chromax
Memory 32GB DDR5 6000 CL30
Video Card(s) MSI RTX 4090 Trio
Storage P5800X 1.6TB 4x 15.36TB Micron 9300 Pro 4x WD Black 8TB M.2
Display(s) Acer Predator XB3 27" 240 Hz
Case Thermaltake Core X9
Audio Device(s) JDS Element IV, DCA Aeon II
Power Supply Seasonic Prime Titanium 850w
Mouse PMM P-305
Keyboard Wooting HE60
VR HMD Valve Index
Software Win 10
If 4080 launched before 4090 it would have been received better I think.

It could have had the performance crown for a few weeks and be seen better, even at $1200.

The problem was 4090 was already out and only 33% more for a similar bump in performance.

If 4090 was instead $2000 or so, the 4080 also would have been seen better.

Many people paid $2000+ for a 4090...

4080 was marginally faster than the 3090 while being more expensive with less VRAM. Nothing could make that card look better save for an extensive price drop. Plus who is dropping $1,300+ for the second best card? It's kind of a joke considering just how much more powerful the 4090 is.

I wouldn't say "people" as in gamers are paying $2,000+ for a 4090. A huge chunk of 4090 sales are due to AI and AI hobbyists (myself being the later). I think the 4000 series has caused people to over-estimate price hikes even the ultra-enthusiasts are able to tolerate. The higher the price, the less people that will be willing to pay. At some point you pare away everyone but the wealthy. The 5000 series is likely to sell gangbusters regardless of if gamers buy them or not. It just seems rather miserable to be a gamer, the only GPU designed to be good is the flagship at an ever increasing cost, meanwhile games are putting in more and more RT, sometimes without rasterization fallback, crushing framerates for anyone who doesn't want to pay those increasing prices. That reads an awful lot like extortion to me

I hope that AMD finally catches a clue in regards to it's pricing this gen and prices aggressively. An affordable 7900 XTX class card would be massively appealing to gamers.
 
Joined
Apr 14, 2016
Messages
111 (0.03/day)
You can't sell last gen's top performance as the new halo card.
Plus who is dropping $1,300+ for the second best card? It's kind of a joke considering just how much more powerful the 4090 is.
I think we'll see the answer to that when 5080 launches (rumored around~4090-tier)

This time they won't make the mistake of pricing the x90 anywhere near the x80... and the 5090 won't be out at the same time.
 
Top