• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Best time to sell your used 4090s is now.

The 5090 will be sold in seconds for the first 6 months. I mean good freaking luck if you can buy one.

Back one year when I upgraded to LGA1700, I’d say the 1-year mark from release date was perfect, at least for the GPU
I narrowly missed Intel's debacle by a few weeks, but my 14900KF is fine anyways, lol.

- Low low-stock issues
- Mid demand (scalpers already scalped, day-oners already day-oned)
- And, more importantly, the hardware already passed the consumer market test.
I feel companies are selling unfinished products to us and we end up being the beta testers.
So I ended up getting all the proper cables and PSU for my setup.
 
For reference prebuilt HP pc with i9 13900k and 4090 going for $2799 just in case some are eyeing the Acer prebuild placeholder with 5080
https://videocardz.com/newz/retaile...3499-rtx-5080-acer-gaming-pcs-ahead-of-launch
1000056274.jpg
 
The 4090 will be faster than the 5080. I would keep it and not sell if it I were you. You will have the second fastest card in the world even when the 5000 series drops. Do NOT SELL YOUR 4090!
 
IF that's true, it won't be by much. However, that is very likely not true.
According to the rumors (yeah, I know lol but they do all seem to agree at this point), the 5080 has less bandwidth, considerably less cores and a lower power limit (and on the same tsmc node?)...none of that points to it even being as fast as the 4090. I'd be surprised if there's that large of an IPC gain that it can make up for all the gimping they did to it.
 
According to the rumors (yeah, I know lol but they do all seem to agree at this point), the 5080 has less bandwidth, considerably less cores and a lower power limit (and on the same tsmc node?)...none of that points to it even being as fast as the 4090. I'd be surprised if there's that large of an IPC gain that it can make up for all the gimping they did to it.
It's possible but unlikely. I think the 5070ti will be on par with the 4090, but that's just my guess based on historical generational jumps forward.
 
It's possible but unlikely. I think the 5070ti will be on par with the 4090, but that's just my guess based on historical generational jumps forward.
This is no longer accurate anymore. Look at the Ampere vs Ada stack. Everything but the x90 is left in the dust, you could play games on a 4080 or a 4070 and not notice it much. The only real generational advancement there is the 4090, the rest just nudged up slightly, often barely even beating the previous-one-tier-up GPU, and this is especially true if you look at x60(ti) territory. Its sometimes even worse, a performance regression on the same tier.

If the shader counts are true, the 5080 will just be a slightly faster 4080 Super; and if the IPC has truly improved a lot... then why would it require such a bonkers TDP increase again? Doesn't make sense at all. And IF the IPC has truly improved a lot... the 5090 will be over twice that, resulting in a similar gap to what we're seeing today in the Ada stack on its own.

Without real competition you can safely leave the historical comparisons behind, the game has changed. Nvidia wants to stall the top end performance level as much as possible, so they can ride it longer. The 5080 looks for all intents and purposes a complete standstill, I bet if we match the power between it and the 4080 S, they'll be in spitting distance of one another.
 
Cuda Core +33%, +33%+128bit extra bandwidth resulting in 60-70% performance based on rumors, so the whole series is 20% faster than you expect. But if that's anything to go by, so does the 5080 and jumps to slightly faster than the 4090.
 
Cuda Core +33%, +33%+128bit extra bandwidth resulting in 60-70% performance based on rumors, so the whole series is 20% faster than you expect. But if that's anything to go by, so does the 5080 and jumps to slightly faster than the 4090.
With 16gb ram.
 
exactly. my buddy irl also thinks he is going to get one on day 1, i keep telling him he won't even be able to get one in month 12.

the factories are focused on AI, this will be a paper launch with the supply/demand ratio.

I agree. The 32gb of vram was a dead give away. 24GB was already kind of unnecessarily high for video games. 32GB is clearly trying to attract other customers. Which I don't really understand since there's the enterprise lineup. Maybe they're trying to capture the middle market, like small businesses and whatnot that can't afford enterprise but could afford a ~$2000 consumer card. But whatever the reason, its not going to be good for gamers.
 
Last edited:
  • Like
Reactions: N/A
This is no longer accurate anymore. Look at the Ampere vs Ada stack.
That could be debated. I have a 2080 and upgraded to 3080. The increase was impressive and dramatic. I have seen the differences between the 3080 and 4080 and those numbers are good. Not as dramatic as the 2000 to 3000 jump, but still significant. We should reasonably expect that the 4000 to 5000 jump will provide a similar jump in performance, and that's what I'm expecting. My only reservation is the wattage cost. I'm really not interested in a space heater for a GPU. The 2080's TDP is 215W, the 3080 is 320w and the 4080 is the same 320W. If the 5080 is the same or higher it's going to be a tough sell for me personally.

EDIT; I did look up the general performance percentages on TPU's specs pages.
2080to3080.jpg 3080to4080.jpg
The jump from 2080 to 3080 is 63%. The jump from 3080 to 4080 is 49%. Not as pronounced, but still a big jump.
Without real competition you can safely leave the historical comparisons behind, the game has changed.
While I would normally agree with you, NVidia is actually completing against themselves, generation on generation. They need to make advances that move performance forward enough that customers are motivated to invest in the new product lineup.
 
Last edited:
That could be debated. I have a 2080 and upgraded to 3080. The increase was impressive and dramatic. I have seen the differences between the 3080 and 4080 and those numbers are good. Not as dramatic as the 2000 to 3000 jump, but still significant. We should reasonably expect that the 4000 to 5000 jump will provide a similar jump in performance, and that's what I'm expecting. My only reservation is the wattage cost. I'm really not interested in a space heater for a GPU. The 2080's TDP is 215W, the 3080 is 320w and the 4080 is the same 320W. If the 5080 is the same or higher it's going to be a tough sell for me personally.

EDIT; I did look up the general performance percentages on TPU's specs pages.
View attachment 376826 View attachment 376828
The jump from 2080 to 3080 is 63%. The jump from 3080 to 4080 is 49%. Not as pronounced, but still a big jump.

While I would normally agree with you, NVidia is actually completing against themselves, generation on generation. They need to make advances that move performance forward enough that customers are motivated to invest in the new product lineup.
Depends how you look at it

1734941197593.png


1734941258585.png

Its the same as going from a 4060 to a 4080 over the course of three generations and three purchases. Ergo, a 4060, a 4070, and now 4080. Put differently, Nvidia just moved two tiers up between Turing and Ada. But the 3080 had an MSRP of 699,-, and the 4080 went for 1199,- US :)

Also, Nvidia is indeed competing against itself, that is why it is pricing each consecutive x80 higher. You're not getting more performance, you're (also) paying more for more performance. And you correctly noted the TDP increase, too, and it ain't gonna stop. If you want to keep getting more performance like we used to, you're paying with more power, heat, MSRP, etc and you'll still get the smallest possible piece of silicon for your money, suped up even further.

There's a lot of smoke and mirrors happening lately, its not really fair anymore to compare the x80 with past gen's x80, especially not between Turing and Ada. Every gen since we've seen the stack change, as well as the performance delta within the stack, to hide the stagnation in the midrange up to and including the x80. And if you get down to x60, its often even worse.
 
Last edited:
Depends how you look at it
I prefer to look at the gain from the perspective of the original card as that is where we're starting from. However, even if we look at it from the retrospective point of view, 39% and 59% is not a terrible shout.
For perspective, the jump from the 9800GTX to the GTX280;
9800to280.jpg
That jump forward was 40%. We could keep going this way, but the numbers don't go back much further. This is the perspective I draw conclusions from as they are the most relevant from a historical context. This metric is very similar for the Radeon side of things, more or less.
Of course Intel is blowing that curve out of the water with the gen on gen progress;
A380toB580.jpg
207% jump? Forget about it! Intel's coming up! NVidia and AMD have no choice but to compete and in one more generation they will have to compete on a whole new level.
Its the same as going from a 4060 to a 4080 over the course of three generations and three purchases. Ergo, a 4060, a 4070, and now 4080. Put differently, Nvidia just moved two tiers up between Turing and Ada. But the 3080 had an MSRP of 699,-, and the 4080 went for 1199,- US :)
I'm not discussing price at all, that's a different subject altogether. I'm also only comparing like-tier GPUs gen on gen as comparing to other model tiers would not be logical or a fair comparison.

All that said, the RTX5000 series is going to be an expected jump up again, one that will align with historical averages regardless or whether we look at the3070/4070/5070, 3080/4080/5080 or 3090/4090/5090.
 
Also, Nvidia is indeed competing against itself, that is why it is pricing each consecutive x80 higher. You're not getting more performance, you're (also) paying more for more performance.
But what about the 4080 Super, which dropped MSRP by $200.

If they weren't competing they would not have taken $200 off the price. 4080 Super has also surpassed 4080 in use on Steam Hardware Survey which suggests the $200 difference does matter, even at the $1000-tier.
 
Just in case anyone missed it, Microsoft is expecting to pay $80 billion (yikes!) just on AI data centers in 2025. A substantial portion of that will be on GPUs.

Yeah, GPU supply for consumers will be constrained all year if a single company is spending 3-4 fabs worth of money just for its data centers in a single year. :(
 
But what about the 4080 Super, which dropped MSRP by $200.

If they weren't competing they would not have taken $200 off the price. 4080 Super has also surpassed 4080 in use on Steam Hardware Survey which suggests the $200 difference does matter, even at the $1000-tier.
It was clearly far too expensive at launch pricing, just like the 4080. It had the worst perf/$ of the stack. The vanilla 4080 was barely bought at its initial MSRP - what you've seen here is just Nvidia and partners not wanting to have stock laying around too long, keeping stock for too long costs money.
 
If 4090 didn't exist the 4080 would have been seen differently.

4080 could have been the World's fastest GPU at $1200.

Nvidia's mistake was launching the 4090 first, and pricing 4090 at $1600, when it has regularly been sold at a higher price...
If the 4090 wouldn't exist AMD wouldn't be going midrange this time around, so yeah indeed differently. But not the way you think it would. It would have been fighting a 1k 7900XTX with the same performance, so not quite so halo and still 200 bucks too expensive.
 
If the 4090 wouldn't exist AMD wouldn't be going midrange this time around, so yeah indeed differently. But not the way you think it would.
If 4080 launched before 4090 it would have been received better I think.

It could have had the performance crown for a few weeks and be seen better, even at $1200.

The problem was 4090 was already out and only 33% more for a similar bump in performance.

If 4090 was instead $2000 or so, the 4080 also would have been seen better.

Many people paid $2000+ for a 4090...
 
If 4080 launched before 4090 it would have been received better I think.

It could have had the performance crown for a few weeks and be seen better, even at $1200.

The problem was 4090 was already only 33% more for a similar bump in performance.

If 4090 was instead $2000 or so, the 4080 also would have been seen better.

Many people paid $2000+ for a 4090...
Oh sure, it could have, but its also questionable people would fall for that once again. It was clear there was going to be an x90, because the gap between Ada's x80 and the 3090(ti) is just too small. Without the x90, Ada would have barely moved the performance forward on the top end of the stack.

You can't sell last gen's top performance as the new halo card.

I think your fantasy is incorrect. We have also seen how Nvidia struggled with the x80, where the now x70ti was called an x80 12GB, and people took a giant shit over that. They never really positioned either the x70ti and the x80 right with Ada, and the reason is as described above: there was barely anything of note moved forward compared to Ampere. It just consumed less power and has DLSS3 / FG. Even Nvidia was stuck trying to figure out the best way to sell complete stagnation (and shit VRAM on the x70ti, which is why they figured hiding that under an x80 moniker was better, or something odd).
 
If 4080 launched before 4090 it would have been received better I think.

It could have had the performance crown for a few weeks and be seen better, even at $1200.

The problem was 4090 was already out and only 33% more for a similar bump in performance.

If 4090 was instead $2000 or so, the 4080 also would have been seen better.

Many people paid $2000+ for a 4090...

4080 was marginally faster than the 3090 while being more expensive with less VRAM. Nothing could make that card look better save for an extensive price drop. Plus who is dropping $1,300+ for the second best card? It's kind of a joke considering just how much more powerful the 4090 is.

I wouldn't say "people" as in gamers are paying $2,000+ for a 4090. A huge chunk of 4090 sales are due to AI and AI hobbyists (myself being the later). I think the 4000 series has caused people to over-estimate price hikes even the ultra-enthusiasts are able to tolerate. The higher the price, the less people that will be willing to pay. At some point you pare away everyone but the wealthy. The 5000 series is likely to sell gangbusters regardless of if gamers buy them or not. It just seems rather miserable to be a gamer, the only GPU designed to be good is the flagship at an ever increasing cost, meanwhile games are putting in more and more RT, sometimes without rasterization fallback, crushing framerates for anyone who doesn't want to pay those increasing prices. That reads an awful lot like extortion to me

I hope that AMD finally catches a clue in regards to it's pricing this gen and prices aggressively. An affordable 7900 XTX class card would be massively appealing to gamers.
 
You can't sell last gen's top performance as the new halo card.
Plus who is dropping $1,300+ for the second best card? It's kind of a joke considering just how much more powerful the 4090 is.
I think we'll see the answer to that when 5080 launches (rumored around~4090-tier)

This time they won't make the mistake of pricing the x90 anywhere near the x80... and the 5090 won't be out at the same time.
 
40 series is noticeably not nearly as big of a jump initially before the super versions of everything besides the xx60 & xx90 class came out and fixed this.
Sorry, but benchmarks pretty much everywhere show a very different result than what you suggest.
 
Back
Top