• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

How much would you pay for RTX 4080?

How much would you pay for RTX 4080 at most?

  • Less than $400

    Votes: 1,533 12.9%
  • $400

    Votes: 522 4.4%
  • $500

    Votes: 1,082 9.1%
  • $600

    Votes: 1,715 14.5%
  • $700

    Votes: 2,615 22.1%
  • $800

    Votes: 2,569 21.7%
  • $900

    Votes: 869 7.3%
  • $1000

    Votes: 639 5.4%
  • $1100

    Votes: 48 0.4%
  • $1200

    Votes: 102 0.9%
  • $1300

    Votes: 23 0.2%
  • $1400

    Votes: 32 0.3%
  • More than $1400

    Votes: 110 0.9%

  • Total voters
    11,859
  • Poll closed .
This goes to show that RTX 4080 is not a card meant for 1080p :D
But it also shows that it's a bad value even for 4K. ;)

I expected to pay no more than a $1,000 for a good AIB card. $1,400+ is insane; thank you very much, but I'll wait, especially now that I've got Intel's toy to play around with for a while.
Please post your experiences somewhere! :D
 
Please post your experiences somewhere! :D
Sure, I just need to check out more games and emulators, didn't have much time this week. I have only played the new Call of Duty and with XESS enabled it manages to stay comfortably above 60fps in 4K with a mix of custom settings. XESS looks okay, small objects (birds, flags, distant trees) exhibit weird ghosting that feels worse than DLSS2 but aside from that no artifacts or problems.

Oh,yeah, and jumping into water tanks the framerate by more than a half, I don't know if it's an Intel-related problem or in general.
 
I would like $699 compared to Ampere, but realistically $799 seems acceptable given the current situation with the extra Ampere stock.
But if i think about 7900XT performance, in current TPU testbed they are going to be really close in raster performance (probably up to -10% from 7900XT in the future) and with recent games with RT, 4080 should be +20% faster on average now, that justifies even $899 compared to 7900XT.
 
Last edited:
All the less than $400 votes are from trolls, because you known damn well that's not a realistic price.

And no, i'm not defending Nvidia here.

I voted less than $900 myself.
It may not be “realistic” but the poll Is asking at what price point an individual would purchase the card (I.e. “how much would you pay” ) - not what they think is a realistic price. I’m not specifically in the market for a 4080, but if it was priced at $500 I might be… You place a greater value on the card than I. No judgement. We are likely just at different places in our lives with different monetary priorities.
 
Depends on the use case. The cheapest 3080 currently costs £739 in the UK (MSi Ventus 3X Plus at Scan UK). Its performance is 81% of that of the 4080 at 1080p, 73% at 1440p and 67% at 4K. So the realistic price for the 4080 is either £912, £1,012 or £1,102, respectively. Despite that, the cheapest one is currently selling for £1,354 (Palit GameRock OmniBlack at Scan UK), which should cost £442 (48%) less for 1080p, £342 (33%) less for 1440p, or £252 (22%) less for 4K gaming. All in all, I think it should cost no more than $1,000 (retail - not MSRP) before I could recommend it to anyone.
I forgot to add, these prices are valid if we assume a 1:1 price-to-performance ratio in the generational leap, which shouldn't be the case. Ideally, Nvidia should have waited for 3080 inventory to clear, and then release the 4080 for the same price as a 3080.
 
I voted $600. The 4080 die is 379mm2 and the 3080 is 628mm2. Even considering cost increases, I don't see the 4080 die at launch costing as much to manufacture as the 3080 die at launch.

In essence it's a 4070 and $600 gives Nvidia ample room above the $500 3070 price tag to keep their large margin. Mind you we as end consumers should not have to lookout for Nvidia's margins. The average person is getting crushed by inflation and all I see are a bunch of companies jacking up prices and laying off people which will only further fuel the fire while they seek to maximize profits. The burden of a recession should not only be bared by the people, the very people who provided for these companies during the pandemic with public money and private spending.
 
- Raise the prices and wait for the effects and sell GUPs to cryptocurrency mining
- "Miningis is over", show the new series, raise prices even twice
- Suddenly, the players consider the same price range as expensive but they are ready to pay as much as 50% more because despite the huge increase, $ 800 still seems better than $ 1200 when the product cost $ 600 (Very weak, limited market competition).
Consumerism wins.

GTX 1080 offered ~ 60 frames in 2K VH settings in many of the most demanding games like Crycis 3 (~ 50) and Witcher 3 (~ 70). RTX 4080 offers ~ 60 frames in 4K in the most demanding games like Cyberpunk 2077 (~ 50) and Dying Light 2 (~ 75). Quite fair after about 6 years and three series as long as the price would be similar.
The performance in 4K with RT is 30-70 few frames in games like Cyberpunk 2077, Metro, Control. Without DLSS 3.0 it's below or on the verge of performance and this is the newest card and the most expensive series, so is it a card for years? Only with DLSS 3.0 because without it the increase in performance in general but also with RT techniques is good, but not overwhelming.
The GTX 1080 was $ 200 more expensive at launch than the GTX 980, but the increase in performance was 40 to 60% which can be considered a big performance boost, but quite expensive (reviews says about good performance but high price).

The RTX costs $ 1199 and there are no arguments why its price should be so much (cryptocurrencies, pandemic, targeting other markets rather than home pc players), almost double the price. Did we get a pure performance boost with and without RT in the form of double the frame rate? And what if in two years, maybe not in the next, but the another one after series, DLSS 4.0 will appear and half of the games will require a new version ? While DLSS 3.0 is not available for the GTX series, why two of the three RTX 20xx and 30xx series have not received 3.0 support? Will this not happen again in the 50xx or 60xx series? And if so, what's the point of paying twice the price right now? Since the $ 200 increase for more than 50% performance on the GTX 1080 was considered as a big price increase, how would you call the RTX 4080's price twice as high - a "robbery":pimp:?
Yep, great points about DLSS support; its already showing its flaws being proprietary as it is. Nvidia's plaything.

Its already rapidly moving into Gsync territory. FSR is making strides and provides more uniform performance advantages.

Also, I have not forgotten that FSR allowed me to play Cyberpunk 2077 at 50 avg FPS @ 3440x1440 at pretty high settings, while running an Nvidia GTX 1080. That experience on its own underlines the whole problem right there, DLSS is an exercise in futility unless you like to be Huang's lap dog.

But just the idea, using competitor technology on your green card because team green never thought of you, its very own (pretty loyal, 5th Geforce in a row for me) customer. Makes you wonder why they haven't even bothered making an agnostic version of DLSS to begin with, even if they run better versions on top. Arrogance, is what I call that.
 
Last edited:
$700, same as the 3080 MSRP even though they weren't even at that price.
 
Remember EVGA

My opinion : nvidia doesn't care much for its consumer AIB "partners", (which are competitors really), when it can sell its cards to datacenters for even more profit.
So it has no problem to engage in a competition with their own AIB partners and forcing them to quit, just like EVGA did.

Even at $600 i would pass because of the high power usage and fire hazard.
 
Remember EVGA

My opinion : nvidia doesn't care much for its consumer AIB "partners", (which are competitors really), when it can sell its cards to datacenters for even more profit.
So it has no problem to engage in a competition with their own AIB partners and forcing them to quit, just like EVGA did.

Even at $600 i would pass because of the high power usage and fire hazard.

Good logic that leads us to believe that nvidia is also free to quit the gaming market, let it focus on the profits only.
 
This goes to show that RTX 4080 is not a card meant for 1080p :D

Every card is meant for 1080p, if 1080p is your target! :)

Playing Crysis 2 Remastered with very high raytracing on my 3090, 1080p 60 is buttery smooth but judging from the GPU utilization figures, it would just barely be enough for 120 Hz and definitely miss the mark for 144 Hz... and a 4080 would be just about powerful enough to meet that performance figure.
 
I voted less than $400 because I would never willingly give my money to an anti-competitive, anti-consumer, anti-capitalist corporation that very clearly has no respect for anyone or anything.
All the less than $400 votes are from trolls
And how many Nvidia fan boys hope that AMD will win at the very top just so Nvidia will lower their prices? That is trolling.
 
2014 – GTX 980: 549 €
2016 – GTX 1080: 599 €
2018 – RTX 2080: 699 €
2020 – RTX 3080: 799 €
2022 – RTX 4080: 1,499 €

This clearly shows the problem with the card.

$100 or $150 more than the 3080 and everyone would be praising the card IMO.
 
Great idea for a thread - hope you send the poll results to NVidia. For me I wouldn't pay more than $600 for the 4080 and $800 for the 4090. And I'm happy to wait several years as needed for the prices to come down. Always vote with your wallet.
 
2014 – GTX 980: 549 €
2016 – GTX 1080: 599 €
2018 – RTX 2080: 699 €
2020 – RTX 3080: 799 €
2022 – RTX 4080: 1,499 €

~ 175%, almost 3 times MSRP now – pure recession in price / performance, the Retail pricing was even worse with RTX 2080 onwards and still is!
Hi,
Are all those current euro numbers
ATM euro is in the toilet so just wondering if you're accounting for current euro verses before euro tanked on prior releases msrp ?
 
Hi,
Think 900 would be more in tune 6.. months after release that is, which is all I'd pay.

Why would you pay 900 in 6 months when you can pay and enjoy better performance with the 7900XTX next month?

None of those are good pricing for me, but at least that's the honest buy I think.
 
Why would you pay 900 in 6 months when you can pay and enjoy better performance with the 7900XTX next month?

None of those are good pricing for me, but at least that's the honest buy I think.
Hi,
Wasn't thinking of amd release but 6.. months is usually plenty of time to see what they are releasing so yeah I never get in a hurry was my main point :cool:

Hopefully 900 would be a good ball park for amd to.
 
6 bills....in a few years, if I feel the card is competitive. My guess is a 7900xtx or xt will likely be the better value now, as well as in my personal time frame. The bleeding edge is nice to chase if you have the time. I generally don't :(
 
After Huang discontinued the support for all rtx3000 owners, including me with just the rtx4000 I will never again pay for products from this ***** company. And now the 4080 trash with 48% uplift for 70% more money and the $900 rtx 4060 now called rtx4070ti... Go to the hell jacket
 
After Huang discontinued the support for all rtx3000 owners, including me with just the rtx4000 I will never again pay for products from this ***** company. And now the 4080 trash with 48% uplift for 70% more money and the $900 rtx 4060 now called rtx4070ti... Go to the hell jacket
If you mean DLSS 3.0 and frame generation, I'm sure Nvidia will announce support on 2000 and 3000 series cards in a year or two when all their fans already have a 4000 series card in their PCs. They said the hardware is there, just it's too slow - whatever they mean by that, who knows.
 
2014 – GTX 980: 549 €
2016 – GTX 1080: 599 €
2018 – RTX 2080: 699 €
2020 – RTX 3080: 799 €
2022 – RTX 4080: 1,499 €

~ 175%, almost 3 times MSRP now – pure recession in price / performance, the Retail pricing was even worse with RTX 2080 onwards and still is!
When you look at the increase cost rate of silicon wafers you will understand more why this increase is not only ' nv greed' based.
In the past there was a linear increase of silicon gen to gen that`s correlated to the linear increase in GPU cost.
Now it's in a more exponential order and so are GPU price. The pic I added is from 2019 analysis and before the +20-30% cost increase at 5/4nm + covid19 effect.
There is a degree of added cost on that from NV\AMD, where NV asks for bigger extra for the best absolut performance, but no so much as you might think and not 300% more out of thin air.
If you are into bashing NV for their high cost, bash TSMC as well. They have major role in the situation we are in.

still, I voted 700$ as it will translate to 1000-1100$ on my country after shipment and tax.
f1.jpg
 
When you look at the increase cost rate of silicon wafers you will understand more why this increase is not only ' nv greed' based.
In the past there was a linear increase of silicon gen to gen that`s correlated to the linear increase in GPu cost.
Now it's in a more exponential order and so are GPU price. The pic I added is from 2020 analysis and before the +20% cost increase at 5/4nm
There is a degree of added cost on that from NV\AMD, but no so much as you might think.
If you are into bashing NV for their high cost, bash TSMC as well. They have major role in the situation we are in.

still, I voted 700$ as it will translate to 1000-1100$ on my country after shipment and tax.
View attachment 270697
If AMD can do a node shrink from 7 nm to 5, introduce chiplets and redesign the GPU core structure among other changes, and still set the 7900 XTX to the same MSRP as the 6900 XT had at launch, then why does Nvidia have to charge double for a simple node shrink? That's the fishy part.
 
When you look at the increase cost rate of silicon wafers you will understand more why this increase is not only ' nv greed' based.
In the past there was a linear increase of silicon gen to gen that`s correlated to the linear increase in GPU cost.
Now it's in a more exponential order and so are GPU price. The pic I added is from 2019 analysis and before the +20-30% cost increase at 5/4nm + covid19 effect.
There is a degree of added cost on that from NV\AMD, where NV asks for bigger extra for the best absolut performance, but no so much as you might think and not 300% more out of thin air.
If you are into bashing NV for their high cost, bash TSMC as well. They have major role in the situation we are in.

still, I voted 700$ as it will translate to 1000-1100$ on my country after shipment and tax.
View attachment 270697
You missing something, rtx 3080 is 628mm^2, while 4080 is 379mm^2 and the graph says that the price is for same die size so 4080 with its twice as small dies should be even cheaper than the 3080 because the price of bigger dies is growing exponentially
 
If AMD can do a node shrink from 7 nm to 5, introduce chiplets and redesign the GPU core structure among other changes, and still set the 7900 XTX to the same MSRP as the 6900 XT had at launch, then why does Nvidia have to charge double for a simple node shrink? That's the fishy part.
Absolut performance advantage, DLSS3, superior RT capabilities, much bigger market penatrestion and general better acceptness world-wide aka more demend, CUDA cores for professional uses.
AMD`s chiplet\MCM and GPU core structure give you as an end user nothing just as the node it is manufactured on give you nothing. It is just an a thecnicl approche (you can admire and praise for itself, I do) to get more preformance, but you as a user "don`t see it"- and can`t "play" with it so it`s a given thing. On the other hand, DLSS3 give you a lot and so do tensor cores givs you strong RT preformance. If you need it or not is not-relevant to the discussion- the point is that those are things that the end user actually see and can measure its impact when on or off.

You missing something, rtx 3080 is 628mm^2, while 4080 is 379mm^2 and the graph says that the price is for same die size so 4080 with its twice as small dies should be even cheaper than the 3080 because the price of bigger dies is growing exponentially
It`s is less than twice as small (314 will be twice ) and as I said those are 2019 prices (see buttom right). add 20-30% (or even more) for today 4nm.
NV do use smaller die area and that's how they makes more money + they charge extra more on top of that. No denying that.
All I said is when factoring silicone price increase, thet is not NV doing in any way, you see that this 300% more that`s been claimed is actually less.
Also, that why you can`t expect the gradual linear increase like we saw in the past. With next gen at 2/3nm we will see yet further exponential increase, maybe in a bigger order than today (Power of 3 insted of 2 for example).
So to bring the non-relevant past (that is linear increase) as a claim for today is null.

To sum it up: The rate of increase is getting bigger and that rate also increases (see "jerk"- the rate of increase in acceleration).
The past is getting less and less relevant for the present when you try to understand current prices.
 
Last edited:
Back
Top