Tuesday, November 29th 2022

$700-800 Ideal Price for GeForce RTX 4080: TechPowerUp Poll Surveying 11,000 Respondents

The ideal price for the NVIDIA GeForce RTX 4080 "Ada" graphics card is around USD $700 to $800, according to results from a recent TechPowerUp Front-page poll surveying our readers. Our poll "How much would you pay for RTX 4080 at most?" received over 11,000 responses. At the number 1 spot with 22% of the vote is $800, closely followed by $700. Together, this range represents 44% of the voters. 14% of our readers think $600 is an ideal price, followed by "less than $400" at 13%. 9% think $500 seems fair, followed by 7% willing to spend as much as $900. 5% is happy to spend $1,100. 2% or less feel that the current $1,200 MSRP is justified or are willing to spend more than MSRP. There's more to a majority finding sanity with the $700 to $800 price-range.

With NVIDIA cancelling the RTX 4080 12 GB, the RTX 4080 16 GB became the only SKU to bear the name "RTX 4080." This $1,200 MSRP GeForce RTX 4080 is the successor to the RTX 3080, which debuted at $700, marking a $500 MSRP increase generation-over-generation (or +71%). You begin to see why most readers prefer the $700-800 range to be the ideal MSRP, and are willing to tolerate a $100 increase. For even more context, the RTX 3080 "Ampere" launched at the same $700 MSRP that its successor, the RTX 2080 "Turing" launched at. The GTX 1080 "Pascal" came out at $600 ($700 for the Founders Edition), which explains the interest for $600 in our poll.
And then there's a sizable chunk of our readers who simply seem disillusioned with GPU pricing, and feel that either $500 to $400, or something lower, is the max that they would be willing to pay for the RTX 4080. Can NVIDIA even break-even at such prices? NVIDIA's own quarterly financial results reference vague margins as high as 60% (not specific to any product, but as a general rule, margins tend to be proportionate to MSRP, with the higher priced products generally having a fatter margin). At 50% to 60% margins for its $1,200 MSRP, we'd be in the neighborhood of $500 to $600. We've seen examples in the past of NVIDIA cutting its prices in sharp response to competitive AMD products, with both brands fiercely locked in price-wars, and their products selling at less than half their MSRPs. So a $500 to $600 price for the RTX 4080 still seems possible on paper, and cannot be easily dismissed as "impossible."

On the other hand, prices have been going up everywhere: we've got inflation, higher prices for gas and power, and no doubt, TSMC is charging more for a 4 nm wafer than what Samsung has been charging for their 8 nm technology. NVIDIA was also Samsung's biggest customer—today there's plenty of competition for allocation on TSMC's latest and greatest nodes. Apple, Qualcomm, AMD, everybody wants their chips made on the best process in the world, so prices will end up higher for that reason, too.
A tiny fraction of our readers thinks that the $1,200 MSRP is fair, or is willing to pay more than $1,400. This probably aligns with the demographic that is actually buying the RTX 4080 at its current prices—or are willing to spend top-dollar for any other high-end graphics card. The poll results indicate that NVIDIA will be able to push more volume by lowering the price, but given the current inventory levels of GeForce 30 cards it could be that they rather be content selling the RTX 4080 at ≥$1,200 at high margins to a tiny fraction of people.
Add your own comment

140 Comments on $700-800 Ideal Price for GeForce RTX 4080: TechPowerUp Poll Surveying 11,000 Respondents

#101
Fluffmeister
Clearly a PERFECT opportunity for AMD to pull the rug from under Nvidia's feet an crush them and steal tons of market share with their $650 RX 7900 XTX and $575 Radeon RX 7900 XT.
Posted on Reply
#102
Minus Infinity
hsew4N must be really expensive
Actually that's part of the problem Nvidia got themselves into. Paying way over the money for bleeding edge node unlike AMD sticking to proven and less expensive N5. Nvidia is screwed on pricing, they simply cannot compete, RDNA3 cost AMD a lot less money to manufacture and even at $899 for 7900XT they are making a nice healthy margin. I would not pay more than $799 for 4080, but even if it were I would most likely still get 7900XT(X). Nvidia doesn't deserve a red cent of mine.
Posted on Reply
#103
Bwaze
Being OK with 70% price increase will inevitably lead to this:

2020, RTX 3080 - $700
2022, RTX 4080 - $1200 <- WE ARE HERE
2024, RTX 5080 - $2040
2026, RTX 6080 - $3468
2028, RTX 7080 - $5896
2030, RTX 8080 - $10022
2032, RTX 9080 - $17038
2034, GTX 1080 - $28965

And remember, before commenting about higher wafer and new processes cost, that this time around an X080 card doesn't even come with a cut down top of the line processor (RTX 4090 has AD102, 608 mm²), as it was normal, but is build around much smaller AD103, 378.6 mm². Compare this to Ampere architecture cards:

GeForce RTX 3080 (GA102, 628 mm²)
GeForce RTX 3080 12GB (GA102)
GeForce RTX 3080 Ti (GA102)
GeForce RTX 3090 (GA102)
GeForce RTX 3090 Ti (GA102)

So, not only we are having an unprecedented price inflation in a single generation (+70%), Nvidia is also selling us a lower tier, which was up until now reserved for X070 cards.

I can only imagine this was all planned during crypto high, and the coke still hasn't run out at Nvidia headquarters.
Posted on Reply
#104
Prima.Vera
I think 700 Euros or Dollars, including taxes and VAT, it's a fair price for a custom 4080 card. The nGreedia own's card shouldn't be more than 649 MSRP....
Posted on Reply
#105
hat
Enthusiast
The 8800 Ultra was less than $850 at launch. The 8800 Ultra was the top end card. The 4090 is $1600 MSRP, and there's still room for a Ti variant.
Posted on Reply
#106
Chomiq
Why_MeThat's on Europe. Nobody forced them to have insane VAT such as Poland's 25%.
23 actually.
Posted on Reply
#107
Vayra86
gffermariShould I have made it clear that I was referring to best single gaming card of the gen?
No I shouldn't...
You've been making perfect sense. The model number in the tier/stack of products does tell us a lot about its positioning, intent, target audience, etc. Its what the whole marketing game is about, denial has no place here, even if there are some outliers to what is an established norm.

In the same vein, die size is an absolute factor too, but the performance relative to a generation before or after it, is up for interpretation and thát is what makes or breaks a price. That's why a 314 sq/mm 1080 sold well while current x80's are a harder sell with a múch larger die.

That's the space Nvidia is exploring here really, trying to stretch things up further to justify higher pricing. Except this is like shock therapy; this isn't just stagnation or a baby step forward, the 4080 is a massive step backwards. Something is gonna give though, and I think this time, its Nvidia, unless they are content with losing market share.
FluffmeisterClearly a PERFECT opportunity for AMD to pull the rug from under Nvidia's feet an crush them and steal tons of market share with their $650 RX 7900 XTX and $575 Radeon RX 7900 XT.
I see what you did there :D
Posted on Reply
#108
JustBenching
Vayra86Absolutely the die was smaller! But that's entirely the beauty of Pascal. It did so much more with so little. That's a shrink The Way It's Meant to be Played. Part of that is also that Nvidia had been stuck on 28nm for só long.

Today, a shrink enables an immediate maxing out of the silicon and then it is still not enough, so we need retarded power targets.
I agree that pascal was a thing of beauty, but honestly, do we REALLY care about how much nvidia is charging us per mm of die? I don't frankly. I check the performance and the price, whether they use a huge or small die is kinda irrelevant to me.

I also don't understand the outrage with the prices. You think its expensive? Great, don't buy it, the prices will drop, profit.
Posted on Reply
#109
Vayra86
Minus InfinityActually that's part of the problem Nvidia got themselves into. Paying way over the money for bleeding edge node unlike AMD sticking to proven and less expensive N5. Nvidia is screwed on pricing, they simply cannot compete, RDNA3 cost AMD a lot less money to manufacture and even at $899 for 7900XT they are making a nice healthy margin. I would not pay more than $799 for 4080, but even if it were I would most likely still get 7900XT(X). Nvidia doesn't deserve a red cent of mine.
You see it too?

Team green needs to secure the top spot in the epeen ladder for 2024 by...
- maxing out the silicon directly on the top end node of the moment
- using increased power targets gen-to-gen since Turing, and despite a shrink, yet another.
- enabling 450W~600W power target on the top end to even extract meaningful OC results... if we call 6% meaningful, which is a stretch.
- pushing even an x80 to a price point beyond what used to be absolute top end

One does start to wonder how they'll proceed from here. But a reduction in ridiculousness is going to surprise me. What roads do they have left? Surely they're not going to follow suit on AMD's technology leadership now, are they? :laugh:

The longer you look at recent developments, it slowly identifies winning and losing technology. Clearly chiplet approach is the way, and Nvidia + Intel might have stuck too long to tried and tested stuff. Both companies keep launching product stacks that only show us 'further escalation' of proven approaches, except they're constantly running in the red zone, stacking bandaid upon bandaid to keep it all afloat. DLSS 3, is for Nvidia what PL1=2 is for Intel. The similarities are striking, the trade offs for that last snippet of performance (or not destroying latency...) are similar too.

Those ships have most definitely hit the iceberg, but apparently the captain still feels fine and the band keeps playing.
fevgatosI agree that pascal was a thing of beauty, but honestly, do we REALLY care about how much nvidia is charging us per mm of die? I don't frankly. I check the performance and the price, whether they use a huge or small die is kinda irrelevant to me.

I also don't understand the outrage with the prices. You think its expensive? Great, don't buy it, the prices will drop, profit.
Absolutely agreed, the one caveat is part of the consumer 'battle' is outrage over prices. These things matter, we should applaud that, not criticise it altogether. What also matters, is that people put their money where their mouth is. (Or, don't put money :)) Don't ever underestimate the power of peer pressure here. It matters, and we need it.

To me, die size / price is an indicator of how much wiggle room there is left for a company to flesh out a product stack further. And it can then also tell us a lot about pricing, what's 'fair' and what's not, etc.
Posted on Reply
#110
Bwaze
fevgatosI also don't understand the outrage with the prices. You think its expensive? Great, don't buy it, the prices will drop, profit.
Outrage? Its a discussion on a forum, not violent protest in a street. But people are clearly "not buying it". Proof? Just look at the availability of an X080 card, released two weeks ago. When has this happened before?

RTX 4080 at Geizhals.eu

Posted on Reply
#112
b1k3rdude
Bomby569You guys are in the EU just buy it in any other country. Some even do free shipping (not sure in those eastern parts but should be the same). Not that you can find much cheaper for what i see
This pointless for a number of reasons, but the obvious are import and vat charges and no international warranty. With just those 2 issues, it renders your suggestion useless.
Posted on Reply
#113
JustBenching
BwazeOutrage? Its a discussion on a forum, not violent protest in a street. But people are clearly "not buying it". Proof? Just look at the availability of an X080 card, released two weeks ago. When has this happened before?

RTX 4080 at Geizhals.eu

Im not specifically talking about this forum. Multiple posters in multiple forums going bonkers. People are not buying the 4080? Great, prices will drop.
Posted on Reply
#114
b1k3rdude
trog100nvidia got rid of sli on mid range cards for a reason..
Yeah was because it wasnt a viable solution. Of the top of my head, micro-stutter, power useage didnt match performance and what I suspect as the main reason - nVidia dumping the responsibility for making games compatible onto games devs. Said devs clearly saw through that B$, which resulted in less and less games supporting the standard. So nVidia as a for-profit corp, it was then about the profit and where best to extract it from. Which si why you only see nVlink in the professional space.
Posted on Reply
#115
Bwaze
fevgatosIm not specifically talking about this forum. Multiple posters in multiple forums going bonkers. People are not buying the 4080? Great, prices will drop.
And people are going bonkers about people complaining about the price. Which is, in my opinion, even sillier.

Will the prices drop? When Nvidia released RTX 2080 with price increase and almost no performance increase compared to GTX 1080 Ti, just the promise of RTX and DLSS games in the future, people weren't very enthusiastic either. But Nvidia persevered, and waited with quite a lot of quarters in the red in gaming sector. And tried to improve this at Ampere launch, which was so inviting on paper. If only the crypto 2020 bull wouldn't happen.

So I wouldn't count on a quick reaction by Nvidia. Right now their coffers are full of dirty crypto money.
Posted on Reply
#116
Unregistered
Vayra86Absolutely the die was smaller! But that's entirely the beauty of Pascal. It did so much more with so little. That's a shrink The Way It's Meant to be Played. Part of that is also that Nvidia had been stuck on 28nm for só long.

Today, a shrink enables an immediate maxing out of the silicon and then it is still not enough, so we need retarded power targets.
The beauty of Pascal was due to 16nm being such a massive leap on its own along with some extra transistor work to make it clock faster. The 16nm process was probably the biggest leap from purely process work since the turn of the century.
ARFGTX 1080 started at 599$, how much of that is the cost of the chip itself? 20%? 30%?
RTX 4080 is a 20% larger chip at 100% higher cost per chip.

So, its contribution to the bill of materials in that asking 1200$+ is mere 20-30%, too.
Yea, probably around the same. I will say the R&D and initial process investment have skyrocketed too. Not enough to cover the cost difference but it is something.

What makes me mad though is when people compare AMD’s 500 series prices to now when AMD was not making anything on their GPUs. They were literally giving GPUs away just to keep a presence in the market. Crypto might be the only reason AMD could be net positive on their discrete GPUs since they bought ATI. The rest of the time they either lost money or broke even.

But still I can’t stand how many whiners there are for the high end luxury market toys. I just have to let this out somewhere once… As long as the low GPUs that can play anything at 1080p are reasonably priced, the market is fine by me. We’ll see how that shakes out with crypto finally where it belongs.
#117
Bomby569
b1k3rdudeThis pointless for a number of reasons, but the obvious are import and vat charges and no international warranty. With just those 2 issues, it renders your suggestion useless.
there is no import charges inside the EU, vat changes just by the difference, and the warranty is exactly the same, 2 years and can be claimed anywhere inside the EU
Posted on Reply
#118
Vayra86
HaserathBut still I can’t stand how many whiners there are for the high end luxury market toys. I just have to let this out somewhere once… As long as the low GPUs that can play anything at 1080p are reasonably priced, the market is fine by me. We’ll see how that shakes out with crypto finally where it belongs.
The last good 1080p GPU with strong perf/$ was the 1060. We only regressed since. The price increases trickle down through the stack, that's why whine happens, because it does damage the market. Another aspect of that is that if people can't 'move up' the stack because its just priced out of the larger part of the target audience's comfort, you get stagnation in gaming as well. Devs cater to majority groups, not minorities. It will inevitably hurt the adoption rates of new technology in gaming, such as RT.

Fine Whine, indeed ;) You should expand your view a little bit I think. In this sense, Nvidia is clearly pushing for short term gain with little care for the long term market conditions. Do they know something we don't about the intermediate future? One might start to think so. They're already pushing harder on markets outside Geforce.

'This is fine', you say. It's clearly not. Its like all things in life; a world of haves and have nots, is a world in perpetual conflict. That's what you see here; when the gap between these two groups becomes 'unfair', is when things spiral out of control.
Posted on Reply
#119
Bwaze
Bomby569there is no import charges inside the EU, vat changes just by the difference, and the warranty is exactly the same, 2 years and can be claimed anywhere inside the EU
But the "common EU market" is less and less common. Large differences in VAT have forced legislation that stores that sell to other countruies MUST calculate the VAT of the recipient country, so for instance someone from Poland or Hungary (27% VAT!) can't order a product from Germany and expect a German VAT (19%).

Several large tech stores have even stopped shipping items abroad. Mindfactory, the largest one, for instance. You can order through packet forwarding company like Mailboxde (taxes are still calculated from the card holder's country), but some of the stores have forbidden even that (Notebooksbilliger, the only seller of Nvidia Founders Edition cards in EU).
Posted on Reply
#120
TheoneandonlyMrK
Why_MeThat's on Europe. Nobody forced them to have insane VAT such as Poland's 25%.
Yes I signed up to pay extra Vat when I was born where I was, it's automagic, we have as much say in it as we do in the weather.

Meanwhile I laugh at your pricing Nvidia, the masses have spoken, stick it up your ass until it's price cut to all hell.
Posted on Reply
#121
Chrispy_
shovenoseI’m strongly considering either a 7900xt or 7900xtx once the reviews come out. I need something better than my rx6600 now that I have a 4k monitor. I agree purely based on specs and the numbers we have so far the XT at $899 is a bad deal compared to the XTX for $999. However, given that I’m pairing it with an i5-11th gen, I wonder if I’m better off with the XT as I’ll probably be wasting an XTX?
The problem with the 6600 at 4K is memory bandwidth and an infinitycache that's too small for higher resolutions. The 6600 tanks hard, even when running FSR performance (1080p native) because it simply lacks the bandwidth to handle the upscaled frame data. You might find that for the games you're playing a 6800XT on clearance will be more than enough. You can pick them up right now for about $520 new, or they regularly sell used for $450 on ebay if you browse by "sold items". Get $175 back from your RX 6600 and that solves your 4K problem for under $300 while avoiding the early-adopter tax and flagship pricing model.

By mid 2023 we should have the 4070/4070Ti and the RX 7800XT. Presumably these will shake up the pricing a lot - in a way that these flagship launches have no real hope of doing.
Posted on Reply
#122
shovenose
Chrispy_The problem with the 6600 at 4K is memory bandwidth and an infinitycache that's too small for higher resolutions. The 6600 tanks hard, even when running FSR performance (1080p native) because it simply lacks the bandwidth to handle the upscaled frame data. You might find that for the games you're playing a 6800XT on clearance will be more than enough. You can pick them up right now for about $520 new, or they regularly sell used for $450 on ebay if you browse by "sold items". Get $175 back from your RX 6600 and that solves your 4K problem for under $300 while avoiding the early-adopter tax and flagship pricing model.

By mid 2023 we should have the 4070/4070Ti and the RX 7800XT. Presumably these will shake up the pricing a lot - in a way that these flagship launches have no real hope of doing.
Hmm, that is probably the path I should take, but I’d just be irritated if I spent $500 on something that didn’t perform as good as I hoped. Regardless I’d be keeping my RX6600 for my secondary PC.
Posted on Reply
#123
Chrispy_
Vayra86You see it too?

Team green needs to secure the top spot in the epeen ladder for 2024 by...
- maxing out the silicon directly on the top end node of the moment
- using increased power targets gen-to-gen since Turing, and despite a shrink, yet another.
- enabling 450W~600W power target on the top end to even extract meaningful OC results... if we call 6% meaningful, which is a stretch.
- pushing even an x80 to a price point beyond what used to be absolute top end

One does start to wonder how they'll proceed from here.
I was having a similar discussion on the CPU front the other day too - specifically that desktop CPUs are not the only CPUs being made. There's no room for a 350W CPU in a laptop and laptops have been steadily overtaking desktops for half a decade at this point. Even on desktop, very few people actually want to spend on crazy CPUs with all the flagship platform costs of a high-end cooler/board/DDR5.

Progress cannot be made much longer the way Nvidia are currently doing it:
  • The 3090 started the trend of tripping OCP on high-end, 1000W PSUs.
  • The partner-model 4090s don't physically fit in several enthusiast cases that were previously thought to have plenty of space for huge GPUs.
  • The 4090 can't be powered safely, apparently as we go through the second flawed iteration of a dumb-idea connector.
  • How big is the gulf going to be between a 450W 4090FE and a laptop 4080(M)? Laptop OEMs were really pushing their luck with 130W variants last generation, there's only so much physical space you can dedicate to cooling in something that has to be about an inch thick and water-cooling is, realistically, not an option.
Vayra86The last good 1080p GPU with strong perf/$ was the 1060. We only regressed since. The price increases trickle down through the stack, that's why whine happens, because it does damage the market. Another aspect of that is that if people can't 'move up' the stack because its just priced out of the larger part of the target audience's comfort, you get stagnation in gaming as well. Devs cater to majority groups, not minorities. It will inevitably hurt the adoption rates of new technology in gaming, such as RT.
RX6600 might change your opinion on that?

It's built for 1080p in terms of ROPs, bandwidth, and cache.
It's priced about the same as the 1060 if you adjust for inflation and added US-China tariffs that didn't exist in 2016 when the 1060 was made.
Partner models use ~130W, which is marginally less than the typical 140-150W that most non-FE 1060 cards used. There's a fair bit of undervolting headroom, too - just like the 1060.
It's twice the performance of the 1060, five years later, or roughly equivalent to 2 generations of 40% uplift each time
It was the clear performance/Watt leader of its generation.

Unfortunately it was an underrated card because it was unexciting, and suffered from low-availability and scalping, but now that those issues are gone it's my go-to recommendation for someone looking to upgrade from a 1060 or lower. You can pick them up used for $175 and if you look for deals you can find them new for less than the 1060's original MSRP.
Posted on Reply
#124
Dirt Chip
One way out of the loop is to make the lower tier GPU`s on older, more mature node. It will hurt possible performance but can lead to lower costs.
Only the x8xx/x9xx need the very edge as it after absolute performance with value being secondary, if any.
x6xx/x5xx and the equivalent AMD counterpart can do just fine with 7/6nm and get the improvement from the arch uplift, new features (DLSS3) and more refined node. The top tier can pay for the port for the 'older' node.

I don`t really know if it`s a practical solution but as a mid-range shopper I wouldn't mind, as long as the price/pref is there.
Posted on Reply
#125
Chrispy_
Dirt ChipOne way out of the loop is to make the lower tier GPU`s on older, more mature node. It will hurt possible performance but can lead to lower costs.
Only the x8xx/x9xx need the very edge as it after absolute performance with value being secondary, if any.
x6xx/x5xx and the equivalent AMD counterpart can do just fine with 7/6nm and get the improvement from the arch uplift, new features (DLSS3) and more refined node. The top tier can pay for the port for the 'older' node.

I don`t really know if it`s a practical solution but as a mid-range shopper I wouldn't mind, as long as the price/pref is there.
Not really practical as there's a lot of work involved with porting a design to a different node. AMD touched on that in a couple of interviews after the 7900-series launch event.
I'm pretty sure GN's Sam Naffziger interview covers it, but don't have time to rewatch it and check.
Posted on Reply
Add your own comment
Nov 27th, 2024 14:21 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts