Tuesday, November 29th 2022
$700-800 Ideal Price for GeForce RTX 4080: TechPowerUp Poll Surveying 11,000 Respondents
The ideal price for the NVIDIA GeForce RTX 4080 "Ada" graphics card is around USD $700 to $800, according to results from a recent TechPowerUp Front-page poll surveying our readers. Our poll "How much would you pay for RTX 4080 at most?" received over 11,000 responses. At the number 1 spot with 22% of the vote is $800, closely followed by $700. Together, this range represents 44% of the voters. 14% of our readers think $600 is an ideal price, followed by "less than $400" at 13%. 9% think $500 seems fair, followed by 7% willing to spend as much as $900. 5% is happy to spend $1,100. 2% or less feel that the current $1,200 MSRP is justified or are willing to spend more than MSRP. There's more to a majority finding sanity with the $700 to $800 price-range.
With NVIDIA cancelling the RTX 4080 12 GB, the RTX 4080 16 GB became the only SKU to bear the name "RTX 4080." This $1,200 MSRP GeForce RTX 4080 is the successor to the RTX 3080, which debuted at $700, marking a $500 MSRP increase generation-over-generation (or +71%). You begin to see why most readers prefer the $700-800 range to be the ideal MSRP, and are willing to tolerate a $100 increase. For even more context, the RTX 3080 "Ampere" launched at the same $700 MSRP that its successor, the RTX 2080 "Turing" launched at. The GTX 1080 "Pascal" came out at $600 ($700 for the Founders Edition), which explains the interest for $600 in our poll.And then there's a sizable chunk of our readers who simply seem disillusioned with GPU pricing, and feel that either $500 to $400, or something lower, is the max that they would be willing to pay for the RTX 4080. Can NVIDIA even break-even at such prices? NVIDIA's own quarterly financial results reference vague margins as high as 60% (not specific to any product, but as a general rule, margins tend to be proportionate to MSRP, with the higher priced products generally having a fatter margin). At 50% to 60% margins for its $1,200 MSRP, we'd be in the neighborhood of $500 to $600. We've seen examples in the past of NVIDIA cutting its prices in sharp response to competitive AMD products, with both brands fiercely locked in price-wars, and their products selling at less than half their MSRPs. So a $500 to $600 price for the RTX 4080 still seems possible on paper, and cannot be easily dismissed as "impossible."
On the other hand, prices have been going up everywhere: we've got inflation, higher prices for gas and power, and no doubt, TSMC is charging more for a 4 nm wafer than what Samsung has been charging for their 8 nm technology. NVIDIA was also Samsung's biggest customer—today there's plenty of competition for allocation on TSMC's latest and greatest nodes. Apple, Qualcomm, AMD, everybody wants their chips made on the best process in the world, so prices will end up higher for that reason, too.A tiny fraction of our readers thinks that the $1,200 MSRP is fair, or is willing to pay more than $1,400. This probably aligns with the demographic that is actually buying the RTX 4080 at its current prices—or are willing to spend top-dollar for any other high-end graphics card. The poll results indicate that NVIDIA will be able to push more volume by lowering the price, but given the current inventory levels of GeForce 30 cards it could be that they rather be content selling the RTX 4080 at ≥$1,200 at high margins to a tiny fraction of people.
With NVIDIA cancelling the RTX 4080 12 GB, the RTX 4080 16 GB became the only SKU to bear the name "RTX 4080." This $1,200 MSRP GeForce RTX 4080 is the successor to the RTX 3080, which debuted at $700, marking a $500 MSRP increase generation-over-generation (or +71%). You begin to see why most readers prefer the $700-800 range to be the ideal MSRP, and are willing to tolerate a $100 increase. For even more context, the RTX 3080 "Ampere" launched at the same $700 MSRP that its successor, the RTX 2080 "Turing" launched at. The GTX 1080 "Pascal" came out at $600 ($700 for the Founders Edition), which explains the interest for $600 in our poll.And then there's a sizable chunk of our readers who simply seem disillusioned with GPU pricing, and feel that either $500 to $400, or something lower, is the max that they would be willing to pay for the RTX 4080. Can NVIDIA even break-even at such prices? NVIDIA's own quarterly financial results reference vague margins as high as 60% (not specific to any product, but as a general rule, margins tend to be proportionate to MSRP, with the higher priced products generally having a fatter margin). At 50% to 60% margins for its $1,200 MSRP, we'd be in the neighborhood of $500 to $600. We've seen examples in the past of NVIDIA cutting its prices in sharp response to competitive AMD products, with both brands fiercely locked in price-wars, and their products selling at less than half their MSRPs. So a $500 to $600 price for the RTX 4080 still seems possible on paper, and cannot be easily dismissed as "impossible."
On the other hand, prices have been going up everywhere: we've got inflation, higher prices for gas and power, and no doubt, TSMC is charging more for a 4 nm wafer than what Samsung has been charging for their 8 nm technology. NVIDIA was also Samsung's biggest customer—today there's plenty of competition for allocation on TSMC's latest and greatest nodes. Apple, Qualcomm, AMD, everybody wants their chips made on the best process in the world, so prices will end up higher for that reason, too.A tiny fraction of our readers thinks that the $1,200 MSRP is fair, or is willing to pay more than $1,400. This probably aligns with the demographic that is actually buying the RTX 4080 at its current prices—or are willing to spend top-dollar for any other high-end graphics card. The poll results indicate that NVIDIA will be able to push more volume by lowering the price, but given the current inventory levels of GeForce 30 cards it could be that they rather be content selling the RTX 4080 at ≥$1,200 at high margins to a tiny fraction of people.
140 Comments on $700-800 Ideal Price for GeForce RTX 4080: TechPowerUp Poll Surveying 11,000 Respondents
2020, RTX 3080 - $700
2022, RTX 4080 - $1200 <- WE ARE HERE
2024, RTX 5080 - $2040
2026, RTX 6080 - $3468
2028, RTX 7080 - $5896
2030, RTX 8080 - $10022
2032, RTX 9080 - $17038
2034, GTX 1080 - $28965
And remember, before commenting about higher wafer and new processes cost, that this time around an X080 card doesn't even come with a cut down top of the line processor (RTX 4090 has AD102, 608 mm²), as it was normal, but is build around much smaller AD103, 378.6 mm². Compare this to Ampere architecture cards:
GeForce RTX 3080 (GA102, 628 mm²)
GeForce RTX 3080 12GB (GA102)
GeForce RTX 3080 Ti (GA102)
GeForce RTX 3090 (GA102)
GeForce RTX 3090 Ti (GA102)
So, not only we are having an unprecedented price inflation in a single generation (+70%), Nvidia is also selling us a lower tier, which was up until now reserved for X070 cards.
I can only imagine this was all planned during crypto high, and the coke still hasn't run out at Nvidia headquarters.
In the same vein, die size is an absolute factor too, but the performance relative to a generation before or after it, is up for interpretation and thát is what makes or breaks a price. That's why a 314 sq/mm 1080 sold well while current x80's are a harder sell with a múch larger die.
That's the space Nvidia is exploring here really, trying to stretch things up further to justify higher pricing. Except this is like shock therapy; this isn't just stagnation or a baby step forward, the 4080 is a massive step backwards. Something is gonna give though, and I think this time, its Nvidia, unless they are content with losing market share. I see what you did there :D
I also don't understand the outrage with the prices. You think its expensive? Great, don't buy it, the prices will drop, profit.
Team green needs to secure the top spot in the epeen ladder for 2024 by...
- maxing out the silicon directly on the top end node of the moment
- using increased power targets gen-to-gen since Turing, and despite a shrink, yet another.
- enabling 450W~600W power target on the top end to even extract meaningful OC results... if we call 6% meaningful, which is a stretch.
- pushing even an x80 to a price point beyond what used to be absolute top end
One does start to wonder how they'll proceed from here. But a reduction in ridiculousness is going to surprise me. What roads do they have left? Surely they're not going to follow suit on AMD's technology leadership now, are they? :laugh:
The longer you look at recent developments, it slowly identifies winning and losing technology. Clearly chiplet approach is the way, and Nvidia + Intel might have stuck too long to tried and tested stuff. Both companies keep launching product stacks that only show us 'further escalation' of proven approaches, except they're constantly running in the red zone, stacking bandaid upon bandaid to keep it all afloat. DLSS 3, is for Nvidia what PL1=2 is for Intel. The similarities are striking, the trade offs for that last snippet of performance (or not destroying latency...) are similar too.
Those ships have most definitely hit the iceberg, but apparently the captain still feels fine and the band keeps playing. Absolutely agreed, the one caveat is part of the consumer 'battle' is outrage over prices. These things matter, we should applaud that, not criticise it altogether. What also matters, is that people put their money where their mouth is. (Or, don't put money :)) Don't ever underestimate the power of peer pressure here. It matters, and we need it.
To me, die size / price is an indicator of how much wiggle room there is left for a company to flesh out a product stack further. And it can then also tell us a lot about pricing, what's 'fair' and what's not, etc.
RTX 4080 at Geizhals.eu
Will the prices drop? When Nvidia released RTX 2080 with price increase and almost no performance increase compared to GTX 1080 Ti, just the promise of RTX and DLSS games in the future, people weren't very enthusiastic either. But Nvidia persevered, and waited with quite a lot of quarters in the red in gaming sector. And tried to improve this at Ampere launch, which was so inviting on paper. If only the crypto 2020 bull wouldn't happen.
So I wouldn't count on a quick reaction by Nvidia. Right now their coffers are full of dirty crypto money.
What makes me mad though is when people compare AMD’s 500 series prices to now when AMD was not making anything on their GPUs. They were literally giving GPUs away just to keep a presence in the market. Crypto might be the only reason AMD could be net positive on their discrete GPUs since they bought ATI. The rest of the time they either lost money or broke even.
But still I can’t stand how many whiners there are for the high end luxury market toys. I just have to let this out somewhere once… As long as the low GPUs that can play anything at 1080p are reasonably priced, the market is fine by me. We’ll see how that shakes out with crypto finally where it belongs.
Fine Whine, indeed ;) You should expand your view a little bit I think. In this sense, Nvidia is clearly pushing for short term gain with little care for the long term market conditions. Do they know something we don't about the intermediate future? One might start to think so. They're already pushing harder on markets outside Geforce.
'This is fine', you say. It's clearly not. Its like all things in life; a world of haves and have nots, is a world in perpetual conflict. That's what you see here; when the gap between these two groups becomes 'unfair', is when things spiral out of control.
Several large tech stores have even stopped shipping items abroad. Mindfactory, the largest one, for instance. You can order through packet forwarding company like Mailboxde (taxes are still calculated from the card holder's country), but some of the stores have forbidden even that (Notebooksbilliger, the only seller of Nvidia Founders Edition cards in EU).
Meanwhile I laugh at your pricing Nvidia, the masses have spoken, stick it up your ass until it's price cut to all hell.
By mid 2023 we should have the 4070/4070Ti and the RX 7800XT. Presumably these will shake up the pricing a lot - in a way that these flagship launches have no real hope of doing.
Progress cannot be made much longer the way Nvidia are currently doing it:
- The 3090 started the trend of tripping OCP on high-end, 1000W PSUs.
- The partner-model 4090s don't physically fit in several enthusiast cases that were previously thought to have plenty of space for huge GPUs.
- The 4090 can't be powered safely, apparently as we go through the second flawed iteration of a dumb-idea connector.
- How big is the gulf going to be between a 450W 4090FE and a laptop 4080(M)? Laptop OEMs were really pushing their luck with 130W variants last generation, there's only so much physical space you can dedicate to cooling in something that has to be about an inch thick and water-cooling is, realistically, not an option.
RX6600 might change your opinion on that?It's built for 1080p in terms of ROPs, bandwidth, and cache.
It's priced about the same as the 1060 if you adjust for inflation and added US-China tariffs that didn't exist in 2016 when the 1060 was made.
Partner models use ~130W, which is marginally less than the typical 140-150W that most non-FE 1060 cards used. There's a fair bit of undervolting headroom, too - just like the 1060.
It's twice the performance of the 1060, five years later, or roughly equivalent to 2 generations of 40% uplift each time
It was the clear performance/Watt leader of its generation.
Unfortunately it was an underrated card because it was unexciting, and suffered from low-availability and scalping, but now that those issues are gone it's my go-to recommendation for someone looking to upgrade from a 1060 or lower. You can pick them up used for $175 and if you look for deals you can find them new for less than the 1060's original MSRP.
Only the x8xx/x9xx need the very edge as it after absolute performance with value being secondary, if any.
x6xx/x5xx and the equivalent AMD counterpart can do just fine with 7/6nm and get the improvement from the arch uplift, new features (DLSS3) and more refined node. The top tier can pay for the port for the 'older' node.
I don`t really know if it`s a practical solution but as a mid-range shopper I wouldn't mind, as long as the price/pref is there.
I'm pretty sure GN's Sam Naffziger interview covers it, but don't have time to rewatch it and check.