Tuesday, November 29th 2022
$700-800 Ideal Price for GeForce RTX 4080: TechPowerUp Poll Surveying 11,000 Respondents
The ideal price for the NVIDIA GeForce RTX 4080 "Ada" graphics card is around USD $700 to $800, according to results from a recent TechPowerUp Front-page poll surveying our readers. Our poll "How much would you pay for RTX 4080 at most?" received over 11,000 responses. At the number 1 spot with 22% of the vote is $800, closely followed by $700. Together, this range represents 44% of the voters. 14% of our readers think $600 is an ideal price, followed by "less than $400" at 13%. 9% think $500 seems fair, followed by 7% willing to spend as much as $900. 5% is happy to spend $1,100. 2% or less feel that the current $1,200 MSRP is justified or are willing to spend more than MSRP. There's more to a majority finding sanity with the $700 to $800 price-range.
With NVIDIA cancelling the RTX 4080 12 GB, the RTX 4080 16 GB became the only SKU to bear the name "RTX 4080." This $1,200 MSRP GeForce RTX 4080 is the successor to the RTX 3080, which debuted at $700, marking a $500 MSRP increase generation-over-generation (or +71%). You begin to see why most readers prefer the $700-800 range to be the ideal MSRP, and are willing to tolerate a $100 increase. For even more context, the RTX 3080 "Ampere" launched at the same $700 MSRP that its successor, the RTX 2080 "Turing" launched at. The GTX 1080 "Pascal" came out at $600 ($700 for the Founders Edition), which explains the interest for $600 in our poll.And then there's a sizable chunk of our readers who simply seem disillusioned with GPU pricing, and feel that either $500 to $400, or something lower, is the max that they would be willing to pay for the RTX 4080. Can NVIDIA even break-even at such prices? NVIDIA's own quarterly financial results reference vague margins as high as 60% (not specific to any product, but as a general rule, margins tend to be proportionate to MSRP, with the higher priced products generally having a fatter margin). At 50% to 60% margins for its $1,200 MSRP, we'd be in the neighborhood of $500 to $600. We've seen examples in the past of NVIDIA cutting its prices in sharp response to competitive AMD products, with both brands fiercely locked in price-wars, and their products selling at less than half their MSRPs. So a $500 to $600 price for the RTX 4080 still seems possible on paper, and cannot be easily dismissed as "impossible."
On the other hand, prices have been going up everywhere: we've got inflation, higher prices for gas and power, and no doubt, TSMC is charging more for a 4 nm wafer than what Samsung has been charging for their 8 nm technology. NVIDIA was also Samsung's biggest customer—today there's plenty of competition for allocation on TSMC's latest and greatest nodes. Apple, Qualcomm, AMD, everybody wants their chips made on the best process in the world, so prices will end up higher for that reason, too.A tiny fraction of our readers thinks that the $1,200 MSRP is fair, or is willing to pay more than $1,400. This probably aligns with the demographic that is actually buying the RTX 4080 at its current prices—or are willing to spend top-dollar for any other high-end graphics card. The poll results indicate that NVIDIA will be able to push more volume by lowering the price, but given the current inventory levels of GeForce 30 cards it could be that they rather be content selling the RTX 4080 at ≥$1,200 at high margins to a tiny fraction of people.
With NVIDIA cancelling the RTX 4080 12 GB, the RTX 4080 16 GB became the only SKU to bear the name "RTX 4080." This $1,200 MSRP GeForce RTX 4080 is the successor to the RTX 3080, which debuted at $700, marking a $500 MSRP increase generation-over-generation (or +71%). You begin to see why most readers prefer the $700-800 range to be the ideal MSRP, and are willing to tolerate a $100 increase. For even more context, the RTX 3080 "Ampere" launched at the same $700 MSRP that its successor, the RTX 2080 "Turing" launched at. The GTX 1080 "Pascal" came out at $600 ($700 for the Founders Edition), which explains the interest for $600 in our poll.And then there's a sizable chunk of our readers who simply seem disillusioned with GPU pricing, and feel that either $500 to $400, or something lower, is the max that they would be willing to pay for the RTX 4080. Can NVIDIA even break-even at such prices? NVIDIA's own quarterly financial results reference vague margins as high as 60% (not specific to any product, but as a general rule, margins tend to be proportionate to MSRP, with the higher priced products generally having a fatter margin). At 50% to 60% margins for its $1,200 MSRP, we'd be in the neighborhood of $500 to $600. We've seen examples in the past of NVIDIA cutting its prices in sharp response to competitive AMD products, with both brands fiercely locked in price-wars, and their products selling at less than half their MSRPs. So a $500 to $600 price for the RTX 4080 still seems possible on paper, and cannot be easily dismissed as "impossible."
On the other hand, prices have been going up everywhere: we've got inflation, higher prices for gas and power, and no doubt, TSMC is charging more for a 4 nm wafer than what Samsung has been charging for their 8 nm technology. NVIDIA was also Samsung's biggest customer—today there's plenty of competition for allocation on TSMC's latest and greatest nodes. Apple, Qualcomm, AMD, everybody wants their chips made on the best process in the world, so prices will end up higher for that reason, too.A tiny fraction of our readers thinks that the $1,200 MSRP is fair, or is willing to pay more than $1,400. This probably aligns with the demographic that is actually buying the RTX 4080 at its current prices—or are willing to spend top-dollar for any other high-end graphics card. The poll results indicate that NVIDIA will be able to push more volume by lowering the price, but given the current inventory levels of GeForce 30 cards it could be that they rather be content selling the RTX 4080 at ≥$1,200 at high margins to a tiny fraction of people.
140 Comments on $700-800 Ideal Price for GeForce RTX 4080: TechPowerUp Poll Surveying 11,000 Respondents
So all the answers below 500$ should be removed entirely.
A x80 card should perform ~75-90% of the best card of the lineup, as it's always been doing, and cost accordingly to the x80 tier, taking into account a few factors like inflation, manufacturers cost on specific node etc.
The 4080 16GB is a true x80 card but it's ridiculously priced. nVidia can price a 4080Ti and a 4090 2-3K. That's fine. They are halo products that target specifi people. But the x80 is just a high end card and should never exceed the 499/599/699/799$ taking into account all the factors.
The 780 cost 649$ nearly 10 years ago. You must be crazy if you expect that the latest x80 card should cost the same 10 years later.
And I remind you. GTX 280: 649$ in 2008.
It's nonsense. GTX 280 was the top single GPU card. Today RTX 4090 is the top single GPU card.
Wow 1gb memory :eek: just shows back then "in memory lane" nv prices were way worse :laugh:
NVIDIA GeForce GTX 295 Specs | TechPowerUp GPU Database
No I shouldn't...
Think about 12+ years ago I bought a 1gb 1 little bitty fan gt640 for like 99.us for a 775 socket q9550 system just for photoshop to work lol
8800 GTS - $400-450
GTX 285 - $380
AMD HD 6950 - $299
AMD HD 5850 - $259
Apart from the fact that I do like the RT tech and would never buy an alternative with lower RT performance, the CUDA acceleration for the CAD/Civil Engineering apps cannot be ignored.
If only AMD had something similar that would be useful to most of us.
Because right now we have no alternative.
There is no justification for that price. Not even 899$.
Obviously the 90%~ qualifier doesnt really apply if we're talking of dual GPU cards, but those are not single GPUs. The xx9x/titan tier cards repalced the dual GPU monsters of the past.
Note: In 2016 I purchased a EVGA 1070 and that was an excellent card that I still have on my back up computer. So please don't tell me I'm an AMD Fanboi. I'm the guy who looks for the best bang for the buck when it come to tech and nothing more.
AMD neglected this space for far too long now, and that hurts me deeply. The crazy things I had to do in OpenCL just to stay away from proprietary solutions... Man!