• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

$700-800 Ideal Price for GeForce RTX 4080: TechPowerUp Poll Surveying 11,000 Respondents

I would give them... tree-fiddy! Nothing more, nothing less! :D
Well, if anything, three-fiddy is the amount you should never give!
south-park-tree-fiddy.gif


My personal problem is that I cannot get rid of nVidia gpus.
Apart from the fact that I do like the RT tech and would never buy an alternative with lower RT performance, the CUDA acceleration for the CAD/Civil Engineering apps cannot be ignored.

If only AMD had something similar that would be useful to most of us.
Because right now we have no alternative.
AMD neglected this space for far too long now, and that hurts me deeply. The crazy things I had to do in OpenCL just to stay away from proprietary solutions... Man!
 
Clearly a PERFECT opportunity for AMD to pull the rug from under Nvidia's feet an crush them and steal tons of market share with their $650 RX 7900 XTX and $575 Radeon RX 7900 XT.
 
4N must be really expensive
Actually that's part of the problem Nvidia got themselves into. Paying way over the money for bleeding edge node unlike AMD sticking to proven and less expensive N5. Nvidia is screwed on pricing, they simply cannot compete, RDNA3 cost AMD a lot less money to manufacture and even at $899 for 7900XT they are making a nice healthy margin. I would not pay more than $799 for 4080, but even if it were I would most likely still get 7900XT(X). Nvidia doesn't deserve a red cent of mine.
 
Being OK with 70% price increase will inevitably lead to this:

2020, RTX 3080 - $700
2022, RTX 4080 - $1200 <- WE ARE HERE
2024, RTX 5080 - $2040
2026, RTX 6080 - $3468
2028, RTX 7080 - $5896
2030, RTX 8080 - $10022
2032, RTX 9080 - $17038
2034, GTX 1080 - $28965

And remember, before commenting about higher wafer and new processes cost, that this time around an X080 card doesn't even come with a cut down top of the line processor (RTX 4090 has AD102, 608 mm²), as it was normal, but is build around much smaller AD103, 378.6 mm². Compare this to Ampere architecture cards:

GeForce RTX 3080 (GA102, 628 mm²)
GeForce RTX 3080 12GB (GA102)
GeForce RTX 3080 Ti (GA102)
GeForce RTX 3090 (GA102)
GeForce RTX 3090 Ti (GA102)

So, not only we are having an unprecedented price inflation in a single generation (+70%), Nvidia is also selling us a lower tier, which was up until now reserved for X070 cards.

I can only imagine this was all planned during crypto high, and the coke still hasn't run out at Nvidia headquarters.
 
I think 700 Euros or Dollars, including taxes and VAT, it's a fair price for a custom 4080 card. The nGreedia own's card shouldn't be more than 649 MSRP....
 
The 8800 Ultra was less than $850 at launch. The 8800 Ultra was the top end card. The 4090 is $1600 MSRP, and there's still room for a Ti variant.
 
Should I have made it clear that I was referring to best single gaming card of the gen?
No I shouldn't...
You've been making perfect sense. The model number in the tier/stack of products does tell us a lot about its positioning, intent, target audience, etc. Its what the whole marketing game is about, denial has no place here, even if there are some outliers to what is an established norm.

In the same vein, die size is an absolute factor too, but the performance relative to a generation before or after it, is up for interpretation and thát is what makes or breaks a price. That's why a 314 sq/mm 1080 sold well while current x80's are a harder sell with a múch larger die.

That's the space Nvidia is exploring here really, trying to stretch things up further to justify higher pricing. Except this is like shock therapy; this isn't just stagnation or a baby step forward, the 4080 is a massive step backwards. Something is gonna give though, and I think this time, its Nvidia, unless they are content with losing market share.

Clearly a PERFECT opportunity for AMD to pull the rug from under Nvidia's feet an crush them and steal tons of market share with their $650 RX 7900 XTX and $575 Radeon RX 7900 XT.
I see what you did there :D
 
Last edited:
Absolutely the die was smaller! But that's entirely the beauty of Pascal. It did so much more with so little. That's a shrink The Way It's Meant to be Played. Part of that is also that Nvidia had been stuck on 28nm for só long.

Today, a shrink enables an immediate maxing out of the silicon and then it is still not enough, so we need retarded power targets.
I agree that pascal was a thing of beauty, but honestly, do we REALLY care about how much nvidia is charging us per mm of die? I don't frankly. I check the performance and the price, whether they use a huge or small die is kinda irrelevant to me.

I also don't understand the outrage with the prices. You think its expensive? Great, don't buy it, the prices will drop, profit.
 
Actually that's part of the problem Nvidia got themselves into. Paying way over the money for bleeding edge node unlike AMD sticking to proven and less expensive N5. Nvidia is screwed on pricing, they simply cannot compete, RDNA3 cost AMD a lot less money to manufacture and even at $899 for 7900XT they are making a nice healthy margin. I would not pay more than $799 for 4080, but even if it were I would most likely still get 7900XT(X). Nvidia doesn't deserve a red cent of mine.
You see it too?

Team green needs to secure the top spot in the epeen ladder for 2024 by...
- maxing out the silicon directly on the top end node of the moment
- using increased power targets gen-to-gen since Turing, and despite a shrink, yet another.
- enabling 450W~600W power target on the top end to even extract meaningful OC results... if we call 6% meaningful, which is a stretch.
- pushing even an x80 to a price point beyond what used to be absolute top end

One does start to wonder how they'll proceed from here. But a reduction in ridiculousness is going to surprise me. What roads do they have left? Surely they're not going to follow suit on AMD's technology leadership now, are they? :laugh:

The longer you look at recent developments, it slowly identifies winning and losing technology. Clearly chiplet approach is the way, and Nvidia + Intel might have stuck too long to tried and tested stuff. Both companies keep launching product stacks that only show us 'further escalation' of proven approaches, except they're constantly running in the red zone, stacking bandaid upon bandaid to keep it all afloat. DLSS 3, is for Nvidia what PL1=2 is for Intel. The similarities are striking, the trade offs for that last snippet of performance (or not destroying latency...) are similar too.

Those ships have most definitely hit the iceberg, but apparently the captain still feels fine and the band keeps playing.

I agree that pascal was a thing of beauty, but honestly, do we REALLY care about how much nvidia is charging us per mm of die? I don't frankly. I check the performance and the price, whether they use a huge or small die is kinda irrelevant to me.

I also don't understand the outrage with the prices. You think its expensive? Great, don't buy it, the prices will drop, profit.
Absolutely agreed, the one caveat is part of the consumer 'battle' is outrage over prices. These things matter, we should applaud that, not criticise it altogether. What also matters, is that people put their money where their mouth is. (Or, don't put money :)) Don't ever underestimate the power of peer pressure here. It matters, and we need it.

To me, die size / price is an indicator of how much wiggle room there is left for a company to flesh out a product stack further. And it can then also tell us a lot about pricing, what's 'fair' and what's not, etc.
 
Last edited:
I also don't understand the outrage with the prices. You think its expensive? Great, don't buy it, the prices will drop, profit.

Outrage? Its a discussion on a forum, not violent protest in a street. But people are clearly "not buying it". Proof? Just look at the availability of an X080 card, released two weeks ago. When has this happened before?

RTX 4080 at Geizhals.eu

Screenshot_20221130_090907_com.android.chrome.jpg
 
You guys are in the EU just buy it in any other country. Some even do free shipping (not sure in those eastern parts but should be the same). Not that you can find much cheaper for what i see
This pointless for a number of reasons, but the obvious are import and vat charges and no international warranty. With just those 2 issues, it renders your suggestion useless.
 
Outrage? Its a discussion on a forum, not violent protest in a street. But people are clearly "not buying it". Proof? Just look at the availability of an X080 card, released two weeks ago. When has this happened before?

RTX 4080 at Geizhals.eu

View attachment 272290
Im not specifically talking about this forum. Multiple posters in multiple forums going bonkers. People are not buying the 4080? Great, prices will drop.
 
nvidia got rid of sli on mid range cards for a reason..
Yeah was because it wasnt a viable solution. Of the top of my head, micro-stutter, power useage didnt match performance and what I suspect as the main reason - nVidia dumping the responsibility for making games compatible onto games devs. Said devs clearly saw through that B$, which resulted in less and less games supporting the standard. So nVidia as a for-profit corp, it was then about the profit and where best to extract it from. Which si why you only see nVlink in the professional space.
 
Last edited:
Im not specifically talking about this forum. Multiple posters in multiple forums going bonkers. People are not buying the 4080? Great, prices will drop.

And people are going bonkers about people complaining about the price. Which is, in my opinion, even sillier.

Will the prices drop? When Nvidia released RTX 2080 with price increase and almost no performance increase compared to GTX 1080 Ti, just the promise of RTX and DLSS games in the future, people weren't very enthusiastic either. But Nvidia persevered, and waited with quite a lot of quarters in the red in gaming sector. And tried to improve this at Ampere launch, which was so inviting on paper. If only the crypto 2020 bull wouldn't happen.

So I wouldn't count on a quick reaction by Nvidia. Right now their coffers are full of dirty crypto money.
 
Absolutely the die was smaller! But that's entirely the beauty of Pascal. It did so much more with so little. That's a shrink The Way It's Meant to be Played. Part of that is also that Nvidia had been stuck on 28nm for só long.

Today, a shrink enables an immediate maxing out of the silicon and then it is still not enough, so we need retarded power targets.
The beauty of Pascal was due to 16nm being such a massive leap on its own along with some extra transistor work to make it clock faster. The 16nm process was probably the biggest leap from purely process work since the turn of the century.
GTX 1080 started at 599$, how much of that is the cost of the chip itself? 20%? 30%?
RTX 4080 is a 20% larger chip at 100% higher cost per chip.

So, its contribution to the bill of materials in that asking 1200$+ is mere 20-30%, too.
Yea, probably around the same. I will say the R&D and initial process investment have skyrocketed too. Not enough to cover the cost difference but it is something.

What makes me mad though is when people compare AMD’s 500 series prices to now when AMD was not making anything on their GPUs. They were literally giving GPUs away just to keep a presence in the market. Crypto might be the only reason AMD could be net positive on their discrete GPUs since they bought ATI. The rest of the time they either lost money or broke even.

But still I can’t stand how many whiners there are for the high end luxury market toys. I just have to let this out somewhere once… As long as the low GPUs that can play anything at 1080p are reasonably priced, the market is fine by me. We’ll see how that shakes out with crypto finally where it belongs.
 
This pointless for a number of reasons, but the obvious are import and vat charges and no international warranty. With just those 2 issues, it renders your suggestion useless.

there is no import charges inside the EU, vat changes just by the difference, and the warranty is exactly the same, 2 years and can be claimed anywhere inside the EU
 
But still I can’t stand how many whiners there are for the high end luxury market toys. I just have to let this out somewhere once… As long as the low GPUs that can play anything at 1080p are reasonably priced, the market is fine by me. We’ll see how that shakes out with crypto finally where it belongs.
The last good 1080p GPU with strong perf/$ was the 1060. We only regressed since. The price increases trickle down through the stack, that's why whine happens, because it does damage the market. Another aspect of that is that if people can't 'move up' the stack because its just priced out of the larger part of the target audience's comfort, you get stagnation in gaming as well. Devs cater to majority groups, not minorities. It will inevitably hurt the adoption rates of new technology in gaming, such as RT.

Fine Whine, indeed ;) You should expand your view a little bit I think. In this sense, Nvidia is clearly pushing for short term gain with little care for the long term market conditions. Do they know something we don't about the intermediate future? One might start to think so. They're already pushing harder on markets outside Geforce.

'This is fine', you say. It's clearly not. Its like all things in life; a world of haves and have nots, is a world in perpetual conflict. That's what you see here; when the gap between these two groups becomes 'unfair', is when things spiral out of control.
 
Last edited:
there is no import charges inside the EU, vat changes just by the difference, and the warranty is exactly the same, 2 years and can be claimed anywhere inside the EU

But the "common EU market" is less and less common. Large differences in VAT have forced legislation that stores that sell to other countruies MUST calculate the VAT of the recipient country, so for instance someone from Poland or Hungary (27% VAT!) can't order a product from Germany and expect a German VAT (19%).

Several large tech stores have even stopped shipping items abroad. Mindfactory, the largest one, for instance. You can order through packet forwarding company like Mailboxde (taxes are still calculated from the card holder's country), but some of the stores have forbidden even that (Notebooksbilliger, the only seller of Nvidia Founders Edition cards in EU).
 
That's on Europe. Nobody forced them to have insane VAT such as Poland's 25%.
Yes I signed up to pay extra Vat when I was born where I was, it's automagic, we have as much say in it as we do in the weather.

Meanwhile I laugh at your pricing Nvidia, the masses have spoken, stick it up your ass until it's price cut to all hell.
 
I’m strongly considering either a 7900xt or 7900xtx once the reviews come out. I need something better than my rx6600 now that I have a 4k monitor. I agree purely based on specs and the numbers we have so far the XT at $899 is a bad deal compared to the XTX for $999. However, given that I’m pairing it with an i5-11th gen, I wonder if I’m better off with the XT as I’ll probably be wasting an XTX?
The problem with the 6600 at 4K is memory bandwidth and an infinitycache that's too small for higher resolutions. The 6600 tanks hard, even when running FSR performance (1080p native) because it simply lacks the bandwidth to handle the upscaled frame data. You might find that for the games you're playing a 6800XT on clearance will be more than enough. You can pick them up right now for about $520 new, or they regularly sell used for $450 on ebay if you browse by "sold items". Get $175 back from your RX 6600 and that solves your 4K problem for under $300 while avoiding the early-adopter tax and flagship pricing model.

By mid 2023 we should have the 4070/4070Ti and the RX 7800XT. Presumably these will shake up the pricing a lot - in a way that these flagship launches have no real hope of doing.
 
The problem with the 6600 at 4K is memory bandwidth and an infinitycache that's too small for higher resolutions. The 6600 tanks hard, even when running FSR performance (1080p native) because it simply lacks the bandwidth to handle the upscaled frame data. You might find that for the games you're playing a 6800XT on clearance will be more than enough. You can pick them up right now for about $520 new, or they regularly sell used for $450 on ebay if you browse by "sold items". Get $175 back from your RX 6600 and that solves your 4K problem for under $300 while avoiding the early-adopter tax and flagship pricing model.

By mid 2023 we should have the 4070/4070Ti and the RX 7800XT. Presumably these will shake up the pricing a lot - in a way that these flagship launches have no real hope of doing.
Hmm, that is probably the path I should take, but I’d just be irritated if I spent $500 on something that didn’t perform as good as I hoped. Regardless I’d be keeping my RX6600 for my secondary PC.
 
You see it too?

Team green needs to secure the top spot in the epeen ladder for 2024 by...
- maxing out the silicon directly on the top end node of the moment
- using increased power targets gen-to-gen since Turing, and despite a shrink, yet another.
- enabling 450W~600W power target on the top end to even extract meaningful OC results... if we call 6% meaningful, which is a stretch.
- pushing even an x80 to a price point beyond what used to be absolute top end

One does start to wonder how they'll proceed from here.
I was having a similar discussion on the CPU front the other day too - specifically that desktop CPUs are not the only CPUs being made. There's no room for a 350W CPU in a laptop and laptops have been steadily overtaking desktops for half a decade at this point. Even on desktop, very few people actually want to spend on crazy CPUs with all the flagship platform costs of a high-end cooler/board/DDR5.

Progress cannot be made much longer the way Nvidia are currently doing it:
  • The 3090 started the trend of tripping OCP on high-end, 1000W PSUs.
  • The partner-model 4090s don't physically fit in several enthusiast cases that were previously thought to have plenty of space for huge GPUs.
  • The 4090 can't be powered safely, apparently as we go through the second flawed iteration of a dumb-idea connector.
  • How big is the gulf going to be between a 450W 4090FE and a laptop 4080(M)? Laptop OEMs were really pushing their luck with 130W variants last generation, there's only so much physical space you can dedicate to cooling in something that has to be about an inch thick and water-cooling is, realistically, not an option.

The last good 1080p GPU with strong perf/$ was the 1060. We only regressed since. The price increases trickle down through the stack, that's why whine happens, because it does damage the market. Another aspect of that is that if people can't 'move up' the stack because its just priced out of the larger part of the target audience's comfort, you get stagnation in gaming as well. Devs cater to majority groups, not minorities. It will inevitably hurt the adoption rates of new technology in gaming, such as RT.
RX6600 might change your opinion on that?

It's built for 1080p in terms of ROPs, bandwidth, and cache.
It's priced about the same as the 1060 if you adjust for inflation and added US-China tariffs that didn't exist in 2016 when the 1060 was made.
Partner models use ~130W, which is marginally less than the typical 140-150W that most non-FE 1060 cards used. There's a fair bit of undervolting headroom, too - just like the 1060.
It's twice the performance of the 1060, five years later, or roughly equivalent to 2 generations of 40% uplift each time
It was the clear performance/Watt leader of its generation.

Unfortunately it was an underrated card because it was unexciting, and suffered from low-availability and scalping, but now that those issues are gone it's my go-to recommendation for someone looking to upgrade from a 1060 or lower. You can pick them up used for $175 and if you look for deals you can find them new for less than the 1060's original MSRP.
 
Last edited:
Back
Top