• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

How much would you pay for RTX 4080?

How much would you pay for RTX 4080 at most?

  • Less than $400

    Votes: 1,533 12.9%
  • $400

    Votes: 522 4.4%
  • $500

    Votes: 1,082 9.1%
  • $600

    Votes: 1,715 14.5%
  • $700

    Votes: 2,615 22.1%
  • $800

    Votes: 2,569 21.7%
  • $900

    Votes: 869 7.3%
  • $1000

    Votes: 639 5.4%
  • $1100

    Votes: 48 0.4%
  • $1200

    Votes: 102 0.9%
  • $1300

    Votes: 23 0.2%
  • $1400

    Votes: 32 0.3%
  • More than $1400

    Votes: 110 0.9%

  • Total voters
    11,859
  • Poll closed .
Absolut performance advantage, DLSS3, superior RT capabilities, much bigger market penatrestion and general better acceptness world-wide aka more demend, CUDA cores for professional uses.
AMD`s chuplet\MCM and GPU core structure give you as an end user nothing just as the node it is manufactured on give you nothing. It is just an approche to get more preformance. On the other hand DLSS3 give you a lot and so do tensor cores givs you strong RT preformance. If you need it or not is not-relevant to the discussion- the point is that those are things that the end user actually see and can measure its impact when on or off.
I don't disagree with that - I'm just trying to see the equation from the perspective of cost. Market penetration and acceptedness are not reasons to increase prices at this rate, nor is RT capability or DLSS, which have been there since Turing.

You only increase product prices if either your costs increase, or if you want to increase profit margins. Cost AFAIK, comes mainly from R&D, manufacturing and logistics. As I said, Nvidia has done very little that is visible to me in their R&D of Ada, so I see no increase in cost there. Manufacturing, as you pointed out, is more expensive on a smaller node for a die of equal size, but AD103 is much smaller than GA102, so it shouldn't be much more expensive to manufacture. Logistics is the same as ever, fuel prices are nearly back to pre-covid times, so I see no cost increase there, either. Besides, if there was any, that would affect other companies, too.

All in all, there is nothing about the 4080 itself that justifies its price.

It`s is less than twice as small (314 will be twice ) and as I said those are 2019 prices (see buttom right). add 20-30% (or even more) for today 4nm.
NV do use smaller die area and that's how they makes more money + they charge extra more on top of that. No denying that.
All I said is when factoring silicone price increase, thet is not NV doing in any way, you see that this 300% more that`s been claimed is actually less.
Also, that why you can`t expect the gradual linear increase like we saw in the past. With next gen at 2/3nm we will see yet further exponential increase, maybe in a bigger order than today (Power of 3 insted of 2 for example).
So to bring the non-relevant past (that is linear increase) as a claim for today is null.

To sum it up: The rate of increase is getting bigger and that rate also increases (see "jerk"- the rate of increase in acceleration).
The past is getting less and less relevant for the present when you try to understand current prices.
Then how can AMD launch the 7900 XTX on the same MSRP as they did the 6900 XT?
 
I don't disagree with that - I'm just trying to see the equation from the perspective of cost. Market penetration and acceptedness are not reasons to increase prices at this rate, nor is RT capability or DLSS, which have been there since Turing.

You only increase product prices if either your costs increase, or if you want to increase profit margins. Cost AFAIK, comes mainly from R&D, manufacturing and logistics. As I said, Nvidia has done very little that is visible to me in their R&D of Ada, so I see no increase in cost there. Manufacturing, as you pointed out, is more expensive on a smaller node for a die of equal size, but AD103 is much smaller than GA102, so it shouldn't be much more expensive to manufacture. Logistics is the same as ever, fuel prices are nearly back to pre-covid times, so I see no cost increase there, either. Besides, if there was any, that would affect other companies, too.

All in all, there is nothing about the 4080 itself that justifies its price.


Then how can AMD launch the 7900 XTX on the same MSRP as they did the 6900 XT?
I don`t see the problem: NV choose to charge more than AMD because they think they can, and nothing so far show that they wrong with that way of thinking. They make the most $$ they can and then some more out of this situation. They exploit the market as much as they can.
You might think that they are not right doing so, but excuse me- so what?
You don`ts see the added value of stronger RT cores or have no need with DLSS 3? Then go with AMD. Simple as that.
But please don`t choose AMD because they made the effort to invent MCM and you want to support them for that, and\or because you angry with NV for their prices and how they exploit the market.
It is futile behaviour with that kind of giant companies. AMD and NV both exploit every penny that they can out of you on the basis of that supportive behaviour in the most cold, calculated way there is. AMD exploit the market as well, only from in a different way because they are in a different postion- thay exploit by being the 'budget option' and as the 'protest purchase againstNV'.
AMD don`t chargh more because they don`t think thay can sell for more when factoring NV as competitors. It will change. The weeker NV will get ,the more AMD will increase it`s prices. Expect nothing more from them.
 
Last edited:
I don`t see the problem: NV choose to charge more than AMD because they think they can, and nothing so far show that they wrong with that way of thinking. They make the most $$ they can and then some more out of this situation. They exploit the market as much as they can.
You might think that they are not right doing so, but excuse me- so what?
You don`ts see the added value of stronger RT cores or have no need with DLSS 3? Then go with AMD. Simple as that.
But please don`t choose AMD because they made the effort to invent MCM and you want to support them for that, and\or because you angry with NV for their prices and how they exploit the market.
It is futile behaviour with that kind of giant companies. AMD and NV both exploit every penny that they can out of you on the basis of that supportive behaviour in the most cold, calculated way there is. AMD exploit the market as well, only from in a different way because they are in a different postion- thay exploit by being the 'budget option' and as the 'protest purchase againstNV'.
AMD don`t chargh more because they don`t think thay can sell for more when factoring NV as competitors. It will change. The weeker NV will get ,the more AMD will increase it`s prices. Expect nothing more from them.
Again, you're not wrong. I only wanted to speculate what Nvidia's price increase is about from a company point of view, besides the sad and obvious "because we can".

You're right about AMD, too. I never said one should choose the 7900 XTX because of the MCM design. I only said that the new design could have given them a reason to increase prices, but they didn't.
 
Hi,
lol does that include tax and other related regional charges if so that's whack

200.gif


I did at 900.us think it's performance is pretty good and it's not a 12gb card :laugh:
 
Every card is meant for 1080p, if 1080p is your target! :)

Playing Crysis 2 Remastered with very high raytracing on my 3090, 1080p 60 is buttery smooth but judging from the GPU utilization figures, it would just barely be enough for 120 Hz and definitely miss the mark for 144 Hz... and a 4080 would be just about powerful enough to meet that performance figure.

I don't really believe in "ray-tracing". I am fine with normal lighting. And I think nvidia ngreedia is a scam, it makes fake marketing "technologies".

Meh.. :rolleyes:

Nobody Wants NVIDIA's $1199 US GeForce RTX 4080: Despite Lower Shipments, Retailers & Stores Are Stocked With Cards (wccftech.com)
 
All in all, there is nothing about the 4080 itself that justifies its price.
To you that`s true. Also to me. I wouldn't even consider 4080 unless 400$ less from the FE MSRP.
NV will not see my money with this product, despite the fact that I must have CUDA for my uses and the linear benefit I can make with having more of them.

Btw- nothing by that ,to my perception, contradicts what I wrote above.
 
To you that`s true. Also to me. I wouldn't even consider 4080 unless 400$ less from the FE MSRP.
NV will not see my money with this product, despite the fact that I must have CUDA for my uses and the linear benefit I can make with having more of them.

Btw- nothing by that ,to my perception, contradicts what I wrote above.
I agree. Even if you need CUDA, there are better price-conscious options available from the 30-series.

I forgot to comment on this:
You don`ts see the added value of stronger RT cores or have no need with DLSS 3?
I don't see stronger RT cores compared to previous Nvidia products. Everything I see about the entire 40-series is the result of more cores, higher frequencies and a node shrink. I'm not convinced that anything has changed in Ada's architecture compared to Ampere. I also suspect DLSS 3 support to be released for Ampere and Turing later, once Nvidia has been satisfied with 40-series sales, but that's just my own thinking.
 
Hi,
lol does that include tax and other related regional charges if so that's whack

200.gif


I did at 900.us think it's performance is pretty good and it's not a 12gb card :laugh:

Jensen may be lurking TPU.......

Jensen Gpu GIF by NVIDIA GeForce


Be smart and bid low.....
 
but they didn't.
I can speculate why they choose not to: They aren't strong enough compared to NV to do so.
Think of a poker game with 2 players.
A have 1,000,000 chips, B have 10 time more.
Who will be the more dominant\aggressive player around the table?
Who will be able to 'bluff' and win more with nothing in his hands?
Who do you bet to win in the end?
Now, which player is NV and which is AMD?
What will happend if A manage to overcome B so the situation is the other way around? Now A have 10,000,000 chips. do you expect the new chip leader (A) to act differently and to be less aggressive, not to bluff and generally play more nicely?

This metaphore is overly simplification of course to represent the dipole GPU market we have right now, but I think It`s hold quite good for it`s purpose (and does not involve cars metaphor, so thats a bonus, while involving chips which is a big bonus imo).
 
Last edited:
I don't really believe in "ray-tracing". I am fine with normal lighting. And I think nvidia ngreedia is a scam, it makes fake marketing "technologies".

Meh.. :rolleyes:

Nobody Wants NVIDIA's $1199 US GeForce RTX 4080: Despite Lower Shipments, Retailers & Stores Are Stocked With Cards (wccftech.com)

Raytracing is an amazing technology. Like DirectX 11, it will take some time for it to go completely mainstream. Ada is a great step in that direction, and AMD also believes in it - they seem to have worked quite hard in RDNA 3's RT capabilities.

There are many reasons to not want a 4080... but chiefly, price.
 
You don`ts see the added value of stronger RT cores or have no need with DLSS 3? Then go with AMD. Simple as that.
But please don`t choose AMD because they made the effort to invent MCM and you want to support them for that, and\or because you angry with NV for their prices and how they exploit the market.

I don't think the RT cores on Ada are actually stronger or those improvements are not translating to games. In some games the 4090 takes a larger hit to performance compared to it's base rasterization performance than the 3090 does. Yes you have higher RT FPS overall but as a percentage you are loosing about the same amount of your total performance as Ampere. We need to see more games for comparison but it for the most part flips between single digit gains and single digit losses.

At the current rate of improvement, it will take decades before enabling RT doesn't drop FPS and that's only with the extremely light RT effects we use now. Nvidia certainly like to tout RT performance but Ada improved far more in raster for gaming than it did in RT.

I don't really see the problem with people choosing to buy an AMD card because they are mad at Nvidia. They are getting a good video card either way and it's not like Nvidia doesn't have the money to weather the storm or can't alter their prices to be more appealing. Just don't go out and buy a card if you don't really need one. Both AMD and Nvidia need to realize that the pandemic is over and people won't pay bonkers prices anymore.
 
Raytracing is an amazing technology. Like DirectX 11, it will take some time for it to go completely mainstream. Ada is a great step in that direction, and AMD also believes in it - they seem to have worked quite hard in RDNA 3's RT capabilities.

There are many reasons to not want a 4080... but chiefly, price.

AMD doesn't believe in ray-tracing. They said they it will be fully supported and available only in the cloud.
So, not mainstream and not for you.

1668880397315.png
 
AMD doesn't believe in ray-tracing. They said they it will be fully supported and available only in the cloud.
So, not mainstream and not for you.

View attachment 270730

RDNA2 has Ray Accelerators built into the compute units...

Think you are misunderstanding that slide. AMD is simply stating that it'll have cards that will accelerate RT workloads and then those card's time will be sold for full scene RT through cloud services. That's why the slide says "Scaling from Local Devices to the Cloud", which indicates AMD wants RT acceleration at all levels.
 
Last edited:
I don't really see the problem with people choosing to buy an AMD card because they are mad at Nvidia. They are getting a good video card either way and it's not like Nvidia doesn't have the money to weather the storm or can't alter their prices to be more appealing. Just don't go out and buy a card if you don't really need one. Both AMD and Nvidia need to realize that the pandemic is over and people won't pay bonkers prices anymore.
If you are in the market for a 300-400$ GPU and mad with NV because of 4080 price and that`s make you choose AMD only, despite the possibility that NV have better all around product in your price range budget so you "forcing" yourself to go AMD- than you are a sucker imo.
That will be what I calld a "protest purchases", a null effort to retaliate against NV, and beside costing to you more and giving less you achieved nothing, despit what you might feel.
Any effort to support any of those companies (NV, AMD, Intel) by buying from the competitor although the competitor have a better product for you will only left you with a lesser product and nothing more. You help no one and only perpetuate the existing situation because the company you 'supported', if anything, will be able to keep produce less product for you but still sell them to you. You will buy from them anyway don`t you?

"protest purchases" or supporting buy is very much welcome and can actually make a change when favoring small to medium companis over giants one. Not with global, multi billion company that is listed on a stock exchange in capitalist orianted market.

Support your local domastic small choclate munufacture over Ferrero or Mars Wrigley Company. Don`t try to support one of two dipole global companies if their product isn`t best for you (that is choosing by being totally brand agonist).
 
AMD believes in RT but they think that powering the two most important consoles, they can slow down nVidia's advantage in RT.
Actually that's happening. The bigger AAA titles, although they support RT, they tend to limit it to RT reflections and shadows because the consoles are too weak.

I still remember the youtubers saying that the next gen of consoles (XBOX SX and PS5) will be better than a 3700X+2080Ti system.....
 
Last edited:
When you look at the increase cost rate of silicon wafers you will understand more why this increase is not only ' nv greed' based.

f1.jpg

Also, check their profits in the same years and compare.
 
Last edited:
This is misleading because it doesn't show the rough number of good dies per ... say ... 10000 wafers: yes, 45nm were a lot cheaper and they got more expensive from them on, but there were also A LOT LESS dies in that number of wafers, so it's not that easy to compare and this chart makes it seem.

Also, check their profits in the same years and compare.
I don`t fallow your logic: The graph show cost per good functional 250mm Dia all things considered.
 
  • Like
Reactions: HTC
I don`t fallow your logic: The graph show cost per good functional 250mm Dia all things considered.

So this refers to 250mm2 in all the nm wafer sizes? Then i misunderstood it and i'll remove that bit from the post.

Still, check nVidia's profits in those years and compare that chart to the one you posted.
 
Still, check nVidia's profits in those years and compare that chart to the one you posted.
What`s with them? I don`t see your point.
 
What`s with them? I don`t see your point.

nVidia's profits rose "more sharply" than the costs you referred to did.

While it's certainly true that the costs to manufacture increased, the prices nVidia sells their cards have increased A LOT MORE, which is how nVidia manages to make such "obscene" profits.
 
So many people so complacent with Apple-like prices on GPUs. That's why companies do it. There is always someone "cool" enough that pays because he can to show off.

:banghead:
 
If you are in the market for a 300-400$ GPU and mad with NV because of 4080 price and that`s make you choose AMD only, despite the possibility that NV have better all around product in your price range budget so you "forcing" yourself to go AMD- than you are a sucker imo.
That will be what I calld a "protest purchases", a null effort to retaliate against NV, and beside costing to you more and giving less you achieved nothing, despit what you might feel.
Any effort to support any of those companies (NV, AMD, Intel) by buying from the competitor although the competitor have a better product for you will only left you with a lesser product and nothing more. You help no one and only perpetuate the existing situation because the company you 'supported', if anything, will be able to keep produce less product for you but still sell them to you. You will buy from them anyway don`t you?

"protest purchases" or supporting buy is very much welcome and can actually make a change when favoring small to medium companis over giants one. Not with global, multi billion company that is listed on a stock exchange in capitalist orianted market.

Support your local domastic small choclate munufacture over Ferrero or Mars Wrigley Company. Don`t try to support one of two dipole global companies if their product isn`t best for you (that is choosing by being totally brand agonist).
I also disagree with protest buying, though I have to add that AMD has pretty competitive products at reasonable prices across their entire stack, so I don't believe you end up with a lesser product right now if you choose AMD for whatever reason. With AMD paving the way for the 7900 series, there are some pretty sweet deals to be had on an RDNA 2 GPU. I've just recently bought a 6750 XT for £470. The competing 3070 starts at at least £100 higher. Some models even touch the £700 mark which is insane. The 6800 and 6900 XT are heavily discounted as well, not to mention the 6600 series which has been a best buy ever since its release. The only Nvidia card I consider worth buying at current retail prices is the 3080 for £700-750. Unfortunately, it's out of my price range (although I'm happy with the 6750 XT, so I don't really need it).
 
AMD has pretty competitive products at reasonable prices across their entire stack

What? :confused:

AMD is cheaper in all possible tiers. There is never "protest buying" (there is "protest NOT buying"), there is sane, reasonable purchase of better Radeons which offer more quality in about everything - better software and drivers, better image quality, better connection technologies, lower power consumption, safe PCIe connectors, stable relationships with partners like Sapphire, XFX (that actually is a former nvidia partner), and many more.
 
I see people talking about cost per mm2 for production costs. Yep, sure that's increased. But did you even look to see the mm2 used by each GPU?

RTX 3080 : 628.4mm2
RTX 4080 : 379mm2

Even if their per mm costs have increased the die size has drastically decreased, by 40%. It definitely does NOT justify the massive cost increase to the cards. Nvidia are being greedy, it's a corporation after all, we expect them to do that. The problem is AMD isn't being competitive, and neither is Intel (in this high-end space). Nvidia has the market by the balls, you don't buy AMD cause they aren't very future proof, and you don't buy nvidia (but you will) cause they're too expensive.

Those people who say they don't believe in Ray Tracing, go live on an intel integrated and tell me you're still fine with it for gaming. Graphics goes forwards, Ray Tracing solves problems that typical shader based raster programs find difficult to scale, and we've been finding difficult to remedy for a decade now without dedicated hardware. Traditional triangle based rasterization is at it's limit of being efficient, and you might not think it, but taking the Ray Tracing route is about making certain effects MORE efficient, because otherwise you have to brute force them with traditional shader programs, which end up slower (grab a GTX 1080 and use it to run Quake 2 RTX, they have the entire ray tracing stack running in shaders).
 
Back
Top