• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

NVIDIA GeForce RTX Series Prices Up To 71% Higher Than Previous Gen

Low quality post by bug
Attacking someone won't justify your lame beliefs, looking at your posts and the number of them only shows that you are probably one sad (most probably fat) man/lady that spend most of his time arguing people on the forum ;) If you think I am dumb I respect your opinion...now you know what I think about you...remember that you've started this ;)
Do you think signing up on a tech forum to whine about tech is a smart thing to do?
 
Low quality post by Liviu Cojocaru
Do you think signing up on a tech forum to whine about tech is a smart thing to do?
Looking at your situation...that is exactly what you do ;)
 
A few people arguing against a mass. Conclusion = so many shills. Seems about right. /sarcasm

1200$ is entry level Quadro price. High end Quadro goes for 10000$. Compared to that 2080Ti OK.

The bigger the die size the less dies can be manufactured on a single wafer. A single defect an ruin a big chunk of the wafer. This for example is why Intel's HEDT chips are so damn expensive too compared to small AMD ones. Also GPU's are one of the most complex thing to produce on any given node. Much less on a cutting edge 7nm which is OK for small low power mobile SoC's at this time but not yet viable for mass producing a 18+ billion behemoth. Next year - probably.

TU102 appears to be about 32mm x 25mm in size. Thus we can calculate how many dies can a industry standard 300mm (12inch) wafer fit. Given a defect rate of only 0,05 which is impossible for chip this size that would amount to:

Max dies per wafer (without defect): 58
Good dies: 39
Defective dies: 19
Partial dies: 8
Yield: ~68%

^ This is the absolute best case scenario. Ever.

More realistically we are looking at a defect rate of 0.15 which would give drastically worse numbers:

Max dies per wafer (without defect): 58
Good dies: 20
Defective dies: 38
Partial dies: 8
Yield: ~34%

Calculator: http://caly-technologies.com/en/die-yield-calculator/

Assuming each wafer costs about 25000$ (it can't be much lower because Quadro RTX 8000 goes for 10000$ by itself so wafer is at least 2x more costly).

25000/20=1250$ well surprise surprise. If we get 20 good dies on 25000$ wafer the price is exactly what it is now for 2080Ti. But while the chip itself may the biggest cost per card there are other components costs that make up the BoM (Bill of Materials).

OK so assuming best case scenario 25000/39= 641$ + other components = retail price.

So already due to the manufacturing cost of the die itself it is almost as expensive as a 1080Ti MSRP of 699$.
Still think Nvidia are robbing us?

Yeah you probably do. Facts are never an obstacle for a crowd with pitchforks screaming bloody murder.
Did I hit a nerve?
We want lower prices, no excuses.
 
Looking at your situation...that is exactly what you do ;)
WTF? What situation? When did I complain? Are you completely incapable of following a conversation?
 
WTF? What situation? When did I complain? Are you completely incapable of following a conversation?
Nope, it's you, you're incapable of accepting others opinion ;) ... I did not reply first you did to me...as you do with I think most of the people on this forum...you feel the need for attention and reassurance that your opinion is the best, like you have all the answers and most important all the facts. You did not counter my opinion with hard facts...just assumptions...I said that the price is really high (as everyone says) and that nVidia did not show a lot to justify this...do you want me to go further?
 
I’m not arguing for or against. Technically against since I’m upgrading a gen behind to 1080 Ti.

Yes, the prices are expensive. But I would like everyone to stop and take a few minutes. Why are they high? Sure, some can be blamed on lack of decent high end competition. However:

Has anyone considered the cost of 10 years of R & D? Does anyone know the cost of fabricating this large die? Does anyone know what it costs to produce this card compared to previous generations?

These are all factors which go into pricing. Pricing is not made up in a vacuum. Until you know these things, or are willing to take them into account, then complaining is just that: complaining.

The goal of any manufacturer is to sell their product. Price it too high and your inventory doesn’t sell. The market will determine if these are priced too high to sell all the stock. If it’s too high, they and AIB’s will adjust.

I personally have found it too high, and have opted for Pascal. Thanks for reading! :)
I know that takes time and lot of effort to develop something but i think that the point (cough.!!!) is delivering to gamers an accesible price mid high end.newegg has the Asus RTX 2080 listed for $839.99
thats too much lol too much , like you i find really appealing the gtx 1080 ti now. Im gonna seat and wait for the real numbers on those new cards.
 
A few people arguing against a mass. Conclusion = so many shills. Seems about right. /sarcasm

1200$ is entry level Quadro price. High end Quadro goes for 10000$. Compared to that 2080Ti OK.

The bigger the die size the less dies can be manufactured on a single wafer. A single defect an ruin a big chunk of the wafer. This for example is why Intel's HEDT chips are so damn expensive too compared to small AMD ones. Also GPU's are one of the most complex thing to produce on any given node. Much less on a cutting edge 7nm which is OK for small low power mobile SoC's at this time but not yet viable for mass producing a 18+ billion behemoth. Next year - probably.

TU102 appears to be about 32mm x 25mm in size. Thus we can calculate how many dies can a industry standard 300mm (12inch) wafer fit. Given a defect rate of only 0,05 which is impossible for chip this size that would amount to:

Max dies per wafer (without defect): 58
Good dies: 39
Defective dies: 19
Partial dies: 8
Yield: ~68%

^ This is the absolute best case scenario. Ever.

More realistically we are looking at a defect rate of 0.15 which would give drastically worse numbers:

Max dies per wafer (without defect): 58
Good dies: 20
Defective dies: 38
Partial dies: 8
Yield: ~34%

Calculator: http://caly-technologies.com/en/die-yield-calculator/

Assuming each wafer costs about 25000$ (it can't be much lower because Quadro RTX 8000 goes for 10000$ by itself so wafer is at least 2x more costly).

25000/20=1250$ well surprise surprise. If we get 20 good dies on 25000$ wafer the price is exactly what it is now for 2080Ti. But while the chip itself may the biggest cost per card there are other components costs that make up the BoM (Bill of Materials).

OK so assuming best case scenario 25000/39= 641$ + other components = retail price.

So already due to the manufacturing cost of the die itself it is almost as expensive as a 1080Ti MSRP of 699$.
Still think Nvidia are robbing us?

Yeah you probably do. Facts are never an obstacle for a crowd with pitchforks screaming bloody murder.
Assuming ? So assuming that 12 nm process is just slightly improved 16 mm finfet which is almost 4 year old . Don't you think the gains are higher ? And do you think that Nvidia pays the same amount of $ per mm2 , that for 16 nm when it was cutting edge , while in current standard 12 nm seems outdated , even 10 nm is soon gonna be replaced by 7 nm , so in my opinion 12 nm is outdated. So if you want we can roll on with that game of assumptions , how much they pay for mm2, but with high probability I can tell that they for sure pay less that for 16 nm , since 12 mm is just slightly improved 12 mm which is almost 4 years old , so do you think they pay the same fare ? . For me it just seems like you want to justify that riddicoulus price , and I don't even know why you want to do that , since for me it seems so irrational . Of course Nvidia is paying a lot less for mm2 in 12 nm process . So maybe I will assess that they are paying half of 16 nm fare . Since 12 nm is outdated currently . See ? All assumptions . The point is that these prices are sick . Even if we make assumptions that rtx 2080 ti is 50 percent faster that gtx 1080 ti in games , which I doubt it will , but let's assume that , so Fe to Fe edition is 72 % more expensive . So in conclusion we would pay more per frame after almost 3 years , you call that progress ? I call that being greedy as fuck , almost as being ripped off , just because there is no real competition . That price hike is not justified at all . Compare gtx 1080 ti to to gtx 980 ti Maxwell , price difference was 50 $ , double amount of memory , while the performance increase was 50 % , so less than 10% price difference , now Fe to Fe is 72% ? How someone can justify that ?
 
Last edited:
Assuming ? So assuming that 12 mm process is just slightly improved 16 mm finfet which is almost 4 year old . Don't you think the gains are higher ? And do you think that Nvidia pays the same amount of $ per mm2 , that for 16 mm when it was cutting edge , while in current standard 12 mm seems outdated , even 10 mm is soon gonna be replaced by 7 nm , so in my opinion 12 mm is outdated . For me it just seems like you want to justify that riddicoulus price , and I don't even know why you want to do that , since for me it seems so irrational . Of course Nvidia is paying a lot less for mm2 in 12 mm process . So maybe I will assess that they are paying half of 16 mm fare . Since 12 mm is outdated currently . See ? All assumptions . The point is that these prices are sick . Even if we make assumptions that rtx 2080 ti is 50 percent faster that gtx 1080 ti in games , which I doubt it will , but let's assume that , so Fe to Fe edition is 72 % more expensive . So in conclusion we would pay more per frame after almost 3 years , you call that progress ? I call that being greedy as fuck , almost as being ripped off , just because there is no real competition . That price hike is not justified at all . Compare gtx 1080 to to gtx 980 to Maxwell , price difference was 50 $ , while the performance increase was 50 % , so less than 10% price difference , now Fe to Fe is 72% ? How someone can justify that ?
Yes assuming. No one ever publicly states defect rate. However the die size and wafer size are real and there are facts to back those numbers up.
I just told you why it's so expensive. Yet you fail to undestand basics of chip production. They can't produce 2080Ti for the same price as 1080Ti - it's technicly impossible. Unless you think they should sell at a loss? Because that's the only way they could do it. I mean forget revenue. Even if they sold these cards at break even prices it would still be more expensive than 16nm Pascal.
 
Yes assuming. No one ever publicly states defect rate. However the die size and wafer size are real and there are facts to back those numbers up.
I just told you why it's so expensive. Yet you fail to undestand basics of chip production. They can't produce 2080Ti for the same price as 1080Ti - it's technicly impossible. Unless you think they should sell at a loss? Because that's the only way they could do it. I mean forget revenue. Even if they sold these cards at break even prices it would still be more expensive than 16nm Pascal.
All assumptions ? If I would assume , that the price for square mm2 , is half of that of 16 nm while it debuted , since 12 mm is basically slightly improved 4 years old 16 mm , so production costs would be similar or slightly higher since the yield could be lower , but considering that it's mature and perfected prices it could end up being higher ? You see ? All assumptions ? The point is that after such long wait , considering price increase , it's a big disappointment , as I stated Fe to Fe is 72 % price increase . So even if it's 50% faster , the amount of memory is the same , then as I said many times before we would end up paying even more per frame that while Pascal debuted , isn't that sick ?
 
In my logical mind, for the FE to be 70% more expensive requires a far higher performance increase than previous generation. For the post that talked about die size and wafer yields, that my friend is irrelevant. If you produce a product that does not improve more than a given trend, yet increase its price by a far higher margin, it is not justified. Likewise, R&D costs are void (consider it a 'Bulldozer-esque' moment) if the product doesn't shit diamonds.
If the 2080ti isn't a huge leap in 'all games' above Pascal's best, then the cost is not justified. The development and cost of a product is required to be reflected by its price. Arguments on wafer size are simplications of a development failure if said new design does not have huge perf increase over last gen.
Reviews will be very telling, one way or another.

Edit: put simply, if a company pisses 10billion down the drain in research for an underperforming product, said company usually has a stock price implosion.
 
To summarize all of that . Are you an expert in foundry production ? Do you know the costs of mm2 currently ? Don't you think that once the process is older then the price of mm2 is lower , since new cutting edge prices are on the way and the process is perfected and yields are higher ? 1080 ti was also big chip . The point is after waiting such long time for new generations , possible performance increase is even lower that the price increase . Price performance ratio seems way overpriced and seems just as taking advantage of no real competition , and trying to cash that time , milk customers , and make much higher margins .

In my logical mind, for the FE to be 70% more expensive requires a far higher performance increase than previous generation. For the post that talked about die size and wafer yields, that my friend is irrelevant. If you produce a product that does not improve more than a given trend, yet increase its price by a far higher margin, it is not justified. Likewise, R&D costs are void (consider it a 'Bulldozer-esque' moment) if the product doesn't shit diamonds.
If the 2080ti isn't a huge leap in 'all games' above Pascal's best, then the cost is not justified. The development and cost of a product is required to be reflected by its price. Arguments on wafer size are simplications of a development failure if said new design does not have huge perf increase over last gen.
Reviews will be very telling, one way or another.
Yes exactly my point . I don't understand why some people try so hard to justify these prices , they are really something out of this world .
 
The only assumptions i made are defect rate and wafer price. I'm pretty confident in those numbers. Unless someone can refute those numbers with facts (you can't unless you work at TSMC's production line). No im not an expert. Very few people on the internet are. Best case there are analysts.
The maturity of the process affects yields. I gave you numbers based on best case end realistic ecpectations on defect rate. TLDR: worst case=1250$ per chip. Realistic=641$ per chip.
The way you talk you expect 12nm process to have a 0.00% defect rate because it is "old". All process nodes no matter how old or new will have some amount of defects. That's unavoidable. You not only expect Nvidia to sell their products at a loss but you also expect them to deny the laws of physics.

Fine i get you think it's overpriced. I told you there are reasons besides greed but you keep playing the same record over and over.
So let me ask you. What price for these cards would be reasonable for you?
 
The only assumptions i made are defect rate and wafer price. I'm pretty confident in those numbers. Unless someone can refute those numbers with facts (you can't unless you work at TSMC's production line). No im not an expert. Very few people on the internet are. Best case there are analysts.
The maturity of the process affects yields. I gave you numbers based on best case end realistic ecpectations on defect rate. TLDR: worst case=1250$ per chip. Realistic=641$ per chip.
The way you talk you expect 12nm process to have a 0.00% defect rate because it is "old". All process nodes no matter how old or new will have some amount of defects. That's unavoidable. You not only expect Nvidia to sell their products at a loss but you also expect them to deny the laws of physics.

Fine i get you think it's overpriced. I told you there are reasons besides greed but you keep playing the same record over and over.
So let me ask you. What price for these cards would be reasonable for you?

You need to understand production cost is not the sole determinant of retail cost. Nor is R&D. The product itself must perform. Now, while we have been told it is awesome at Ray Tracing, it has not been benchmarked against available games. It's a little obvious that Nvidia did not show ANY, non RT performance figures.

I get they are pushing a new direction and I applaud that but it looks like a cash grab to me, and a lot of other observers, including web reviewers.

@W1zzard is the one that will tell us, assuming he gets a 2080ti.

And please, understand, my purchase history is GTX Titan, GTX 780ti (X2), Kingpin GTX 980ti, GTX 1080ti. So, not a hater, very much a supporter of the fastest consumer gfx cards.
 
You need to understand production cost is not the sole determinant of retail cost. Nor is R&D.
I understand that perfectly. Yet some people fail to realize that if you add GDDR6 memory cost and other material costs to a chip that costs 641$ or more to make then it is not very suprising that you end up with an expesive graphics card.

Sure Nvidia has margins. Especially on FE models. There's litte doubt that FE cooler does not cost extra 200$ like in the case of 2080Ti FE vs SEP MSRP. But some people seem to believe Nvidia's margins are 50% or more from each 2080Ti card. That's just funny.
 
Four years ago the 970 launched for $330, two years later the 1070 launched for $450, and after another two years the 2070 is launching for $600, I dread to think where the price will be in another two years.

I'm really interested in seeing the performance of these cards in RT and standard titles, I hope the 2060 is more powerful than a 1080 Ti or Titan Xp.
 
The only assumptions i made are defect rate and wafer price. I'm pretty confident in those numbers. Unless someone can refute those numbers with facts (you can't unless you work at TSMC's production line). No im not an expert. Very few people on the internet are. Best case there are analysts.
The maturity of the process affects yields. I gave you numbers based on best case end realistic ecpectations on defect rate. TLDR: worst case=1250$ per chip. Realistic=641$ per chip.
The way you talk you expect 12nm process to have a 0.00% defect rate because it is "old". All process nodes no matter how old or new will have some amount of defects. That's unavoidable. You not only expect Nvidia to sell their products at a loss but you also expect them to deny the laws of physics.

Fine i get you think it's overpriced. I told you there are reasons besides greed but you keep playing the same record over and over.
So let me ask you. What price for these cards would be reasonable for you?
I think reasonable price would be in 750$-800$ range . Since paying more money that performance increase just seems like cash grab to me . And going back to tsmc 12 mm . I understand there will always be defective chips in the wafers . But the percentage would be much lower in mature process . And do you really think that they pay the same amount for mm2 that for 16 mm ? I think they are paying a lot less than that , since 12 mm is older tech , not cutting edge . So let's roll with the game of assumptions you started . If they pay half for mm2 that 16 mm , since it's in principle like I said slightly improved 16 nm , 12 mm with better yeilds , so if they would pay half of the fare of 16 mm when it was cutting edge , then it would result in the same price that 1080 ti , or even lower than that ? That's why I want to stop the assumption . One solid fact is that the price of Fe to Fe is 71 % more . And even if performance increase would be 50% it would still be more expensive per frame . Isn't that riddicoulus , usually , after generation price per frame was much cheaper . And now it's getting more expensive ? How can you justify that ? Isn't that defying logic sense ? Not to mention prevorious generation was doubling memory size , gtx 980 ti , to 1080 ti was double the amount of memory , now it's the same as prevorious generation . Not to mention low clokcs , higher tdp , power efficiency wouldnt be ground breaking , especially comparing prevorious generation gains for example Maxwell to Pascal .

I understand that perfectly. Yet some people fail to realize that if you add GDDR6 memory cost and other material costs to a chip that costs 641$ or more to make then it is not very suprising that you end up with an expesive graphics card.

Sure Nvidia has margins. Especially on FE models. There's litte doubt that FE cooler does not cost extra 200$ like in the case of 2080Ti FE vs SEP MSRP. But some people seem to believe Nvidia's margins are 50% or more from each 2080Ti card. That's just funny.
Besides how did you come up with that 641 $ number , out of the moon ? If I would say that chip costs 200$ . Let's stop with that assumptions . It's just way overpriced from what it seems .

I understand that perfectly. Yet some people fail to realize that if you add GDDR6 memory cost and other material costs to a chip that costs 641$ or more to make then it is not very suprising that you end up with an expesive graphics card.

Sure Nvidia has margins. Especially on FE models. There's litte doubt that FE cooler does not cost extra 200$ like in the case of 2080Ti FE vs SEP MSRP. But some people seem to believe Nvidia's margins are 50% or more from each 2080Ti card. That's just funny.
Just check Nvidia financial results . Gross margin of q1 2018 was 59.4% and that's a fact .
 
Everyone get along or move along.
 
No competition from RTG, hence Nvidia is free reign to price it's GPUs accordingly.
 
Exactly . Well I think I will stick with my gtx 1070 for much longer that I imagined . But for 1080p it's still more than enough for some time. I hope there will be some competition in the future , toherwise as consumers we are going to be screwed , I still remember the times when I got my 970 for 329 $ , then 1070 for 400 $ , now rtx 2070 is going to be for 500$ so almost 180 % over Maxwell MSRP , and what's even more concerning that they didn't disclose any real game performance numbers , tflops , I doubt it will be in gtx 1080 ti range , probably in gtx 1080 territory , like it was with Maxwell to Pascal , when gtx 1070 was in gtx 980 ti performance range . So I ask where is the progress considering price increase , the progress is called more money ? , Or the second money per frame , when usually price per frame was much lower after each generation ? . And as consumer who was thinking about upgrade , now I will vote with my wallet , and will stick to my gtx 1070 , since that price for midrange is sick , and gtx 2080 ti is just I sanely priced . I hope people will complain about that pricing otherwise that pricing will became the norm .
 
I think reasonable price would be in 750$-800$ range
Besides how did you come up with that 641 $ number , out of the moon ? If I would say that chip costs 200$ . Let's stop with that assumptions . It's just way overpriced from what it seems
Just check Nvidia financial results . Gross margin of q1 2018 was 59.4% and that's a fact .
So what's stopping you from getting RTX 2080 then? It's roughly in that price range. That is why different tiers exist. No one one is forcing you to buy Ti. You're complaining about the price of the Halo product. There is always diminishing returns in terms of performance the more expensive hardware you buy. That's always been the case and always will be. Vega64 is bad perf per $ compared to Vega56. 2700X is bad perf per $ compared to 2600X and so on.

641$ was explained earlier. It is a reasonable estimate based on available data for TU102 cost. Asking for TU102 in 750-800$ range means you're asking Nvidia to sell it with 0 margin. The only way they would do that, ever - is if AMD was breathing down their necks. Wich is not the case unfortunately.

Would i like cheaper high end cards? Sure. Who would not. But i also understand the cost of the chips themselves and that there is no competition.
 
So what's stopping you from getting RTX 2080 then? It's roughly in that price range. That is why different tiers exist.
And that is exactly why you don’t do what you are suggesting. Keep buying down each gen, and a few years from now someone goes from a xx80Ti to a xx30, and the only thing they can max out are games 5-10 years old.

Pricing will cause that, and it shouldn’t come to having to do that. Once you get to a tier you like....stay there. If you keep slipping down, it will be prohibitively more expensive to buy the tier you were always used to, whatever that is.
 
Last edited:
And that is exactly why you don’t do what you are suggesting. Keep buying down each gen, and a few years from now someone goes from a xx80Ti to a xx30, and the only thing they can max out are games 5-10 years old.

Pricing will cause that, and it shouldn’t come to having to do that. Once you get to a tier you like....stay there. If you keep slipping down, it will be prohibitively more expensive to buy the tier you were always used to, whatever that is.
Don't get too attached. You have a point but a RTX 2080 or even 2070 should handle everything below 4K just fine. So by buying cards in that price range you still get performance increases. Used cards are another matter. New cards can never compete with used cards in terms of perf per $.
 
Don't get too attached. You have a point but a RTX 2080 or even 2070 should handle everything below 4K just fine. So by buying cards in that price range you still get performance increases. Used cards are another matter. New cards can never compete with used cards in terms of perf per $.
I think this discussion is too long .
The conclusion is that Nvidia is trying to rip us off , and the price I crease is huge and in my opinion not justified . As for wafer price I found you overestimated the cost by at least 10000 $ , 20000 $ per wafer was a year ago , now it's probably a lot less . So if the rtx 2080 ti chip cost 350 $ maybe . Anyway that's the assumptions . And the price hike is way to much to be justified . Just look at financial results of Nvidia . They realise that they have no real competition in high end cards , that's why they are ripping us off . I think they could easily sell rtx 2080 ti for 800 $ which would be a 100$ more than gtx 1080 ti , and still make a lot of profit . But the point is that they are being greedy . What is really co concerning in my opinion is your mindset , trying to justyfi that incredibly big price increase . If we follow your thinking then soon for 500$ we will buy low end graphics , and we should be ok with that ? Price Increase bigger than performance gain , when it should be the other way around ? Its insane isn't it ? No real performance number during conference ? Isn't that sketchy as fuck ? That's what happens when there is no competition , soon Nvidia could have 100 % margin , and everybody should be happy and give them as much as they want ? . So let's raise taxes , why not , our government can do that ,let's give government 80 % of our earnings to them , and we should be happy about that. What is most concerning is justifing the greedy company , and trying every possible way to justify that price , why not . Next generation could cost 2000$ , and then , yeah sure , new technology , so we should gladly pay more etc ? That defies common logic , price performance ratio . That's the consequence of no competition , and company that is taking advantage of that . Imagine a situation , when there is one high speed internet provider , and if you want high speed connection , and after three years he wants 72 % more , just because he doesn't have competition . Yeah sure , when even Xbox one X , can playy some games at 4k 30 fps , while the rtx 2070 which will be over 500 $ the price of whole console , probably couldn't handle 60 FPS , where there is logic in that, and that's the price of GPU only . So every year , I should go tier down ? Where is progress in that . What about price to performance increase , like it was every generation . It defies logic .
 
Last edited:
Back
Top