Monday, August 20th 2018

NVIDIA GeForce RTX Series Prices Up To 71% Higher Than Previous Gen

NVIDIA revealed the SEP prices of its GeForce RTX 20-series, and it's a bloodbath in the absence of competition from AMD. The SEP price is the lowest price you'll be able to find a custom-design card at. NVIDIA is pricing its reference design cards, dubbed "Founders Edition," at a premium of 10-15 percent. These cards don't just have a better (looking) cooler, but also slightly higher clock speeds.

The GeForce RTX 2070 is where the lineup begins, for now. This card has an SEP pricing of USD $499. Its Founders Edition variant is priced at $599, or a staggering 20% premium. You'll recall that the previous-generation GTX 1070 launched at $379, with its Founders Edition at $449. The GeForce RTX 2080, which is the posterboy of this series, starts at $699, with its Founders Edition card at $799. The GTX 1080 launched at $599, with $699 for the Founders Edition. Leading the pack is the RTX 2080 Ti, launched at $999, with its Founders Edition variant at $1,199. The GTX 1080 Ti launched at $699, for the Founders Edition no less.
Add your own comment

225 Comments on NVIDIA GeForce RTX Series Prices Up To 71% Higher Than Previous Gen

#151
jormungand
rtwjunkieI’m not arguing for or against. Technically against since I’m upgrading a gen behind to 1080 Ti.

Yes, the prices are expensive. But I would like everyone to stop and take a few minutes. Why are they high? Sure, some can be blamed on lack of decent high end competition. However:

Has anyone considered the cost of 10 years of R & D? Does anyone know the cost of fabricating this large die? Does anyone know what it costs to produce this card compared to previous generations?

These are all factors which go into pricing. Pricing is not made up in a vacuum. Until you know these things, or are willing to take them into account, then complaining is just that: complaining.

The goal of any manufacturer is to sell their product. Price it too high and your inventory doesn’t sell. The market will determine if these are priced too high to sell all the stock. If it’s too high, they and AIB’s will adjust.

I personally have found it too high, and have opted for Pascal. Thanks for reading! :)
I know that takes time and lot of effort to develop something but i think that the point (cough.!!!) is delivering to gamers an accesible price mid high end.newegg has the Asus RTX 2080 listed for $839.99
thats too much lol too much , like you i find really appealing the gtx 1080 ti now. Im gonna seat and wait for the real numbers on those new cards.
Posted on Reply
#152
Somethingnew
TomorrowA few people arguing against a mass. Conclusion = so many shills. Seems about right. /sarcasm

1200$ is entry level Quadro price. High end Quadro goes for 10000$. Compared to that 2080Ti OK.

The bigger the die size the less dies can be manufactured on a single wafer. A single defect an ruin a big chunk of the wafer. This for example is why Intel's HEDT chips are so damn expensive too compared to small AMD ones. Also GPU's are one of the most complex thing to produce on any given node. Much less on a cutting edge 7nm which is OK for small low power mobile SoC's at this time but not yet viable for mass producing a 18+ billion behemoth. Next year - probably.

TU102 appears to be about 32mm x 25mm in size. Thus we can calculate how many dies can a industry standard 300mm (12inch) wafer fit. Given a defect rate of only 0,05 which is impossible for chip this size that would amount to:

Max dies per wafer (without defect): 58
Good dies: 39
Defective dies: 19
Partial dies: 8
Yield: ~68%

^ This is the absolute best case scenario. Ever.

More realistically we are looking at a defect rate of 0.15 which would give drastically worse numbers:

Max dies per wafer (without defect): 58
Good dies: 20
Defective dies: 38
Partial dies: 8
Yield: ~34%

Calculator: caly-technologies.com/en/die-yield-calculator/

Assuming each wafer costs about 25000$ (it can't be much lower because Quadro RTX 8000 goes for 10000$ by itself so wafer is at least 2x more costly).

25000/20=1250$ well surprise surprise. If we get 20 good dies on 25000$ wafer the price is exactly what it is now for 2080Ti. But while the chip itself may the biggest cost per card there are other components costs that make up the BoM (Bill of Materials).

OK so assuming best case scenario 25000/39= 641$ + other components = retail price.

So already due to the manufacturing cost of the die itself it is almost as expensive as a 1080Ti MSRP of 699$.
Still think Nvidia are robbing us?

Yeah you probably do. Facts are never an obstacle for a crowd with pitchforks screaming bloody murder.
Assuming ? So assuming that 12 nm process is just slightly improved 16 mm finfet which is almost 4 year old . Don't you think the gains are higher ? And do you think that Nvidia pays the same amount of $ per mm2 , that for 16 nm when it was cutting edge , while in current standard 12 nm seems outdated , even 10 nm is soon gonna be replaced by 7 nm , so in my opinion 12 nm is outdated. So if you want we can roll on with that game of assumptions , how much they pay for mm2, but with high probability I can tell that they for sure pay less that for 16 nm , since 12 mm is just slightly improved 12 mm which is almost 4 years old , so do you think they pay the same fare ? . For me it just seems like you want to justify that riddicoulus price , and I don't even know why you want to do that , since for me it seems so irrational . Of course Nvidia is paying a lot less for mm2 in 12 nm process . So maybe I will assess that they are paying half of 16 nm fare . Since 12 nm is outdated currently . See ? All assumptions . The point is that these prices are sick . Even if we make assumptions that rtx 2080 ti is 50 percent faster that gtx 1080 ti in games , which I doubt it will , but let's assume that , so Fe to Fe edition is 72 % more expensive . So in conclusion we would pay more per frame after almost 3 years , you call that progress ? I call that being greedy as fuck , almost as being ripped off , just because there is no real competition . That price hike is not justified at all . Compare gtx 1080 ti to to gtx 980 ti Maxwell , price difference was 50 $ , double amount of memory , while the performance increase was 50 % , so less than 10% price difference , now Fe to Fe is 72% ? How someone can justify that ?
Posted on Reply
#153
Tomorrow
SomethingnewAssuming ? So assuming that 12 mm process is just slightly improved 16 mm finfet which is almost 4 year old . Don't you think the gains are higher ? And do you think that Nvidia pays the same amount of $ per mm2 , that for 16 mm when it was cutting edge , while in current standard 12 mm seems outdated , even 10 mm is soon gonna be replaced by 7 nm , so in my opinion 12 mm is outdated . For me it just seems like you want to justify that riddicoulus price , and I don't even know why you want to do that , since for me it seems so irrational . Of course Nvidia is paying a lot less for mm2 in 12 mm process . So maybe I will assess that they are paying half of 16 mm fare . Since 12 mm is outdated currently . See ? All assumptions . The point is that these prices are sick . Even if we make assumptions that rtx 2080 ti is 50 percent faster that gtx 1080 ti in games , which I doubt it will , but let's assume that , so Fe to Fe edition is 72 % more expensive . So in conclusion we would pay more per frame after almost 3 years , you call that progress ? I call that being greedy as fuck , almost as being ripped off , just because there is no real competition . That price hike is not justified at all . Compare gtx 1080 to to gtx 980 to Maxwell , price difference was 50 $ , while the performance increase was 50 % , so less than 10% price difference , now Fe to Fe is 72% ? How someone can justify that ?
Yes assuming. No one ever publicly states defect rate. However the die size and wafer size are real and there are facts to back those numbers up.
I just told you why it's so expensive. Yet you fail to undestand basics of chip production. They can't produce 2080Ti for the same price as 1080Ti - it's technicly impossible. Unless you think they should sell at a loss? Because that's the only way they could do it. I mean forget revenue. Even if they sold these cards at break even prices it would still be more expensive than 16nm Pascal.
Posted on Reply
#154
Somethingnew
TomorrowYes assuming. No one ever publicly states defect rate. However the die size and wafer size are real and there are facts to back those numbers up.
I just told you why it's so expensive. Yet you fail to undestand basics of chip production. They can't produce 2080Ti for the same price as 1080Ti - it's technicly impossible. Unless you think they should sell at a loss? Because that's the only way they could do it. I mean forget revenue. Even if they sold these cards at break even prices it would still be more expensive than 16nm Pascal.
All assumptions ? If I would assume , that the price for square mm2 , is half of that of 16 nm while it debuted , since 12 mm is basically slightly improved 4 years old 16 mm , so production costs would be similar or slightly higher since the yield could be lower , but considering that it's mature and perfected prices it could end up being higher ? You see ? All assumptions ? The point is that after such long wait , considering price increase , it's a big disappointment , as I stated Fe to Fe is 72 % price increase . So even if it's 50% faster , the amount of memory is the same , then as I said many times before we would end up paying even more per frame that while Pascal debuted , isn't that sick ?
Posted on Reply
#155
the54thvoid
Super Intoxicated Moderator
In my logical mind, for the FE to be 70% more expensive requires a far higher performance increase than previous generation. For the post that talked about die size and wafer yields, that my friend is irrelevant. If you produce a product that does not improve more than a given trend, yet increase its price by a far higher margin, it is not justified. Likewise, R&D costs are void (consider it a 'Bulldozer-esque' moment) if the product doesn't shit diamonds.
If the 2080ti isn't a huge leap in 'all games' above Pascal's best, then the cost is not justified. The development and cost of a product is required to be reflected by its price. Arguments on wafer size are simplications of a development failure if said new design does not have huge perf increase over last gen.
Reviews will be very telling, one way or another.

Edit: put simply, if a company pisses 10billion down the drain in research for an underperforming product, said company usually has a stock price implosion.
Posted on Reply
#156
Somethingnew
To summarize all of that . Are you an expert in foundry production ? Do you know the costs of mm2 currently ? Don't you think that once the process is older then the price of mm2 is lower , since new cutting edge prices are on the way and the process is perfected and yields are higher ? 1080 ti was also big chip . The point is after waiting such long time for new generations , possible performance increase is even lower that the price increase . Price performance ratio seems way overpriced and seems just as taking advantage of no real competition , and trying to cash that time , milk customers , and make much higher margins .
the54thvoidIn my logical mind, for the FE to be 70% more expensive requires a far higher performance increase than previous generation. For the post that talked about die size and wafer yields, that my friend is irrelevant. If you produce a product that does not improve more than a given trend, yet increase its price by a far higher margin, it is not justified. Likewise, R&D costs are void (consider it a 'Bulldozer-esque' moment) if the product doesn't shit diamonds.
If the 2080ti isn't a huge leap in 'all games' above Pascal's best, then the cost is not justified. The development and cost of a product is required to be reflected by its price. Arguments on wafer size are simplications of a development failure if said new design does not have huge perf increase over last gen.
Reviews will be very telling, one way or another.
Yes exactly my point . I don't understand why some people try so hard to justify these prices , they are really something out of this world .
Posted on Reply
#157
Tomorrow
The only assumptions i made are defect rate and wafer price. I'm pretty confident in those numbers. Unless someone can refute those numbers with facts (you can't unless you work at TSMC's production line). No im not an expert. Very few people on the internet are. Best case there are analysts.
The maturity of the process affects yields. I gave you numbers based on best case end realistic ecpectations on defect rate. TLDR: worst case=1250$ per chip. Realistic=641$ per chip.
The way you talk you expect 12nm process to have a 0.00% defect rate because it is "old". All process nodes no matter how old or new will have some amount of defects. That's unavoidable. You not only expect Nvidia to sell their products at a loss but you also expect them to deny the laws of physics.

Fine i get you think it's overpriced. I told you there are reasons besides greed but you keep playing the same record over and over.
So let me ask you. What price for these cards would be reasonable for you?
Posted on Reply
#158
the54thvoid
Super Intoxicated Moderator
TomorrowThe only assumptions i made are defect rate and wafer price. I'm pretty confident in those numbers. Unless someone can refute those numbers with facts (you can't unless you work at TSMC's production line). No im not an expert. Very few people on the internet are. Best case there are analysts.
The maturity of the process affects yields. I gave you numbers based on best case end realistic ecpectations on defect rate. TLDR: worst case=1250$ per chip. Realistic=641$ per chip.
The way you talk you expect 12nm process to have a 0.00% defect rate because it is "old". All process nodes no matter how old or new will have some amount of defects. That's unavoidable. You not only expect Nvidia to sell their products at a loss but you also expect them to deny the laws of physics.

Fine i get you think it's overpriced. I told you there are reasons besides greed but you keep playing the same record over and over.
So let me ask you. What price for these cards would be reasonable for you?
You need to understand production cost is not the sole determinant of retail cost. Nor is R&D. The product itself must perform. Now, while we have been told it is awesome at Ray Tracing, it has not been benchmarked against available games. It's a little obvious that Nvidia did not show ANY, non RT performance figures.

I get they are pushing a new direction and I applaud that but it looks like a cash grab to me, and a lot of other observers, including web reviewers.

@W1zzard is the one that will tell us, assuming he gets a 2080ti.

And please, understand, my purchase history is GTX Titan, GTX 780ti (X2), Kingpin GTX 980ti, GTX 1080ti. So, not a hater, very much a supporter of the fastest consumer gfx cards.
Posted on Reply
#159
Tomorrow
the54thvoidYou need to understand production cost is not the sole determinant of retail cost. Nor is R&D.
I understand that perfectly. Yet some people fail to realize that if you add GDDR6 memory cost and other material costs to a chip that costs 641$ or more to make then it is not very suprising that you end up with an expesive graphics card.

Sure Nvidia has margins. Especially on FE models. There's litte doubt that FE cooler does not cost extra 200$ like in the case of 2080Ti FE vs SEP MSRP. But some people seem to believe Nvidia's margins are 50% or more from each 2080Ti card. That's just funny.
Posted on Reply
#160
TheOne
Four years ago the 970 launched for $330, two years later the 1070 launched for $450, and after another two years the 2070 is launching for $600, I dread to think where the price will be in another two years.

I'm really interested in seeing the performance of these cards in RT and standard titles, I hope the 2060 is more powerful than a 1080 Ti or Titan Xp.
Posted on Reply
#161
Somethingnew
TomorrowThe only assumptions i made are defect rate and wafer price. I'm pretty confident in those numbers. Unless someone can refute those numbers with facts (you can't unless you work at TSMC's production line). No im not an expert. Very few people on the internet are. Best case there are analysts.
The maturity of the process affects yields. I gave you numbers based on best case end realistic ecpectations on defect rate. TLDR: worst case=1250$ per chip. Realistic=641$ per chip.
The way you talk you expect 12nm process to have a 0.00% defect rate because it is "old". All process nodes no matter how old or new will have some amount of defects. That's unavoidable. You not only expect Nvidia to sell their products at a loss but you also expect them to deny the laws of physics.

Fine i get you think it's overpriced. I told you there are reasons besides greed but you keep playing the same record over and over.
So let me ask you. What price for these cards would be reasonable for you?
I think reasonable price would be in 750$-800$ range . Since paying more money that performance increase just seems like cash grab to me . And going back to tsmc 12 mm . I understand there will always be defective chips in the wafers . But the percentage would be much lower in mature process . And do you really think that they pay the same amount for mm2 that for 16 mm ? I think they are paying a lot less than that , since 12 mm is older tech , not cutting edge . So let's roll with the game of assumptions you started . If they pay half for mm2 that 16 mm , since it's in principle like I said slightly improved 16 nm , 12 mm with better yeilds , so if they would pay half of the fare of 16 mm when it was cutting edge , then it would result in the same price that 1080 ti , or even lower than that ? That's why I want to stop the assumption . One solid fact is that the price of Fe to Fe is 71 % more . And even if performance increase would be 50% it would still be more expensive per frame . Isn't that riddicoulus , usually , after generation price per frame was much cheaper . And now it's getting more expensive ? How can you justify that ? Isn't that defying logic sense ? Not to mention prevorious generation was doubling memory size , gtx 980 ti , to 1080 ti was double the amount of memory , now it's the same as prevorious generation . Not to mention low clokcs , higher tdp , power efficiency wouldnt be ground breaking , especially comparing prevorious generation gains for example Maxwell to Pascal .
TomorrowI understand that perfectly. Yet some people fail to realize that if you add GDDR6 memory cost and other material costs to a chip that costs 641$ or more to make then it is not very suprising that you end up with an expesive graphics card.

Sure Nvidia has margins. Especially on FE models. There's litte doubt that FE cooler does not cost extra 200$ like in the case of 2080Ti FE vs SEP MSRP. But some people seem to believe Nvidia's margins are 50% or more from each 2080Ti card. That's just funny.
Besides how did you come up with that 641 $ number , out of the moon ? If I would say that chip costs 200$ . Let's stop with that assumptions . It's just way overpriced from what it seems .
TomorrowI understand that perfectly. Yet some people fail to realize that if you add GDDR6 memory cost and other material costs to a chip that costs 641$ or more to make then it is not very suprising that you end up with an expesive graphics card.

Sure Nvidia has margins. Especially on FE models. There's litte doubt that FE cooler does not cost extra 200$ like in the case of 2080Ti FE vs SEP MSRP. But some people seem to believe Nvidia's margins are 50% or more from each 2080Ti card. That's just funny.
Just check Nvidia financial results . Gross margin of q1 2018 was 59.4% and that's a fact .
Posted on Reply
#162
Xzibit
SomethingnewJust check Nvidia financial results . Gross margin of q1 2018 was 59.4% and that's a fact .
Think your looking at last years.

63.3% Q2 2018/FY 2019
64.5% Q1 2018/FY 2019
Posted on Reply
#163
Mindweaver
Moderato®™
Everyone get along or move along.
Posted on Reply
#164
Somethingnew
XzibitThink your looking at last years.

63.3% Q2 2018/FY 2019
64.5% Q1 2018/FY 2019
Yes you are right , thanks for correction . I was looking at 1q/fy2018 my mistake. . The actual number is actually 64.5% for 2018 1q .
Posted on Reply
#165
ValenOne
No competition from RTG, hence Nvidia is free reign to price it's GPUs accordingly.
Posted on Reply
#166
Somethingnew
Exactly . Well I think I will stick with my gtx 1070 for much longer that I imagined . But for 1080p it's still more than enough for some time. I hope there will be some competition in the future , toherwise as consumers we are going to be screwed , I still remember the times when I got my 970 for 329 $ , then 1070 for 400 $ , now rtx 2070 is going to be for 500$ so almost 180 % over Maxwell MSRP , and what's even more concerning that they didn't disclose any real game performance numbers , tflops , I doubt it will be in gtx 1080 ti range , probably in gtx 1080 territory , like it was with Maxwell to Pascal , when gtx 1070 was in gtx 980 ti performance range . So I ask where is the progress considering price increase , the progress is called more money ? , Or the second money per frame , when usually price per frame was much lower after each generation ? . And as consumer who was thinking about upgrade , now I will vote with my wallet , and will stick to my gtx 1070 , since that price for midrange is sick , and gtx 2080 ti is just I sanely priced . I hope people will complain about that pricing otherwise that pricing will became the norm .
Posted on Reply
#167
Tomorrow
SomethingnewI think reasonable price would be in 750$-800$ range
Besides how did you come up with that 641 $ number , out of the moon ? If I would say that chip costs 200$ . Let's stop with that assumptions . It's just way overpriced from what it seems
Just check Nvidia financial results . Gross margin of q1 2018 was 59.4% and that's a fact .
So what's stopping you from getting RTX 2080 then? It's roughly in that price range. That is why different tiers exist. No one one is forcing you to buy Ti. You're complaining about the price of the Halo product. There is always diminishing returns in terms of performance the more expensive hardware you buy. That's always been the case and always will be. Vega64 is bad perf per $ compared to Vega56. 2700X is bad perf per $ compared to 2600X and so on.

641$ was explained earlier. It is a reasonable estimate based on available data for TU102 cost. Asking for TU102 in 750-800$ range means you're asking Nvidia to sell it with 0 margin. The only way they would do that, ever - is if AMD was breathing down their necks. Wich is not the case unfortunately.

Would i like cheaper high end cards? Sure. Who would not. But i also understand the cost of the chips themselves and that there is no competition.
Posted on Reply
#168
rtwjunkie
PC Gaming Enthusiast
TomorrowSo what's stopping you from getting RTX 2080 then? It's roughly in that price range. That is why different tiers exist.
And that is exactly why you don’t do what you are suggesting. Keep buying down each gen, and a few years from now someone goes from a xx80Ti to a xx30, and the only thing they can max out are games 5-10 years old.

Pricing will cause that, and it shouldn’t come to having to do that. Once you get to a tier you like....stay there. If you keep slipping down, it will be prohibitively more expensive to buy the tier you were always used to, whatever that is.
Posted on Reply
#169
Tomorrow
rtwjunkieAnd that is exactly why you don’t do what you are suggesting. Keep buying down each gen, and a few years from now someone goes from a xx80Ti to a xx30, and the only thing they can max out are games 5-10 years old.

Pricing will cause that, and it shouldn’t come to having to do that. Once you get to a tier you like....stay there. If you keep slipping down, it will be prohibitively more expensive to buy the tier you were always used to, whatever that is.
Don't get too attached. You have a point but a RTX 2080 or even 2070 should handle everything below 4K just fine. So by buying cards in that price range you still get performance increases. Used cards are another matter. New cards can never compete with used cards in terms of perf per $.
Posted on Reply
#170
Somethingnew
TomorrowDon't get too attached. You have a point but a RTX 2080 or even 2070 should handle everything below 4K just fine. So by buying cards in that price range you still get performance increases. Used cards are another matter. New cards can never compete with used cards in terms of perf per $.
I think this discussion is too long .
The conclusion is that Nvidia is trying to rip us off , and the price I crease is huge and in my opinion not justified . As for wafer price I found you overestimated the cost by at least 10000 $ , 20000 $ per wafer was a year ago , now it's probably a lot less . So if the rtx 2080 ti chip cost 350 $ maybe . Anyway that's the assumptions . And the price hike is way to much to be justified . Just look at financial results of Nvidia . They realise that they have no real competition in high end cards , that's why they are ripping us off . I think they could easily sell rtx 2080 ti for 800 $ which would be a 100$ more than gtx 1080 ti , and still make a lot of profit . But the point is that they are being greedy . What is really co concerning in my opinion is your mindset , trying to justyfi that incredibly big price increase . If we follow your thinking then soon for 500$ we will buy low end graphics , and we should be ok with that ? Price Increase bigger than performance gain , when it should be the other way around ? Its insane isn't it ? No real performance number during conference ? Isn't that sketchy as fuck ? That's what happens when there is no competition , soon Nvidia could have 100 % margin , and everybody should be happy and give them as much as they want ? . So let's raise taxes , why not , our government can do that ,let's give government 80 % of our earnings to them , and we should be happy about that. What is most concerning is justifing the greedy company , and trying every possible way to justify that price , why not . Next generation could cost 2000$ , and then , yeah sure , new technology , so we should gladly pay more etc ? That defies common logic , price performance ratio . That's the consequence of no competition , and company that is taking advantage of that . Imagine a situation , when there is one high speed internet provider , and if you want high speed connection , and after three years he wants 72 % more , just because he doesn't have competition . Yeah sure , when even Xbox one X , can playy some games at 4k 30 fps , while the rtx 2070 which will be over 500 $ the price of whole console , probably couldn't handle 60 FPS , where there is logic in that, and that's the price of GPU only . So every year , I should go tier down ? Where is progress in that . What about price to performance increase , like it was every generation . It defies logic .
Posted on Reply
#171
Prima.Vera
bugbut neither was the first Voodoo graphics accelerator
Yo're joking right? VooDoo was and still is THE BEST 3D accelerator ever produced, from the impact to the gaming market perspective. It literally revolutionized the Gaming industry and put it on the path that there is today. No other card has impacted game visual quality as Voodoo did.
rtwjunkieThe market will determine if these are priced too high to sell all the stock. If it’s too high, they and AIB’s will adjust.
This is going to be another bullshit fest from nVidia and partners. Due to very low yields and availability, most of the cards will be sold out, but they will claim that the sales are booming and they cannot coupe with the orders, therefore incresing their sahres value and making them even richer on false premises.... GREEDY unscrupulous Capitalism at it's finest!
Posted on Reply
#172
N3M3515
Holy shit, what's the matter with those sky high prices???
Posted on Reply
#173
GoldenX
N3M3515Holy shit, what's the matter with those sky high prices???
The higher your market share is, the higher your prices get, the dumber the "loyal clients" get. Look at Apple. Thank god Intel is no longer an example here.
Posted on Reply
#174
coolernoob
guys, why are you keep doing this:

this is not what Jensen wanted you to think about now (with that full moth of preorder time before first real reviews - that is the true inovation here :D)... think about all the GigaRays and RTX-OPS and preorder now, for just "499" (that actually might be 1299,- for the card he demoed)... think about the technology and 10 year development, think about die size (size DOES matter!), think about Jensen's enthusiasm and press that preorder. Dont be a downer and dont think about those +20% performance real world gains and over +50% price gains (in 2.5 year time vs previous gen Pascal) - that kind of negativity is for loosers, downers, amd fanboiz, you are not like that - buy now - think later. Dont think about RayTrace as performance hog aka - 30-50fps on a 1920x1080 resolution on a 1299$ gpu :D, think about GIGA and Rays in a speed of light, think about those slideshow gameplays (with RTX ON) Jensen sowed us - those "just works!". Most of you downers got this wrong, but some enlightened folks (enlightened by the giga Rays of pure bliss :D) here gets the Jensens vision and the future and announced their preorder plans like a champs *stands up and claps with tears in his eyes and then salutes the green flag* - you are da real MVPs - let the Jensen Rays be with you
Posted on Reply
#175
Tomorrow
SomethingnewI think this discussion is too long
Yes it is.
SomethingnewThe conclusion is that Nvidia is trying to rip us off
Your conclusion is.
Somethingnewand in my opinion not justified
Fair enough.
SomethingnewAs for wafer price I found you overestimated the cost by at least 10000 $ , 20000 $ per wafer was a year ago , now it's probably a lot less
It can't be ~10000$ because Quadro RTX 8000 (also a TU102, albeit a fully enabled one) is itself 10000$: www.techpowerup.com/gpudb/3306/quadro-rtx-8000

This would contradict your claim that Nvidia is massively increasing margins. Surely they would not swallow half the wafer cost to keep RTX 8000 at 10000?

So if it's so expensive then simple logic dictates that the wafer must be more expensive. 25000$ is not unreasonable cost even if they get only a handful of fully enabled TU102's from it.
SomethingnewSo if the rtx 2080 ti chip cost 350 $ maybe
In your dreams maybe. Here on the real world things are less rosy...
SomethingnewI think they could easily sell rtx 2080 ti for 800 $ which would be a 100$ more than gtx 1080 ti , and still make a lot of profit
They could but as we both know they have no incentive to do so.
SomethingnewWhat is really co concerning in my opinion is your mindset , trying to justyfi that incredibly big price increase . If we follow your thinking then soon for 500$ we will buy low end graphics , and we should be ok with that ?
Don't put words in my mouth. That is your interpretation. I'm justifying no one. I'm trying to see why it's so expensive instead of jumping to the most obvious and easy answer like most people.
SomethingnewNo real performance number during conference ? Isn't that sketchy as fuck ?
Yeah that was strange.
SomethingnewImagine a situation , when there is one high speed Internet provider , and if you want high speed connection , and after three years he wants 72 % more , just because he doesn't have competition . Yeah sure
Do they also increase your speeds by 20-30% before asking that? Will the price per Mbit/s remain the same? Not the best comparison.
Somethingnewwhen even Xbox one X , can play some games at 4k 30 fps , while the rtx 2070 which will be over 500 $ the price of whole console , probably couldn't handle 60 FPS , where there is logic in that, and that's the price of GPU only
Consoles are running bare metal API's and games are perfectly optimized for those. Plus checkerboard 4K rendering meaning not real 4K. That is why they can run what seems like 4K 30 on much slower hardware.
Posted on Reply
Add your own comment
Dec 18th, 2024 10:12 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts