# NVIDIA GeForce RTX 2060 Founders Edition Pictured, Tested



## btarunr (Dec 31, 2018)

Here are some of the first pictures of NVIDIA's upcoming GeForce RTX 2060 Founders Edition graphics card. You'll know from our older report that there could be as many as six variants of the RTX 2060 based on memory size and type. The Founders Edition is based on the top-spec one with 6 GB of GDDR6 memory. The card looks similar in design to the RTX 2070 Founders Edition, which is probably because NVIDIA is reusing the reference-design PCB and cooling solution, minus two of the eight memory chips. The card continues to pull power from a single 8-pin PCIe power connector.

According to VideoCardz, NVIDIA could launch the RTX 2060 on the 15th of January, 2019. It could get an earlier unveiling by CEO Jen-Hsun Huang at NVIDIA's CES 2019 event, slated for January 7th. The top-spec RTX 2060 trim is based on the TU106-300 ASIC, configured with 1,920 CUDA cores, 120 TMUs, 48 ROPs, 240 tensor cores, and 30 RT cores. With an estimated FP32 compute performance of 6.5 TFLOP/s, the card is expected to perform on par with the GTX 1070 Ti from the previous generation in workloads that lack DXR. VideoCardz also posted performance numbers obtained from NVIDIA's Reviewer's Guide, that point to the same possibility.



 

 




In its Reviewer's Guide document, NVIDIA tested the RTX 2060 Founders Edition on a machine powered by a Core i9-7900X processor and 16 GB of memory. The card was tested at 1920 x 1080 and 2560 x 1440, its target consumer segment. Performance numbers obtained at both resolutions point to the card performing within ±5% of the GTX 1070 Ti (and possibly the RX Vega 56 from the AMD camp). The guide also mentions an SEP pricing of the RTX 2060 6 GB at USD $349.99. 



 



*View at TechPowerUp Main Site*


----------



## Paganstomp (Dec 31, 2018)

They also have some benchmarks.  If it matters to anywho.


----------



## eidairaman1 (Dec 31, 2018)

Triple slot?


----------



## TheOne (Dec 31, 2018)

Apparently it's also suppose to cost $350 and be bundled with an EA title.


----------



## Gorstak (Dec 31, 2018)

I'm waiting for RT 3030, if released. I may buy a passive one.


----------



## eidairaman1 (Dec 31, 2018)

So much for low end being a viable price choice. My vaporX 290 was $460 in 2014.


----------



## Berfs1 (Dec 31, 2018)

Is it just me or is the 8 pin on the side of the card?


----------



## lexluthermiester (Dec 31, 2018)

eidairaman1 said:


> So much for low end being a viable price choice. My vaporX 290 was $460 in 2014.


Everyone is complaining about price. Is it really that expensive? And if so, is really that difficult to save money for an extra month or two?


Berfs1 said:


> Is it just me or is the 8 pin on the side of the card?


Seems to be at the end of the card.


----------



## Berfs1 (Dec 31, 2018)

lexluthermiester said:


> Everyone is complaining about price. Is it really that expensive? And if so, is really that difficult to save money for an extra month or two?
> 
> Seems to be at the end of the card.


Side as in on the right side of the GPU. Like, for better cable management


----------



## eidairaman1 (Dec 31, 2018)

lexluthermiester said:


> Everyone is complaining about price. Is it really that expensive? And if so, is really that difficult to save money for an extra month or two?
> 
> Seems to be at the end of the card.



It truly isn't worth 300 usd or more


----------



## Xzibit (Dec 31, 2018)

lexluthermiester said:


> *Everyone is complaining about price. Is it really that expensive?* And if so, is really that difficult to save money for an extra month or two?
> 
> Seems to be at the end of the card.



All you have to do is look at that steam hardware survey you like to quote and get an idea of whats too much for the Steam majority,  If it wasnt too difficult those SHS stats would be different.


----------



## lexluthermiester (Dec 31, 2018)

Berfs1 said:


> Side as in on the right side of the GPU. Like, for better cable management


I know what you said, doesn't look that way. I looks like it's on the end of the card.


eidairaman1 said:


> It truly isn't worth 300 usd or more


Vote with your wallet and don't get one.


Xzibit said:


> All you have to do is look at that steam hardware survey *you like to quote* and get an idea of whats too much for the Steam majority,  If it wasnt too difficult those SHS stats would be different.


I've quoted it once. And you're going to through it in my face like it's a thing? Grow up a bit.

BTW, you failed to invalidate my point.


----------



## Vayra86 (Dec 31, 2018)

lexluthermiester said:


> BTW, you failed to invalidate my point.



What do you want to hear? Something that goes entirely against the current sentiment regarding Turing? You're getting ripped off, its that simple. Artificial price changes don't change the fact you're paying the full price for 3 year old performance. Yes, even your 2070 or what was it, is overpriced, yesteryears performance.

Its not a positive for Turing that the price is equal to Pascal. Not. At. All. Anyone explaining it like that is doing it to justify a bad deal. And its not a positive this 2060 is even a Turing card to begin with, because its DXR performance is abysmal.

Oh yeah, and to top it off, Nvidia gives you six 2060's to pick from  Because its such a well rounded card, why not use a myriad of different buses and VRAM chips. 350 bucks for scraps and leftovers 'Good deal'.


----------



## Renald (Dec 31, 2018)

lexluthermiester said:


> BTW, you failed to invalidate my point.



400$ for an high end card (meaning it's the best a company can do) like the the 290 for the record, is understandable.
400$ for a mid-range card (see what a 2080 Ti can do) is not good.

Nvidia doubled they're  prices  because AMD is waiting for next year to propose something,  because they are focusing on CPU and don't have Nvidia nor Intel firepower. They can't do both GPU and CPU.
So you are bending over waiting for the theft "a month or two".

Here is your invalidation.


----------



## lexluthermiester (Dec 31, 2018)

Vayra86 said:


> You're getting ripped off, its that simple.


That's an opinion, not a merit based fact.


Vayra86 said:


> Yes, even your 2070 or what was it, is overpriced, yesteryears performance.


Also an opinion not based on merit. Yes, I paid a premium, but I also got a card which performs 50%-60% better than it's previous gen counterpart, which I traded up from. That is an improvement and certainly not "yesteryears" performance.


Vayra86 said:


> Its not a positive for Turing that the price is equal to Pascal.


But it is positive that there has been a performance improvement.


Vayra86 said:


> Not. At. All.


Your. Misguided. Opinion.


Vayra86 said:


> Anyone explaining it like that is doing it to justify a bad deal.


No, anyone explaining that way appreciates the advancements and performance increase.


Renald said:


> 400$ for an high end card (meaning it's the best a company can do) like the the 290 for the record, is understandable.


Highend cards haven't been $400 in almost 10 years. Your point is factless.


Renald said:


> 400$ for a mid-range card (see what a 2080 Ti can do) is not good.


Your opinion. You're not the maker/manufacturer. You don't set prices. Vote with your wallet.


Renald said:


> Here is your invalidation.


My arguments are based upon merit and fact. Yours were based on feelings. Invalidation rejected.

Anyone complaining about the price needs to learn how to budget their money better and save up for a bit longer to get the best, whether it's AMD or NVidia.


----------



## Mistral (Dec 31, 2018)

This actually doesn't seem too horrible... curious placement for the ports though


----------



## M2B (Dec 31, 2018)

Vayra86 said:


> What do you want to hear? Something that goes entirely against the current sentiment regarding Turing? You're getting ripped off, its that simple. Artificial price changes don't change the fact you're paying the full price for 3 year old performance. Yes, even your 2070 or what was it, is overpriced, yesteryears performance.
> 
> Its not a positive for Turing that the price is equal to Pascal. Not. At. All. Anyone explaining it like that is doing it to justify a bad deal. And its not a positive this 2060 is even a Turing card to begin with, because its DXR performance is abysmal.
> 
> Oh yeah, and to top it off, Nvidia gives you six 2060's to pick from  Because its such a well rounded card, why not use a myriad of different buses and VRAM chips. 350 bucks for scraps and leftovers 'Good deal'.



"3 Year old pefromance"
So you were expecting a 350$ GPU with 1080Ti performamce on the same node?
Cum on, I know you know better than that.


----------



## Robcostyle (Dec 31, 2018)

Ignore that fanboy, guys, he does nothing exept trolling - ~1500$ for 2080 ti is ridiculous, and nothing changes that fact.

As for 2060, I bet the prices are gonna be up to 500$ for this one.


----------



## lexluthermiester (Dec 31, 2018)

Robcostyle said:


> Ignore that fanboy, guys, he does nothing *except* trolling


Says the user who's currently trolling.


Robcostyle said:


> 1500$ for 2080 ti is ridiculous, and nothing changes that fact.


Except that I didn't buy a 2080ti. I bought a 2080 and spent less than $800 for it. What was that about fact's?

Here's a set of facts;
1. Every generation of new GPU's get a price increase.
2. Every generation of GPU's offers a performance increase.
3. People always complain about said price increase while trying to minimize or ignoring the increase in performance.


----------



## Vayra86 (Dec 31, 2018)

lexluthermiester said:


> That's an opinion, not a merit based fact.
> 
> Also an opinion not based on merit. Yes, I paid a premium, but I also got a card which performs 50%-60% better than it's previous gen counterpart, which I traded up from. That is an improvement and certainly not "yesteryears" performance.
> 
> ...



Its an improvement, you've got a great deal, Turing is awesome. Happy? Opinions are like assholes, ey 



lexluthermiester said:


> Everyone is complaining about price. Is it really that expensive? And if so, is really that difficult to save money for an extra month or two



This is called baiting, by the way, so be careful about calling out the trolls here, that might backfire on you.



M2B said:


> "3 Year old pefromance"
> So you were expecting a 350$ GPU with 1080Ti performamce on the same node?
> Cum on, I know you know better than that.



Did I ever say that? I'm just saying we're standing still for years now so paying the same price for the same performance isn't a good deal. You could have done that 3 years ago, and in the meantime, games and the performance they request have been going up, not down. Therefore, its simply a worse deal at this point in time. Nothing more, nothing less. You can sugar coat that with all sorts of nonsense, but its still what it is and the vast majority can see that too. Pulling that out of context doesn't change it either.


----------



## lexluthermiester (Dec 31, 2018)

Vayra86 said:


> Its an improvement, you've got a great deal


Yes, it is an improvement, but I didn't say I got a "great deal", I just said I'm ok with it and that everyone complaining about the prices are whining needlessly and fruitlessly..


Vayra86 said:


> Opinions are like assholes, ey


Except that in this debate, my "opinions" are based on merit and fact.


----------



## Tsukiyomi91 (Dec 31, 2018)

$350. that's the supposed price tag for the reference card, eh? AIBs will have no issues marking up that price with their own version of the GPU, up north of $400 or higher, depending on whether they got their hands on the TU106-300-A1 variant, bump up the core/memory clocks, slap on 2.5 or triple slot coolers with RGB all over it etc. With that said, I say the specs are decent but never once see a mid-range GPU to have that many Tensor Cores & RT Cores.


----------



## Vayra86 (Dec 31, 2018)

lexluthermiester said:


> Yes, it is an improvement, but I didn't say I got a "great deal", I just said I'm ok with it and that everyone complaining about the prices are whining needlessly and fruitlessly..
> 
> Except that in this debate, my "opinions" are based on merit and fact.



It is as needless and fruitless as you saying 'save a few more months' or asking rhetorical questions.

And no, your opinions are just opinions like every other. When it comes to facts, you've paid full price for 3 year old performance, which makes it relatively expensive and therefore logical for _other_ people to feel ripped off, which is why many say they won't pay this price. That is all. Move on.


----------



## lexluthermiester (Dec 31, 2018)

Tsukiyomi91 said:


> AIBs will have no issues marking up that price with their own version of the GPU, up north of $400 or higher


Very likely. It's also equally likely that many AIB's will have offerings that come in lower than the MSRP. That happened with 2070/2080/2080ti, so it seems reasonable that it will happen with the 2060/2050(?).



Vayra86 said:


> you've paid full price for 3 year old performance


Wrong, and every benchmark showing performance numbers bare that out as fact. The 2080 cleanly beats out the 1080 and beats out 1080ti if it doesn't match it. Also RTX offers advancements Pascal can not. The 2080/2080ti and RTX Titan are the best on the market. NVidia knows this and demands a premium price for it. If you don't want to pay that price, ok, don't buy one. Settle for less.


Vayra86 said:


> which makes it relatively expensive and therefore logical for _other_ people to *feel* ripped off


Key word there..


----------



## Tsukiyomi91 (Dec 31, 2018)

It's possible. One of the few things may happen is AIBs such as Zotac, MSI etc may sell the cards at slightly lower than expected MSRP for the upcoming 2060, as a means for those who want to get their hands on it without caring much about unwanted aesthetics such as RGB, over the top heavy air coolers & whatnot.


----------



## Vayra86 (Dec 31, 2018)

lexluthermiester said:


> Very likely. It's also equally likely that many AIB's will have offering that come in lower than $350. That happened with 2070/2080/2080ti, so it seems reasonable that it will happen with the 2060/2050(?).
> 
> 
> Wrong, and every benchmark showing performance numbers bare that out as fact. The 2080 cleanly beats out the 1080 and beats out 1080ti if it doesn't match it. Also RTX offers advancements Pascal can not. The 2080/2080ti and RTX Titan are the best on the market. NVidia knows this and demands a premium price for it. If you don't want to pay that price, ok, don't buy one. Settle for less.
> ...



Try again, with those facts of yours. The cards are equal and the 2080 never cleanly beats a 1080ti. So the fact remains, its yesterdays' performance at full price, or more than full price, because 1080ti's were cheaper at multiple points in the recent past. The only situations it 'wins' are those with 20 FPS minimums or sub 30 FPS averages @ 4K.










Said it before, your comparison is pitting an FE 1080ti versus AIB 2080's. Nvidia played that very well. It would be good to realize it. Note how the 2080 OC gains literally zero FPS.

Its alright though, cognitive dissonance is a bitch, isn't it.


----------



## Nxodus (Dec 31, 2018)

You can't expect every new generation to be 100% faster than the last one, technology slows down naturally after hitting a peak. I really don't understand why people think the 20 series is a rip-off. You have high expectations for the upcoming AMD cards but it will be the same thing, a minor improvement, and die shrinks wont change that. Better get used to it that we hit a peak and new generations will only have a "minor" improvement over the last ones


----------



## Manu_PT (Dec 31, 2018)

lexluthermiester said:


> Very likely. It's also equally likely that many AIB's will have offering that come in lower than $350. That happened with 2070/2080/2080ti, so it seems reasonable that it will happen with the 2060/2050(?).
> 
> 
> Wrong, and every benchmark showing performance numbers bare that out as fact. The 2080 cleanly beats out the 1080 and beats out 1080ti if it doesn't match it. Also RTX offers advancements Pascal can not. The 2080/2080ti and RTX Titan are the best on the market. NVidia knows this and demands a premium price for it. If you don't want to pay that price, ok, don't buy one. Settle for less.
> ...



The RTX2080 doesnt beat the gtx 1080ti and while it is better than 1080, is also way more expensive. Yeah it has ray tracing wich 0,1% of the games use.

The price vs performance ratio on this 2xxx gen is worse than the last 4 gens. Facts.


----------



## TheOne (Dec 31, 2018)

With so many rumored variants it's hard to guess where the prices on these GPU's will fall.


----------



## lexluthermiester (Dec 31, 2018)

Manu_PT said:


> The RTX2080 doesnt beat the gtx 1080ti


Yes, it does, in many games. See below..


Manu_PT said:


> is also way more expensive.


When I looked at buying mine the 1080ti's were all $50 to $100 more expensive. Soooo, no.


Vayra86 said:


> Try again, with those facts of yours.


Try these for facts from TPU's own reviews;
https://www.techpowerup.com/reviews/ASUS/GeForce_RTX_2070_Strix_OC/31.html





Or here;
https://www.techpowerup.com/reviews/Zotac/GeForce_RTX_2080_AMP_Extreme/31.html





My EVGA 2080 performs similarly to this Zotac.
These are provable performance facts from the reviewers on this very site. You were saying?


----------



## Vayra86 (Dec 31, 2018)

lexluthermiester said:


> Yes, it does, in many games. See below..
> 
> When I looked at buying mine the 1080ti's were all $50 to $100 more expensive. Soooo, no.
> 
> ...







Move on. Please. You're making a fool of yourself.


----------



## kastriot (Dec 31, 2018)

Well i guess for 350$ or 300 euros it would be good card lacking competition from AMD atm.


----------



## lexluthermiester (Dec 31, 2018)

Vayra86 said:


> Move on. Please. You're making a fool of yourself.


I just proved you wrong, with fact based merit and I'm making a fool of *myself*? Keep telling yourself that..


----------



## Vayra86 (Dec 31, 2018)

lexluthermiester said:


> I just proved you wrong, with fact based merit and I'm making a fool of *myself*? Keep telling yourself that..



Do you not read? Do you not interpret the data you get presented and present yourself? TPU reviews a stock/FE 1080ti versus a nearly maxed out 2080. I said this when you bought your 2080. I said it here. I showed you sources that underline that.

Again: cognitive dissonance is at full force with you, you even fail to read when I highlighted and repeated the text right in front of you. Instead you happily ignore that, and proceed to present the same data as 'fact based merit'. Yes those numbers are facts, now check what they're based on. Hell - even *TPUs testing show every RTX card within 2-3% stock vs OC.*

My oh my. Remember that topic where you completely miscalculated millisecond delays? You needed four posts to have the penny drop. Just go back, take a deep breath, and read the whole thing start to finish. Try harder, you can do it!

Another example of your cognitive dissonance is saying 'when I bought my 2080, the 1080ti was more expensive'. That just means your timing sucks. Not that you suddenly had a good deal because you waited too long. Yesterdays performance.


----------



## FYFI13 (Dec 31, 2018)

Ignorance is hard with this one. Ignored for good.


----------



## Manu_PT (Dec 31, 2018)

Vayra86 said:


> Do you not read? Do you not interpret the data you get presented and present yourself? TPU reviews a stock/FE 1080ti versus a nearly maxed out 2080. I said this when you bought your 2080. I said it here. I showed you sources that underline that.
> 
> Again: cognitive dissonance is at full force with you, you even fail to read when I highlighted and repeated the text right in front of you. Instead you happily ignore that, and proceed to present the same data as 'fact based merit'. Yes those numbers are facts, now check what they're based on. Hell - even *TPUs testing show every RTX card within 2-3% stock vs OC.*
> 
> ...



Also, according to his own logic, no one knows why he didn´t buy a RTX 2080ti instead, because afterall:



lexluthermiester said:


> is really that difficult to save money for an extra month or two?



Why, as a consumer, complain about product pricing? is it that difficult to save money for 2 extra months? Logic of the day.


----------



## Vya Domus (Dec 31, 2018)

lexluthermiester said:


> Everyone is complaining about price. Is it really that expensive? And if so, is really that difficult to save money for an extra month or two?



You know the answer to that question all to well, 350$ for an x60 card is nuts and, no ,not everybody can do that. It it were that simple this wouldn't a matter of discussion, I have never seen a more weak and devoid of logic argument than this one. I am sorry, I can't help but notice how you always manage to find an excuse or explanation every time the value of this new lineup of cards is questioned. Incidentally you own one of those cards too, hmm.


----------



## lexluthermiester (Dec 31, 2018)

Vayra86 said:


> Again: cognitive dissonance is at full force with you, you even fail to read when I highlighted and repeated the text right in front of you. Instead you happily ignore that, and proceed to present the same data as 'fact based merit'. Yes those numbers are facts, now check what they're based on. Hell - even *TPUs testing show every RTX card within 2-3% stock vs OC.*
> 
> My oh my. Remember that topic where you completely miscalculated millisecond delays? You needed four posts to have the penny drop. Just go back, take a deep breath, and read the whole thing start to finish. Try harder, you can do it!
> 
> Another example of your cognitive dissonance is saying 'when I bought my 2080, the 1080ti was more expensive'. That just means your timing sucks. Not that you suddenly had a good deal because you waited too long. Yesterdays performance.


And like that conversation, you're ignoring and/or overlooking context. Or perhaps it's completely overhead like a Concorde at Mach 2. Try re-reading and paying close attention to choices of vocabulary..


FYFI13 said:


> Ignorance is hard with this one. Ignored for good.


Ok then, Bye bye.


Vya Domus said:


> You know the answer to that question all to well, 350$ for a x60 card is nuts and no not every one can do that.


I'm not saying it's a great price. I'm saying that the performance offered by the entire RTX line justifies the price in NVidia's eyes. As they set the prices and they call the shots the choice is save up to get one or not. Everyone can save money. If a person can't afford it, perhaps that person needs to re-examine their priorities.


Vya Domus said:


> Incidentally you own one of those cards too, hmm.


That's right. And I have no problem with the price I paid. The performance increase justifies it to me.


----------



## Vayra86 (Dec 31, 2018)

lexluthermiester said:


> And like that conversation, you're ignoring and/or overlooking context. Or perhaps it's completely overhead like a Concorde at Mach 2. Try re-reading and paying close attention to choices of vocabulary..
> 
> Ok then, Bye bye.
> 
> ...



Cool. So we've established its not a great price. So its also not a great deal. You're happy with that. The performance of yesterday is justifiable to you at that price even though you could've had it much cheaper if you timed it right. Thank you. Let's move on.

*Note: you needed a full page this time instead of four posts.


----------



## Vya Domus (Dec 31, 2018)

lexluthermiester said:


> I'm saying that the performance offered by the entire RTX line justifies the price in NVidia's eyes.



That's not even worth mentioning, this matters to absolutely *no one*. No one buys something thinking to themselves, "Yeah this pile of cash is totally justified because Nvidia thinks so".




lexluthermiester said:


> If a person can't afford it, perhaps that person needs to re-examine their priorities.



Shifting the entire weight of the argument on the buyer shows you have no footing in your logic. Give it up, you have nothing to back your opinion up. Of which I don't understand why you are being so adamant to stick by, no one has a problem with what you buy and how great of a value it was but you are trying to turn this into a fact of sorts.


----------



## Vayra86 (Dec 31, 2018)

Vya Domus said:


> That's not even worth mentioning, this matters to absolutely *no one*. No one buys something thinking to themselves, "Yeah this pile of cash is totally justified because Nvidia thinks so".
> 
> 
> 
> ...



You know I'd drop this if it wasn't too much fun. Now its Nvidia's price that is justifiying the buyers' purchase and I'm not reading context 

What's next? Inflation?


----------



## Turmania (Dec 31, 2018)

1070ti performance not bad at all. But depends on the price I suppose. If 349 is for founders edition we can expect 250 to 300 for standard models. 8pin leads me to belive this will be higher then 1060 consumption which was around 120w.maybe this will consume 150w.


----------



## lexluthermiester (Dec 31, 2018)

Vya Domus said:


> No one buys something thinking to themselves, "Yeah this pile of cash is totally justified because Nvidia thinks so".


You're right. They buy it thinking: " Yeah I want this level of performance and I'm willing to spend this money to get it. "


Vya Domus said:


> Shifting the entire weight of the argument on the buyer shows you have no footing in your logic.


Your opinion. My logic is that not only of a buyer but also that of a retailer. And I'm building systems almost daily for people who want premium performance and are willing to pay for it. And most of the time it's for people who want the 2080ti. People arguing that these prices are to high are clearly out of touch with the buying public.



Vayra86 said:


> You know I'd drop this if it wasn't too much fun.


Right?


----------



## Vayra86 (Dec 31, 2018)

lexluthermiester said:


> People arguing that these prices are to high are clearly out of touch with the buying public.







Please, go play a game or something. I might die over here

*note how the curve is already plateau'd and no longer rising as it did at launch. Perhaps by 2020 it'll be 2%.


----------



## lexluthermiester (Dec 31, 2018)

Vayra86 said:


> View attachment 113729
> 
> Please, go play a game or something. I might die over here
> 
> *note how the curve is already plateau'd and no longer rising as it did at launch. Perhaps by 2020 it'll be 2%.


Oh please. You can not be serious with that...


----------



## Vya Domus (Dec 31, 2018)

lexluthermiester said:


> You can not be serious with that...



It's as serious as your claim that you cannot complain about pricing.


----------



## lexluthermiester (Dec 31, 2018)

Vya Domus said:


> It's as serious as your claim that you cannot complain about pricing.


Oh, we can complain til the cows come home, but what good is it going to do? You think NVidia will change their minds?
So again, if someone wants the performance, they have to pay the price. If not get something else..


----------



## Vya Domus (Dec 31, 2018)

lexluthermiester said:


> So again, if someone wants the performance, they have to pay the price. If not get something else..



Not again, they can just save up, remember ?


----------



## lexluthermiester (Dec 31, 2018)

Vya Domus said:


> Not again, they can just save up, remember ?


Of course..


----------



## Pumper (Dec 31, 2018)

kastriot said:


> Well i guess for 350$ or 300 euros it would be good card lacking competition from AMD atm.



On what planet a 350 USD GPU was ever lower in €? It will be 350€ at the very least, but not likely closer to 375€.


----------



## jabbadap (Dec 31, 2018)

Hmh, this being mainly FullHD card, BFV 1080p 65 FPS RTX on med does not sound too bad. Maybe this is the most balanced Turing card after all. I kind of expected $399 for FE and little less for vanilla card. So $349.99 for both, while is still a _*lot*_ for xx60 card, is not overpriced compared to perf/$ of Pascals... And before anyone starts I'm not counting sales/rebates or used card market.


----------



## cucker tarlson (Dec 31, 2018)

eidairaman1 said:


> It truly isn't worth 300 usd or more


it isn't,but everyone's jacking up prices these days,and midrange nvidia cards are no exception. I'd rather have this at 350 than rx590,second refresh of polaris with ddr5 at friggin 280, no question about that.If the leak is to be true, 2060 is 35-40% faster while more efficient at the same time.


----------



## efikkan (Dec 31, 2018)

Let's judge the value when we see the pricing and actual performance.
The performance level suggested in the "leak" puts this close to GTX 1080, and also Vega 64. If this is accurate, and it's priced below Vega 64, it will be a much better buy.


----------



## lexluthermiester (Dec 31, 2018)

efikkan said:


> Let's judge the value when we see the pricing and actual performance.
> The performance level suggested in the "leak" puts this close to GTX 1080, and also Vega 64. If this is accurate, and it's priced below Vega 64, it will be a much better buy.


Agreed.


----------



## Vayra86 (Dec 31, 2018)

Nxodus said:


> You can't expect every new generation to be 100% faster than the last one, technology slows down naturally after hitting a peak. I really don't understand why people think the 20 series is a rip-off. You have high expectations for the upcoming AMD cards but it will be the same thing, a minor improvement, and die shrinks wont change that. Better get used to it that we hit a peak and new generations will only have a "minor" improvement over the last ones



Its quite simple, perf/dollar. It used to *improve *with every generation, and with Turing, it does not. And that is bad, seeing as we've already been looking at Pascal's performance for quite some time now. Turing is relatively late, and advances nothing in terms of absolute performance. It just added a new 1200 dollar tier on top of the stack.

As an added bonus you get functionality that is currently implemented in ONE game and never really makes a difference in its overall experience, and the majority of the Turing product stack hasn't even got enough RT cores to make use of it proper.

As my sources point out, if you read the numbers right, its _not even a minor improvement_ at all. GPU performance increases have grinded to a complete halt at every price point except 1200 bucks. All to facilitate some new feature nobody knows will be vaporware or not in a few years.

*Now, the most important part of all this, is what you've just said yourself*: its not realistic to keep expecting massive performance jumps ad infinitum. In that vein, isn't it especially strange for Nvidia to _spend 30% of the die space *not* on absolute performance_, but on some new feature that only works for a tiny selection of games?

There is only one logic behind all of this, and that is trying to maximize the cashflow generated from already deployed tech. Turing's RT is no more than repurposed Volta technology, and Nvidia just threw it at the gaming wall to see if it sticks. So to me, the only real question here is "Do you want to support that business practice" or not. I don't, because I know that will make future GPUs that much more expensive to make for questionable benefits.



efikkan said:


> Let's judge the value when we see the pricing and actual performance.
> The performance level suggested in the "leak" puts this close to GTX 1080, and also Vega 64. If this is accurate, and it's priced below Vega 64, it will be a much better buy.



I agree. If it provides 1080/V64 perf at 350 (and not the hamstrung bad VRAM/bus choice, shitty chip-version option... yeah that's a lot of caveats Nvidia's built into Turing these days!) then we have an actual perf/dollar improvement. Minor, but present.

We all know however that won't be the case, because the 2070 already takes that spot. So instead of fooling ourselves with some weird, illogical 'leaks' that misrepresent actual performance (as I've pointed out with the 1080ti stock vs 2080 OC examples), let's just get real.


----------



## lexluthermiester (Dec 31, 2018)

Gonna borrow from another conversation for a moment with this breaking news story;

Some are proceeding with the logic that is based on the idea of comparing a 2060 to a 1070, a 1080 to a 2070, a 1080ti to a 2080 and a Titan to a 2080ti.

The rest of us are not doing that. We're comparing a 1060 to a 2060, 1070 to a 2070, a 1080 to a 2080, a 1080ti to a 2080ti and a GTX Titan to an RTX Titan. 

We now return you to your normally scheduled conversation..


----------



## Easy Rhino (Dec 31, 2018)

yup, looks like a graphics card.


----------



## cucker tarlson (Dec 31, 2018)

2060 is gonna be 50-60% faster than 1060 if the leak is true. That's the biggest perf improvement out of the whole turing line-up,most are around 40%.If this is close to 1080 performance,then consider that 1080 runs 1440p faster than 1060 does 1080p.


----------



## eidairaman1 (Dec 31, 2018)

lexluthermiester said:


> That's an opinion, not a merit based fact.
> 
> Also an opinion not based on merit. Yes, I paid a premium, but I also got a card which performs 50%-60% better than it's previous gen counterpart, which I traded up from. That is an improvement and certainly not "yesteryears" performance.
> 
> ...



Yeah I did budget to get the 290 VaporX in 2014 because then I was in between jobs. In fact I budgeted that entire year to build the rig that is in my signature because of bills, life etc. I am just thinking of other users, 300 for a low end card is ridiculous when it should be no more than $250. I can see a 2070/ti being 300-350.


----------



## Vayra86 (Dec 31, 2018)

cucker tarlson said:


> 2060 is gonna be 50-60% faster than 1060 if the leak is true. That's the biggest perf improvement out of the whole turing line-up,most are around 40%.If this is close to 1080 performance,then consider that 1080 runs 1440p faster than 1060 does 1080p.



Doesn't make sense though, because that will cannibalize the 2070 and we know that isn't Nvidia's style at all.


----------



## eidairaman1 (Dec 31, 2018)

PSI vote with my wallet because I am on my 290 VaporX. I also dont advocate a company with the color of greed.


----------



## Easy Rhino (Dec 31, 2018)

Every time a new lineup is released people complain about the pricing/value.


----------



## ensabrenoir (Dec 31, 2018)

...you know that no matter the facts or what we think ....Nvdia will sell a ton of these.  And yes of course  its over priced...its called business and profit.  They seem to have a good handle on what the market will bear.   The mythical $350 high end video card...we'll have to wait and  see just how disruptive Intel wants to be.  But all in all don't complain about the price of Porsche's when your intent is to only buy a mustang.  There is a model for everyone.  Buy  what makes you happy.


----------



## Vayra86 (Dec 31, 2018)

Easy Rhino said:


> Every time a new lineup is released people complain about the pricing/value.



And every time sales will determine whether that was justified or not. So far, market share doesn't show RTX to be a smash hit, and neither does Nvidia's stock price.

Compare that to Pascal - we also complained about it, but we all got one. Even under further inflated mining prices.


----------



## lexluthermiester (Dec 31, 2018)

Easy Rhino said:


> Every time a new lineup is released people complain about the pricing/value.


Thank You. 



Vayra86 said:


> So far, market share doesn't show RTX to be a smash hit, and neither does Nvidia's stock price.


And yet they regularly sellout everywhere. Funny that..


----------



## cucker tarlson (Dec 31, 2018)

Vayra86 said:


> Doesn't make sense though, because that will cannibalize the 2070 and we know that isn't Nvidia's style at all.


2070 is still faster than 1080. TPU says 20% but I trust hwunboxed review more and it shows 7%. 2060 is gonna be close to 15% slower than 2070. That's a difference. even 1070ti and 1080 coexist nicely with just under 10% and $100 between them.Remember 2070 has that extra 2gb too.


----------



## lexluthermiester (Dec 31, 2018)

cucker tarlson said:


> Remember 2070 has that extra 2gb too.


That's a good point.


----------



## SIGSEGV (Dec 31, 2018)

Easy Rhino said:


> Every time a new lineup is released people complain about the pricing/value.



not so much in the past. this time is worst and uncontrollable.


----------



## jabbadap (Dec 31, 2018)

cucker tarlson said:


> 2070 is still faster than 1080. TPU says 20% but I trust hwunboxed review more and it shows 7%. 2060 is gonna be close to 15% slower than 2070. That's a difference. even 1070ti and 1080 coexist nicely with just under 10% and $100 between them.Remember 2070 has that extra 2gb too.



 Which means more memory bandwidth and more ROPs too. 

Techspot uses OC custom gtx 1080(MSI Gaming X ) and TPU GTX 1080 FE, so in that way I trust more tpu it's more apples to apples comparison. That EVGA black review from TPU is stock to stock clocked both review.


----------



## EarthDog (Dec 31, 2018)

Easy Rhino said:


> Every time a new lineup is released people complain about the pricing/value.


Not everytime.  Not everytime was like this, however. The fact is many are dissatisfied with the price to performance ratio these cards put forth. The problem is the value add (RT/TC) isnt there yet. One AAA title with another to come soon (SOTR)...and well over a dozen more listed. NV is being vilified for being first to the market with technology and the corresponding price increase.

The humerous part, to me, is the same people shooting down RT are the same people that propped up AMD and Vulkan pleading to give it time to see the value. While an API and HW are different things, I'm sure the astute are able to see the irony there.

People...we cant have RT without cards being in the market. SOMEONE had to be first...it just happened to be NV. I dont think anyone is buying these cards for RT capabilities now but for performance over last gen. And while it isnt what they were used to (likely due to RT and TC), there are notable differences in each tier (1080ti to 2080 ti, etc). Sure, it isnt what it was, but the price we all pay for innovation. How would we see RT implemented well if not for putting it in the market??

Pricing was a kick in the pants. Because the market adjusted, for better or worse, a lot of these cards are now a no brainer over last gen. It is what it is.


Anyway, looking forward to see facts about performance and pricing on the RTX 2060 and deciding then. 3 pages of this discussion...jeeezus.lol

Edit: the source in the first post doesnt have those results...where are those from?????


----------



## xkm1948 (Dec 31, 2018)

Oh my, so much hate for something yet to be released.

So yeah, we all want RTX Titan performance at no more than $250 and great overclockabily. That won’t happen unless there is competition.


----------



## Rahnak (Dec 31, 2018)

lexluthermiester said:


> Some are proceeding with the logic that is based on the idea of comparing a 2060 to a 1070, a 1080 to a 2070, a 1080ti to a 2080 and a Titan to a 2080ti.



Ideally they should match the last gen next tier, especially if you're gonna bump up the price.

I recently looked at the price I paid for my cpu and gpu back when I built my PC in 2011 and I was pretty shocked. 185€ for Intel's top midrange CPU and 170€ for a 560 Ti. Compare that to today's pricing and midrange is considerably more expensive. And then you look over at console pricing and it's a lot more tame. Sometimes it looks like they are trying to kill the desktop market.


----------



## Vayra86 (Dec 31, 2018)

cucker tarlson said:


> 2070 is still faster than 1080. TPU says 20% but I trust hwunboxed review more and it shows 7%. 2060 is gonna be close to 15% slower than 2070. That's a difference. even 1070ti and 1080 coexist nicely with just under 10% and $100 between them.Remember 2070 has that extra 2gb too.



1070ti and 1080 only coexist nicely because 1080's suddenly became hard to find at reasonable pricing after the 1070ti launched 

Nvidia controls the supply chain very well - that overstock on Pascal cards? It was never an issue, and might have even been a contingency for Turing. They also showed with their FEs that they know exactly what buttons to push to get AIBs to get in line.



EarthDog said:


> Not everytime.  Not everytime was like this, however. The fact is many are dissatisfied with the price to performance ratio these cards put forth. The problem is the value add (RT/TC) isnt there yet. One AAA title with another to come soon (SOTR)...and well over a dozen more listed. NV is being vilified for being first to the market with technology and the corresponding price increase.
> 
> The humerous part, to me, is the same people shooting down RT are the same people that propped up AMD and Vulkan pleading to give it time to see the value. While an API and HW are different things, I'm sure the astute are able to see the irony there.
> 
> ...



I like this reading of the whole Turing deal to be honest with you. It all hinges on whether DXR will stick or not. So far I'm not convinced...


----------



## lexluthermiester (Dec 31, 2018)

SIGSEGV said:


> not so much in the past. this time is worst and uncontrollable.


I can think of at least three other times were the percentage jump in price was comparable to this one. The Geforce 2 to Geforce 3, the FX 5X00 series to 6X00 series and the 9X00 Series to the GTX 2X0 series. And that's just NVidia. ATI had two instances where the generational price jump was huge. ATI justified it because of vertex shaders and such, as did NVidia. 3DFX did one as well. The Voodoo 2 was twice the price of the Voodoo 1.


xkm1948 said:


> Oh my, so much hate for something yet to be released.


That's what I was thinking..


----------



## sepheronx (Dec 31, 2018)

EarthDog said:


> Not everytime.  Not everytime was like this, however. The fact is many are dissatisfied with the price to performance ratio these cards put forth. The problem is the value add (RT/TC) isnt there yet. One AAA title with another to come soon (SOTR)...and well over a dozen more listed. NV is being vilified for being first to the market with technology and the corresponding price increase.
> 
> The humerous part, to me, is the same people shooting down RT are the same people that propped up AMD and Vulkan pleading to give it time to see the value. While an API and HW are different things, I'm sure the astute are able to see the irony there.
> 
> ...



I am excited for more games to support Vulkan.

Anyway, I hope this also forces AMD to do something as well.  But we will wait a while.  And I hope prices eventually drop unlike what happened for the last 2 - 3 years.


----------



## lexluthermiester (Dec 31, 2018)

Rahnak said:


> Sometimes it looks like they are trying to kill the desktop market.


Make no mistake, if they could, they would.


----------



## EarthDog (Dec 31, 2018)

sepheronx said:


> I am excited for more games to support Vulkan.


Vulkan reminds me of Physx... it is there and can be useful in some titles... but isnt a big player. 

Vulkan really doesnt seem to have taken hold since its release. Not sure how it can gain traction. Time isnt on its side.


----------



## jabbadap (Dec 31, 2018)

sepheronx said:


> I am excited for more games to support Vulkan.
> 
> Anyway, I hope this also forces AMD to do something as well.  But we will wait a while.  And I hope prices eventually drop unlike what happened for the last 2 - 3 years.



If one uses only Linux for gaming(as myself), every working DX11 windows only tittle uses Vulkan. So yes it already is huge success on Penguin people.

I.E. both oldish AAA games Witcher 3 and GTAV works very well with DXVK.


----------



## Vayra86 (Dec 31, 2018)

EarthDog said:


> Vulkan reminds me of Physx... it is there and can be useful in some titles... but isnt a big player.
> 
> Vulkan really doesnt seem to have taken hold since its release. Not sure how it can gain traction. Time isnt on its side.



I think the only saving grace for Vulkan is widespread ARM adoption and it gaining traction on that.

*I like where this topic is going now, by the way.



lexluthermiester said:


> And yet they regularly sellout everywhere. Funny that..



Its not surprising for a new gen to sell out in the early days. You saw my userbenchmark adoption rate graph, it peaks, then it plateaus. Its the same with every new product release and doesn't tell us much, because the product is still scarce. There have been multiple press releases about delays and limited stock.


----------



## sepheronx (Dec 31, 2018)

EarthDog said:


> Vulkan reminds me of Physx... it is there and can be useful in some titles... but isnt a big player.
> 
> Vulkan really doesnt seem to have taken hold since its release. Not sure how it can gain traction. Time isnt on its side.



Vulkan was simply a replacement for DX12 to carry a lot of the same functions but work well with Linux.  Titles that have it, run well and look nice with it.  But you are right, it is still highly limited.  Although with Valve's backing, I hope it does improve more.  I imagine Intel will show more interest too now they decided to join the dGPU market for 2020.


----------



## eidairaman1 (Dec 31, 2018)

EarthDog said:


> Vulkan reminds me of Physx... it is there and can be useful in some titles... but isnt a big player.
> 
> Vulkan really doesnt seem to have taken hold since its release. Not sure how it can gain traction. Time isnt on its side.



Vulkan is a 3D API akin to D3D, physx is physics only.


----------



## EarthDog (Dec 31, 2018)

eidairaman1 said:


> Vulkan is a 3D API akin to D3D, physx is physics only.


Ummhmm. Not sure what your point is, however. 

My point was to share things that didnt exactly take hold or make as big of waves as they could have in the market. Details arent really relevant here.


----------



## eidairaman1 (Dec 31, 2018)

EarthDog said:


> Ummhmm. Not sure what your name point is, however.



What im getting at is vulkan is a full flaeged api alternative to dx, physx is like rtx merely, not a whole lot added to the graphics of anything.


----------



## EarthDog (Dec 31, 2018)

Ummhmm. See my edit.

Details....not relevant here. But appreciate the extra info for any who may not know.



....hey look a balloon!!!....drifts away from the subject..........


----------



## lexluthermiester (Dec 31, 2018)

Vayra86 said:


> You saw my userbenchmark adoption rate graph, it peaks, then it plateaus.


Didn't look to closely at it as it was cut off.


----------



## Vayra86 (Dec 31, 2018)

lexluthermiester said:


> Didn't look to closely at it as it was cut off.



Its nice to look at to get insight in what's happening. There is a clear trend even with the lower RTX GPUs. Relatively low user ratings compared to Pascal, and an early surge of sales followed by stagnation.






https://gpu.userbenchmark.com/


----------



## jabbadap (Dec 31, 2018)

eidairaman1 said:


> What im getting at is vulkan is a full flaeged api alternative to dx, physx is like rtx merely, not a whole lot added to the graphics of anything.



Of course it should not add a lot to graphics, it's physics middleware. It adds to physics simulations to make them feel more real life. Audio as audio, graphics as graphics and physics as physics. Ray tracing is replacing rasterizing on rendering pipeline, so it's a new(very old)more realistic way of doing visuals.


----------



## EarthDog (Dec 31, 2018)

Vayra86 said:


> Its nice to look at to get insight in what's happening. There is a clear trend even with the lower RTX GPUs. Relatively low user ratings compared to Pascal, and an early surge of sales followed by stagnation.
> 
> View attachment 113731
> 
> ...


It may not lie, but premture info ican be premature here. Though I believe the trend, I'd love to see two things...

1. 9xx series to 10xx series adaptation over the same length of time upon 10 series launch to form a base/context. What if 10 series was similar? We dont know.. 
2. This same chart in a year after it's been in the market.


----------



## Vayra86 (Dec 31, 2018)

EarthDog said:


> It may not lie, but premture info is premature here. Though I believe the trend, I'd love to see two things...
> 
> 1. 9xx series to 10xx series adaptation over the same length of time upon 10 series launch to form a base/context.
> 2. This same chart in a year after it's been in the market.



It seems the chart only goes back until March 17, still, seems a bit different. 1080ti has been a steady climb until Sept. 18. 980ti only lost 0.75% market share across the whole chart.

But I think the most telling of it all is User Ratings. The people who reported here also rated their GPU. Not a single RTX GPU gets past 71% rating, most Pascals are much higher.


----------



## EarthDog (Dec 31, 2018)

I bet it's a steady climb... most will be. Think about it. 

The proof in the pudding is adaptation from the the previous gen. Considering the blown out of bm proportion issues, I bet it flattens a bit... but well see it go back up as time goes on. 


It's good information...but lacks a proper context to be worth much.


----------



## efikkan (Dec 31, 2018)

Vayra86 said:


> Its nice to look at to get insight in what's happening. There is a clear trend even with the lower RTX GPUs. Relatively low user ratings compared to Pascal, and an early surge of sales followed by stagnation.
> https://gpu.userbenchmark.com/


Market share of what? People on that webpage?
Useless statistics are, well useless.


----------



## Vayra86 (Dec 31, 2018)

efikkan said:


> Market share of what? People on that webpage?
> Useless statistics are, well useless.



Did you even look at the numbers behind this data? The site is free to use and access, everyone has equal opportunity to do this, and there are no credible arguments to say a specific group of enthusiasts benches more x80's than they do x70's or x80ti's.

Statistics are always a slice of reality, and these numbers are pretty solid for statistical purposes:


----------



## Vya Domus (Dec 31, 2018)

EarthDog said:


> It may not lie, but premture info ican be premature here.



Actually, that's the point. These things haven't been around for long.


----------



## efikkan (Dec 31, 2018)

Vayra86 said:


> Did you even look at the numbers behind this data? The site is free to use and access, everyone has equal opportunity to do this, and there are no credible arguments to say a specific group of enthusiasts benches more x80's than they do x70's or x80ti's.
> 
> Statistics are always a slice of reality, and these numbers are pretty solid for statistical purposes:


Do you even know how statistics works?
I politely asked for what kind of market shares these represent? People on that webpage? Statistics from certain stores?

It should be clear that this does not match the real market, as you can see in the Steam Hardware Survey, the GTX 1060 market share is over 11 times greater than RX 580 and RX 480 combined!


----------



## jabbadap (Dec 31, 2018)

Pumper said:


> On what planet a 350 USD GPU was ever lower in €? It will be 350€ at the very least, but not likely closer to 375€.



Euro prices are always with VAT. So in Germany $350 is 363.81€ with 19% VAT and her in Finland it would translate to 379.09€ with 24% VAT.


----------



## EarthDog (Dec 31, 2018)

Vya Domus said:


> Actually, that's the point. These things haven't been around for long.


Which tells us we need more info or something to compare it to before we can look at the data and draw those conclusions from it. Hard to say it's more or less without seeing the previous gen's numbers, right?

Again, no context... so it's hard to say there is a difference in adaptation without knowing past info or having more data moving forward.

Premature conclusion drawn.... but I agree the writing is on the wall.


----------



## bug (Dec 31, 2018)

There's so much toxicity in this thread, I'm not even going to bother reading it all.

This card is about 33% faster than a GTX 1060 6GB, which is an above average generational jump. It's supposed to be selling for $350 (which is a lot), but the GTX 1060 6 GB FE was already $300, almost two years ago. Non-FE could be had for $50 less.

The rumored variant without DXR should be an exact replacement for the GTX 1060, as far as price is concerned.


----------



## Vya Domus (Dec 31, 2018)

EarthDog said:


> Which tells us we need more info or something to compare it to before we can look at the data and draw those conclusions from it.



The information is there and you can compare it right now, a trend is a trend.


----------



## xkm1948 (Dec 31, 2018)

bug said:


> There's so much toxicity in this thread, I'm not even going to bother reading it all.
> 
> This card is about 33% faster than a GTX 1060 6GB, which is an above average generational jump. It's supposed to be selling for $350 (which is a lot), but the GTX 1060 6 GB FE was already $300, almost two years ago. Non-FE could be had for $50 less.
> 
> The rumored variant without DXR should be an exact replacement for the GTX 1060, as far as price is concerned.



Exactly. Kinda bizarre that a tech forum hates tech progression.


----------



## EarthDog (Dec 31, 2018)

Vya Domus said:


> The information is there and you can compare it right now, a trend is a trend.


See edit above...

Its 3 months worth of data on 2xxx series... hard to call it anything with such a small dataset, none the less a trend.

Edit: it also isnt proven by these graphs adaptation is slow as we do not have the data to compare last gen over this gen. While that theory is believable, extrapolation of this data to form that conclusion is patently premature from the data we have. Again, I can believe it, but, we dont know as the graphs domt show the info needed to make a conclusion based on facts.


----------



## Vya Domus (Dec 31, 2018)

EarthDog said:


> See edit above...
> 
> Its 3 months worth of data on 2xxx series... hard to call it anything with such a small dataset, none the less a trend.



3 months is plenty, it's enough to compare the initial market response.


----------



## EarthDog (Dec 31, 2018)

Vya Domus said:


> 3 months is plenty, it's enough to compare the initial market response.


sure... but we cant conclude that its slower than other cards because we dont have the information. Just going off of what vayara is trying to take from it... 

'Initial sales boom and stagnant'

Did other cards show the same so is it really different? Is it not a normal thing to see a lot of sales out of the gate and it settles? I mean... what's the point here??? I struggling to take much away from it for many reasons.


----------



## jabbadap (Dec 31, 2018)

Oh Happy new year to all 



Spoiler: Off topic






EarthDog said:


> See edit above...
> 
> Its 3 months worth of data on 2xxx series... hard to call it anything with such a small dataset, none the less a trend.



If i Remember correctly pascals had great availability from the beginning, Turings had very limited.

PS. nothing againt's with your post @EarthDog I'm just tired for that off topic and it was the last post on that topic


----------



## unikin (Dec 31, 2018)

I said f... it I'm out after seeing $350-400 price tag on RTX 2060 and bought Sapphire RX 570 for € 140. I won't buy new GPU until I can get GTX 1080 level of performance for 300 bucks or less. Hell we're entering 2019 and they want to charge us the same for 2 yo performance? No way I'm buying it.


----------



## Vya Domus (Dec 31, 2018)

EarthDog said:


> but we cant conclude that its slower than other cards because we dont have the information.



Jesus Christ man, *the information is already there*, no matter how many more years you'll wait the data for these first months wont change. There is absolutely nothing premature about this comparison. Anything outside of that , meaning projecting these numbers for the future, then yeah that's speculation.


----------



## EarthDog (Dec 31, 2018)

I'm not JChrist...lol

Premature conclusion without context is premature bud. Sorry.  We have no idea if this is not normal with the data there. I find a trend of initial sales and then it slowing down to be part of a normal sales life cycle of a new product...only time will tell if this is a 'trend'...the data doesnt.

We'll have to agree to disagree.


----------



## Vayra86 (Dec 31, 2018)

EarthDog said:


> sure... but we cant conclude that its slower than other cards because we dont have the information. Just going off of what vayara is trying to take from it...
> 
> 'Initial sales boom and stagnant'
> 
> Did other cards show the same so is it really different? Is it not a normal thing to see a lot of sales out of the gate and it settles? I mean... what's the point here??? I struggling to take much away from it for many reasons.



Go on that website and filter GPUs on User Ratings. That should give you an impression and context to the stagnation. It also can be compared independent of time because people rate their GPU only when they bench it.

Sentiment. For Turing it has been as abysmal as DXR performance at BFV launch. That same sentiment is reflected in Nvidia stock value and on this forum at large, and on the big bad interwebs too. People ain't stupid, only a small percentage of them is.

This is the paradox of the crystal ball ey. When you predict it, there is no data, and when you confirm it afterwards, people say it was 20/20 hindsight. So far with regards to Turing my ball was pretty accurate from the moment somebody shouted 10 Gigarays on stage.


----------



## EarthDog (Dec 31, 2018)

Only time will tell, boys. One can try to translate correlative things, like ratings, to it... but only time will tell. 

That said, I believe it to be a bit slower due to price, and the overblown dead card issue, but I'm built off facts and these graphs dont give me (anyone) enough information to factually make that jump. I'd like to see 9xx to 10xx series rates and 7xx to 9xx series as well as that chart in 2 years to see how adaptation REALLY shakes out. It's simply a premature conclusion (but with writing on the wall).


----------



## Vya Domus (Dec 31, 2018)

EarthDog said:


> only time will tell if this is a 'trend'...the data doesnt.



It does, you really need to open your eyes and realize that a comparison over the span of 3 months is just as valid as one over the period of 3 years. No one on here concluded that this has to be a trend that will carry on for years, everyone kept the comparisons within their respective time frames. It's you alone who thinks that wasn't the case.


----------



## Slizzo (Dec 31, 2018)

lexluthermiester said:


> Very likely. It's also equally likely that many AIB's will have offerings that come in lower than the MSRP. That happened with 2070/2080/2080ti, so it seems reasonable that it will happen with the 2060/2050(?).



Except MSRP, for the 2080Ti At least, is "starting at $999". So far, only ONE SINGLE card has hit that price, on offer from EVGA: The 2080 Ti Black Edition.

As for stock being low... yeah it's selling out, but that doesn't mean that NVIDIA is delivering a lot of GPUs to AIBs or owners. By this time in it's product cycle, Pascal was easy to get. Not many RTX cards are truly that easy to get.


----------



## Vayra86 (Dec 31, 2018)

Vya Domus said:


> 3 months is plenty, it's enough to compare the initial market response.



Exactly. This was in response to 'but RTX is out of stock everywhere'. Right, so sales *plateau for a month* and today you still find cards out of stock (I checked local markets, many GPUs on long delivery times and some stores with low availability). That means they simply aren't there to begin with, ie product is scarce.

Why? Nvidia is possibly having some issues with yields, we've already seen several reports of things not being smooth as you'd want them to be. None of this is surprising, but it is telling when it comes to how feasible Turing is in the long run. We know these dies are flippin' massive. We know that inflates the price, and we see it happening right in front of our eyes. This launch was rushed to be in time for Christmas, and missed the mark.

OR: Nvidia is holding on to their purchased production capacity for other product lines... GTX 1060 versions out of GP104 perhaps? Or the myriad of 2060's we're about to see? A rumored GTX 11xx series?

Either way, you don't do any of this when you've got a sweet, viable product that people are waiting in line for.


----------



## EarthDog (Dec 31, 2018)

Vya Domus said:


> ...a comparison over the span of 3 months is just as valid as one over the period of 3 years.


Just quoting for permanence. 

Have fun guys!

EDIT: Also, plenty of stock here in the states (newegg/amazon/brick and mortar stores like BBuy and MCenter). 

What woulds they have gained by waiting? More stock when there is currently stock (at least for the US and UK). I mean some models are not available, but is this because of how the card partner cut it or NVIDIA? How does anyone know allocations to card partners? Why can't those not in stock be an issue on the AIB providing the different than ref/FE boards? There can be plenty of reasons some models don't show.


----------



## Vya Domus (Dec 31, 2018)

Vayra86 said:


> That means they simply aren't there to begin with.



And there's a very good reason for that that has nothing to do with perf/dollar or anything of the sort. These GPU dies are massive , there is no way Nvidia can pump them out at the same rate as they have done with the much smaller Pascal cards, even on a mature node. This is a matter of manufacturing limitations. The new GDDR6 doesn't help either.


----------



## eidairaman1 (Dec 31, 2018)

jabbadap said:


> Of course it should not add a lot to graphics, it's physics middleware. It adds to physics simulations to make them feel more real life. Audio as audio, graphics as graphics and physics as physics. Ray tracing is replacing rasterizing on rendering pipeline, so it's a new(very old)more realistic way of doing visuals.



I like the fact Vulkan Exists, as it is not locked into Windows 10 like DX12 is or DX11 for W8... (W7 here because of bugginess with 10 changing all the time.



Slizzo said:


> Except MSRP, for the 2080Ti At least, is "starting at $999". So far, only ONE SINGLE card has hit that price, on offer from EVGA: The 2080 Ti Black Edition.
> 
> As for stock being low... yeah it's selling out, but that doesn't mean that NVIDIA is delivering a lot of GPUs to AIBs or owners. By this time in it's product cycle, Pascal was easy to get. Not many RTX cards are truly that easy to get.



Yeah $1000 usd for a gpu that will be underperforming in 2 years, no thanks. 450-500 should be the cap.


----------



## kings (Dec 31, 2018)

People compare this to the price of the GTX 1060, but forget the RX 480 was on the market at the same time to keep prices a bit lower! Basically, the closest competition to the RTX 2060 will be the RX Vega 56 (if we believe in the leaks), which still costs upwards of $400~$450, except for one or another occasional promotion.

Unless AMD pulls something off their hat in January, $350 to $400 for the RTX 2060 will be in tune with what AMD also offers! Nvidia with their dominant position, is not interested in disrupting the market with price/performance.


----------



## eidairaman1 (Dec 31, 2018)

kings said:


> People compare this to the price of the GTX 1060, but forget the RX 480 was on the market at the same time to keep prices a bit lower! Basically, the closest competition to the RTX 2060 will be the RX Vega 56 (if we believe in the leaks), which still costs upwards of $400~$450, except for one or another occasional promotion.
> 
> Unless AMD pulls something off their hat in January, $350 to $400 for the RTX 2060 will be in tune with what AMD also offers! Nvidia with their dominant position, is not interested in disrupting the market with price/performance.



They never have


----------



## GoldenX (Dec 31, 2018)

Expensive is expensive, that's a fact and there is no argument against that. You can pay for it? Good for you, that doesn't give it good value.
We are in the same situation as before Ryzen on the GPU market, with the difference that Nvidia can design something new like almost/fake RTRT, it's mostly useless for now, but it's new anyway.
We need competition, that being the Nth iteration of GCN, or whatever Intel is doing.

RTX fans sound just like "hurr durr Vega is not GCN" fans.


----------



## Tsukiyomi91 (Dec 31, 2018)

with nothing for AMD to offer at the table, Nvidia has THE final say. Don't like it, don't buy 'em. Simple as that. I don't think that reaching the $350 goal is a problem once you set your financial priorities straight & stop complaining. Don't have the luxury to buy one, fine by you. Don't bash others around for what they like just because they got the money for it & you don't.


----------



## GoldenX (Dec 31, 2018)

Tsukiyomi91 said:


> with nothing for AMD to offer at the table, Nvidia has THE final say. Don't like it, don't buy 'em. Simple as that. I don't think that reaching the $350 goal is a problem once you set your financial priorities straight & stop complaining. Don't have the luxury to buy one, fine by you. Don't bash others around for what they like just because they got the money for it & you don't.


So a GPU for an iPhone price at USD 1300 is fine, a low to mid end one at 350 is fine, and complaining about raising prices is not. Great, when can I buy a Celeron for USD1500? I can't wait.


----------



## unikin (Dec 31, 2018)

Tsukiyomi91 said:


> with nothing for AMD to offer at the table, Nvidia has THE final say. Don't like it, don't buy 'em. Simple as that. I don't think that reaching the $350 goal is a problem once you set your financial priorities straight & stop complaining. Don't have the luxury to buy one, fine by you. Don't bash others around for what they like just because they got the money for it & you don't.



It's not about being able to afford it or not it's about allowing them to rip you off or not. I have 3 Titans V (costing me nearly 10K) in my computing system and am not willing to buy RTX for my VR gaming PC at these prices. I want price/performance ratio for my money. Titan V offers good value compared to Quadro line for my computing needs (that's why NGreedia crippled RTX titanium now, greedy b..), RTX doesn't do the same for gaming, so I'm staying away from it like a plague and will gladly buy Vega II/Navi if it offers better bang for my bucks. I don't want to be mikled more than is neccessary.


----------



## xkm1948 (Dec 31, 2018)

Nice to see bunch of couch GPU designers and financial analysts knows better than a multi million GPU company regarding both technology and pricing. It is called capitalism for a reason, no competition means Nvidia can have free say on how much they price their cards. You don’t like it then don’t buy, good for you. Someone else likes it they buy and it is entirely their own business. NVIDIA is “greedy” sure yeah they better f*ucking be greedy. They are a for profit company not a f*ucking charity. How the hell are they able to develop new GPU designs? NVIDIA engineers should just work out of pure love instead of feeding their families?  So much moan and whining


----------



## GoldenX (Dec 31, 2018)

Another capitalist with the "not a charity" meme. A hundred years with the same lame excuse.
Price inflation due to lack of competition/monopoly is a good thing now?
Would you kindly be a slave somewhere else? I think Ryan is calling you.


----------



## xkm1948 (Dec 31, 2018)

GoldenX said:


> Another capitalist with the "not a charity" meme. A hundred years with the same lame excuse.
> Price inflation due to lack of competition/monopoly is a good thing now?
> Would you kindly be a slave somewhere else? I think Ryan is calling you.




I heard Soviet Russia is good for folks like you. All the GPUs you can have for free.

Oh wait, it collapsed.


----------



## GoldenX (Dec 31, 2018)

xkm1948 said:


> I heard Soviet Russia is good for folks like you. All the GPUs you can have for free.
> 
> Oh wait, it collapsed.


Apples to oranges, the Cold War already ended, get on with the news. Tip, there was a nice crisis on the 30s.


----------



## unikin (Dec 31, 2018)

xkm1948 said:


> Nice to see bunch of couch GPU designers and financial analysts knows better than a multi million GPU company regarding both technology and pricing. It is called capitalism for a reason, no competition means Nvidia can have free say on how much they price their cards. You don’t like it then don’t buy, good for you. Someone else likes it they buy and it is entirely their own business. NVIDIA is “greedy” sure yeah they better f*ucking be greedy. They are a for profit company not a f*ucking charity. How the hell are they able to develop new GPU designs? NVIDIA engineers should just work out of pure love instead of feeding their families?  So much moan and whining



Yep, we also used to have antitrust laws and people with spine and balls to implement them against duo(mono)polies, in the age of apathy we have neither anymore. That's why West is turning into shithole.


----------



## Vya Domus (Dec 31, 2018)

xkm1948 said:


> I heard Soviet Russia is good for folks like you. All the GPUs you can have for free.
> 
> Oh wait, it collapsed.



Loving the shitposting, keep up the good work in dumbing down the quality of this forum and lowering the level of discussion.


----------



## GoldenX (Dec 31, 2018)

unikin said:


> Yep, we also used to have antitrust laws and people with spine and balls to implement them against duo(mono)polies, in the age of apathy we have neither anymore. That's why West is turning into shithole.


Worst thing is the Black and White mentality. Oh no, you criticize my product, I should not pay my employees for their work now.


----------



## Rahnak (Dec 31, 2018)

kings said:


> People compare this to the price of the GTX 1060, but forget the RX 480 was on the market at the same time to keep prices a bit lower! Basically, the closest competition to the RTX 2060 will be the RX Vega 56 (if we believe in the leaks), which still costs upwards of $400~$450, except for one or another occasional promotion.
> 
> Unless AMD pulls something off their hat in January, $350 to $400 for the RTX 2060 will be in tune with what AMD also offers! Nvidia with their dominant position, is not interested in disrupting the market with price/performance.



Maybe AMD isn't the only competition anymore..? You have 400€ consoles taking their first steps into 4K already. Next generation is most likely going to make 4K mainstream. PC is still mostly 1080p. At this rate I don't foresee big changes in the near future, so PC gaming is probably going to lag behind consoles on resolution (considering highest adoption rates) and costing way more at that. And that's pretty sad.


----------



## illli (Dec 31, 2018)

This is... quite pathetic actually.  You can buy a 1070 ti for $350 today.  When this comes out you can buy a RTX 2060 @ 1070ti performance for.... the same price.  NV priced their cards way too much this time around. RTX 2060 should have come in at $250, the RTX 2070 $350 and so on.


----------



## the54thvoid (Dec 31, 2018)

xkm1948 said:


> Nice to see bunch of couch GPU designers and financial analysts knows better than a multi million GPU company regarding both technology and pricing. It is called capitalism for a reason, no competition means Nvidia can have free say on how much they price their cards. You don’t like it then don’t buy, good for you. Someone else likes it they buy and it is entirely their own business. NVIDIA is “greedy” sure yeah they better f*ucking be greedy. They are a for profit company not a f*ucking charity. How the hell are they able to develop new GPU designs? NVIDIA engineers should just work out of pure love instead of feeding their families?  So much moan and whining



This is pretty much exactly the way it is. I can't think of a single 'successful' manufacturer that produces cheap items in the absence of comparative competition.  



GoldenX said:


> Another capitalist with the "not a charity" meme. A hundred years with the same lame excuse.
> Price inflation due to lack of competition/monopoly is a good thing now?
> Would you kindly be a slave somewhere else? I think Ryan is calling you.



But they're not a bloody charity. They have shareholders who quite literally demand a high ROI. This is how the business works. Yes, it means awful prices but that is life. And I'll add to the charity theme that seems to annoy and say it's a graphics card to enable you to_ play games_. Play games. It's a luxury. It's not bread, it's not education - you are not entitled to it. 

The sense of injustice is absolutely wonderful. I'm not a fan of wealth or capitalism but I understand it. What I cannot comprehend is the attitude of people that complain so vehemently about pricing. Take a look around. Every successful company is doing it.


----------



## GoldenX (Dec 31, 2018)

the54thvoid said:


> This is pretty much exactly the way it is. I can't think of a single 'successful' manufacturer that produces cheap items in the absence of comparative competition.
> 
> 
> 
> ...


That doesn't make it right. Keep rising prices, tolerating it, see what happens.
"They all do it" is not an argument, it's complacency.
Once they see their sales plummet as they see their shares now, we'll see what happens.

Coplaining about stupid prices is not a "sense of injustice", it's an option we have. What are companies now, dictatorship states? "Just buy it at whatever price they ask".


----------



## Rowsol (Dec 31, 2018)

The performance numbers are surprising, faster than I expected.

As for Nvidia's pricing: They will charge as much as they can, which is currently a lot because they have no competition.  If AMD or Intel ends up putting out something competitive than maybe we'll see some price drops.


----------



## TheOne (Dec 31, 2018)

I have a feeling if Intel and AMD can get their performance numbers to NVIDIA's and include RT they will price along side them, especially Intel.


----------



## Gutterbanger (Dec 31, 2018)

lexluthermiester said:


> Everyone is complaining about price. Is it really that expensive? And if so, is really that difficult to save money for an extra month or two?
> 
> The price is not the problem although it is to high for a  card with no future. When RT hits full swing even the 2080 will be under powered for any of the new games. the 1080 TI is enough card for the next four years, by then RT will be in full swing along with many other upgrades that the 2080 is not ready for  better cards will be needed. Buying a card for the future now is not going to work IMO. I will wait and buy a card in the future that is actually made to play the games in the future as I think the 2080 in four years will be just like trying to use a GTX 950 to play in 4k now.


----------



## the54thvoid (Dec 31, 2018)

GoldenX said:


> *That doesn't make it right.* Keep rising prices, tolerating it, see what happens.
> "They all do it" is not an argument, it's complacency.





Right and wrong do not apply to capitalism.

Ethical? We can use that - but *not* right or wrong.

FWIW, I did not upgrade my 1080ti to a 2080ti because for me, it's NOT worth it. But I'm not bitching about it. I'm quietly voting with my wallet but as you'll see, that does nothing to change the situation. Yes - the quaint and rather impotent concept that if I don't buy it, they'll fail... well, there are a lot of people with good incomes that will buy it because to them, it has value.

As for the statement: "are companies dictatorships?" Take a good look at that. Nvidia (NVDA) is a Nasdaq listed company with shareholders that receive dividends for the company's performance. If they started selling cheap cards, the price would tank (that's how it works, it would be seen as a loss of confidence in business product and confidence is what the share index is all about.)

@xkm1948 actually has got it so correct - if you want 'sensibly' priced cards you need a government supported chip maker to swamp the market. Not going to happen. AMD is your only hope of Nvidia prices coming back to earth.  And don't hold your breath on that one.


----------



## Blueberries (Dec 31, 2018)

1070ti to 1080 performance for a $350 MSRP is a bargain, people. When the 1070 (non ti) came out the MSRP was $380, and you couldn't find an AIB for under $450.

Performance is up and cost is down, and people are still finding a way to complain. Not to mention the lower power draw of this card means lower thermals and fan profile, and just to add icing on the already sweet cake you get some RT cores you'll never use.


Hell, I paid $400 for a GTX 670 years ago when Kepler came out, it didn't have HALF the performance of this card and it sounded LIKE A LAWNMOWER. I don't know if you guys are just now getting into computer hardware but we didn't all have it this easy.


----------



## GoldenX (Dec 31, 2018)

the54thvoid said:


> Right and wrong do not apply to capitalism.
> 
> Ethical? We can use that - but *not* right or wrong.
> 
> ...


There is a difference between silently not buying and saying you can't complain at all about pricing.
The fact that "right and wrong don't apply to capitalism" sounds like communism with better cars.


----------



## Totally (Dec 31, 2018)

lexluthermiester said:


> Everyone is complaining about price. Is it really that expensive? And if so, is really that difficult to save money for an extra month or two?
> 
> Seems to be at the end of the card.



To refererence the SHS, the bulk of cards on there are x50/x60 that occupy the $150-250 space. You try to overgeneralize things with "And if so, is really that difficult to save money for an extra month or two?" What if it took an individual 10 months just to save up the $250 in the first place? So they're supposed to just save for another 7-8 months? They are, that's we have threads should I buy old gen x or wait for new gen y because by the time they've finally saved up enough new cards are around the corner.


----------



## xorbe (Dec 31, 2018)

Blueberries said:


> 1070ti to 1080 performance for a $350 MSRP is a bargain, people. When the 1070 (non ti) came out the MSRP was $380, and you couldn't find an AIB for under $450.



I was going to mention something along this line.  Sounds like 1070Ti/1080 ballpark for $350.  Of course the 1080 came out over 2.5 years ago, the wheels of progress are moving slowly these days ...


----------



## moproblems99 (Dec 31, 2018)

lexluthermiester said:


> Gonna borrow from another conversation for a moment with this breaking news story;
> 
> Some are proceeding with the logic that is based on the idea of comparing a 2060 to a 1070, a 1080 to a 2070, a 1080ti to a 2080 and a Titan to a 2080ti.
> 
> ...



It should be looked at like this:  What did $350 get me last gen at launch?  In this case, pretty much exactly what you get this gen with a little extra slightly useful fluff.



Vayra86 said:


> Doesn't make sense though, because that will cannibalize the 2070 and we know that isn't Nvidia's style at all.



I said this in another thread.  Either the the price is wrong or the performance is wrong in these charts.  The numbers are too close to the 2070.



EarthDog said:


> Vulkan reminds me of Physx... it is there and can be useful in some titles... but isnt a big player.
> 
> Vulkan really doesnt seem to have taken hold since its release. Not sure how it can gain traction. Time isnt on its side.



I wish Vulkan would be used more so I could drop Windows completely.  If Linux had more share, Vulkan would be used.  I think RTX is very much like Vulkan in the sense that the potential user base is so small that adoption will be slow.


----------



## HisDivineOrder (Dec 31, 2018)

the54thvoid said:


> Right and wrong do not apply to capitalism.
> 
> Ethical? We can use that - but *not* right or wrong.
> 
> ...




I don't think it's better to silently accept things the way they are that you don't want them to be.  I think complaining about them is a decent first step towards doing something about it.  After all, people could have said the same thing when nvidia released the FX series.  But they didn't.  They could have said it with the 200 series.  But they didn't.

No, nvidia responded to market competition.

https://www.cnet.com/news/nvidia-cuts-prices-on-gtx-260-280-graphics-boards/

No, the problem here is AMD's lack of competition.  When this kind of "release the same performance at slightly higher pricing" happened, we'd always have ATI to counterbalance it.  Well, ATI or 3dfx.  Now, though, AMD's purchase of ATI has left the industry crippled.  We're either slaves to Intel's greed or nvidia's greed.  That's why there's equal amounts of blame to go toward AMD, whose incompetence is stagnating GPU progress throughout the industry.  Hopefully, Intel will come along and price low with decent tech and support with good drivers (far from a sure thing) and give us the competition that AMD seemingly is unable to provide to nvidia.


----------



## Gorstak (Dec 31, 2018)

You aren't slaves to anything. 8800 GT 512 10 years ago costed 2000kn. I know cuz I bought it. And there were cards like 9800 GTX+ and ultra, that costed a tremendous ammount.
Everything remained the same. You can still buy a modern 8800 GT type for the same price. So mainstream cards remained with same price, and a normal, mainstream desktop pc still costs 4000kn.


----------



## Eric3988 (Dec 31, 2018)

I love me some capitalism, but a good product can still be at a bad price point. It's too bad we can't have a discussion about this without so much mudslinging everywhere. I, for one, don't buy into any leaks ever and will judge this card based on actual price-performance in games I play, not with artificial benchmarks. If the price is justified on performance then it's a buy, if it's basically an updated 1070 at the same price, then I'll just scratch my head and move on.


----------



## moproblems99 (Dec 31, 2018)

HisDivineOrder said:


> We're either slaves to Intel's greed or nvidia's greed.



Last I checked, no one is required to by PC components.


----------



## xkm1948 (Dec 31, 2018)

moproblems99 said:


> Last I checked, no one is required to by PC components.



The f*ucking age of entitlement to anything. Yep you said it 100% right. i gotta say the education system did a good job training millions pf people to think like this.


----------



## Fouquin (Dec 31, 2018)

lexluthermiester said:


> Here's a set of facts;
> 1. Every generation of new GPU's get a price increase.
> 2. Every generation of GPU's offers a performance increase.
> 3. People always complain about said price increase while trying to minimize or ignoring the increase in performance.



*Every *generation gets a price increase? Well hang up just a second there, that's not a 'fact' I've ever heard or seen presented.






The _average_ MSRP of a flagship nVidia GPU was set at a hair over $525 for nearly a decade, with the majority of new releases being either cheaper or the same price. The GTX 780 was the first card to buck the trend in half a decade, and the GTX 980 looked like a return to that comfortable average before the 10 series and now 20 series have cranked up the heat. Even with the migration of flagship status to 'Ti' series cards the price increase is felt at the launch of the 20 series, with the 2080 Ti's minimum cost set where the first three generations of Titans were.

I have no argument against the fact that performance increased with each generation, except for with the 9800 GTX but that was a very strange special case as the 8800 Ultra was announced _after _the 9800 series launched, so there's this distinct overlap of product lines that lead to the 9800 GTX taking a bit of a back seat while still being sold as the flagship product of a new generation.

So the actual fact is the 20 series breaks the trend by being the first product series in over a decade to feature an increased MSRP over a previous generation that also had an increased MSRP. It also features the highest increase in average price, as well as the highest MSRP values to date since the 8800 Ultra 11 years ago. Those facts have helped form the opinion of large group of consumers that the entire product series is overpriced.


----------



## lexluthermiester (Dec 31, 2018)

Slizzo said:


> Except MSRP, for the 2080Ti At least, is "starting at $999".


That's for the FE direct from NVidia. Most did/do not pay anywhere near that price for an AIB model. For example mine was under $800. Yes, yes.



Fouquin said:


> *Every *generation gets a price increase? Well hang up just a second there, that's not a 'fact' I've ever heard or seen presented.


You didn't go back far enough and your own graph is incorrect.


----------



## Fouquin (Jan 1, 2019)

lexluthermiester said:


> You didn't go back far enough and your own graph is incorrect.



I went back over a decade and provided 10 generations of data. Far enough is subjective, but there's plenty of data there to draw a conclusion. Rather than simply say, "You're wrong," please offer up corrections where needed.


----------



## lexluthermiester (Jan 1, 2019)

moproblems99 said:


> It should be looked at like this: What did $350 get me last gen at launch? In this case, pretty much exactly what you get this gen with a little extra slightly useful fluff.


Gotta disagree. It should be looked at like this; " What level of performance can $350 get me today? Does it meet my needs? " While you have a good point, making comparisons to previous gen cards is fruitless as that is not going to help with a current purchase unless you're willing to buy used.



Fouquin said:


> I went back over a decade and provided 10 generations of data. Far enough is subjective, but there's plenty of data there to draw a conclusion. Rather than simply say, "You're wrong," please offer up corrections where needed.


I've already done that previously, you seem to have missed it.. Hint, look on page 3 of this thread.


----------



## GoldenX (Jan 1, 2019)

xkm1948 said:


> The f*ucking age of entitlement to anything. Yep you said it 100% right. i gotta say the education system did a good job training millions pf people to think like this.


I'm not forced to get a 2060, but inflating prices sets the bar higher for the competition, so enjoy your mid range at USD700 5 years from now, thanks.


----------



## Xzibit (Jan 1, 2019)

Fouquin said:


> I went back over a decade and provided 10 generations of data. Far enough is subjective, but there's plenty of data there to draw a conclusion. Rather than simply say, "You're wrong," please offer up corrections where needed.



Be prepared to get redirected to one of his facts opinions.


----------



## lexluthermiester (Jan 1, 2019)

Xzibit said:


> Be prepared to get redirected to one of his facts opinions.


As opposed to the mindless drivel that regularly comes out of you?

Research is helpful. Go look up MSRP's from *reliable* sources and you will find all sorts of useful information. You will also discover that the numbers in his graph are just a bit off.


----------



## moproblems99 (Jan 1, 2019)

lexluthermiester said:


> Gotta disagree. It should be looked at like this; " What can level of performance can $350 get me today? Does it meet my needs? " While you have a good point, making comparisons to previous gen cards is fruitless as that is not going to help with a current purchase unless you're willing to buy used.



That is true when you are considering what you are actually going to buy but not when you are measuring progress between generations.  I don't recall exactly and I don't care enough to look but the 1070 was either $349 or $399 MSRP launch price.  If the 2060 is really $349 MSRP with 1070ti performance (which I don't believe) then the only progress this generation is RTX.  People's belief of the value or RTX vary widely but RTX likely won't have much value for a view years and the 20 series will have a hard time keeping up by then.



lexluthermiester said:


> That's for the FE direct from NVidia. Most did/do not pay anywhere near that price for an AIB model. For example mine was under $800. Yes, yes.



You got a 2080ti for under $800?


----------



## lexluthermiester (Jan 1, 2019)

moproblems99 said:


> People's belief of the value or RTX vary widely but RTX likely won't have much value for a view years and the 20 series will have a hard time keeping up by then.


See, I'm one of those who like RTX. I didn't at first because there wasn't enough info to make an informed conclusion. Once that info was out there and the potential of RTX's RTRT was explored, even just a little, I saw the value of it. That was the motivation needed to get one and have fun with it.


moproblems99 said:


> You got a 2080ti for under $800?


2080. The 2080FE is MSRP'd at $999. The 2080tiFE is at $1199.


----------



## Fouquin (Jan 1, 2019)

lexluthermiester said:


> As opposed to the mindless drivel that regularly comes out of you?



Please cease the unnecessary ad-hominem attacks. There is no need to introduce personal attacks and toxicity in a debate about computer hardware.



lexluthermiester said:


> I've already done that previously, you seem to have missed it.. Hint, look on page 3 of this thread.



Oh, I did. Is it this?



lexluthermiester said:


> I can think of at least three other times were the percentage jump in price was comparable to this one. The Geforce 2 to Geforce 3, the FX 5X00 series to 6X00 series and the 9X00 Series to the GTX 2X0 series. And that's just NVidia. ATI had two instances where the generational price jump was huge. ATI justified it because of vertex shaders and such, as did NVidia. 3DFX did one as well. The Voodoo 2 was twice the price of the Voodoo 1.



Yes, that is still a lot of words and not a lot of actual price figures. It also doesn't say anything about how the data within the graph I posted is incorrect, as you insisted it was.

Here are the prices of older generations that I was able to gather from reviews and retail listings:







Those numbers are largely irrelevant as they show effectively the same general price scale, and have actually decreased the averaged MSRP to $517.



lexluthermiester said:


> Go look up MSRP's from *reliable* sources and you will find all sorts of useful information. You will also discover that the numbers in his graph are just a bit off.



Please provide these reliable sources. All you have done thus far is act as the contrarian and provided no tangible proof of your facts. "You are wrong," is not an argument. If you have information to contribute, don't feel as though you need to hold it back. Put it out there, share it, let me be proven wrong! That's the point of the debate.


----------



## bug (Jan 1, 2019)

GoldenX said:


> There is a difference between silently not buying and saying you can't complain at all about pricing.


There is also a difference between complaining about pricing and making that the only point in every thread that has Nvidia in its title 


GoldenX said:


> The fact that "right and wrong don't apply to capitalism" sounds like communism with better cars.


They do, actually apply. Making money is good, losing money is bad. That's why the state has the supervisor's job


----------



## lexluthermiester (Jan 1, 2019)

Fouquin said:


> Please cease the unnecessary ad-hominem attacks. There is no need to introduce personal attacks and toxicity in a debate about computer hardware.


Thanks for the tip. And when you address everyone equally for such "attacks", I'll be all too happy to take it more seriously.


Fouquin said:


> Please provide these reliable sources.


Ok. Here's just a few examples;
https://www.anandtech.com/show/537/27
https://www.anandtech.com/show/742/12
https://www.guru3d.com/articles_pages/geforce_fx_5800_ultra_the_review,24.html
https://www.guru3d.com/articles-pages/albatron-geforce-6800-gt-256-mb-review,1.html

These are only examples of the kind of research done when I go looking for info. People can say I talk out my backside all they want, doesn't make them correct. And *not all* of your numbers jive well with the info out there. Nor does it jive with what I remember from back then. Thing is, I'm not going to hold anyone's hands. If I tell you I know something and that you might be wrong, you can bet your life on the fact that I've gone looking to verify info or have personal experience(that also gets re-verified).



bug said:


> There is also a difference between complaining about pricing and making that the only point in every thread that has Nvidia in its title


This. Yes.


----------



## moproblems99 (Jan 1, 2019)

lexluthermiester said:


> See, I'm one of those who like RTX. I didn't at first because there wasn't enough info to make an informed conclusion. Once that info was out there and the potential of RTX's RTRT was explored, even just a little, I saw the value of it. That was the motivation needed to get one and have fun with it.



Hey, more power to you.  Since they ruined the Battlefield series, it doesn't hold a lot of value to me.  When I look at RTRT, I see all the other empty promises we've been given such as mGPU.  If devs haven't gotten around to a feature that would be beneficial to far more people than RTRT then I don't see how RTRT is going to take off anytime soon.  10% of users on steam have above a 1070.  Let's say that the 20 series matches that (which, to me, is doubtful at this point), what incentive do developers have to spend a metric boat load of resources (assumption on my part) to implement a feature that one out of ten users will be able to use?  That doesn't even account for consoles which won't be RTRT capable for this generation (also an assumption by me).

Look at Bethesda.  They can't even be bothered to update their engine from something this decade.  I don't think CoD's engine has had an overhaul either (also an assumption).  I honestly don't care what happens with RTRT as from the videos I have seen it doesn't even look that better.  I haven't seen it in person so I will hold final judgement until then.  Instead of RTRT, I wish developers would spend those resources on actually finishing their games before launch.



lexluthermiester said:


> Slizzo said:
> 
> 
> > Except MSRP, for the 2080Ti At least, is "starting at $999". So far, only ONE SINGLE card has hit that price, on offer from EVGA: The 2080 Ti Black Edition.
> ...





lexluthermiester said:


> 2080. The 2080FE is MSRP'd at $999. The 2080tiFE is at $1199.



Ah, I was confused.  You had quoted someone referring to a 2080ti starting at $999 and said you had gotten it for $800.  I was going to call that a deal I would consider paying.


----------



## lexluthermiester (Jan 1, 2019)

moproblems99 said:


> Ah, I was confused. You had quoted someone referring to a 2080ti starting at $999 and said you had gotten it for $800.


No worries, sometimes these conversations go all over the place and can be hard to follow..


moproblems99 said:


> I was going to call that a deal I would consider paying.


That'd be a deal I'd go for too!


----------



## Fouquin (Jan 1, 2019)

lexluthermiester said:


> Ok. Here's just a few examples;
> https://www.anandtech.com/show/537/27
> https://www.anandtech.com/show/742/12
> https://www.guru3d.com/articles_pages/geforce_fx_5800_ultra_the_review,24.html
> ...



Those are good sources, and are ones that I referenced myself to determine prices (as shown by the chart reflecting the very same values). I also referred to nVidia's product pages (when prices are listed) as well as archived retailer listings such as from Newegg and/or TigerDirect. A lot of the prices were also pulled directly from our very own GPU Database which uses information from all the aforementioned sources to provide the launch-day MSRP of a card if possible. If a specific number does not jive with you, offer up evidence to substantiate a correction. Otherwise, the values will not be changed and the debate will continue along using the alleged incorrect values.

Regardless, only a single instance of a price decrease across two generations is sufficient to disprove your presented fact of, "Every generation of new GPUs get a price increase."



lexluthermiester said:


> Thing is, I'm not going to hold anyone's hands. If I tell you I know something and that you might be wrong, you can bet your life on the fact that I've gone looking to verify info or have personal experience(that also gets re-verified).



That's all well and good, but you also made a completely false statement presented as fact that was incredibly easy to disprove by the very same research. I can only trust your knowledge by what you say, and when you say something which is false that trust is shaken and becomes uncertain. As it stands, doing independent research and verification has led to exposing that actual facts, something that would never happen if I had blind faith in your every word.

With all that out of the way it's clear the pricing on the 20 series GPUs is a rare case of substantial increase over the historical average. Since it has been so long since the last comparable increase, and the increase is subjectively not justifiable there has been an uproar surrounding it. That now taints the entire product line and as such continue to expect commotion when information on the 20 series continues to appear, including this news on the RTX 2060 FE.


----------



## moproblems99 (Jan 1, 2019)

lexluthermiester said:


> No worries, sometimes these conversations go all over the place and can be hard to follow..
> 
> That'd be a deal I'd go for too!



Also, just wanted to say I wasn't trying to be a jerk when I said 'more power to you'.  Gaming is supposed to fun.  If you aren't enjoying it, you're doing it wrong.


----------



## lexluthermiester (Jan 1, 2019)

Fouquin said:


> "Every generation of new GPUs get a price increase."


Ok, I'll partly retract and re-qualify. Most generational transitions experience an increase. Still, this one isn't the largest jump in relative price.


Fouquin said:


> That's all well and good, but you also made a completely false statement presented as fact that was incredibly easy to disprove by the very same research.


The problem with your conclusion is the following; when you compare an individual series and/or model group, there has always been an increase on one level or another. So it's not that the info I stated was false, it's more that it requires a bigger picture point of view. We could nitpick each other all day long. But hey, what would be the point beyond a bit of amusement..


moproblems99 said:


> Also, just wanted to say I wasn't trying to be a jerk when I said 'more power to you'.


I gotcha, no worries.


moproblems99 said:


> Gaming is supposed to fun. If you aren't enjoying it, you're doing it wrong.


Exactly, and that's the most important thing!


----------



## Totally (Jan 1, 2019)

lexluthermiester said:


> That's for the FE direct from NVidia. Most did/do not pay anywhere near that price for an AIB model. For example mine was under $800. Yes, yes.


 You realize he said Ti right, direct from Nvidia those are $1,199, with AIB srp being the aforementioned $999, and quick look available ti are matching FE prices with a $100 markup on most. Where did you get this $800 2080ti?


----------



## xkm1948 (Jan 1, 2019)

Totally said:


> You realize he said Ti right, direct from Nvidia those are $1,199, with AIB srp being the aforementioned $999, and quick look available ti are matching FE prices with a $100 markup on most. Where did you get this $800 2080ti?



He got 2080, not 2080Ti

Do you even read?


----------



## GoldenX (Jan 1, 2019)

bug said:


> There is also a difference between complaining about pricing and making that the only point in every thread that has Nvidia in its title


Not my fault RTX looks like an entire i9 lineup.


----------



## Totally (Jan 1, 2019)

xkm1948 said:


> He got 2080, not 2080Ti
> 
> Do you even read?



He was qouting some one referring to a 2080ti and referencing his that he didn't specific was the non-ti. Those following clarifying posts weren't present and were probably made while I wasn't double checking in case pricing had changed and hadn't taken note. So don't jump down my throat just because someone is being 'hard to follow'.


----------



## Gutterbanger (Jan 1, 2019)

All the cards were a lot easier to get before they came out with the Ring Doorbell, Just sayin.


----------



## Tsukiyomi91 (Jan 1, 2019)

how about this... since the "teaser" data of how the RTX2060 performing as close or slightly better than the GTX1070Ti isn't reliable enough, why not let other reviewers do their work on it once the NDA is lifted, considering some might already get their hands on the card, particularly the Founders Edition of the 2060. Just stop all the complaints, laments & bantering... while leaving those folks who will take their time in reviewing said GPUs? Can we all do that so we can see how the card actually performs & not throwing opinions here & there??


----------



## GoldenX (Jan 1, 2019)

And where is the fun in that?


----------



## lexluthermiester (Jan 1, 2019)

GoldenX said:


> And where is the fun in that?


Speculation is a ton of fun sure, but most of what has been going on in this thread has been NVidia bashing about price and supposed performance. If it were more about interesting speculation than it would be more fun.


----------



## Tsukiyomi91 (Jan 1, 2019)

Having constructive opinions are fine. Destructive ones while bashing others? Not cool & best to keep it to yourself.


----------



## Renald (Jan 1, 2019)

lexluthermiester said:


> Here's a set of facts;
> 1. Every generation of new GPU's get a price increase.
> 2. Every generation of GPU's offers a performance increase.
> 3. People always complain about said price increase while trying to minimize or ignoring the increase in performance.



I didn't read all 7 pages but I have to respond to that because you are clearly misunderstanding the situation and manipulating information.

Facts :
1 - Indeed that's a fact but there must be a cause
2 - I couldn't agree more
3 - People complain indeed, but not because the increase of price, that's where you're forgetting point n°4 which explains this point, and doesn't justify point 1 and justify complaints

So :
4 - The ratio for manufacturers of performance/cost is ALWAYS increasing because performance increase a lot, but costs don't grow that fast, and sometimes they go down when doing shrinking or renaming.


Here is an example of your logic  :
1 - a Voodoo card score 1
2 - a 2080 TI card score 1000
3 - In your logic, the starting price should increase accordingly to the increase of performance

So, saying, for example that a Voodoo costs 300$ at the time (like 20 ears ago), a 2080TI should cost 300 000$ at launch. See where you're wrong dude ?
A Voodoo should maybe cost a single dollar to produce today. And so you have to take that in input for point n°1 which explain point n°3. (old ratio : 1/300 | new ratio : 1, and higher is better)

That's the same thing for many things like HDD, SSD (remember when a SSD cost 1$/GB ? with Vertex 2 for example). Remember the cost of a 8086 ? now it's a single dollar, even less and a high-end CPU which is 200000x more powerful is "only" 400$. See ? Get out of the matrix lex, you're blinded.

People are complaining because Nvidia is putting crazy prices on something easy to produce for them. End of story.
And stop with "Vote with your wallet" or "learn how to save money". I could build myself a right now with the last Threadripper or i9 + 2TB SSD + 2080Ti and don't even sense it on my accounts. It's a question of taking people for idiots and Nvidia is doing it for many years now.


----------



## efikkan (Jan 1, 2019)

Fouquin said:


> *Every *generation gets a price increase? Well hang up just a second there, that's not a 'fact' I've ever heard or seen presented.
> 
> View attachment 113763


You have to calculate the inflated price for a fair comparison.
8800 Ultra becomes ~$1007
8800 GTX (which launched at $649) becomes ~$811
Prices for Turing doesn't look so bad then…


----------



## jabbadap (Jan 1, 2019)

Fouquin said:


> *Every *generation gets a price increase? Well hang up just a second there, that's not a 'fact' I've ever heard or seen presented.
> 
> View attachment 113763
> 
> ...



While generally I agree with you that every generations does not have price increases. But could you add the die sizes on that chart?  As pricing is not always that black and white, it's depend on different variables. By die sizes: gtx 280 was a really big chip(die shrinked really fast to smaller GTX285 with $359 msrp), Fermis were big chips but had a fierce competition, gtx780 was a big kepler, gtx980 was on the same node as gtx680 thus fair, gtx1080 was rip off without a competition, RTX 2080 is again a big chip and no competition.


----------



## B-Real (Jan 1, 2019)

lexluthermiester said:


> Wrong, and every benchmark showing performance numbers bare that out as fact. The 2080 cleanly beats out the 1080 and beats out 1080ti if it doesn't match it. Also RTX offers advancements Pascal can not. The 2080/2080ti and RTX Titan are the best on the market. NVidia knows this and demands a premium price for it. If you don't want to pay that price, ok, don't buy one. Settle for less.
> 
> Key word there..





lexluthermiester said:


> Everyone is complaining about price. Is it really that expensive? And if so, is really that difficult to save money for an extra month or two?


Haha. 2 pathetic tries to defend this gen.


----------



## dj-electric (Jan 1, 2019)

My 2 cents on pricing:

The 8800 GTX was 600$ or today 725$
8800 Ultra was 800$ or today 999$
The 8800 GT was only 300$ or today 360$
All with inflation.

Here's the thing. The 8800 GT gave you about 75%-80% of the performance the 8800 Ultra did for almost a third of its price. It was fairly close in performance, yet much cheaper.
That's how things were sold to us. The 8800 Ultra was the cream of the crop for fine diners, while offering slightly higher performance.

Today, you have:
RTX 2080 Ti for 999$
RTX 2080 for 799$
RTX 2070 for 499$

The performance difference between the RTX 2070 and RTX 2080 Ti is quite substantial. You don't get 75% of flagship performance  for half of its price. You get just a little over half.

The flagship to mainstream ratio is very different today than it was 10-11 years ago, even if "cards were still sold at over 750$". There was compensation for the mid range. Today midrange is absolutely cruel.
The only things 300$ will get you is RX 590, or GTX 1060 GDDR5X, with both having 2+ year technologies in them


----------



## cucker tarlson (Jan 1, 2019)

eidairaman1 said:


> PSI vote with my wallet because I am on my 290 VaporX. I also dont advocate a company with the color of greed.


how many time are you going to mention that you bought an AMD card 5 years ago so you're supporting the right company.

fact is they're using 12nm to launch a $350 card that,if the leak is to be true,outperforms AMD's $400 V56 and nearly matches $500 V64.Has RT and DLSS support too,while all Vega has is gimmicky HBCC that they hyped but it turned out to have zero impact. Meanwhile all AMD did this year is launch a 12nm flop at $280 that is barely faster than rx580 oc vs oc.

nvidia has had better value cards at the ~$350-400 price point for quite some time,starting with the gtx 970,then 1070,1070Ti and now rtx 2060 is gonna be no exception.They didn't make the jump to 7nm yet,but still are bringing the performance of 1080/V64 down to $350 level at 12nm.


----------



## Fouquin (Jan 1, 2019)

efikkan said:


> You have to calculate the inflated price for a fair comparison.
> 8800 Ultra becomes ~$1007
> 8800 GTX (which launched at $649) becomes ~$811
> Prices for Turing doesn't look so bad then…



You don't need to adjust for inflation for MSRP between generations. Reason being that the inflation of the US Dollar between a set of product launches is not enough to significantly impact the MSRP of the product. For example; the value of the US Dollar between the GTX 680 and GTX 780 would not have changed substantially enough to justify the $150 price increase as compensation. This was not a value adjusted comparison, it was a generational comparison.


----------



## jabbadap (Jan 1, 2019)

dj-electric said:


> My 2 cents on pricing:
> 
> The 8800 GTX was 600$ or today 725$
> 8800 Ultra was 800$ or today 999$
> ...



Uhm 8800 GT msrp was $199, Same chip 8800 GTS was $349. And not to mention 8800 GTX was manufactured on 90nm node and 8800 GT half year later on smaller 65nm node.


----------



## moproblems99 (Jan 1, 2019)

Renald said:


> eople are complaining because Nvidia is putting crazy prices on something easy to produce for them. End of story.
> And stop with "Vote with your wallet" or "learn how to save money". I could build myself a right now with the last Threadripper or i9 + 2TB SSD + 2080Ti and don't even sense it on my accounts. It's a question of taking people for idiots and Nvidia is doing it for many years now.



Honestly, there wasn't much they could do as a publicly traded company.  The 10 series is already better than anything AMD has and everyone had a serious surplus.  Then they released cards that are faster then their last gen.  Why lower the prices on something that is already better than the competition?  Unfortunately for us, that is just not good business.  People need to figure out what they value and stick to it until Navi comes out.  Then they better hope that Navi is mostly competitive.  I surmise it will be similar to Vega -> Pascal.



cucker tarlson said:


> nvidia has had better value cards at the ~$350-400 price point for quite some time,starting with the gtx 970,then 1070,1070Ti and now rtx 2060 is gonna be no exception.They didn't make the jump to 7nm yet,but still are bringing the performance of 1080/V64 down to $350 level at 12nm.



That really depends on the situation.  Those of us enjoying Freesync have a much better value metric when combining the cost of a monitor in with a gpu.  Do we have the bestest and fastest?  No.  Do we have a solution that works?  Yes.

EDIT:  For what is worth, I have $1000 into my monitor and GPU.  Going by these leaks, a 2060 and a G-Sync monitor is going to cost a little bit more than that.  How bad is the value really?


----------



## efikkan (Jan 1, 2019)

Fouquin said:


> You don't need to adjust for inflation for MSRP between generations. Reason being that the inflation of the US Dollar between a set of product launches is not enough to significantly impact the MSRP of the product. For example; the value of the US Dollar between the GTX 680 and GTX 780 would not have changed substantially enough to justify the $150 price increase as compensation. This was not a value adjusted comparison, it was a generational comparison.


Wrong, you missed the point, the prices must be calculated with inflation to give a fair comparison.

That does not mean that a price increase is pure inflation, there are many factors here, including varying production costs, competition, etc. But these can only be compared after we have the price corrected for inflation, otherwise any comparison is pointless.


----------



## lexluthermiester (Jan 1, 2019)

Renald said:


> I didn't read all 7 pages but I have to respond to that because you are clearly misunderstanding the situation and manipulating information.


And that is why you are way behind the conversation. But hey, that's ok. Context means very little anyway.. 


B-Real said:


> Haha. 2 *pathetic* tries to defend this gen.


Key word there. Roll credits!


----------



## cucker tarlson (Jan 1, 2019)

moproblems99 said:


> Honestly, there wasn't much they could do as a publicly traded company.  The 10 series is already better than anything AMD has and everyone had a serious surplus.  Then they released cards that are faster then their last gen.  Why lower the prices on something that is already better than the competition?  Unfortunately for us, that is just not good business.  People need to figure out what they value and stick to it until Navi comes out.  Then they better hope that Navi is mostly competitive.  I surmise it will be similar to Vega -> Pascal.
> 
> 
> 
> ...


very true, I myself can't imagine playing without variable refresh rate anymore. I've had it since 2015 and frame synchronisation with low input lag is an absolut must for me,witch g-sync/freesync you're able to achieve both and effortlessly.An example from the game that I play a lot-shadow warrior 2. You're playing a game at 150 fps,suddenly there's all sort of explosions going on the screen and the framerate drops to 80 for a couple of seconds.With VRR you're able to achieve no tearing and no feeling of stutter while this is happening.The framerate drop is still noticeable but the transition is very smooth.

However,that is not the point.AMD themselves tried to deflect the fact that Vega series is simply worse than nvidia's offerings by blind tests (that's the dumbest thing a gpu manufacturer has done in the last decade or ever IMO,our card is as good if you don't look at the fps counter  ) or including a freesync monitor in the price,what followed was this fake msrp scam where they tried to sway the public and reviewers that V64 was a $400 card while in reality it cost somwhere between 1080 and 1080Ti.

Those blind tests and fake msrp was the reason I started to percieve RTG in completely different light,and I laugh when I see peole blindly defending it while bashing nvidia for being shady.


----------



## Renald (Jan 1, 2019)

moproblems99 said:


> Honestly, there wasn't much they could do as a publicly traded company.  The 10 series is already better than anything AMD has and everyone had a serious surplus.  Then they released cards that are faster then their last gen.  Why lower the prices on something that is already better than the competition?  Unfortunately for us, that is just not good business.  People need to figure out what they value and stick to it until Navi comes out.  Then they better hope that Navi is mostly competitive.  I surmise it will be similar to Vega -> Pascal.


And I agree, sadly, that's there's not an inch of competition right now. But AMD is fueling consoles and Zen, that's quite good no ?
In the end, by lowering the need of high-end cards, AMD is saving their asses. That's why those cards aren't relevant in current market, there's few people in need for such power except 4K.
Let's wait for Navi.

Edit :
@lexluthermiester : got your attention, no need for insults


----------



## lexluthermiester (Jan 1, 2019)

cucker tarlson said:


> AMD themselves tried to deflect the fact that Vega series is simply worse than nvidia's offerings by blind tests (that's the dumbest thing a gpu manufacturer has done in the last decade or ever IMO,our card is as good if you don't look at the fps counter  )


While that's not wrong, let's be fair, Radeon cards are a good value for the price/performance ratio they offer. This has been demonstrated with many a review and benchmark.



Renald said:


> @lexluthermiester : stop acting like a douch.


Take your own advice sir.


----------



## cucker tarlson (Jan 1, 2019)

lexluthermiester said:


> While that's not wrong, let's be fair, Radeon cards are a good value for the price/performance ratio they offer. This has been demonstrated with many a review and benchmark.


Fury X was a flop at $650,and while Vega is much better value,it's its price that dictates it. That's why I'm picking on blind tests (which is not anti-consumer,just horribly dumb) the msrp confusion and low availability/high initial price. For those who pick a freesync monitor and can be patient AMD is a good value proposition,true.


----------



## Renald (Jan 1, 2019)

lexluthermiester said:


> While that's not wrong, let's be fair, Radeon cards are a good value for the price/performance ratio they offer. This has been demonstrated with many a review and benchmark.


Vega had price and consumption problems.
RTX have the same problem (without consumption  problems)


----------



## kings (Jan 1, 2019)

Renald said:


> In the end, by lowering the need of high-end cards, AMD is saving their asses. That's why those cards aren't relevant in current market, there's few people in need for such power except 4K.
> Let's wait for Navi.



I dont know why people keep telling that high-end is not relevant! When AMD/ATI competed shoulder to shoulder with Nvidia at the top in past years, I saw no one saying that high-end was not important!

It´s important to compete in the high-end, for a matter of market perception. The brand who have the top performance products, is more likely to sell the rest of the range better! This happens in everything, not only in PC hardware!

That's why AMD is stuck on 20%~25% of market share, despite having good cards in the mid-range for a few years now! Competing only in the mid-range is not enough to gain meaningful market share, as recent years have proved us!


----------



## lexluthermiester (Jan 1, 2019)

Renald said:


> Edit :
> @lexluthermiester : got your attention, no need for insults


Nice edit, now again, take your own advice.


Renald said:


> Vega had price and consumption problems.
> RTX have the same problem (without consumption problems)


So what you seem to be implying is that neither AMD nor NVidia can satisfy your requirements/needs? Good thing Intel has IGP options then..



kings said:


> When AMD competed shoulder to shoulder with Nvidia at the top in past years, I saw no one saying that high-end was not important!


This, Yes.


kings said:


> It´s important to compete in the high-end, for a matter of market perception. The brand who have the top performance products, is more likely to sell the rest of the range better! This happens in everything, not only in PC hardware! That's why AMD is stuck on 20%~25% of market share, despite having good cards in the mid-range for a few years now! Competing only in the mid-range is not enough to gain meaningful market share, as recent years have proved us!


AMD offering a high-end product to compete in the market and challenging NVidia like they used too would be excellent!


----------



## moproblems99 (Jan 1, 2019)

cucker tarlson said:


> However,that is not the point.AMD themselves tried to deflect the fact that Vega series is simply worse than nvidia's offerings by blind tests (that's the dumbest thing a gpu manufacturer has done in the last decade or ever IMO,our card is as good if you don't look at the fps counter  ) or including a freesync monitor in the price,what followed was this fake msrp scam where they tried to sway the public and reviewers that V64 was a $400 card while in reality it cost somwhere between 1080 and 1080Ti.



That is exactly their point.  What difference does it make if the games look and play good?  My current monitor is 75hz.  I don't care if I get 80fps or 180fps.  After 75, it doesn't matter anymore.  Obviously, people with high refresh rates, it does matter.  Bottom line, I can get generally the same performance at the same price point no matter which brand I choose.  Currently, I have a Vega.  Last time, I had a GTX 980.  Next time?  Since I have Freesync, I likely will have Navi.  However, I may have a 2080ti when they hit the price point at which I find value.  Currently, there is none for me because I think BF V is trash and I don't care about benchmarks.

My Afterburner overlay stays the same color no matter what color my gpu bleeds.


----------



## cucker tarlson (Jan 1, 2019)

moproblems99 said:


> That is exactly their point.  What difference does it make if the games look and play good?  My current monitor is 75hz.  I don't care if I get 80fps or 180fps.  After 75, it doesn't matter anymore.  Obviously, people with high refresh rates, it does matter.  Bottom line, I can get generally the same performance at the same price point no matter which brand I choose.  Currently, I have a Vega.  Last time, I had a GTX 980.  Next time?  Since I have Freesync, I likely will have Navi.  However, I may have a 2080ti when they hit the price point at which I find value.  Currently, there is none for me because I think BF V is trash and I don't care about benchmarks.
> 
> My Afterburner overlay stays the same color no matter what color my gpu bleeds.


comparing a V64 with 1080Ti in a blind test was stupid. You can't tell me you can't pick a winner,unless youre AMD and you think I'm a gullible child.

I get that in your case your stance is reasonable,but some people choose to go by whatever standard fits their narrative. Blind testing - sure,why not. Nvidia Kepler cards lose a couple of frames in a few games after a driver update - shady nvidia clearly gimping performance,AMD is the only choice. Where were they in those circumstances to tell us 1-2 fps in a couple of games across the board is not somehing anyone will percieve.Wait till they fix the drivers in the next update - hell no. Wait for drivers/cards from AMD - no problem apparently.


----------



## lexluthermiester (Jan 1, 2019)

cucker tarlson said:


> comparing a V64 with 1080Ti in a blind test was stupid. You can't tell me you can't pick a winner,unless youre AMD and you think I'm a gullible child.


Ah but when you factor performance/price ratio into the equation suddenly the dynamic changes.


----------



## cucker tarlson (Jan 1, 2019)

lexluthermiester said:


> Ah but when you factor performance/price ratio into the equation suddenly the dynamic changes.


what if nvidia compared 1070Ti with fast sync enabled on a 144hz non-vrr monitor vs V64 using freesync.
I'm not arguing the value of freesync,just the stuff they did try to sway us with when Vega launched. Pure performance wise it was a match for a year old GTX 1080,at higher price and way higher power draw.That's the whole reason for doing blind tests and counting freesync into the price. Good points for some,moot points for me. End of story.


----------



## GoldenX (Jan 1, 2019)

lexluthermiester said:


> Speculation is a ton of fun sure, but most of what has been going on in this thread has been NVidia bashing about price and supposed performance. If it were more about interesting speculation than it would be more fun.


We already know the 4GB variant is gimped to hell and back. So yeah, not much on that front either.


----------



## Renald (Jan 1, 2019)

kings said:


> I dont know why people keep telling that high-end is not relevant! When AMD/ATI competed shoulder to shoulder with Nvidia at the top in past years, I saw no one saying that high-end was not important!
> 
> It´s important to compete in the high-end, for a matter of market perception. The brand who have the top performance products, is more likely to sell the rest of the range better! This happens in everything, not only in PC hardware!
> !



I didn't say it's not important, just not relevant at this time and at this price.
They can always upgrade their lineup, that's good, but before AMD will catch up with nvidia, they have nearly a full year ahead. They already outperform AMD by far.

So instead of putting higher and higher prices, they could go same price and a small + in perf. AMD can't compete if they have better perf for same price point.
And going too far doesn't make you sell more. Ferrari can have the fastest car,  it  won't get all people to get a Ferrari. It's good publicity, sure, but they have that for 10 years now. They have nothing to prove (talking about nvidia here).


----------



## lexluthermiester (Jan 1, 2019)

GoldenX said:


> We already know the 4GB variant is gimped to hell and back.


Not for sure we don't. Waiting to see the benchmarks and reviews.


----------



## moproblems99 (Jan 1, 2019)

cucker tarlson said:


> comparing a V64 with 1080Ti in a blind test was stupid. You can't tell me you can't pick a winner,unless youre AMD and you think I'm a gullible child.



You are aren't seeing the point.  Nobody can tell the difference between a Vega 64 @ 75hz/75fps and a 1080ti @ 75hz/75fps.  Chasing fps is totally useless once you hit your monitors refresh rate.


----------



## cucker tarlson (Jan 1, 2019)

moproblems99 said:


> You are aren't seeing the point.  Nobody can tell the difference between a Vega 64 @ 75hz/75fps and a 1080ti @ 75hz/75fps.  Chasing fps is totally useless once you hit your monitors refresh rate.


were the test done on a 75hz monitor ?


----------



## moproblems99 (Jan 1, 2019)

cucker tarlson said:


> were the test done on a 75hz monitor ?



I really don't care, what's your point?


----------



## cucker tarlson (Jan 1, 2019)

moproblems99 said:


> I really don't care, what's your point?


what is yours ? that there is no monitor over 75hz ?


----------



## moproblems99 (Jan 1, 2019)

This isn't difficult.  Let's even remove an exact refresh rate to make it easier.  When two cards are vsynced to refresh rates, you can't tell them apart.


----------



## cucker tarlson (Jan 1, 2019)

moproblems99 said:


> This isn't difficult.  Let's even remove an exact refresh rate to make it easier.  When two cards are vsynced to refresh rates, you can't tell them apart.


was that the point of the blind test ? you seem to have no idea of what I'm referring to.


----------



## moproblems99 (Jan 1, 2019)

The point of the test was that you couldn't tell which card was in which machine.  That is what a blind test is used for.  I still don't see why this is so difficult.


----------



## Fouquin (Jan 1, 2019)

efikkan said:


> Wrong, you missed the point, the prices must be calculated with inflation to give a fair comparison.
> 
> That does not mean that a price increase is pure inflation, there are many factors here, including varying production costs, competition, etc. But these can only be compared after we have the price corrected for inflation, otherwise any comparison is pointless.



You missed the entire point of the comparison. It was a chart to simply show the launch price for each flagship. It was not a comparison across multiple generations (I.E. not meant to compare adjusted value of non-subsequent products).

The entire point of the chart was to show that prices did not increase with each new generation. No adjusted value required.


----------



## GoldenX (Jan 2, 2019)

lexluthermiester said:


> Not for sure we don't. Waiting to see the benchmarks and reviews.


Sure, here are some wild predictions for when it happens, the 4GB 128bit variant will be crap, just like Navi will be too.


----------



## cucker tarlson (Jan 2, 2019)

moproblems99 said:


> The point of the test was that you couldn't tell which card was in which machine.  That is what a blind test is used for.  I still don't see why this is so difficult.


so,what was the point of having two cards,a v64 and 1080ti,which are 30% apart in performance,in two rigs that are capped at 100hz. what was the point of it really? the card could not could not deliver with pure performance numbers, so they capped the fps when comparing it against 1080Ti in the most laughable AMD-sponsored test I've seen  can you imagine the reverse, if navi beat 2060 by 30% and someone did a nvidia-sponsored blind test where they're both delivering 100 fps.You'd both be the first people to heckle it,and rightly so.



GoldenX said:


> Sure, here are some wild predictions for when it happens, the 4GB 128bit variant will be crap, just like Navi will be too.



the 4gb 128-bit variant will easily match 1070 at 1080p/1440p if the 2060 6gb leak is true. and how is navi gonna be crap ? even if they took polaris at 7nm and ddr6 that'd make a good card.


----------



## INSTG8R (Jan 2, 2019)

cucker tarlson said:


> so,what was the point of having two cards,a v64 and 1080ti,which are 30% apart in performance,in two rigs that are capped at 100hz. what was the point of it really? the card could not could not deliver with pure performance numbers, so they capped the fps when comparing it against 1080Ti in the most laughable AMD-sponsored test I've seen  can you imagine the reverse, if navi beat 2060 by 30% and someone did a nvidia-sponsored blind test where they're both delivering 100 fps.You'd both be the first people to heckle it,and rightly so.
> 
> 
> 
> the 4gb 128-bit variant will easily match 1070 at 1080p/1440p if the 2060 6gb leak is true. and how is navi gonna be crap ? even if they took polaris at 7nm and ddr6 that'd make a good card.


Point of it was 100FPS of smooth gaming is possible. how much more is needed?  I don’t worry that there’s cards that can get 10-20FPS more than I can because I get my “100FPS” oF smooth Freesync gaming. Diminishing returns for epeen after that.


----------



## lexluthermiester (Jan 2, 2019)

GoldenX said:


> Sure, here are some wild predictions for when it happens, the 4GB 128bit variant will be crap, just like Navi will be too.


That's definitely a set of wild predictions. Can't see either one happening.


----------



## EarthDog (Jan 2, 2019)

cucker tarlson said:


> the 4gb 128-bit variant will easily match 1070 at 1080p/1440p if the 2060 6gb leak is true. and how is navi gonna be crap ? even if they took polaris at 7nm and ddr6 that'd make a good card.


Haters going hate... it's just that simple.


----------



## lexluthermiester (Jan 2, 2019)

EarthDog said:


> Haters going hate... it's just that simple.


Right. You think they'd give up or have something better to do..


----------



## moproblems99 (Jan 2, 2019)

cucker tarlson said:


> o,what was the point of having two cards,a v64 and 1080ti,which are 30% apart in performance,in two rigs that are capped at 100hz. what was the point of it really?



I mean I really don't know what else to say because we just covered it over the last two pages of posts and have come full circle here.  Once you hit your monitors refresh rate, any 'extra' performance is going down the toilet for fps chasing.  If anyone is concerned about extra power draw but then fps chases, they are silly.  There is simply no point in worrying about frames over your monitors refresh rate.  That was the whole purpose of the test.

If you can't understand that, good day!


----------



## bug (Jan 2, 2019)

moproblems99 said:


> I mean I really don't know what else to say because we just covered it over the last two pages of posts and have come full circle here. * Once you hit your monitors refresh rate, any 'extra' performance is going down the toilet for fps chasing*.  If anyone is concerned about extra power draw but then fps chases, they are silly.  There is simply no point in worrying about frames over your monitors refresh rate.  That was the whole purpose of the test.
> 
> If you can't understand that, good day!



I think we can all agree on that, if you meant "once your *min fps* *hits your monitor's refresh rate". But even that is contingent to one title.

*or at least the 99 or 90 percentile, because absolute min fps is usually just a freak occurence


----------



## moproblems99 (Jan 2, 2019)

bug said:


> I think we can all agree on that, if you meant "once your *min fps* *hits your monitor's refresh rate". But even that is contingent to one title.
> 
> *or at least the 99 or 90 percentile, because absolute min fps is usually just a freak occurence



Considering we couldn't get the basics down, I didn't want to go advanced yet.  But yes, the extra processing power is much better for your minimums.


----------



## cucker tarlson (Jan 2, 2019)

moproblems99 said:


> I mean I really don't know what else to say because we just covered it over the last two pages of posts and have come full circle here.  Once you *hit your monitors refresh rate, any 'extra' performance is going down the toilet for fps chasing.*  If anyone is concerned about extra power draw but then fps chases, they are silly.  There is simply no point in worrying about frames over your monitors refresh rate.  That was the whole purpose of the test.
> 
> If you can't understand that, good day!


you lay it down in black in white,yet you can't understand. tests that are limited in any way,in this case by the resolution/refresh rate (100hz) and the choice of games (doom,shadow warrior) are just pointless and are not a good way to test performance.  I don't know how to express it in any simpler terms either. I mean,a gpu comparison with a v-sync cap ? Who even heard of something that stupid.

What is the exact point that we should stop "fps chasing" ? 90 fps ? 100 fps ? 85 fps ? Can you please tell us ? Cause I was under the impression that it's subjective in every case. I absolutely can feel 100 vs 130 fps instantly. The blind test was pointless and just pulling wool over the public eyes.


----------



## moproblems99 (Jan 2, 2019)

ermahgerd.....



cucker tarlson said:


> you lay it down in black in white,yet you can't understand. tests that are limited in any way,in this case by the resolution/refresh rate (100hz) and the choice of games (doom,shadow warrior) are just pointless and are not a good way to test performance. I don't know how to express it in any simpler terms either. I mean,a gpu comparison with a v-sync cap ? Who even heard of something that stupid.



This really is black and white.  They weren't comparing GPUs, they were comparing a scenarios.  It went like this:  Hey, you can play X game at Y resolution at Z Hz and you can't tell the difference.  Which is absolutely correct.  Was it the most eye opening test ever?  No.  Did it show people that if you target your monitor that any gpu will likely work?  Yes.



cucker tarlson said:


> What is the exact point that we should stop "fps chasing" ? 90 fps ? 100 fps ? 85 fps ? Can you please tell us ? Cause I was under the impression that it's subjective in every case. I absolutely can feel 100 vs 130 fps instantly. The blind test was pointless and just pulling wool over the public eyes.



This is perhaps the most black and white and easiest.  The correct answer is, drum roll please: your monitor's refresh rate!


----------



## cucker tarlson (Jan 2, 2019)

moproblems99 said:


> ermahgerd.....
> 
> 
> 
> ...


why didn't they go with a 144/165hz one for the tests then ? And it's obvious you're unlikely to see the difference when they just tell you to sit down and play a couple of game for a couple of minutes each.But in the games you play extensively at home you'll easily see the fps difference immediately. Imagine you've played hundreds of hours of BF1 at 100 fps and the same amount at 130 fps. You'll see the difference right away.


----------



## lexluthermiester (Jan 2, 2019)

moproblems99 said:


> your monitor's refresh rate!


Not to take sides here, because both of you have made fair points, the above is correct. Once a systems minimum FPS matches or exceeds the maximum refresh rate of the display, FPS chasing becomes a waste of time, effort and energy.

EDIT; Realistically, 120hz is the physical limit to what the human eye can distinguish in real-time. While we can still see a difference in "smoothness" above 120hz, it's only a slight perceptional difference and can be very subjective from person to person.


----------



## moproblems99 (Jan 2, 2019)

cucker tarlson said:


> why didn't they go with a 144/165hz one for the tests then ? And it's obvious you're unlikely to see the difference when they just tell you to sit down and play a couple of game for a couple of minutes each.But in the games you play extensively at home you'll easily see the fps difference immediately. Imagine you've played hundreds of hours of BF1 at 100 fps and the same amount at 130 fps. You'll see the difference right away.



We both know the answer to that question:  The Vega likely can't do it.  Wow, that is disingenuous you are probably telling yourself.  The reason they did so is that it is very likely many more people play at 100hz and below.  So that is who they targeted.  Where it would benefit them the most.

And I don't even want to imagine playing any BF title for hundreds of hours.  It only brings up images of my own suicide.



lexluthermiester said:


> Not to take sides here, because both of you have made fair points, the above is correct. Once a systems minimum FPS matches or exceeds the maximum refresh rate of the display, FPS chasing becomes a waste of time, effort and energy.



I don't play for sides.  There is really nothing Cucker has said that is wrong, just that it doesn't apply to what started this debate about the blind test.


----------



## kanecvr (Jan 3, 2019)

lexluthermiester said:


> Everyone is complaining about price. Is it really that expensive? And if so, is really that difficult to save money for an extra month or two?
> 
> Seems to be at the end of the card.



You don't seem to get it do you - nvidia has been steadily increasing the cost of it's graphics cards for the last 4-5 generations, and swapping products around. With the 6xx series, the GTX 680 was NOT actually their high-end product - no - that was the original titan. So they released the 680 witch is a mid end card branded as a high end one. The codename for the GPU is GK104 (G=geforce K=Kepler 4=model number). In previous generation cards, Fermi, the 104 numbering was assigned to the GTX 460 - GF104, and the high end model was the GF100. Currently, the GP106 card is the GTX 1060 - but what bore the 106 moniker a few generations ago? the GTX 450! yeah. The GF106 is the GTX 450, witch means the GP104 SHOUD have been the GTX 1050, NOT the 1060. Nvidia keeps swapping names, making even more money off the backs of people like you, who are willing to spend the ludicrous amounts of money these bottomless pit of a company is asking for.

The 680 should have been the GK100, witch nvidia later renamed into GK110 because of the 7 series or kepler refresh - and sold as the original titan - a 1000$ card! No other high end video card sold for that much before - the GTX 480 had a MSRP of 499$, and the 580 was 450$ - so nvidia doubled their profits with Kepler by simply moving the high and mid end cards around their lineup and "inventing" two new models - the Titan, and later, the GK110b, the 780ti - but not before milking consumers with their initial run of defective GK110 chips, the GTX 780, or GK110-300. 

Now nvidia is asking 400$ for a mainstream card, almost as much as the 2, 4 and 5 series high end models cost. But wait - the current flagship, the Titan RTX is now 2500 bloody dollars, and the 2800ti was 1299$ at launch, with prices reaching 1500$ in some places. In fact, it's still 1500$ at most online retailers in my country. F#(k that! 1500$ can buy you a lot of nice stuff - a decent car, a good bike, a boat, loads of clothes, a nice vacation - I'm not forking over to nvidia to pay for a product witch has cost around the 400$ mark for the better part of 18 freakin' years. 

Don't you realize we're being taken for fools? Companies are treating us like idiots, and we're happy to oblige by forking out more cash for shittier products...


----------



## EarthDog (Jan 3, 2019)

So... youd rather have a 'high-end' card from a 'couple' generations ago than a modern card with the latest features which performs the same (or better)....just because its labeled as a midrange card? 

I'd rather go with the modern card with all the latest gizmos, and less power use given the same cost and performance.


----------



## GoldenX (Jan 3, 2019)

EarthDog said:


> So... youd rather have a 'high-end' card from a 'couple' generations ago than a modern card with the latest features which performs the same (or better)....just because its labeled as a midrange card?
> 
> I'd rather go with the modern card with all the latest gizmos, and less power use given the same cost and performance.


Amen to that. Plus warranty.


----------



## efikkan (Jan 3, 2019)

kanecvr said:


> With the 6xx series, the GTX 680 was NOT actually their high-end product - no - that was the original titan.


No, the first Titan was released along with the 700 series.
The top model of the 600 series was the GTX 690, having two GK104 chips.



kanecvr said:


> So they released the 680 witch is a mid end card branded as a high end one. The codename for the GPU is GK104 (G=geforce K=Kepler 4=model number). In previous generation cards, Fermi, the 104 numbering was assigned to the GTX 460 - GF104, and the high end model was the GF100. <snip>
> 
> The 680 should have been the GK100, witch nvidia later renamed into GK110 because of the 7 series or kepler refresh - and sold as the original titan - a 1000$ card!


I have to correct you there.
GK100 was bad and Nvidia had to do a fairly "last minute" rebrand of "GTX 670 Ti" into "GTX 680". (I remember my GTX 680 box had stickers over all the product names) The GK100 was only used for some compute cards, but the GK110 was a revised version, which ended up in the GTX 780, and was pretty much what the GTX 680 _should have been_.

You have to remember that Kepler was a major architectural redesign for Nvidia.


----------



## Berfs1 (Jan 3, 2019)

Renald said:


> 400$ for an high end card (meaning it's the best a company can do) like the the 290 for the record, is understandable.
> 400$ for a mid-range card (see what a 2080 Ti can do) is not good.
> 
> Nvidia doubled they're  prices  because AMD is waiting for next year to propose something,  because they are focusing on CPU and don't have Nvidia nor Intel firepower. They can't do both GPU and CPU.
> ...


You are saying a 2080 Ti should be 600$? Because they “doubled” it? I know you are going to say “it was a metaphor”, and that’s where problems come in. It’s NOT double, at most I would say they charged 10-25% more than it’s worth. 10-25% is NOT double. That’s where customer satisfaction can be altered, by misleading a consumer market and saying it’s absolutely not worth it where in reality it’s not like what you said.



efikkan said:


> No, the first Titan was released along with the 700 series.
> The top model of the 600 series was the GTX 690, having two GK104 chips.
> 
> 
> ...


The GTX 670 Ti doesn’t exist. It’s not a rebrand. Also, if I recall correctly, the 680 has the fully enabled GK104 chip (with the 690 being the same but two GK104 chips). Kepler 1st gen capped at 1536 cores, Kepler 2nd gen capped at 2880 cores. Just by architecture. Not for the intent to rip people off.


----------



## cucker tarlson (Jan 3, 2019)

It's a 750mm2 die with 11 gigs of ddr6 and hardware RT acceleration, you can have one at $1000 or you can have none, they'll not sell it to you at $600 if vega 64 is $400 and 2080Ti is 1.85x Vega's performance.








2080Ti RTX = V64 rasterization


----------



## moproblems99 (Jan 3, 2019)

They'll sell it for what people will pay.  Which is apparently what it is listed at.


----------



## Renald (Jan 3, 2019)

Berfs1 said:


> You are saying a 2080 Ti should be 600$? Because they “doubled” it? I know you are going to say “it was a metaphor”, and that’s where problems come in. It’s NOT double, at most I would say they charged 10-25% more than it’s worth. 10-25% is NOT double. That’s where customer satisfaction can be altered, by misleading a consumer market and saying it’s absolutely not worth it where in reality it’s not like what you said.


No, It's not he comparison I wish to make. The 2080 Ti is way more  powerful than a 2060, any memory setup considered.

A 2080Ti should cost ~800/900 IMO and a 2080 600$.
If AMD was in the competition, I think that nvidia would be much closer to that range of prices.


----------



## lexluthermiester (Jan 4, 2019)

Renald said:


> A 2080Ti should cost ~800/900 IMO and a 2080 600$.


Yeah, but you're not the one making them having to recoup manufacturing and R&D costs. And for the record, 2080's are not far from the $600 you mentioned.


----------



## wolf (Jan 4, 2019)

kings said:


> People compare this to the price of the GTX 1060, but forget the RX 480 was on the market at the same time to keep prices a bit lower! Basically, the closest competition to the RTX 2060 will be the RX Vega 56 (if we believe in the leaks), which still costs upwards of $400~$450, except for one or another occasional promotion.
> 
> Unless AMD pulls something off their hat in January, $350 to $400 for the RTX 2060 will be in tune with what AMD also offers! Nvidia with their dominant position, is not interested in disrupting the market with price/performance.





xkm1948 said:


> Nice to see bunch of couch GPU designers and financial analysts knows better than a multi million GPU company regarding both technology and pricing. It is called capitalism for a reason, no competition means Nvidia can have free say on how much they price their cards. You don’t like it then don’t buy, good for you. Someone else likes it they buy and it is entirely their own business. NVIDIA is “greedy” sure yeah they better f*ucking be greedy. They are a for profit company not a f*ucking charity.



Good to see a few people out there are onto it, probably others too I just haven't quoted you all. In the absence of competition at certain price points which AMD has generally been able to do at least in the low/mid/upper-midrange (and often top teir) segments previously for some time, Nvidia just has this ability to charge a premium for premium performance. Add to that fact the upper-end RTX chips are enormous and use newer more expensive memory and yeah, you find them charging top dollar for them, and so they should in that position.

As has been said, don't like it? vote with your wallet! I sure have. I bought a GTX1080 at launch and ~2.5 years later I personally have no compelling reason to upgrade, that comes down to my rig, screen, available time to game, what I play, price performance etc etc etc, add it all together - that equation is different for every buyer.

Do I think 20 series RTX is worth it? Not yet but I'm glad someones doing it, I've seen BFV played with it on and I truly hope Ray Tracing is in the future of gaming.

My take is that when one or both of these two things happen prices will drop, perhaps but a negligible amount, perhaps significantly;

1. Nvidia clears out all (or virtually all) 10 series stock, which the market still seems hungry for, partly because many offerings offer more than adequate performance for the particular consumer's needs.
2. AMD answer the 20 series upper level performance, or release cards matching 1080/2070/vega perf at lower prices (or again, both)


----------



## bug (Jan 4, 2019)

lexluthermiester said:


> Yeah, but you're not the one making them having to recoup manufacturing and R&D costs. And for the record, 2080's are not far from the $600 you mentioned.


My guess would be at least some of the R&D has been covered by Volta. It's the sheer die size that makes Turing so damn expensive.
If Nvidia manages to tweak the hardware for their 7nm lineup, then we'll have a strong proposition for DXR. Otherwise, we'll have to wait for another generation.


----------



## lexluthermiester (Jan 4, 2019)

bug said:


> My guess would be at least some of the R&D has been covered by Volta.


Maybe, but how many Volta cards have they actually sold? Even so, Volta and Turing are not the same. They have similarities, but are different enough that a lot of R&D is unrelated and doesn't cross over.


bug said:


> It's the sheer die size that makes Turing so damn expensive.


That is what I was referring to with manufacturing costs. Pricey wafer dies, even if you manage a high wafer/die ratio yield. That price goes way up of you can't manage at least an 88% wafer yield, which will be challenging given the total size of a functional die.


----------



## bug (Jan 4, 2019)

lexluthermiester said:


> Maybe, but how many Volta cards have they actually sold? Even so, Volta and Turing are not the same. They have similarities, but are different enough that a lot of R&D is unrelated and doesn't cross over.



Well, Volta is only for professional cards. The Quadro goes for $9,000, God knows how much they charge for a Tesla.
And about differences, I'm really not aware of many, save for the tweaks Nvidia did to make it more fit for DXR (and probably less fit for general computing). I'm sure Anand did a piece highlighting the differences (and I'm sure I read it), but nothing striking has stuck.

That said, yes, R&D does not usually pay off after just one iteration. I was just saying they've already made _some_ of that back.


----------



## moproblems99 (Jan 4, 2019)

wolf said:


> Nvidia clears out all (or virtually all) 10 series stock



They could certainly just restock 10 series at the prices they are at and leave the 20 series where they are to continue this pricing model until AMD releases something.  Lord help us all if it doesn't compete.


----------



## kanecvr (Jan 6, 2019)

efikkan said:


> No, the first Titan was released along with the 700 series.
> The top model of the 600 series was the GTX 690, having two GK104 chips.



Yes, the original titan was released with the 7 series, but both use the kepler architecture. In fact, the 680 and 770 are identical, sans for a small clock speed increase in favor of the 770 (1045 vs 1006Mhz). You can even flash a 680 to 770 and vice versa (if the cards use the same pcb like say a reference design).



efikkan said:


> I have to correct you there.
> GK100 was bad and Nvidia had to do a fairly "last minute" rebrand of "GTX 670 Ti" into "GTX 680". (I remember my GTX 680 box had stickers over all the product names) The GK100 was only used for some compute cards, but the GK110 was a revised version, which ended up in the GTX 780, and was pretty much what the GTX 680 _should have been_.
> 
> You have to remember that Kepler was a major architectural redesign for Nvidia.



No. GK100=GK110. The reason for the extra "1" is the name change from 6 series to 7 series - try to make it look like a new product. Kepler also has GK2xx branded chips witch do contain small architectural improvements, mostly for the scheduler and power efficiency. Again, and for the last time - the 680 is NOT GK100 - it's GK104. *Nvidia did not release any GPU with the GK100 codename, not even in the professional market.* There were rumors and the tech press did speculate that GK100 would be reserved for the (then) new Tesla, but that never happened. This is the complete list of Kepler GPUs, both 6 and 7 series, and including the Titan, Quadro and Tesla cards:

Full GK104 - GTX 680, GTX 770, GTX 880M and several profesional cards.
Cut down GK104 - GTX 660, 760, 670, 680M, 860M and several professional cards
GK106 - GTX 650ti boost, 650ti, 660, and several mobile and pro cards.
GK107 - GTX 640, 740, 820 and lots of mobile cards
GK110 - GTX 780 (cut down GK110), GTX 780ti and the original Titan, as well as the Titan Black, Titan Z and loads of Tesla / Quadro cards like the K6000
GK208 - entry level 7 series and 8 series cards, both Geforce and Quadro branded
GK208B - entry level 7 series and 8 series cards, both Geforce and Quadro branded
GK210 - Revised and slightly cut-down version of the GK100/GK110. Launched as the Tesla K80
GK20A - GPU built into the Tegra K1 SoC
I know from a trustworthy source (nvidia board partner employee) that nvidia had no issues whatsoever with the GK100. If fact internal testing showed what a huge leap in performance Kepler was over Fermi. This is THE REASON why nvidia decided to launch the GK104 mid-range chip as the GTX 680 - the GK104 is 30 to 50% faster then the GF100 used in the GTX 480 and 580. Some clever people in management came up with this marketing stunt - spread Kepler over two series of cards, 600 and 700 series, and release the GK104 first, and save the full GK100 (GK110) for the later 700 series and launch it as a premium product, creating a new market segment with the 780ti and original titan. GK110 is simply the name chosen for launch, replacing the GK100 moniker, mainly to attempt to obfuscate savvy consumers and the tech press. As nvidia naming schemes go, the GK110 should have been an entry level chip: GK104>GK106>GK107 -> the smaller the last number, the larger the chip. The only Kepler revision is named GK2xx (see above) and only includes entry level cards.


----------



## jabbadap (Jan 6, 2019)

While true, gk110 had two versions too. GK110 and GK110b, latter could clock a bit higher. All gtx780 "AIBs GHz editions" used the gk110b version of that chip. Going back to that ancient history, everybody knows that gtx780ti had aged pretty bad mostly because of 3GB vram. But how has GTX 780 6GB version aged?


----------



## Berfs1 (Jan 7, 2019)

I told y’all the power plug was on the side lol


----------

