# NVIDIA GeForce RTX 2080 Ti Founders Edition 11 GB



## W1zzard (Sep 19, 2018)

NVIDIA debuted its Turing graphics architecture today, straightaway with the flagship RTX 2080 Ti. This card packs the promise of real-time ray tracing at 4K UHD, besides huge gains in performance. NVIDIA also put out its best cooler design since TITAN, commanding a very high price for some very beautiful visuals.

*Show full review*


----------



## Deleted member 178884 (Sep 19, 2018)

I've been waiting for this - turns out it's not a improvement I expected, It's a pass and a time to save for a second 1080 ti ftw3 and watercool them both.


----------



## TheinsanegamerN (Sep 19, 2018)

For the price, that is disappointing.

Granted, the 2080ti performs well, but to see the 2080 at 1080ti levels more often then not, it seems per core the 2080ti is just pascal all over again, and hits the same OC limits. For an arch that has been in the works for years, this seems like just the pascal arch with core counts pumped up. That being said I do marvel at the 2080ti drawing less power then a vega 64 despite a truly massive die.

This card should be $600, not $1200.


----------



## HaKN ! (Sep 19, 2018)

Completly a joke for the price.... really Nvidia? All that RTX BS hype for that ? Pffff i really dont see anyone throwing money after this....


----------



## Vya Domus (Sep 19, 2018)

Sooooo nothing at all to show with regards to the whole RTX thing, so much for reinventing computer graphics huh ? Color me surprised, though I finally realized why they put RTX in the name. So that we don't forget about it completely.


----------



## dj-electric (Sep 19, 2018)

Great performance, not for that terrible price


----------



## zelnep (Sep 19, 2018)

what a failure +28% faster than gtx 1080 ti - after 2.5 year of progress and almost x2 price increase? I did think this  RTX 2080 ti would be like +50% faster (than gtx 1080 ti) - and still would call this a failure ! but this is just sad  I just did check the power consumption... are you kidding me? how can this be worse than Pascal? (and still cost x2 price)


----------



## newtekie1 (Sep 19, 2018)

TheinsanegamerN said:


> Granted, the 2080ti performs well, but to see the 2080 at 1080ti levels more often then not, it seems per core the 2080ti is just pascal all over again, and hits the same OC limits. For an arch that has been in the works for years, this seems like just the pascal arch with core counts pumped up. That being said I do marvel at the 2080ti drawing less power then a vega 64 despite a truly massive die.



Per core it is just pascal all over again?  Really?  The 2080 has 17% LESS cores than the 1080Ti, yet still manages to outperform it by 8% overall at roughly the same clock speeds.  Not to mention the narrow memory bus.  That's a pretty decent feat right there.

Granted, the price is too high...


----------



## medi01 (Sep 19, 2018)

So, AdoredTV was right, or was he not?


----------



## zelnep (Sep 19, 2018)

medi01 said:


> So, AdoredTV was right, or was he not?


adored was wrong - this is WORSE, he did predicted +40% increase over gtx1080 ti - but obviously that was leakded from nvidias "testing guidlines".


----------



## M2B (Sep 19, 2018)

Thi





zelnep said:


> what a failure +28% faster than gtx 1080 ti - after 2.5 year of progress and almost x2 price increase? I did think this  RTX 2080 ti would be like +50% faster (than gtx 1080 ti) - and still would call this a failure ! but this is just sad  I just did check the power consumption... are you kidding me? how can this be worse than Pascal? (and still cost x2 price)



This is 38 percent faster than 1080Ti, not 28 percent.


----------



## dwade (Sep 19, 2018)

High FPS even in 4k. You’re going to need at least a 8700k to push this. Preferably the 9900k.


----------



## Fluffmeister (Sep 19, 2018)

TheinsanegamerN said:


> That being said I do marvel at the 2080ti drawing less power then a vega 64 despite a truly massive die.



Indeed, whilst being 85% faster at 4K!


----------



## intelzen (Sep 19, 2018)

9.2 for this kind of a joke? I get it - better not mess with nvidia or wont get nothing to test next time :/


----------



## B-Real (Sep 19, 2018)

4K performance increase and 2 generation comparison with starting MSRPs:

1.
1080Ti - 2080: 9% for same price (or +50$ for the 2080)
980Ti - 1080 was 39% for -50$
780Ti - 980 was 9% for -150$

2.
1080Ti - 2080Ti: 39% for +300$ (more like 350$ actually)
980Ti - 1080Ti was 83% (!!!) for +50$
780Ti - 980Ti was 43% for -50$

I just hope that everyone is thinking this over and stop ordering.



dj-electric said:


> Great performance, not for that terrible price


I feel that irony there. 



TheinsanegamerN said:


> That being said I do marvel at the 2080ti drawing less power then a vega 64 despite a truly massive


You will never ever draw this power from a Vega64 with correct UV (which shouldn't be a problem who buys a GPU priced this high).



M2B said:


> Thi
> 
> This is 38 percent faster than 1080Ti, not 28 percent.


Anyway, this is a huge disappointment to every unbiased person. And the initial thoughts were good that the 2070 will never match the 1080Ti with non OC but the 2080 will perform around that, and the 2080Ti will not even be near a 50% increase to the 1080Ti.


----------



## swirl09 (Sep 19, 2018)

Respect to that FE card, its actually solid.

Numbers are fairly close to what i was expecting. Not thrilled with the clocks and temps of the MSI trio, which is the one I have ordered ^_^’ hopefully my one fares a little better.

Really excited to see DLSS in action, I think its going to push this purchase from feeling the sting for 1/3 more perf, to actually making it worthwhile...


----------



## Fluffmeister (Sep 19, 2018)

zelnep said:


> are there typo? or explain to me how *100 -72 = 38* then*?* (relative performance 4K, for non 4k - even worse than that)



100/72  = 1.38. It's 38% faster at 4K.


----------



## Deleted member 178884 (Sep 19, 2018)

The issue with the price here is you can grab 2 1080 tis from the used market in excellent condition - yeah they can get their money from retards.


----------



## Robcostyle (Sep 19, 2018)

So, considering that all reviewers had to sign some sort of "confidential" aagreement, in order to get samples and drivers at first - I suggest IRL results of 2080 Ti are even worse than TPU showed? What a green disgrace....


----------



## xkm1948 (Sep 19, 2018)

Why so much hate? I don’t get it.


----------



## Deleted member 178884 (Sep 19, 2018)

xkm1948 said:


> Why so much hate? I don’t get it.


Come out of the cave you've been in, it's been over a year of pascal for 30% - and you can get TWO 1080tis for the price of one 2080 ti. It makes sense.



xkm1948 said:


> Why so much hate? I don’t get it.


Also the 1080 ti was around 50% than the 980 ti - that's the gains people want, at this price it costs more than a titan xp which Is a joke.


----------



## newtekie1 (Sep 19, 2018)

zelnep said:


> are there typo? or explain to me how *100 -72 = 38* then*?* (relative performance 4K, for non 4k - even worse than that)



That isn't how math works.

I'll make it easy, I'll give you an example:

Card A = 72FPS
Card B = 100FPS

Card B is getting 28FPS more performance.

If you normalize to Card A, then Card B is ~38% faster(actually 39 if you round correctly).  28FPS / 72FPS = 0.39 x 100 = 39% Faster.
If you normalize to Card B(like the graphs in the review) then Card A is 28% slower.  28FPS / 100FPS = 0.28 x 100 = 28% Slower.

So, 1080Ti is 28% slower than 2080Ti, and the 2080Ti is 38% faster than 1080Ti.



Fluffmeister said:


> 100/72  = 1.38. It's 38% faster at 4K.



Or you could do it this way.



xkm1948 said:


> Why so much hate? I don’t get it.



Really, price, and that's it.  The card performs, it a little heavy on power usage but still not terrible, and as usual the the 3rd party cards are quiet but even the reference card isn't too loud either.  So really, it all comes down to the price.


----------



## Fluffmeister (Sep 19, 2018)

xkm1948 said:


> Why so much hate? I don’t get it.



Math problems mainly 

This things are beasts, but silly expensive, we can thank the shit competition for that. Reading the Pascal reviews a lot of them didn't appreciate those cards then either, now they are starting to.

No doubt cards like the GTX 1080 and GTX 1080 Ti remain great buys.


----------



## B-Real (Sep 19, 2018)

Xx Tek Tip xX said:


> Also the 1080 ti was around 50% than the 980 ti - that's the gains people want, at this price it costs more than a titan xp which Is a joke.


What? No... the 1080Ti was about 83% faster than the 980Ti.








Fluffmeister said:


> Math problems mainly
> 
> This things are beasts, but silly expensive, we can thank the shit competition for that. Reading the Pascal reviews a lot of them didn't appreciate those cards then either, now they are starting to.
> 
> No doubt cards like the GTX 1080 and GTX 1080 Ti remain great buys.


How can you call this thing a beast when it's not even half faster than the 1080Ti compared to the 1080Ti-980Ti? I really don't get it.


----------



## Vya Domus (Sep 19, 2018)

B-Real said:


> How can you call this thing a beast when it's not even half faster than the 1080Ti compared to the 1080Ti-980Ti? I really don't get it.



When you see your favorite color in front of your eyes even a one digit improvement makes it a beast.


----------



## beautyless (Sep 19, 2018)

Upgrade 2080Ti = 1199$
Upgrade 4K 120Hz Gsync monitor = 1999$ (Recommended)
Upgrade CPU i7 8th-9th + RAM = 800$ (Recommended)

It's huge cost. But at last, we got absolute 4k capable system to buy.


----------



## W1zzard (Sep 19, 2018)

Robcostyle said:


> I suggest IRL results of 2080 Ti are even worse than TPU showed?


The numbers in our reviews are real-life results


----------



## efikkan (Sep 19, 2018)

39% more performance over the previous generation is nothing to complain about.

To all of you who said this generation would suck; you better buy some of this and eat your own words:








TheinsanegamerN said:


> For an arch that has been in the works for years, this seems like just the pascal arch with core counts pumped up.


21% extra cores, 11% extra GFlop/s and yet it yields 39% higher performance in 4K. No this is *nothing* like Pascal.


----------



## Frick (Sep 19, 2018)

Dat perf/price graph.


----------



## swirl09 (Sep 19, 2018)

Im sure we would all love maxwell -> pascal gains again, but the perf increase here is more in line with previous generation gains.

I get the price isnt great, and when you consider the time between gens and the perf : price ratio, it seems miserable. But, this what happens when there is no competition. I would rather see this happen, than have nvidia sit on their hands and wait until it had something to respond to. Also I think people are dismissing the addition of the AI cores. While I think RT looks nice, I am not optimistic it will be worth the performance hit. However its DLSS that stands to make a bigger impact, its a case of seeing how well its supported and how it looks. But it has the potential of turning the gap into that maxwell -> pascal we wanted to see.


----------



## TheOne (Sep 19, 2018)

Review needs a Titan, also there are a lot of reviews considering this is a delayed pre-release card.


----------



## kings (Sep 19, 2018)

The problem is the lack of alternatives and Nvidia know this. No one can buy an imaginary AMD card.

A person who wants something better than a GTX 1080/Vega 64 (for whatever the reason), there's nothing besides Nvidia.


----------



## lemkeant (Sep 19, 2018)

efikkan said:


> 39% more performance over the previous generation is nothing to complain about.
> 
> To all of you who said this generation would suck; you better buy some of this and eat your own words:
> 
> ...



Its the cost that's the main issue. If they were priced reasonably, then the mood/opinions would be much better.

I had a 2080ti pre-ordered but I cancelled it last week after the rumors weren't blowing me away. Glad a cancelled it


----------



## moproblems99 (Sep 19, 2018)

Fluffmeister said:


> we can thank the shit competition for that



Actually, it is equally as much as people feeling the need to have the latest and greatest whether there is a benefit or not.  Nvidia will charge what people will pay.  There could be all the competition in the world but if people want to make poor choices, nVidia will happily oblige.


----------



## Fluffmeister (Sep 19, 2018)

moproblems99 said:


> Actually, it is equally as much as people feeling the need to have the latest and greatest whether there is a benefit or not.  Nvidia will charge what people will pay.  There could be all the competition in the world but if people want to make poor choices, nVidia will happily oblige.



Indeed, if they sell they aren't too expensive, it's what the market will take.

The fact there aren't any alternatives in this market is just the unfortunate truth.

I'll leave the rest of you to complain about what other people do with their money.


----------



## xkm1948 (Sep 19, 2018)

I find it amazing how our couch GPU designers know better how to design a GPU and make it 50% better minium per generation. Like all those engineers at Nvidia should give their job to you guys.

Seriously people tends to forget Mawell to Pascal jump are partially due to the massive jump from 28nm to 16nm. You cannot expect magic like that to happen every gen. Geez stop being hypocrites.


----------



## Fleurious (Sep 19, 2018)

Nice enough performance but i’ll look for a deal on a used 1080ti instead.


----------



## windwhirl (Sep 19, 2018)

@W1zzard
 In the paragraph about DLSS, in the Conclusion, you left behind a "[... fill in ... ]"...


----------



## FordGT90Concept (Sep 19, 2018)

1920x1080 = 19% more performance
2560x1440 = 25% more performance
3840x2160 = 28% more performance
Price = 44% more money

Something is wrong with this picture.  NVIDIA is banking on RTX branding moving these cards.  I hope the market responds with a hearty laugh in NVIDIA's face.


----------



## zelnep (Sep 19, 2018)

Fluffmeister said:


> 100/72  = 1.38. It's 38% faster at 4K.


yes, I was wrong, it is 38%, lets even round it up and call it *+40%*... still a failure for the price increase. and no matter if I like it or not - 2080 ti and 2080 will be the only option for 4K gaming for next 2 years? and it stars at "just 899"$. imagine next generation of nvidias then, another +30-+40% and price 1600$? I mean 1599?


----------



## Joss (Sep 19, 2018)

moproblems99 said:


> Nvidia will charge what people will pay


Exactly.
The problem is not with Nvidia being greedy or AMD not being competitive.
The problem is the consumer. It lost all sense of perspective.


----------



## mypg0306 (Sep 19, 2018)

TheinsanegamerN said:


> For the price, that is disappointing.
> 
> Granted, the 2080ti performs well, but to see the 2080 at 1080ti levels more often then not, it seems per core the 2080ti is just pascal all over again, and hits the same OC limits. For an arch that has been in the works for years, this seems like just the pascal arch with core counts pumped up. That being said I do marvel at the 2080ti drawing less power then a vega 64 despite a truly massive die.
> 
> This card should be $600, not $1200.


 you are absolutely correct!


----------



## bug (Sep 19, 2018)

Vya Domus said:


> Sooooo nothing at all to show with regards to the whole RTX thing, so much for reinventing computer graphics huh ? Color me surprised, though I finally realized why they put RTX in the name. So that we don't forget about it completely.


Yes, twice as fast as Vega64 at 4k while drawing a little less power=nothing at all to show 
If it weren't for the insane pricing, this would have been game over for AMD.


----------



## Fluffmeister (Sep 19, 2018)

FordGT90Concept said:


> 1920x1080 = 19%
> 2560x1440 = 25%
> 3840x2160 = 28%
> Price = 178%
> ...



That will be your maths.


----------



## R0H1T (Sep 19, 2018)

efikkan said:


> 39% more performance over the previous generation is nothing to complain about.
> 
> To all of you who said this generation would suck; you better buy some of this and eat your own words:
> 
> ...


Part of that can be explained with this, do we have number from *Titan Xp* to compare this with?



Yeah no, this is definitely a good leap but let's be clear ~ this isn't what Nvidia wanted, they wanted Turing on 10nm & this is just a compromise, albeit a reasonable one & no the price is still atrocious!


----------



## bug (Sep 19, 2018)

R0H1T said:


> Part of that can be explained with this, do we have number from *Titan Xp* to compare this with?
> 
> View attachment 107117View attachment 107118
> 
> Yeah no, this is definitely a good leap but let's be clear ~ this isn't what Nvidia wanted, they wanted Turing on 10nm & this is just a compromise, albeit a reasonable one & no the price is still atrocious!


Price is always based mostly on the die size. And die size is huge. Sucks, but on the upside it makes my decision to (not) buy a no brainer


----------



## Ferrum Master (Sep 19, 2018)

Pretty much they need shunt mod(thanks for the pics W1z), water block and see what it really does while being OC. So far... meh...


----------



## CandymanGR (Sep 19, 2018)

Sooo.. more or less as expected. Makes sense to buy if you play at 4K or 1080p/120Hz-144Hz (in general for high refresh rates). Doesn't make sense if you play at 1080p/60Hz unless if you want the extra features. The pricing is indeed terrible (also as expected).

P.S. Nice, complete review.


----------



## ShurikN (Sep 19, 2018)

> RTX Technology not gimmicky, does bring tangible IQ improvements


Which we will have to take your word for it, cause there's nothing to show it at the moment...


----------



## Vya Domus (Sep 19, 2018)

> RTX Technology not gimmicky, does bring tangible IQ improvements



I have to say this is the complete opposite of tangible, there is nothing you can do with it right now.


----------



## bug (Sep 19, 2018)

ShurikN said:


> Which we will have to take your word for it, cause there's nothing to show it at the moment...


Has there ever been an instance where support for any technology came in software, with hardware to follow?
Of course there's no title support on launch day, but planned support looks way better than it did for DX12, async compute or even tessellation in their time.

And the IQ improvements you have already seen in the demos (or was just one demo?).


----------



## Fluffmeister (Sep 19, 2018)

I believe proper Microsoft DXR support doesn't drop till next months October update anyway.


----------



## Steevo (Sep 19, 2018)

1080TI Overclocked gained what, 15% 
2080Ti gained 10%

So for enthusiasts the 2080Ti might be worth the price, but for the other 98% of people 1080Ti with a easy overclock at 50-60% lower price will reach 80-85% of this cards performance. $689 1080Ti on Amazon VS $1249 for out of stock cards.  

Its an amazing piece of engineering, but the price is stupid.


----------



## R0H1T (Sep 19, 2018)

Steevo said:


> 1080TI Overclocked gained what, 15%
> 2080Ti gained 10%
> 
> So for enthusiasts the 2080Ti might be worth the price, but for the other 98% of people 1080Ti with a easy overclock at 50-60% lower price will reach 80-85% of this cards performance. $689 1080Ti on Amazon VS $1249 for out of stock cards.
> ...


I don't think even enthusiasts should buy the _*RTX*_es at these prices, wait for price drop otherwise you're setting a precedent where $*1000 *will be the new normal for future (top end) GPUs, do we want another *Apple* in the PC space?


----------



## ShurikN (Sep 19, 2018)

bug said:


> Has there ever been an instance where support for any technology came in software, with hardware to follow?
> Of course there's no title support on launch day, but planned support looks way better than it did for DX12, async compute or even tessellation in their time.
> 
> And the IQ improvements you have already seen in the demos (or was just one demo?).


Unlike previous generations, this one has it's main selling point in it's name. A selling point you cannot use.

Which demo, the Tomb Raider one with zero differences on or off, or the BFV demo with painfully fake fire.


----------



## CandymanGR (Sep 19, 2018)

Steevo said:


> Its an amazing piece of engineering, but the price is stupid.



This. In a nutsell.                                                                   ˄˄˄˄˄˄˄˄˄˄˄˄˄˄˄˄˄˄˄˄˄˄˄


----------



## bug (Sep 19, 2018)

ShurikN said:


> Unlike previous generations, this one has it's main selling point in it's name. A selling point you cannot use.


Ah, so now it's about the sticker on the box. You can troll better than that.


ShurikN said:


> Which demo, the Tomb Raider one with zero differences on or off, or the BFV demo with painfully fake fire.


Ok, multiple demos it is. Tomb Raider and Metro were particularly uninspired, though (TR because it was too early in the development and Metro because it was only using one light source).


----------



## ShurikN (Sep 19, 2018)

Steevo said:


> Its an amazing piece of engineering, but the price is stupid.


Only the 2080Ti is amazing, the regular 2080 has no place in the market at this moment. It trades blows with 1080Ti, which you can get for far less money.


----------



## TheOne (Sep 19, 2018)

Review could also use some professional software benchmarks.


----------



## ShurikN (Sep 19, 2018)

bug said:


> Ah, so now it's about the sticker on the box. You can troll better than that.


Are you seriously writing this nonsense by yourself, or are you having outside help.
Sticker on the box??? Really, thats the straw you are grasping for? The entire product line is basically called GeForce Ray-Tracing. They just gave it a prettier name than that for obvious marketing reasons.


----------



## newtekie1 (Sep 19, 2018)

ShurikN said:


> Only the 2080Ti is amazing, the regular 2080 has no place in the market at this moment. It trades blows with 1080Ti, which you can get for far less money.



2080 vs 1080Ti is about an 8% performance improvement for a 15% price increase.  That's about what I would expect in the high end.  Price almost never increases at the same rate at performance once you hit the high end.


----------



## Animalpak (Sep 19, 2018)

It is a Nuclear power plant damn the card is hell of a fast card !


----------



## ShurikN (Sep 19, 2018)

newtekie1 said:


> 2080 vs 1080Ti is about an 8% performance improvement for a 15% price increase.  That's about what I would expect in the high end.  Price almost never increases at the same rate at performance once you hit the high end.


8% against a 1080ti FE. And 0% against a $100 cheaper Evga 1080ti

GN conclusion


> The RTX 2080 is poor value today. NVidia's own GTX 1080 Ti offers superior value at $150 less, in some cases, or $100 less on average. The cards perform equivalently, and yet the 1080 Ti is cheaper and still readily available (and with better models, too).
> ...
> Until we see a price drop in the 2080, compelling RTX implementations in an actually relevant game, or depleted stock on the 1080 Ti, there is no strong reason we would recommend the RTX 2080 card.


----------



## W1zzard (Sep 19, 2018)

Fluffmeister said:


> I believe proper Microsoft DXR support doesn't drop till next months October update anyway.


That's correct. You can enable developer mode on Windows 10 April update right now to enable DXR support


----------



## Frick (Sep 19, 2018)

xkm1948 said:


> I find it amazing how our couch GPU designers know better how to design a GPU and make it 50% better minium per generation. Like all those engineers at Nvidia should give their job to you guys.
> 
> Seriously people tends to forget Mawell to Pascal jump are partially due to the massive jump from 28nm to 16nm. You cannot expect magic like that to happen every gen. Geez stop being hypocrites.



It's the price that's bad.


----------



## FeelinFroggy (Sep 19, 2018)

Hilarious, fastest consumer GPU in the world and most of the comments here are complaints from people who were never going to buy one anyway.


----------



## BluesFanUK (Sep 19, 2018)

I'm on a 980ti, but i'll wait for the die shrink. Next year i'll upgrade.


----------



## Razrback16 (Sep 19, 2018)

Thanks as always for the thorough review -

As expected, performance on this card is completely unacceptable to justify the pricing NVidia are asking. RTX at this stage is mostly worthless to gamers. As the reviewer noted, there are currently zero games that support RTX further making the $ NVidia is asking for these cards just straight up asinine - and don't get me wrong, I'm all for new technology, but you can't implement new technology that literally adds nothing for current games, and then ask people to pay some utterly ridiculous price premium for your R&D costs when you're already rolling in the money every quarter anyway. Informed customers are just not going to buy what you're selling in that case, at least not in high numbers. 

I could see a $750 per base unit price tag, or ~$900-1000 if it included a pre-installed full cover waterblock with a warranty, but $1000+ for a base card that most enthusiast gamers are going to have to modify right off the bat to put a waterblock or high end air cooler on is just crazy, at least to me. I understand not everyone will agree with me, but at this price point...NVidia can keep it on the shelf. I had been looking forward to an upgrade as I do have games that are GPU limited as they sit right now at the resolution and framerates I play at, but I'll just cope for now and wait for either a massive price drop or the 3000 series.


----------



## Vya Domus (Sep 19, 2018)

FeelinFroggy said:


> Hilarious, fastest consumer GPU in the world and most of the comments here are complaints from people who were never going to buy one anyway.



You're not going to buy one if you complain about it are you ? Your point ?


----------



## Frick (Sep 19, 2018)

FeelinFroggy said:


> Hilarious, fastest consumer GPU in the world and most of the comments here are complaints from people who were never going to buy one anyway.



Yep. That is literally the point of forums like this and reviews. If only verified buyers could have opinions about anything ... that's just really stupid.


----------



## R0H1T (Sep 19, 2018)

FeelinFroggy said:


> Hilarious, fastest consumer GPU in the world and most of the comments here are complaints from people who were never going to buy one anyway.


What's hilarious is that people now support egregious pricing for a "feature" that's not even "featured" in games at launch. What's even worse is that people who buy this are gonna *enable Nvidia* to quite possibly "fleece" us for a long time to come, not unlike Intel just till last year. Don't wanna bring politics into this, but you know what happened over there!


----------



## alexander brett (Sep 19, 2018)

Fluffmeister said:


> Math problems mainly
> 
> This things are beasts, but silly expensive, we can thank the shit competition for that. Reading the Pascal reviews a lot of them didn't appreciate those cards then either, now they are starting to.
> 
> No doubt cards like the GTX 1080 and GTX 1080 Ti remain great buys.


That's right, blame the lame performance and high cost on the rest of the industry instead of the company that manufactured the product. Trolling is fun.


----------



## Diverge (Sep 19, 2018)

New technology and gadgets are hobbies for a lot of people, so new toys for the people who are willing and can afford to spend the money. People who can't afford, or feel it's a waste of money can have 2nd-hand hand-me-downs or stay with what they currently have. In the end no one is forcing you to buy anything new. 

That's pretty much what it all boils down to. Calling people retarded just because they have more disposable income, or want the best regardless, is pretty ignorant.


----------



## Fluffmeister (Sep 19, 2018)

FeelinFroggy said:


> Hilarious, fastest consumer GPU in the world and most of the comments here are complaints from people who were never going to buy one anyway.



Price is basically the only complaint, so that is what is clung too.



alexander brett said:


> That's right, blame the lame performance and high cost on the rest of the industry instead of the company that manufactured the product. Trolling is fun.



Thanks I will.


----------



## newtekie1 (Sep 19, 2018)

ShurikN said:


> 8% against a 1080ti FE. And 0% against a $100 cheaper Evga 1080ti



Yeah, not exactly, the EVGA 1080Ti SC2 is only about 4% faster than a 1080Ti FE, not 8%.  Oh, and the cheapest Evga 1080Ti on newegg right now is $710 vs. $790 for an aftermarket 2080.  So, 4% for $80, again not unacceptable in the high end GPU market.



alexander brett said:


> That's right, blame the lame performance and high cost on the rest of the industry instead of the company that manufactured the product. Trolling is fun.



Says the fastest consumer GPU on the market has lame performance, and then says others are trolling...


----------



## R0H1T (Sep 19, 2018)

Fluffmeister said:


> *Price is basically the only complaint, so that is what is clung too.*
> 
> 
> 
> Thanks I will.


It isn't the only complaint, but by far the biggest one & rightfully so. If $*1000* is the new normal then prices of GPU across the board will explode. For someone who buys mid range, like the vast majority of dGPU buyers, it'll basically boil down to a Ryzen^2 (gpu) from AMD or Conroe Part Deux from Intel. Can I live without a next gen dGPU ~ of course but this affects other areas of our lives as well.

Continuing from the other thread, that was locked yesterday, what people don't understand is that enabling *corporate greed* isn't a good thing ~ ever! If a non negligible minority continues supporting practices like this, it affects all of us ~ negatively. You can put the morality spin here, but the fact of the matter is more power/money in the hands of "plebs" is generally a good thing. The corporate overlords aren't our friends & they'll never have our best interests in their heart.


----------



## TheOne (Sep 19, 2018)

If NVIDIA uses the same pricing model for the main and lower end cards, and if the pricing isn't corrected, then pc gaming could once again become the unwanted step child of the gaming industry, and this coming after the crypto mining craze is just poor timing.

Personally I'm hoping that the next generation, whether they're from NVIDIA, AMD or Intel, will bring the price back to a reasonable level, but that is a year+ away.


----------



## Mighty-Lu-Bu (Sep 19, 2018)

Next year is going to be AMD's year- I have a feeling they are going to absolutely crush it with Zen 2 and Navi.


----------



## mcraygsx (Sep 19, 2018)

While the performance is in same territory as Titan XP series. TSMC’s 12nm FinFet has not helped them in department of power efficiency. I suppose since TSMC’s 12nm is same as 16nm. 2080 Ti is one hungry card.

Price itself leaves a bad taste in mouth.


----------



## laszlo (Sep 19, 2018)

new king for sure but even i had the needed money wouldn't spend so much for a vga... hope older card prices will go down soon..


----------



## kings (Sep 19, 2018)

newtekie1 said:


> Says the fastest consumer GPU on the market has lame performance, and then says others are trolling...



Yeah, some people just want to complain about something...

Of course price is insane, everyone here could agree, but lame performance? C´mom guys, be serious...

Despite the price, the 2080Ti is a beast of a card, no doubt about that. If Nvidia sold it for $500, I guess no one would mind to have such a lame performance card in their system.


----------



## RejZoR (Sep 19, 2018)

What I don't understand is why NVIDIA had to rush this and release a RTX product that's main and core feature is RTX and on launch, there is exactly ZERO games for it. You're buying a 1200€ graphic card on promise to eventually get some of this fancy stuff. Not even Shadow of Tomb Raider has ANY of the RTX features included. If gamers are anything like me, I'd want to play a brand new game with all the new special effects straight away. Because playing it again just won't be the same. It's potential they lost because they rushed the whole thing. And that DLSS or whatever it is, same. No actual games use it. I don't get it why there was this need to rush when they could actually have RTX games on launch day, DLSS games on launch day, fixed power consumption on launch day. They aren't being forced to release by AMD, so why have they done such a rushed half ass release? I really don't understand NVIDIA.

At least performance jump is huge even at 1080p which makes it a brilliant 144Hz gaming option for those of us who don't care about 4K, but admire high and smooth framerate. Still, unless I win a lottery, I don't see myself buying one anytime soon. GTX 1080Ti serves me perfectly fine.


----------



## TheTechGuy1337 (Sep 19, 2018)

It is priced way too high. That's the real reason behinds everyone's hate. Only people with money to blow are going to pay those kind of premiums.


----------



## RejZoR (Sep 19, 2018)

If this was GTX 2080Ti (no RTX and no Tensor stuff) for the price of GTX 1080Ti when it was originally sold, I think many would jump on it as gains are significant even for good old rasterization which is used in 100% of the games. Unlike RTX which is to this date used in exactly ZERO games. Not even Shadow of Tomb Raider has it and it was just released...


----------



## TheOne (Sep 19, 2018)

RejZoR said:


> What I don't understand is why NVIDIA had to rush this and release a RTX product that's main and core feature is RTX and on launch, there is exactly ZERO games for it. You're buying a 1200€ graphic card on promise to eventually get some of this fancy stuff. Not even Shadow of Tomb Raider has ANY of the RTX features included. If gamers are anything like me, I'd want to play a brand new game with all the new special effects straight away. Because playing it again just won't be the same. It's potential they lost because they rushed the whole thing. And that DLSS or whatever it is, same. No actual games use it. I don't get it why there was this need to rush when they could actually have RTX games on launch day, DLSS games on launch day, fixed power consumption on launch day. They aren't being forced to release by AMD, so why have they done such a rushed half ass release? I really don't understand NVIDIA.
> 
> At least performance jump is huge even at 1080p which makes it a brilliant 144Hz gaming option for those of us who don't care about 4K, but admire high and smooth framerate. Still, unless I win a lottery, I don't see myself buying one anytime soon. GTX 1080Ti serves me perfectly fine.




Makes less sense if you also take into account the rumors about overstock on 10 series cards, they should have waited until March, they also seem desperate to get impulse buyers, I would say they wanted to make it in time for Christmas, but the 2060 may not launch until next year, and the 2070 and higher are to expensive for most.


----------



## ppn (Sep 19, 2018)

RejZoR said:


> What I don't understand is why NVIDIA had to rush this and release a RTX product that's main and core feature is RTX and on launch, there is exactly ZERO games for it.



Which came first the chicken or the egg.

20-series was designed for 10nm Samsung and later shrinked to 8nm.  They didnt rush, they were late. This was meant to be March 30th release.

Shrink to 50% of 754 mm²  -> 350 mm² and the clocks would be 20% higher. On 8nm (soon hopefully) -> 300 mm² .


----------



## Diverge (Sep 19, 2018)

RejZoR said:


> What I don't understand is why NVIDIA had to rush this and release a RTX product that's main and core feature is RTX and on launch, there is exactly ZERO games for it. You're buying a 1200€ graphic card on promise to eventually get some of this fancy stuff. Not even Shadow of Tomb Raider has ANY of the RTX features included. If gamers are anything like me, I'd want to play a brand new game with all the new special effects straight away. Because playing it again just won't be the same. It's potential they lost because they rushed the whole thing. And that DLSS or whatever it is, same. No actual games use it. I don't get it why there was this need to rush when they could actually have RTX games on launch day, DLSS games on launch day, fixed power consumption on launch day. They aren't being forced to release by AMD, so why have they done such a rushed half ass release? I really don't understand NVIDIA.
> 
> At least performance jump is huge even at 1080p which makes it a brilliant 144Hz gaming option for those of us who don't care about 4K, but admire high and smooth framerate. Still, unless I win a lottery, I don't see myself buying one anytime soon. GTX 1080Ti serves me perfectly fine.




They didn't need to release anything, but did, because they know the audience who buys Titans will upgrade. They have zero competition, so this wasn't a need... but development of new tech needs to start somewhere, and RTX is it. Technology improvements come before implementation in games.


----------



## EatingDirt (Sep 19, 2018)

> RTX Technology not gimmicky, does bring tangible IQ improvements



This is the definition for gimmick: an ingenious or novel device, scheme, or stratagem, especially one designed to attract attention or increase appeal.

Currently RTX is a gimmick, as the Ray Tracing is currently a novel device designed to increase appeal(especially on the 2080(non-ti)). It has no tangible value currently, as nothing uses it, and we can't see performance with it on/off. It may look great, but if we have to turn down settings to get acceptable framerate, will it be worth the IQ losses elsewhere?


----------



## RejZoR (Sep 19, 2018)

Rushed in terms of not offering it at right time. Imagine if they released RTX cards and on launch day say "You can now actually play X number of games with RTX". And game studios could also sell this as a feature. Being first ray traced game in existence is kinda big deal. Saying "Oh we released a patch for it 2 months later" is kinda weak as most people don't even care about the game anymore at that time. But experiencing brand new game with brand new special effect, that's big. Just remember first time we played Pixel Shaded games and we stared at water for hours. That's why. They could orchestrate a big release with game studios of which both would benefit greatly even by delaying release solely because of that. And even with 1 month delay they wouldn't lose anything for Holiday spending season.


----------



## kings (Sep 19, 2018)

RejZoR said:


> What I don't understand is why NVIDIA had to rush this and release a RTX product that's main and core feature is RTX and on launch, there is exactly ZERO games for it. You're buying a 1200€ graphic card on promise to eventually get some of this fancy stuff. Not even Shadow of Tomb Raider has ANY of the RTX features included. If gamers are anything like me, I'd want to play a brand new game with all the new special effects straight away. Because playing it again just won't be the same. It's potential they lost because they rushed the whole thing. And that DLSS or whatever it is, same. No actual games use it. I don't get it why there was this need to rush when they could actually have RTX games on launch day, DLSS games on launch day, fixed power consumption on launch day. They aren't being forced to release by AMD, so why have they done such a rushed half ass release? I really don't understand NVIDIA.
> 
> At least performance jump is huge even at 1080p which makes it a brilliant 144Hz gaming option for those of us who don't care about 4K, but admire high and smooth framerate. Still, unless I win a lottery, I don't see myself buying one anytime soon. GTX 1080Ti serves me perfectly fine.



It seems to me that this 20 series is Nvidia testing the waters for next year.

Probably launching these new features along with a very new manufacturing process would be a big risk (they've been working with 12nm for over a year, so they knew what they could count on).

And since AMD is MIA and will continue like that for some time, there was no better time for this, no pressure whatsoever.

As a user of a 980Ti, I will also pass this generation. This card still serves me well, so it should last perfectly until 7nm arrive.


----------



## W1zzard (Sep 19, 2018)

TheOne said:


> Makes less sense if you also take into account the rumors about overstock on 10 series cards, they should have waited until March, they also seem desperate to get impulse buyers, I would say they wanted to make it in time for Christmas, but the 2060 may not launch until next year, and the 2070 and higher are to expensive for most.


I heard some thoughts on that. Seems that due to the end of the mining boom, NVIDIA is trying to get some more $$ in for 2018, to show good financials and keep stock holders happy.


----------



## xkm1948 (Sep 19, 2018)

Mighty-Lu-Bu said:


> Next year is going to be AMD's year- I have a feeling they are going to absolutely crush it with Zen 2 and Navi.



Zen2 maybe. Navi... keep on dreaming bro. Navi will still be GCN, a hot and slow mess



EatingDirt said:


> This is the definition for gimmick: an ingenious or novel device, scheme, or stratagem, especially one designed to attract attention or increase appeal.
> 
> Currently RTX is a gimmick, as the Ray Tracing is currently a novel device designed to increase appeal(especially on the 2080(non-ti)). It has no tangible value currently, as nothing uses it, and we can't see performance with it on/off. It may look great, but if we have to turn down settings to get acceptable framerate, will it be worth the IQ losses elsewhere?




Sure it is a gimmick. Just like hardware T&L, or DirectX 9. Every new technology is a gimmick to you i guess.


----------



## moproblems99 (Sep 19, 2018)

R0H1T said:


> The corporate overlords aren't our friends & they'll never have our best interests in their heart.



Keep in mind, the only entity with your best interests in mind is you.  If you go through life expecting anything different you are setting yourself up for failure.



xkm1948 said:


> Zen2 maybe. Navi... keep on dreaming bro. Navi will still be GCN, a hot and slow mess



Care to enlighten us on your knowledge base?

EDIT: Fixed quotes.


----------



## EatingDirt (Sep 19, 2018)

xkm1948 said:


> Sure it is a gimmick. Just like hardware T&L, or DirectX 9. Every new technology is a gimmick to you i guess.




I'm fine with new technologies but when a statement in the pro column of the article says, "RTX Technology not gimmicky, does bring tangible IQ improvements", then proof needs to be provided and in the review the only provided evidence are screenshots of a star wars demo we've already seen. Until we know how capable these cards are at raytracing in a game, it may very well be a gimmick for these cards.


----------



## CheapMeat (Sep 19, 2018)

I'm not understanding all the complaints. The ONLY true bad thing is the price. There are no bad products, just bad pricing.  If a 2080Ti was $100, you'd all shut up (well...maybe not...eh). That's not possible. I can never can afford the newest GPU's, so I wouldn't be able to get it anyway. But nothing about it screams "terrible" like some of the posters make it seem. It still beats out a 1080Ti if you were a big spender anyway and certainly beats out anything AMD bothered with. If you're gaming at 4K, this will absolutely be on your radar.  And luckily some of the newer features aren't as proprietary as Nvidia tends to make them. There will absolutely be games coming up and much broader support than PhysX or GameWorks, etc.  I  think the complaints are coming from those who spend top dollar every single cycle and buy the top $60 AAA game on release dates and everything that doesn't meet that highest marginal "wants" is looked at like peasantry.  Any reasonable person is more cautious with their money regardless of where in the rankings said product is.


----------



## Captain_Tom (Sep 19, 2018)

TheOne said:


> If NVIDIA uses the same pricing model for the main and lower end cards, and if the pricing isn't corrected, then pc gaming could once again become the unwanted step child of the gaming industry, and this coming after the crypto mining craze is just poor timing.
> 
> Personally I'm hoping that the next generation, whether they're from NVIDIA, AMD or Intel, will bring the price back to a reasonable level, but that is a year+ away.



It could actually be closer than you think, but it isn't a guarantee.  If AMD really wanted to - they have several options for launching a full line-up to take this on as early as Q1 2019.  The question is if they think it is worth it, or even literally if it is worth it when Ryzen 3 will make them way more money.


----------



## Assimilator (Sep 19, 2018)

W1zzard said:
			
		

> AMD's fastest, the Vega 64 is far behind, reaching only about half the performance of RTX 2080 Ti.



HAHAHAHA oh wow. That is just pathetic.

To top that off, we have the clueless hordes here crying that a $ 1,200 card (2080 Ti) that's twice as fast as a $600 card (Vega 64) is overpriced. Who wants to bet half of them will own a Turing card in 6 months' time?


----------



## bug (Sep 19, 2018)

CheapMeat said:


> I'm not understanding all the complaints. The ONLY true bad thing is the price. There are no bad products, just bad pricing.  I never can afford the newest GPU's, so I wouldn't be able to get it anyway. But nothing about it screams "terrible" like some of the posters make it seem. It still beats out a 1080Ti if you were a big spender anyway and certainly beats out anything AMD bothered with. And luckily some of the newer features aren't as proprietary as Nvidia tends to make them. There will absolutely be games coming up.  I  think the complaints are coming from those who spend top dollar every single cycle and buy the top $60 AAA game on release dates and everything that doesn't meet that highest end is looked at like peasantry.  Any reasonable person is more cautious with their money regardless of where in the rankings said product is.


What's not to understand?
Nvidia introduces GameWorks, it catches flak for using proprietary technologies.
Nvidia introduces Pascal, it catches flak for introducing a mere refinement over Maxwell.
Nvidia introduces Turing, it catches flak for RTX not being in widespread use already.

Is the pattern more obvious now? 



Assimilator said:


> HAHAHAHA oh wow. That is just pathetic.
> 
> To top that off, we have the clueless hordes here crying that a $ 1,200 card (2080 Ti) that's twice as fast as a $600 card (Vega 64) is overpriced. Who wants to bet half of them will own a Turing card in 6 months' time?


Well, yes. Price never increases linearly with performance. Not at the high end, at least.


----------



## Captain_Tom (Sep 19, 2018)

CheapMeat said:


> I'm not understanding all the complaints. The ONLY true bad thing is the price. There are no bad products, just bad pricing.  If a 2080Ti was $100, you'd all shut up. That's not possible. I can never can afford the newest GPU's, so I wouldn't be able to get it anyway. But nothing about it screams "terrible" like some of the posters make it seem. It still beats out a 1080Ti if you were a big spender anyway and certainly beats out anything AMD bothered with. And luckily some of the newer features aren't as proprietary as Nvidia tends to make them. There will absolutely be games coming up.  I  think the complaints are coming from those who spend top dollar every single cycle and buy the top $60 AAA game on release dates and everything that doesn't meet that highest marginal "wants" is looked at like peasantry.  Any reasonable person is more cautious with their money regardless of where in the rankings said product is.



I completely agree - the time to complain about Nvidia was when they launched a $1000 Titan in 2013, a $700+ 1080 in 2016, or lied about how much VRAM their cards had (multiple times actually).  What's happening right now is no different than GK100 for $1000, or GP104 for $700.

In fact at least Nvidia has a massive performance advantage this time around, and 4K gaming is FINALLY conquered.  Turing actually impresses me, it overperformed estimates.  Anyone expecting the price tag to be "justified" was delusional, and that's because Nvidia has never justified their price tags before - but at least now they can claim it's because they have no competition...


----------



## Mighty-Lu-Bu (Sep 19, 2018)

xkm1948 said:


> Zen2 maybe. Navi... keep on dreaming bro. Navi will still be GCN, a hot and slow mess
> 
> 
> 
> ...



I mean Vega was essentially AMD's first time in the high end GPU market because previously they just heavily saturated the mid-tier market, but how did AMD do? Well Sapphire's Nitro+ Vega 64 consistently beats the GTX 1080 FE so I think AMD did pretty good for their first time in the high end market.

Navi is suppose to be even better than Vega with next gen memory and I guarantee you that Navi will be cheaper than the new Nvidia cards. Unless you are solely gaming at 4k, the GTX 1080ti and the GTX 2080ti are completely overkill. I personally think that 1440p is the ideal resolution for gaming and both the Vega 56 and 64 handle 1440p like a champ.


----------



## efikkan (Sep 19, 2018)

The only thing to complain about is the price, and it's fine letting Nvidia know it's too high.
But the price doesn't change the fact that Turing is a great achievement in performance gains, efficiency, etc. 39% gains over the previous generation is a substantial step forward, and is still more than most generational changes in the past. If AMD achieved this, no one would complain at all.



bug said:


> What's not to understand?
> Nvidia introduces GameWorks, it catches flak for using proprietary technologies.
> Nvidia introduces Pascal, it catches flak for introducing a mere refinement over Maxwell.
> Nvidia introduces Turing, it catches flak for RTX not being in widespread use already.
> Is the pattern more obvious now?


Nvidia introduces CUDA, it catches flak for using proprietary technologies.
Nvidia introduces G-Sync, it catches flak for using proprietary technologies.

AMD introduces Mantle, a proprietary AMD-only alternative to Direct3D and OpenGL with "support" in a handful games. Even after it turned out that Mantle only had marginal gains for low-end APUs, and that it will never catch on among developers, AMD are still praised.

I wonder if there is a double standard?


----------



## yeeeeman (Sep 19, 2018)

Only DLSS can save nvidia. They bet on a large die with this generation and they cannot sell it at peanuts price. They simply cannot afford to sell a 2080ti at 1080Ti prices, cause they won't make almost any profit, since 2080ti is on a newer node and its die is much bigger compared to 1080Ti (because of RT and tensor cores). If dlss can add 25% more performance at least, they might sell pretty well.


----------



## bug (Sep 19, 2018)

@efikkan Yeah, "price is too high" is an understatement.
But even that is only in part because of greed and/or lack of competition. The price is high because chips this large are _that_ expensive.
Like you said, it's ok to let Nvidia know they went too far, because at the end of the day, justified or not, these prices mean extremely few gamers will afford to buy these cards. Even then, low sales will convey the message better than any forums whining.


----------



## kings (Sep 19, 2018)

Mighty-Lu-Bu said:


> I mean Vega was essentially AMD's first time in the high end GPU market because previously they just heavily saturated the mid-tier market, but how did AMD do? Well Sapphire's Nitro+ Vega 64 consistently beats the GTX 1080 FE so I think AMD did pretty good for their first time in the high end market.



Are you serious? Vega was the first hig-end card from AMD?

What about Fury X? What about 290X? What about HD7970/Ghz Edition? What about the multiple dual-GPU cards they released?

AMD has been competing in the high-end market for a long time.


----------



## EatingDirt (Sep 19, 2018)

efikkan said:


> The only thing to complain about is the price, and it's fine letting Nvidia know it's too high.
> But the price doesn't change the fact that Turing is a great achievement in performance gains, efficiency, etc. 39% gains over the previous generation is a substantial step forward, and is still more than most generational changes in the past. If AMD achieved this, no one would complain at all.
> 
> 
> ...




So, Cuda I don't know enough about so I won't comment. However Freesync is essentially just AMD's name for adaptive sync, a VESA standard. It's something Nvidia could support, but instead choose to continue on with Gsync which is basically identical except for the added price premium.

Your statement about Mantle is entirely false though. Mantle was not designed to be a proprietary AMD-only API. Vulcan is what we got from AMD's API development, an open standard API that every single gamer should support(unless they work for Microsoft).


----------



## TheinsanegamerN (Sep 19, 2018)

Mighty-Lu-Bu said:


> *I mean Vega was essentially AMD's first time in the high end GPU market *because previously they just heavily saturated the mid-tier market, but how did AMD do? Well Sapphire's Nitro+ Vega 64 consistently beats the GTX 1080 FE so I think AMD did pretty good for their first time in the high end market.
> 
> Navi is suppose to be even better than Vega with next gen memory and I guarantee you that Navi will be cheaper than the new Nvidia cards. Unless you are solely gaming at 4k, the GTX 1080ti and the GTX 2080ti are completely overkill. I personally think that 1440p is the ideal resolution for gaming and both the Vega 56 and 64 handle 1440p like a champ.


So the 9800 pro, 2900xt, 4870, 4870x2, 5870, 5970, 6970, 6990, 7970, 290x, 295x2, fury X, ece, do NONE of those exist anymore?


----------



## efikkan (Sep 19, 2018)

bug said:


> @efikkan Yeah, "price is too high" is an understatement.
> 
> But even that is only in part because of greed and/or lack of competition. The price is high because chips this large are that expensive.
> 
> Like you said, it's ok to let Nvidia know they went too far, because at the end of the day, justified or not, these prices mean extremely few gamers will afford to buy these cards. Even then, low sales will convey the message better than any forums whining.


We still have to remember that Nvidia is trying to dump the remaining stock of Pascal cards as well. It remains to be seen if the price remains unchanged in a few months as the Pascal cards runs out.

Nvidia should be cautious though, driving up the cost of desktop gaming might hurt the long-term gains we've seen in user base vs. consoles.

Still, we have to remember that AMD have done similar things that have been long forgotten. When Vega launched, it was priced $100 over "MSRP" with "$100 worth of games included", but they quickly changed this after massive criticism.



EatingDirt said:


> So, Cuda I don't know enough about so I won't comment. However Freesync is essentially just AMD's name for adaptive sync, a VESA standard. It's something Nvidia could support, but instead choose to continue on with Gsync which is basically identical except for the added price premium.
> 
> Your statement about Mantle is entirely false though. Mantle was not designed to be a proprietary AMD-only API. Vulcan is what we got from AMD's API development, an open standard API that every single gamer should support(unless they work for Microsoft).


G-Sync still relies on adaptive sync, with advanced processing on the monitor side, which FreeSync doesn't have. If it's worth it or not is up to you, but they are not 100% comparable.

You're wrong about Mantle. It was a proprietary API designed around GCN. Direct3D and Vulkan inherited some of the function names from it, but not the underlying structure, they are only similar on the surface. Vulkan is built on a SPIR-V, which is derived from OpenCL. The next version of Direct3D is planning to copy this compiler structure from Vulkan, which is something Mantle lacks completely.


----------



## Mighty-Lu-Bu (Sep 19, 2018)

kings said:


> Are you serious? Vega was the first hig-end card from AMD?
> 
> What about Fury X? What about 290X? What about HD7970/Ghz Edition? What about the multiple dual-GPU cards they released?
> 
> AMD has been competing in the high-end market for a long time.



So neither the Fury X or the 290X (or its multiple revisions) are considered high end, though they may be in the upper echelon of mid-tier cards. With today's standards, you don't start hitting the high-end GPU market until you get to GTX 1070 levels. The Fury X's performance is slightly above a GTX 1060 6GB, but it is no where near a GTX 1070. The 290X and its family can't even compete with the GTX 1060 6GB which is a mid tier card. If these cards were released 4-6 years ago that would be a different story, but when they were released they weren't really high end.

The HD7970 is a different story because for it's time it was definitely high end, but that was also 7 years ago. AMD hasn't made high end GPUs for a very long time so I guess it was a poor choice of wording on my end- what I should have said was Vega was essentially AMD's return to the high end GPU market. Would that be more acceptable?


----------



## Vya Domus (Sep 19, 2018)

CUDA means absolutely nothing to the average consumer.



Mighty-Lu-Bu said:


> So neither the Fury X or the 290X (or its multiple revisions) are considered high end,



Absolutely not true, 290X was faster than anything else at the time of release, you cannot get more high end than that.

I think you are confusing it with something esle.


----------



## TheinsanegamerN (Sep 19, 2018)

Mighty-Lu-Bu said:


> So neither the Fury X or the 290X (or its multiple revisions) are considered high end, though they may be in the upper echelon of mid-tier cards. *With today's standards, you don't start hitting the high-end GPU market until you get to GTX 1070 levels.* The Fury X's performance is slightly above a GTX 1060 6GB, but it is no where near a GTX 1070. The 290X and its family can't even compete with the GTX 1060 6GB which is a mid tier card. If these cards were released 4-6 years ago that would be a different story, but when they were released they weren't really high end.
> 
> The HD7970 is a different story because for it's time it was definitely high end, but that was also 7 years ago. AMD hasn't made high end GPUs for a very long time so I guess it was a poor choice of wording on my end- what I should have said was Vega was essentially AMD's return to the high end GPU market. Would that be more acceptable?


So nvidia never competed in the high end either, because the 780ti wasnt anywhere near the 1070 right?

I think you just dont understand what the term high end GPU means.

The 290X was faster then the 780, and traded blows with the 780ti. In case you dont remember, the 290x made nvidia panic, as the 780ti was not slated for a consumer release and was rushed out after the 290x turned out to be way better then the 780. The fury X was trading blows with the 980ti after a few months of driver updates. Ergo, those cards are "high end". The 9800 pro was king of the hill for over a year after its launch, and absolutely embarrassed nvidia. The 5870 at launch was the fastest GPU available.

Also interesting you have gone from "AMD never competed in the high end" to "Well AMD DID compete int he high end, but that was like 7 years ago". Which is it?


----------



## kings (Sep 19, 2018)

Mighty-Lu-Bu said:


> So neither the Fury X or the 290X (or its multiple revisions) are considered high end, though they may be in the upper echelon of mid-tier cards. With today's standards, you don't start hitting the high-end GPU market until you get to GTX 1070 levels. The Fury X's performance is slightly above a GTX 1060 6GB, but it is no where near a GTX 1070. The 290X and its family can't even compete with the GTX 1060 6GB which is a mid tier card. If these cards were released 4-6 years ago that would be a different story, but when they were released they weren't really high end.
> 
> The HD7970 is a different story because for it's time it was definitely high end, but that was also 7 years ago. AMD hasn't made high end GPUs for a very long time so I guess it was a poor choice of wording on my end- what I should have said was Vega was essentially AMD's return to the high end GPU market. Would that be more acceptable?



I ask again, are you serious or trolling at this point?


----------



## Mighty-Lu-Bu (Sep 19, 2018)

Vya Domus said:


> CUDA means absolutely nothing to the average consumer.
> 
> 
> 
> ...



I mean the 290x was not faster than the GTX 780ti and they were released within a month of each other. From the GTX 780ti, Nvidia went with the 900 series and how did AMD respond? By rehashing the 290x with the 390x.


----------



## moproblems99 (Sep 19, 2018)

Mighty-Lu-Bu said:


> If these cards were released 4-6 years ago that would be a different story, but when they were released they weren't really high end.



Did I miss the the Fury or 290X release the other day?  I could have sworn I read something about them many, many moons ago.


----------



## Mighty-Lu-Bu (Sep 19, 2018)

TheinsanegamerN said:


> So nvidia never competed in the high end either, because the 780ti wasnt anywhere near the 1070 right?
> 
> I think you just dont understand what the term high end GPU means.
> 
> ...



The Fury X got absolutely stomped on by the GTX 980ti... wasn't really trading blows.

http://gpu.userbenchmark.com/Compare/Nvidia-GTX-980-Ti-vs-AMD-R9-Fury-X/3439vs3498

The last AMD card that I had was the R9 390x which got released around the same time as the GTX 980ti, but Nvidia's card was a lot better in terms of performance. The Fury X got smoked, and the 290X / 390X got smoked. I sold my R9 390X and I got a GTX 1070, yet I am about to sell my GTX 1070 because I just got a Vega 64. Either way you spin it, it boils down to this: AMD hasn't been competitive in the high end GPU market for 3+ years or so.


----------



## Vya Domus (Sep 19, 2018)

Mighty-Lu-Bu said:


> I mean the 290x was not faster than the GTX 780ti and they were released within a month of each other. From the GTX 780ti, Nvidia went with the 900 series and how did AMD respond? By rehashing the 290x with the 390x.



That has nothing to do with your claim that the 290X wasn't high end, which is undoubtedly not true.



Mighty-Lu-Bu said:


> The Fury X got absolutely stomped on by the GTX 980ti... wasn't really trading blows.
> 
> http://gpu.userbenchmark.com/Compare/Nvidia-GTX-980-Ti-vs-AMD-R9-Fury-X/3439vs3498



Userbenchmark, ehm ... OK. Though have a look at TPUs own rankings:






7%. We certainly have different definitions of "stomped" , still a high end card nonetheless.


----------



## Mighty-Lu-Bu (Sep 19, 2018)

Vya Domus said:


> That has nothing to do with your claim that the 290X wasn't high end, which is undoubtedly not true.
> 
> 
> 
> ...



Instead of showing me some useless benchmark, why don't you look at this and compare and contrast.

http://gpu.userbenchmark.com/Compare/Nvidia-GTX-980-Ti-vs-AMD-R9-Fury-X/3439vs3498

The funny thing is that I am an AMD fan, but until Vega, they haven't really been relevant in the high end GPU market. Why buy a Fury X when I can buy a GTX 980ti that is not only cheaper, but completely outperforms it?


----------



## EatingDirt (Sep 19, 2018)

efikkan said:


> G-Sync still relies on adaptive sync, with advanced processing on the monitor side, which FreeSync doesn't have. If it's worth it or not is up to you, but they are not 100% comparable.
> 
> You're wrong about Mantle. It was a proprietary API designed around GCN. Direct3D and Vulkan inherited some of the function names from it, but not the underlying structure, they are only similar on the surface. Vulkan is built on a SPIR-V, which is derived from OpenCL. The next version of Direct3D is planning to copy this compiler structure from Vulkan, which is something Mantle lacks completely.



G-Sync & Freesync are absolutely directly comparable. They do the same basic thing, syncronizing a monitors refresh with the frames the GPU is producing, that's it. There are multiple video's both objective & subjective that show hardware that Nvidia adds simply isn't needed. Simply put, Nvidia doesn't need to use G-sync modules.

As for Mantle we know AMD gave their Mantle code to Khronos. With where AMD was in the market in 2014, it's unlikely they thought they could get away with mantle being  a proprietary AMD-only API. Mantle was essentially the awakening of low-level API's, and it would be more sane to praise AMD for developing mantle to criticize them for it.

AMD simply doesn't get a lot of flak because they support many open standards(Adaptive Sync, OpenCL, etc.), and maybe because of where they are at on the market they need to, however the same cannot be said about Nvidia(G-sync, Cuda, etc.).


----------



## Vya Domus (Sep 19, 2018)

Mighty-Lu-Bu said:


> Instead of showing me some useless benchmark, why don't you look at this and compare and contrast.
> 
> http://gpu.userbenchmark.com/Compare/Nvidia-GTX-980-Ti-vs-AMD-R9-Fury-X/3439vs3498



Thanks for spamming the same link, didn't notice it the first time. Still quite irrelevant but have it your way , 980ti stomped Fury X. TPU doesn't know shit about benchmarking.


----------



## Naito (Sep 19, 2018)

Performs well, but nVidia wanged out with the prices. Also can't help but wonder how much better they would have been on 10nm? Lower power usage and higher clocks? Ten to twenty percent more performance?


----------



## terroralpha (Sep 19, 2018)

medi01 said:


> So, AdoredTV was right, or was he not?





zelnep said:


> adored was wrong - this is WORSE, he did predicted +40% increase over gtx1080 ti - but obviously that was leakded from nvidias "testing guidlines".



adored was not the only one who predicted this, everyone did. everyone was saying 2080 ti would be 35%-40% faster than 1080 Ti. this review showed it was 38% faster, other reviewers got 35% or so. seems spot on to me. the 1080p and 1440p benches don't count as many of them were bottlenecked by the CPU and in case of doom a lot of the cards were hitting the 200 FPS hard limit. if you were picking either of these up for 1080P, you need to get your head examined. the 2080 Ti still makes sense for ultrawide monitors (performance wise) if you absolutely must get 100+ FPS in every game with everything maxed out.



ShurikN said:


> Only the 2080Ti is amazing, the regular 2080 has no place in the market at this moment. It trades blows with 1080Ti, which you can get for far less money.



are we comparing prices of used 1080 Tis? in the US, new 1080 Tis cost about $700 while 2080 can be picked up for as little as $750 and custom models with beefy coolers start at $800 (links in prices). so for $50 - $100 extra you get a card that's a little faster *BUT most importantly it does not choke in properly optimized DX12 titles and async*, which has been a problem for nvidia for a while now. in productivity benchmarks, both Turing cards destroy everything. if i was getting into high end GPUs now, i would buy a 2080 over 1080 Ti. if you're already spending $700+, what's another $50 for a card that won't get owned by $500 AMD cards anytime async is involved?


and you can't blame nvidia for these prices too. we, the consumers are not obligated to buy anything. however, nvidia's board partners still have millions of pascal cards they have to move. these high RTX prices will allow them to do so. if the 2080 Ti was the same price as 1080 Ti, a lot of the partners would take massive hits to their profits.


----------



## Mighty-Lu-Bu (Sep 19, 2018)

Vya Domus said:


> Thanks for spamming the same link, didn't notice it the first time. Still quite irrelevant but have it your way , 980ti stomped Fury X.



It's completely relevant because it shows that the 980ti beats (sometimes significantly) in almost every performance category. The Fury X can't even run Witcher 3 on max settings at 1080p at a stable 60fps, but the 980ti can.


----------



## xkm1948 (Sep 19, 2018)

Assimilator said:


> HAHAHAHA oh wow. That is just pathetic.
> 
> To top that off, we have the clueless hordes here crying that a $ 1,200 card (2080 Ti) that's twice as fast as a $600 card (Vega 64) is overpriced. Who wants to bet half of them will own a Turing card in 6 months' time?




Man the Red Guards are out in doves this time.

When RTG GPU delivers: OMFG world peace happily ever after

When RTG GPU sucks: They have smaller budget, underdog, FineWine(tm) etc etc

When NV GPU delivers: We demand 1000fps at 8k for $200, while still complaining about “big bad green”

When NV GPU sucks:  Hell yeah death the devil green!

Meanwhile in reality when you check their system profiles, most of them rock a Nvidia “evil” GPU

All you haters should go buy a Vega64 to support RTG today.


----------



## tfdsaf (Sep 19, 2018)

Seems like your numbers are quite ways off, for example hardware unboxed on the same games and resolutions found the RTX 2080 much closer to the GTX 1080ti and even losing in two games. 

Again, maybe you should recheck your results as they seem off compared to other more reputable reviewers. 

The cards themselves are impressive no doubt about it pure performance wise. Value wise they are really bad actually, in fact for the price increase it actually offers negative performance. Both the RTX 2080 and RTX 2080ti offer negative value compared to the 2080 and 2080ti respectfully.


----------



## terroralpha (Sep 19, 2018)

xkm1948 said:


> Man the Red Guards are out in doves this time.
> 
> When RTG GPU delivers: OMFG world peace happily ever after
> 
> ...



i wish i could like this 1000 times. i've owned more AMD hardware than any of the morons calling me an nvidia shill. i put a vega 64 into my VR rig over a 1080 not because it's faster (it is slower) but because i wanted to support the little guy. meanwhile they're all salivating over used 1080 ti mining cards on ebay.


----------



## xkm1948 (Sep 19, 2018)

tfdsaf said:


> Seems like your numbers are quite ways off, for example hardware unboxed on the same games and resolutions found the RTX 2080 much closer to the GTX 1080ti and even losing in two games.
> 
> Again, maybe you should recheck your results as they seem off compared to other more reputable reviewers.
> 
> The cards themselves are impressive no doubt about it pure performance wise. Value wise they are really bad actually, in fact for the price increase it actually offers negative performance. Both the RTX 2080 and RTX 2080ti offer negative value compared to the 2080 and 2080ti respectfully.




You call a well established 10+ years reviewer and review site less reputable than some self claimed “youtuber” WTF are you smoking? Seriously?


----------



## Totally (Sep 19, 2018)

Fluffmeister said:


> 100/72  = 1.38. It's 38% faster at 4K.




If it's already normalized you add/subtract not multiply, since it is already in terms of "other card performance divided by 2080 ti performance" which is .72 or 72%


----------



## terroralpha (Sep 19, 2018)

tfdsaf said:


> Seems like your numbers are quite ways off, for example hardware unboxed on the same games and resolutions found the RTX 2080 much closer to the GTX 1080ti and even losing in two games.
> 
> Again, maybe you should recheck your results as they seem off compared to other more reputable reviewers.
> 
> The cards themselves are impressive no doubt about it pure performance wise. Value wise they are really bad actually, in fact for the price increase it actually offers negative performance. Both the RTX 2080 and RTX 2080ti offer negative value compared to the 2080 and 2080ti respectfully.



i haven't seen adored's crap yet, but TPU seems in line with everything i've seen so far. here is bitwit's relative performance: https://www.techpowerup.com/forums/attachments/bitwit-jpg.107144/
note the difference in overall 4k performance, 8.7% which is almost exactly what TPU got.  

tech jesus got pretty much the same thing. maybe it's adored who is wrong?



Totally said:


> If it's already normalized you add/subtract not multiply, since it is already in terms of "other card performance divided by 2080 ti performance" which is .72 or 72%



please tell me this a joke or a troll. i know that people in the states are generally awful with math but this is getting scary.


----------



## Mighty-Lu-Bu (Sep 19, 2018)

xkm1948 said:


> Man the Red Guards are out in doves this time.
> 
> When RTG GPU delivers: OMFG world peace happily ever after
> 
> ...



In my defense, the only reason I have a GTX 1070 is because at the time, one of the best AMD cards was the R9 390X (which I had), but I was heaving serious slow down in a heavily modded Skyrim- my GTX 1070 solved that. Now I am going from a GTX 1070 to a Vega 64


----------



## W1zzard (Sep 19, 2018)

Vya Domus said:


> Userbenchmark, ehm ... OK. Though have a look at TPUs own rankings:


Protip: look at the data in this review. I've included Fury X and 980 Ti in last week's rebench


----------



## Vya Domus (Sep 19, 2018)

W1zzard said:


> Protip: look at the data in this review. I've included Fury X and 980 Ti in last week's rebench



So turns out Fury X is slightly ahead actually.


----------



## springs113 (Sep 19, 2018)

Fluffmeister said:


> Math problems mainly
> 
> This things are beasts, but silly expensive, we can thank the shit competition for that. Reading the Pascal reviews a lot of them didn't appreciate those cards then either, now they are starting to.
> 
> No doubt cards like the GTX 1080 and GTX 1080 Ti remain great buys.


No  dont blame the competition for shit prices... blame the consumers that know the prices are shit and still buy anyways.   If company A releases a product marginally better than company B but charges 2x the price.   How can we as consumers purchase company A's product then complain?  Don't buy and watch the prices fall... unfortunately something so simple is oh so complicated to do.


----------



## xkm1948 (Sep 19, 2018)

Vya Domus said:


> So turns out Fury X is slightly ahead actually.



While being an overclockers dream.

Any 980Ti with mild OC will blow FuryX away in all resolutions. FuryX can’t OC.


----------



## Fluffmeister (Sep 19, 2018)

Competition forces prices to go down, as it stands there is nothing marginally worse.

Reading this and other threads suggest the mighty GTX 1080 Ti is about to get more love.... Nvidia don't mind.


----------



## terroralpha (Sep 19, 2018)

Vya Domus said:


> So turns out Fury X is slightly ahead actually.



that would make sense because the reference 980 Ti has a garbage cooler, which is what TPU and others use to bench. but the fury X gets absolutely pummeled by custom 980 Tis, like the strix. the most glaring case was the MSI 980 Ti lightning, which was on average 25% faster than the fury x at every resolution. i personally had a 980 Ti kingpin which at the time of my purchase was the same exact price as a reference fury X and it was even faster than the lightning card TPU tested. mine boosted to 1540MHz on the GPU and 2100MHz on the memory. TPU's lightning went to 1518 and 2090 Mhz respectively.


----------



## Mighty-Lu-Bu (Sep 19, 2018)

Fluffmeister said:


> Competition forces prices to go down, as it stands there is nothing marginally worse.
> 
> Reading this and other threads suggest the mighty GTX 1080 Ti is about to get more love.... Nvidia don't mind.



While this is true, AMD won't have anything that competes with the new Nvidia cards until mid to late 2019 at the earliest which is a big bummer.



terroralpha said:


> that would make sense because the reference 980 Ti has a garbage cooler, which is what TPU and others use to bench. but the fury X gets absolutely pummeled by custom 980 Tis, like the strix. the most glaring case was the MSI 980 Ti lightning, which was on average 25% faster than the fury x at every resolution. i personally had a 980 Ti kingpin which at the time of my purchase was the same exact price as a reference fury X and it was even faster than the lightning card TPU tested.



That's why I have been saying that the GTX 980ti is superior to the Fury X. It also overclocks significantly better.


----------



## Vya Domus (Sep 19, 2018)

xkm1948 said:


> While being an overclockers dream.
> 
> Any 980Ti with mild OC will blow FuryX away in all resolutions. FuryX can’t OC.



Ah of course, I know Fury X is a touching subject to you. Go on, tell us once again how horrible it was owning one.

Can we stop arguing about cards released many years ago that are quite irrelevant. Fury X was trash, as is RTG in general, got it. Can we move on now?


----------



## moproblems99 (Sep 19, 2018)

xkm1948 said:


> Man the Red Guards are out in doves this time.
> 
> When RTG GPU delivers: OMFG world peace happily ever after
> 
> ...



Thanks, I did!  Although, in honesty, the only thing I hate is stupidity.

Also, in case you haven't noticed.  Most of the people that think Vega is slow, power hungry, and hot - don't actually have one.  The only thing to complain about with Touring is the fact the prices are too damn high and RTX is useless* for gaming at the moment.  Even when the games come out over the next few months.  1080@60Hz is not what most people are going to be thrilled with running.  Already people complain about consoles targeting 1080@60.

*Assuming RTX demos are final performance numbers.


----------



## B-Real (Sep 19, 2018)

swirl09 said:


> Respect to that FE card, its actually solid.
> 
> Numbers are fairly close to what i was expecting. Not thrilled with the clocks and temps of the MSI trio, which is the one I have ordered ^_^’ hopefully my one fares a little better.
> 
> Really excited to see DLSS in action, I think its going to push this purchase from feeling the sting for 1/3 more perf, to actually making it worthwhile...


This is a required justification to spend your 1100-1200$ on a single GPU..



kings said:


> Yeah, some people just want to complain about something...
> 
> Of course price is insane, everyone here could agree, but lame performance? C´mom guys, be serious...
> 
> Despite the price, the 2080Ti is a beast of a card, no doubt about that. If Nvidia sold it for $500, I guess no one would mind to have such a lame performance card in their system.



No, people weren't expecting a 500$ 2080Ti. I think even a 800$ 2080Ti would have been OK. But 1000$ MSRP, which was 1100$ in reality is nonsense.


----------



## moproblems99 (Sep 19, 2018)

Fluffmeister said:


> Competition forces prices to go down, as it stands there is nothing marginally worse.



Wrong.  Consumers not purchasing drives prices down.  You don't need competition to have consumers not buy something.  Prices will fall until consumers buy.


----------



## Vya Domus (Sep 19, 2018)

moproblems99 said:


> Wrong.  Consumers not purchasing drives prices down.  You don't need competition to have consumers not buy something.  Prices will fall until consumers buy.



Yep, competition drives technological advancement not prices.


----------



## terroralpha (Sep 19, 2018)

im seriously considering a Bykski block for my 2080 Ti. anyone use these before? i'm seeing good reviews on amazon and other places from their previous models.

https://www.ebay.com/itm/113249948373
i have a targeted ebay coupon that will make the price $100.


----------



## Fluffmeister (Sep 19, 2018)

moproblems99 said:


> Wrong.  Consumers not purchasing drives prices down.  You don't need competition to have consumers not buy something.  Prices will fall until consumers buy.



Consumers will buy, so the prices are fine ultimately. If it doesn't fit your budget other cards are available.

Shocking.


----------



## bug (Sep 19, 2018)

Mighty-Lu-Bu said:


> While this is true, AMD won't have anything that competes with the new Nvidia cards until mid to late 2019 at the earliest which is a big bummer


AMD was already behind because of lack of TBR (supposedly in the Vega silicon, though apparently never actually enabled). Now, even if you disregard RTX, they also miss the hardware for advanced AA and mesh shading. Hoping they'll be able to close that gap within the year is a bit optimistic. I wish they would, but experience has taught me that whenever AMD isn't leaking upcoming products, upcoming products disappoint.


----------



## xkm1948 (Sep 19, 2018)

terroralpha said:


> im seriously considering a Bykski block for my 2080 Ti. anyone use these before? i'm seeing good reviews on amazon and other places from their previous models.
> 
> https://www.ebay.com/itm/113249948373
> i have a targeted ebay coupon that will make the price $100.




I am thinking of going EK MLC for mine 2080Ti Custom loop is still out of my comfort zone


----------



## moproblems99 (Sep 19, 2018)

Fluffmeister said:


> Consumers will buy, so the prices are fine ultimately.  If it doesn't fit your budget other cards are available.



I don't disagree with you that people will buy.  As to your other comment, I can afford anything I like thank you.  Luckily, my needs are met so I don't need to worry about this generation.  Had my needs not been met, it would be time to consider how much I value PC gaming versus something more....productive and useful.


----------



## Fluffmeister (Sep 19, 2018)

moproblems99 said:


> I don't disagree with you that people will buy.  As to your other comment, I can afford anything I like thank you.  Luckily, my needs are met so I don't need to worry about this generation.  Had my needs not been met, it would be time to consider how much I value PC gaming versus something more....productive and useful.



And that is absolutely fine, you have a Vega 56 and no doubt waited long enough for that.

Others always want more... as it stands the RTX 2080 Ti is 85% faster than a Vega 64 at 4k.


----------



## moproblems99 (Sep 20, 2018)

Fluffmeister said:


> Others always want more... as it stands the RTX 2080 Ti is 85% faster than a Vega 64 at 4k.



So real question, anyone can answer.  Next year( when 3080TI or whatever its name may be) is release, is $1500USD going to be an acceptable price because it is 38% faster than 2080TI?


----------



## Fluffmeister (Sep 20, 2018)

moproblems99 said:


> So real question, anyone can answer.  Next year( when 3080TI or whatever its name may be) is release, is $1500USD going to be an acceptable price because it is 38% faster than 2080TI?



Well no, because apparently despite the popular belief here, if the "competition" undercut you with what amounts to be the same product, you can't sell at silly prices.

Besides, the RTX 2080 Ti will be more affordable then anyway.


----------



## terroralpha (Sep 20, 2018)

moproblems99 said:


> So real question, anyone can answer.  Next year( when 3080TI or whatever its name may be) is release, is $1500USD going to be an acceptable price because it is 38% faster than 2080TI?



ok, i'll bite. if the fastest AMD card on the market at the time (navi 64?) is only half as fast but costs $750, then yea, i can see why someone would spend that much on it. if i buy a 4k 120Hz or 144Hz monitor, i might get it. it's unlikely as i prefer ultrawide screens, but i'm not ruling it out.

the prices are dictated by the market, at least in the US. product and service costs are set by what people are willing to pay. here is an example: i run an IT consulting company in NYC. my clients include law and accounting firms and ad agencies. i bill $250/hour. there are other IT companies that bill half as much or less. so why do my clients not dump me and go elsewhere? because those clowns do not do as good of a job as i do. why should i lower my prices because some idiot elsewhere in the city, who can't do what I can, charges a lower price?

same principal applies to everything. i have been using iphones since they first because available on verizon wireless, but i will not buy the new iphone because i don't think they are worth it. in fact, i'm probably going to change back to android. and i'm not the only one who thinks so. their stock lost 5% in the last week due to lackluster pre orders. however, i do think that the 2080 Ti is worth so i will keep my pre order. i was lucky enough to get a gigabyte "A" binned 2080 Ti for $1030 after tax on amazon during the nvidia keynote, before the prices all went off the rails. but if i had to pay $1200 i would still do it.


----------



## Blueberries (Sep 20, 2018)

Everyone's hung up on the price and glancing over a MASSIVE 40% 4k performance improvement?!

18% Performance/Watt improvement!


----------



## swirl09 (Sep 20, 2018)

moproblems99 said:


> So real question, anyone can answer.  Next year( when 3080TI or whatever its name may be) is release, is $1500USD going to be an acceptable price because it is 38% faster than 2080TI?


Ill answer your real(/hypothetical) question. If Im being honest, I would say Id have to see how I feel at the time, but I believe I would be leaning towards a no. 

The only reason I purchased a 2080ti is because I remember thinking the wait between the Titan and the 1080ti was painfully slow and I thought to myself I would have rather spent the money on the Titan than wait again. I wasnt expecting to be buying any card right now, I thought Turing would launch with 2070/2080, wth the 2080 only just edging the 1080ti and my hope was a non-silly-3k Titan would show up soon after and I would buy it. 

But Nvidia seems to be pushing up the Titan brand, and to my luck (I use “luck” loosely here) the 2080ti is announced right out the gate for the old Titan price point.

It is pushing things to their tolerance point, but I am sure thats no mistake on Nvidias part. It may be more than Id have liked to spend, but I appreciate it is a physically huge chunk of silicon and that R&D cost is a thing (whether we like it or not).

For what its worth, (IMO!) I dont see the prices going up further, but I do think a grand for a flagship will be the new norm.


----------



## kings (Sep 20, 2018)

The sad truth is, If Apple can sell an iPhone for +$1200, in a market where there is a lot of competition... Nvidia will have no problem selling a $1200 graphics card, that is almost twice as fast the best AMD card.


----------



## Metroid (Sep 20, 2018)

I was not expecting anything major here and it wasn't in the end, coming from 16nm to 12nm, that is 4nm of difference or 25%, 25% is what i expected and it was delivered, the problem is the price and that is what nvidia got wrong, nvidia should have msrp this at maximum of $499 for a rtx 2080ti, $399 for a rtx 2080 and $299 for the rtx 2070 and since amd's 7nm is coming, this might become one of the worse possible release to date concerning price.


----------



## Frick (Sep 20, 2018)

CheapMeat said:


> I'm not understanding all the complaints. The ONLY true bad thing is the price. There are no bad products, just bad pricing.  If a 2080Ti was $100, you'd all shut up (well...maybe not...eh). That's not possible. I can never can afford the newest GPU's, so I wouldn't be able to get it anyway. But nothing about it screams "terrible" like some of the posters make it seem. It still beats out a 1080Ti if you were a big spender anyway and certainly beats out anything AMD bothered with. If you're gaming at 4K, this will absolutely be on your radar.  And luckily some of the newer features aren't as proprietary as Nvidia tends to make them. There will absolutely be games coming up and much broader support than PhysX or GameWorks, etc.  I  think the complaints are coming from those who spend top dollar every single cycle and buy the top $60 AAA game on release dates and everything that doesn't meet that highest marginal "wants" is looked at like peasantry.  Any reasonable person is more cautious with their money regardless of where in the rankings said product is.



How do you aquire a product? By buying it with money. What determines if a product is worth the money? The cost in relation to what you get. It's like that with every single thing we buy. Some things provide value by being exclusive and niched (like Mount Balnc pens), while others provide value by sacrificing form in order to make a statement or have a lower price. The only exception is art, which can't really have a purpose other than itself.

This is a consumer GPU in a hobby market. It's not special, exclusive or niched (that would be the purpose of Titan) and thus the price has to make sense, but it really, really doesn't. And that "you wouldn't complain if it was cheaper" argument is ... I really don't understand what you mean. No, we wouldn't. That's the whole point.

And any complaints about the performance is really a complaint about either the perf/price ratio or a complaint about the state of the industry as a whole (lack of competition, creeping prices) and both complaints are excessively valid. These things are good (especially at 4K), but not for the money.


----------



## moproblems99 (Sep 20, 2018)

Metroid said:


> I was not expecting anything major here and it wasn't in the end, coming from 16nm to 12nm, that is 4nm of difference or 25%, 25% is what i expected and it was delivered, the problem is the price and that is what nvidia got wrong, nvidia should have msrp this at maximum of $499 for a rtx 2080ti, $399 for a rtx 2080 and $299 for the rtx 2070 and since amd's 7nm is coming, this might become one of the worse possible release to date concerning price.



So, it may be expensive but maybe lay off the alcohol.


----------



## DEFEATEST (Sep 20, 2018)

...And they will be all gone in no time....cuz sheep and youtube.


----------



## jaggerwild (Sep 20, 2018)

DEFEATEST said:


> ...And they will be all gone in no time....cuz sheep and youtube.



 Cause of we all think they wont you mean?(im lost)........


----------



## jigar2speed (Sep 20, 2018)

Its not a bad card, its the bad pricing.


----------



## techy1 (Sep 20, 2018)

jigar2speed said:


> Its not a bad card, its the bad pricing.


is there any way we can get this product without price? if no - then it is bad product.


----------



## wolf (Sep 20, 2018)

Great comprehensive review as always @W1zzard 

Always get a laugh when people complain that the single fastest/best product in a category is expesnive/overpriced... of course it is.

This is what it costs to have the best, by a wide margin, on day 1. You gotta pay for the privilege of being the top dog. Oh how quickly its forgotten that nobody is innocent in this game...

Id like to see what turing is truely capable of woth a node shrink and fully unlocked core count, but even for now its a commendable effort.


----------



## Frick (Sep 20, 2018)

wolf said:


> Great comprehensive review as always @W1zzard
> 
> Always get a laugh when people complain that the single fastest/best product in a category is expesnive/overpriced... of course it is.
> 
> This is what it costs to have the best, by a wide margin, on day 1. You gotta pay for the privilege of being the top dog. Oh how quickly its forgotten that nobody is innocent in this game...



And I just get sad when people miss the point.


----------



## medi01 (Sep 20, 2018)

xkm1948 said:


> When NV GPU delivers: We demand 1000fps at 8k for $200, while still complaining about “big bad green”



The strawman is strong in you, citizen.

It's about how far can you bend over backwards, really.
1080 was 27% more expensive than 980, but then it was 40%-ish faster, but while nvidia was making revenue record after revenue recored, there were multiple excuses on top of it, "new process is expensive" ON THIS VERY SITE.

Now, brief look at perf/$ of Techpowerup chart clearly demonstrates the issue, *with 2xxx series, prices rise faster than performance *(and that on a site that did sign "innovative" NDA... we will get full picture when heise.de/computerbase.de review the cards).







No no, of course you can bend over backwards, no worries.




wolf said:


> Always get a laugh when people complain that the single fastest/best product in a category is expesnive/overpriced... of course it is.



2080 is not "single fastest best product", in fact it is rivaled by 1080Ti, nor is 2070.
It's not hard to see, really, you just need to get from that shackle of lies of yours.



zelnep said:


> adored was wrong - this is WORSE, he did predicted +40% increase over gtx1080 ti - but obviously that was leakded from nvidias "testing guidlines".


"Testing guidlines" weren't available, when AdoredTV made that claim, so, uh, would you mind?


----------



## Voluman (Sep 20, 2018)

Thank you for the great review!

Can you check it with Unity Adam standalone pack? I am really curious what it should be.


----------



## Assimilator (Sep 20, 2018)

Frick said:


> This is a consumer GPU in a hobby market. It's not special, exclusive or niched...



Completely incorrect. The fastest anything in any segment occupies - by its very definition - a special, exclusive niche that allows it to command whatever price it wants, until it is superseded by something faster.



medi01 said:


> 1080 was 27% more expensive than 980, but then it was 40%-ish faster, but while nvidia was making revenue record after revenue recored, there were multiple excuses on top of it, "new process is expensive" ON THIS VERY SITE.



What, exactly, is your point?

NVIDIA isn't making life-saving medical supplies for everyone, they are producing a luxury product for a small market, and they are within their rights to charge whatever they think that market will bear. The fact that you think that's exploitative and unfair and wrong doesn't matter one iota, because this is pure capitalism at work and it doesn't give a s**t about your feelings.

If you want prices to come down, stop complaining about a company that has effectively monopolised its market by simply being better than everyone else, and start complaining about the everyone else that consistently fails to compete. Until AMD RTG pull their collective heads out of their collective you-know-whats and git gud, or Intel actually finally manages to deliver a discrete GPU that isn't terrible, NVIDIA is the only game in town, and the only game in town gets to charge whatever the hell it wants.



medi01 said:


> 2080 is not "single fastest best product", in fact it is rivaled by 1080Ti, nor is 2070.



We're talking about the 2080 Ti, the subject of this review - do try to keep up. And the fact of the matter is that RTX 2080 Ti is, hands down, the single fastest graphics card in the world to date.


----------



## B-Real (Sep 20, 2018)

Checking other reviewers' results, the 2080 - 2080 Ti is even worse than in Your reviews. 2080 is ~ 1080 Ti (no 9% difference in 4K but 0-1%) and 2080 Ti is around 20-30 (not 39%). 

An absolute failure gen in terms of price/performance.


----------



## ensabrenoir (Sep 20, 2018)

...the cards are a total "winged unicorn."  to those  willing to buy the dream.  Curious to see what the GTX 2070's bring to the table.


----------



## Captain_Tom (Sep 20, 2018)

B-Real said:


> Checking other reviewers' results, the 2080 - 2080 Ti is even worse than in Your reviews. 2080 is ~ 1080 Ti (no 9% difference in 4K but 0-1%) and 2080 Ti is around 20-30 (not 39%).
> 
> An absolute failure gen in terms of price/performance.



Because the GK100 for $1000 and GP104 for $700+ was _good_ price/perf?


----------



## efikkan (Sep 20, 2018)

GTX 2080 might be about $50 too expensive, but isn't a bad deal. GTX 2080 Ti is too expensive, but this is still not unprecedented; GeForce 8800 Ultra launched at $850 (~$1034 today).

Still, it's sad to see that another thread is just polluted by FUD. You know people are spoiled when they complain about a successor _just_ being 39% better and _just_ being 18% more energy efficient, and offer features you can't fully utilize yet. The tone was completely different when the shoe was on the other foot; not just forum members, but also most of the tech press touted AMD's glorious Mantle and (imaginary) Direct3D 12 advantage.


----------



## zilul (Sep 20, 2018)

efikkan said:


> but also most of the tech press touted AMD's glorious Mantle and (imaginary) Direct3D 12 advantage.



Because it isn't a proprietary technology, it's for the benefit of the whole industry, and probably RTX won't be possible if it wasn't for DX12.

As for my opinion, if Nv can charge this much, logically it mean there is people who are willing to pay this huge price, and it will continue until the next major economical crisis or war, people in time of stability tend to destruct themselves with compulsive consumerism.


----------



## Vya Domus (Sep 20, 2018)

zilul said:


> and probably RTX won't be possible if it wasn't for DX12.



RTX runs under DXR. Nvidia, AMD and Microsoft probably talked about and developed these technologies long before we got to know about them.


----------



## moproblems99 (Sep 20, 2018)

efikkan said:


> The tone was completely different when the shoe was on the other foot



You know everybody says that but I just don't see it.  The only difference is who is complaining.  I'll admit that I didn't think Touring would get performance increases as it did and nVidia should most likely be commended for pulling that off.  I just don't see RTX struggling to push 1080@60 on the top card as an accomplishment.  In fact, I am more impressed with what they were able to do with 'base' performance improvements.

All that said, the price is still too damn high.


----------



## W1zzard (Sep 20, 2018)

zilul said:


> and probably RTX won't be possible if it wasn't for DX12.


RTX works with NVIDIA OptiX (their own API that's been out for like 5 years), Microsoft DXR and Vulkan RT. Vulkan is open source


----------



## dwade (Sep 20, 2018)

Preordered 2. 9900k by next month to drive these two beast. I'm not waiting for Navi and neither should anyone since it'll be a PS5 exclusive---AMD abandoned PC gamers for console gamers; remember that when you decide to support this company.


----------



## zilul (Sep 20, 2018)

dwade said:


> Preordered 2. 9900k by next month to drive these two beast. I'm not waiting for Navi and neither should anyone since it'll be a PS5 exclusive---AMD abandoned PC gamers for console gamers; remember that when you decide to support this company.



Yea why not, but bringing AMD and beating the dead horse in every of your comments won't help with the lack of high end competition either, because right now you are appearing like a diehard Nvidia shill, you are just indirectly telling people "I'm biased so don't take my comments seriously".

Many people are happy with their RX580/570/560 etc, they didn't made a bad choice or did a supportive purchase as your comment imply, they bought it because the product is competitive and maybe cheaper depend where they live, that's all. 



W1zzard said:


> RTX works with NVIDIA OptiX (their own API that's been out for like 5 years), Microsoft DXR and Vulkan RT. Vulkan is open source



Thanks for the info, but I was talking about the gaming engines, which use DX or vulkan as far as I know.


----------



## TheinsanegamerN (Sep 20, 2018)

dwade said:


> Preordered 2. 9900k by next month to drive these two beast. I'm not waiting for Navi and neither should anyone since it'll be a PS5 exclusive---AMD abandoned PC gamers for console gamers; remember that when you decide to support this company.


Low quality bait M8. This kinda post work better over on reddit. 

There is no evidence that Navi will be PS5 exclusive. Sony may get early access if they are helping to finance it, but that does not mean it wont be made into PC GPUs. 

AMD also didnt abandon the Pc for consoles. They released the polaris chips to great success, and released the half baked Vegas. They are not very competent at GPUs right now, but they are still there. 

Also keep in mind AMD has way less revenue then either nvidia or intel, and is competing against both and doing SOMETHING. The fact the 480 was as close to nvidia's 1060as it was was a miracle. Now, with ryzen making serious dough and AMD finally clawing back some server sales, investment in RTG should allow a new arch that is at least somewhat competitive with turing.


----------



## Paganstomp (Sep 20, 2018)

No. Not a first day buyer here since I don't own or play any game that needs a HYBRID RAY TRACING card. This would be a complete waste of money. There are just too many things in life that need to be taking care of. But would be great for elitist snobs that tend to get themselves into inappropriate situations with 3D printed guns and under age Texas girls.


----------



## dwade (Sep 20, 2018)

TheinsanegamerN said:


> Low quality bait M8. This kinda post work better over on reddit.
> 
> There is no evidence that Navi will be PS5 exclusive. Sony may get early access if they are helping to finance it, but that does not mean it wont be made into PC GPUs.
> 
> ...


https://www.forbes.com/sites/jasone...nys-playstation-5-vega-suffered/#31816a024fda
Vega suffered because AMD decided to reallocate their resources onto Navi for Sony's console. Raja left for a reason. AMD chose console gaming over PC gaming. End of story.

Back on topic. Solid review for a solid graphics card. If only there are more of them in stock since they're selling out like crazy.


----------



## terroralpha (Sep 21, 2018)

ensabrenoir said:


> ...the cards are a total "winged unicorn."  to those  willing to buy the dream.  Curious to see what the GTX 2070's bring to the table.



absolutely nothing. you will get 1080-like performance but the card will be way too slow for ray tracing so no one will ever be able to use it. to only upside will be that the 2070 will not choke to RX 580 speed anytime async is implemented. that's it.


----------



## Caring1 (Sep 21, 2018)

dwade said:


> Raja left for a reason.


It's called corporate poaching, he has no loyalty, he prefers the almighty dollar, clearly.



dwade said:


> AMD chose console gaming over PC gaming. End of story.


AMD chose to focus on the Processor side of their business for the short term, end of story, now back to the topic.


----------



## Captain_Tom (Sep 21, 2018)

dwade said:


> https://www.forbes.com/sites/jasone...nys-playstation-5-vega-suffered/#31816a024fda
> Vega suffered because AMD decided to reallocate their resources onto Navi for Sony's console. Raja left for a reason. AMD chose console gaming over PC gaming. End of story.
> 
> Back on topic. Solid review for a solid graphics card. If only there are more of them in stock since they're selling out like crazy.



Vega was a well executed compute card, and if Raja couldn't see that - it's good he is gone.  Bye bye.

AMD needs the PS5 WAY more than Vega needed that extra 20% performance.  The revenue generated for AMD by the PS5 will allow them to truly fund a mega gaming architecture for the next decade, and that IS what they need.

I really don't get why that's so hard for people to understand.  Furthermore, Nvidia is only as ahead as AMD lets them be ahead anyways.  If AMD wanted to, they could launch an entire new line-up that's fairly competitive right now.  However their money is better spend on EPYC.


----------



## Assimilator (Sep 21, 2018)

Captain_Tom said:


> Furthermore, Nvidia is only as ahead as AMD lets them be ahead anyways.  If AMD wanted to, they could launch an entire new line-up that's fairly competitive right now.



Dafuq are you smoking?


----------



## Captain_Tom (Sep 21, 2018)

Assimilator said:


> Dafuq are you smoking?



What are you smoking? lol

"There's no way AMD can compete with the Titan!"  Then 290X launches and roflstomps it with a smaller die.  It's happened before, get a longer memory.

Things are definitely worse for Radeon right now - Turing is very impressive (Especially in my opinion), but that doesn't mean AMD couldn't launch an entire line-up by January that crushes the majority of Turing.  It really wouldn't be very hard.


----------



## R0H1T (Sep 21, 2018)

Assimilator said:


> Dafuq are you smoking?


At *7nm* TSMC, *if *AMD could get higher clocks out of Vega or _*refined*_ Vega then yes it is possible for them to compete with Nvidia up to 2080 IMO. They won't touch 2080Ti anytime soon, unless they radically change their GPU uarch.


----------



## W1zzard (Sep 21, 2018)

Captain_Tom said:


> It really wouldn't be very hard.


The biggest issue for AMD is figuring out power efficiency as you can't just add "more stuff" like you do to raise performance

What's the biggest perf step AMD has ever made generation over generation? They need to double existing perf..
Once that's achieved they need to cut power of this new design into half, ie double perf/watt.. both at the same time

This is not like Ryzen where Intel made only small single digit percentage improvements generation over generation


----------



## Assimilator (Sep 21, 2018)

Captain_Tom said:


> What are you smoking? lol
> 
> "There's no way AMD can compete with the Titan!"  Then 290X launches and roflstomps it with a smaller die.  It's happened before, get a longer memory.
> 
> Things are definitely worse for Radeon right now - Turing is very impressive (Especially in my opinion), but that doesn't mean AMD couldn't launch an entire line-up by January that crushes the majority of Turing.  It really wouldn't be very hard.



Either you're the world's most useless troll, or literally retarded. Where, exactly, is AMD going to procure this mythical faster-than-Turing architecture from, considering their current fastest GPU architecture is only half as performant as NVIDIA's, and they have made it very clear that there is no replacement for that arch arriving in the near future?

Unless you think that the shrink of Vega to 7nm will suddenly, magically make a poor architecture perform 200% better.... which it won't because that's not how it works. If it did, AMD would have been hyping it up like mofos and current Vega cards would already have been replaced.

But keep on with your delusions if that's what floats your boat, the rest of us will continue to live in the real world.


----------



## btarunr (Sep 21, 2018)

No namecalling please.


----------



## medi01 (Sep 21, 2018)

Assimilator said:


> What, exactly, is your point?


That greed was masked by "reasons".



Assimilator said:


> NVIDIA isn't making life-saving medical supplies for everyone, they are producing a luxury product for a small market,


GPUs are not luxury products (not even high end) and GPU market is by no means "a small market", but it's again, has nothing to do with the discussion.



Assimilator said:


> If you want prices to come down...


Why? I want them to go up.



W1zzard said:


> This is not like Ryzen where Intel made only small single digit percentage improvements generation over generation



I think it's unrealistic to expect AMD's investment into high end GPU (way to risky).
*Vega was a fluke* (nVidai likely has a bunch and a half of own flukes which we have never seen, but unlike AMD it has enough money to fund R&D on multiple fronts), but 580 is a chip of roughly the same size as 1060 and it certainly doesn't consume "twice" the amount of power. (I also keep asking myself how many of the tested cards are cherry picked)

But given "full 7nm commitment", what is unrealistic about 1H 2019 Navi?

And regarding "Intel was doing nothing", uh, was it:







Steady progress through all these years, with Conroe jump looking that big only because P4 IPC was that low.

You should know better than me, that GPUs are inherently "more paralleled" and hence can still scale easier with die size, than CPUs.


----------



## Assimilator (Sep 21, 2018)

medi01 said:


> *Vega was a fluke*



Wrong. See: Fury, Vega's predecessor. Not as bad as Vega, but GCN should have been retired then.



medi01 said:


> But given "full 7nm commitment", what is unrealistic about 1H 2019 Navi?



The fact that AMD has said it won't happen until 2H 2019 at best? Which means 2020 more likely, since their forecasts are always overly optimistic.

But even if they did manage to get 7nm Navi shipped in 1H 2019, GeForce RTX is available right now, which means it will benefit from two of the most important shopping events: Black Friday and Christmas of 2018. A GPU launching in 2019 misses both of those; a GPU launching in 2020 might as well not bother.

Vega's failure has caused AMD to give up on the high-end GPU market. Going forward, NVIDIA will be the only game in town in that bracket, unless Intel is somehow able to compete.



medi01 said:


> And regarding "Intel was doing nothing", uh, was it:
> 
> 
> 
> ...



He didn't say "nothing". He said small improvements, which is absolutely true.


----------



## medi01 (Sep 21, 2018)

Assimilator said:


> Wrong. See: Fury, Vega's predecessor.


Fury beat 980Ti at 4k, while Vega could barely take on 1080, so, no.



Assimilator said:


> The fact that AMD has said it won't happen until 2H 2019 at best?


I'd appreciate a link and quote.



Assimilator said:


> GeForce RTX is available right now, which means it will benefit from two of the most important shopping events...:


I'm not sure if it is stupidity caused by greed or some evil plan or both, but nVidai doesn't seem to consider dropping prices on older gen, 2xxx cards at given price are hard to justify and RTX stuff crippling 2080Ti to 1080p and poor fps is hardly exciting.  And, uh, we also have that mining craze which I still can't figure if it's over or not and once it is over, we'd get market flooded with those cards, which is again, yay.

But even more importantly:
1) AMD doesn't have to "not miss" the mentioned shopping events
2) Exorbitant pricing set by Huang benefits AMD too
3) Mobile market is where they seem to have offerings.



Assimilator said:


> He said small improvements, which is absolutely true.


Same improvements like before, while he implied Intel was slow, because no competition, so no, not rue.



Assimilator said:


> Vega's failure has caused AMD to give up on the high-end GPU market. Going forward, NVIDIA will be the only game in town in that bracket, unless Intel is somehow able to compete.


After AMD recovers mid to mid-high end (which it could, having much more resources to spend on R&D than before), lucrative high end will naturally follow.


----------



## Captain_Tom (Sep 21, 2018)

W1zzard said:


> The biggest issue for AMD is figuring out power efficiency as you can't just add "more stuff" like you do to raise performance
> 
> What's the biggest perf step AMD has ever made generation over generation? They need to double existing perf..
> Once that's achieved they need to cut power of this new design into half, ie double perf/watt.. both at the same time
> ...



I said _most_ of Nvidia's current line-up, and yes they really could launch 14nm chips right now with GDDR6 that easily compete with everything up to the 2080.  Vega M in laptops has the same or better efficiency as Pascal, and heck we already know the XBX has a 384-bit Polaris card with extra compute units that would beat the 1070.  But neither of these are on desktop because AMD is literally choosing to not care.

They launched a small midrange chip in 2016, and then a compute chip with half the ram in 2017 - that's it lol!  *Anyone acting like AMD has no options fails to see they haven't even tried.*



Assimilator said:


> Either you're the world's most useless troll, or literally retarded. Where, exactly, is AMD going to procure this mythical faster-than-Turing architecture from, considering their current fastest GPU architecture is only half as performant as NVIDIA's



LOL see my other comment - AMD has almost releasesd nothing for 2 years, and it's because that's all they need to do to keep 1/3rd of the "retarded" PC gaming market.  They aren't even trying at all lol.   But if you want me to tell you about this "Mythical" Turing-Killer - I need _you_ to tell me which process AMD is using.  There are multiple ways to skin a cat:


GF 12nmFF
GF 12nm-Planar 
TSMC 7nmFF
^ Choose.   I could even conceive of a GF 14nmFF  process that at least crushes Turing in Price/perf, but top performance would be hard to achieve using such an inferior node.


----------



## efikkan (Sep 21, 2018)

R0H1T said:


> At 7nm TSMC, if AMD could get higher clocks out of Vega or refined Vega then yes it is possible for them to compete with Nvidia up to 2080 IMO. They won't touch 2080Ti anytime soon, unless they radically change their GPU uarch.


7nm is not going to save AMD. Pascal already offers ~80% and Turing ~100% more performance per watt. Even the best node shrink will not offset that.

The reason why AMD's GPUs struggle is not lack of theoretical performance, it's lack of utilization. Throwing in more cores is not going to solve it either, doing so will only increase the gap to Nvidia. Look at what Nvidia have been doing; they don't _just_ add more cores, they also keep squeezing more and more performance out of them, and that is ultimately also the key to energy efficiency.

Before Polaris arrived, people claimed it would put AMD ahead of Nvidia in efficiency, with the claimed superior 14 nm vs. 16 nm, but in reality it barely became on par with Maxwell on 28 nm. Now people are desperately clinging to the hope that 7 nm is going to save them, despite the efficiency gap being larger than ever.


----------



## Captain_Tom (Sep 21, 2018)

efikkan said:


> 7nm is not going to save AMD. Pascal already offers ~80% and Turing ~100% more performance per watt. Even the best node shrink will not offset that.
> 
> The reason why AMD's GPUs struggle is not lack of theoretical performance, it's lack of utilization. Throwing in more cores is not going to solve it either, doing so will only increase the gap to Nvidia. Look at what Nvidia have been doing; they don't _just_ add more cores, they also keep squeezing more and more performance out of them, and that is ultimately also the key to energy efficiency.



Wake me up when Pascal can run make 15w GPU's for laptops that compete with the 1050 Ti...


----------



## Durvelle27 (Sep 21, 2018)

Definitely saw this coming. 

Performance isn’t terrible but the price makes it’s very unappealing. Nvidia could have sold many more and had better support if it was priced better especially for those who have 980Ti-1080s. But at this price you’re better off buying a 1080Ti and Overclocking it.


----------



## swirl09 (Sep 22, 2018)

swirl09 said:


> Ill answer your real(/hypothetical) question. If Im being honest, I would say Id have to see how I feel at the time, but I believe I would be leaning towards a no.


I know its probable that no one cares, but I want to change my answer. I am aware that most say whatever on the internet and its almost immediately discarded/forgotten, but I would like to think I stand over what I say - if only for myself.

With that said, the question was whether or not I would pay $1,500 for another 38% perf bump next time around. Quoted was my answer. Im changing this to a flat no. Having just paid only a little less than that for my still pending 2080ti, the thought of only paying a couple of hundred more to go another 38% (in the here and now*!) was tempting, but of course if I had to wait another gen AND pay more, it was a _probably _no.

The thing I did not consider, was that while a third or so more perf now (going to the 2080ti) is decent, part of my purchase was the due to the addition of the new cores. Even with them being (largely) unproven at this point, the prospect of what DLSS could bring was enough to tip the scales. So "just" getting another similar bump next time around at a higher cost would NOT do it for me.


----------



## R0H1T (Sep 22, 2018)

efikkan said:


> 7nm is not going to save AMD. Pascal already offers ~80% and Turing ~100% more performance per watt. Even the best node shrink will not offset that.
> 
> The reason why AMD's GPUs struggle is not lack of theoretical performance, it's lack of utilization. *Throwing in more cores is not going to solve it either, doing so will only increase the gap to Nvidia*. Look at what Nvidia have been doing; they don't _just_ add more cores, they also keep squeezing more and more performance out of them, and that is ultimately also the key to energy efficiency.
> 
> Before Polaris arrived, people claimed it would put AMD ahead of Nvidia in efficiency, *with the claimed superior 14 nm vs. 16 nm*, but in reality it barely became on par with Maxwell on 28 nm. Now people are desperately clinging to the hope that 7 nm is going to save them, *despite the efficiency gap being larger than ever*.


I'm not saying anything about more (GPU) cores but higher clocks which can & probably will help Vega, again provided it clocks higher on 7nm without killing efficiency.

The assumption was that GF 14nm was gonna be superior to 16nm TSMC, which probably turned out to be untrue.

So you're saying 7nm TSMC will be worse than 12nm (SS?) or Vega is just bad regardless of what node it's on?


----------



## efikkan (Sep 22, 2018)

R0H1T said:


> I'm not saying anything about more (GPU) cores but higher clocks which can & probably will help Vega, again provided it clocks higher on 7nm without killing efficiency.


If we're lucky, we're looking at a choice between retaining the clocks and improving efficiency up to 50-60% or retaining the maximum TDP and pushing the clocks ~30%, or a mix of the two. One thing to remember is that hotter chips are harder to shrink, because watt per mm² also becomes a scaling threshold, leading them to either sacrifice density or clock speed. More efficient designs will benefit more from better nodes than inferior designs, that's why Nvidia increased their lead when going from 28 nm to 16 nm/14 nm.

When Turing have 100% more performance per watt and nearly 100% more raw performance, and all of this with just having 15% Flop/s than Vega 64, it should start to paint a picture of how far behind AMD is.



R0H1T said:


> So you're saying 7nm TSMC will be worse than 12nm (SS?) or Vega is just bad regardless of what node it's on?


GCN is outdated and have fallen behind. 7 nm is not going to save them, and by the time AMD have 7 nm, Nvidia also will have access to it.


----------



## medi01 (Sep 22, 2018)

efikkan said:


> . Pascal already offers ~80%


Right:






Almost. Sort of kind of. But actually less, if we get back from reality distortion field of yours.


----------



## newtekie1 (Sep 22, 2018)

medi01 said:


> Right:
> 
> 
> 
> ...



In that chart, Vega64 vs GTX 1080 puts the 1080 85% better performance per watt.


----------



## Diverge (Sep 22, 2018)

Anyone know the max power limit in afterburner/precision on the FE cards?

Evga released an updated bios for their cards, that takes them to 130% max power limit... so curious what the review sample FE cards have. If it is less, do you think nvidia will release a bios update? (do they ever release bios updates?)

https://forums.evga.com/tm.aspx?m=2858793

If they are limited to 115%, that seems odd, because nvidia specifically said that the 20 series has a robust power delivery, and more headroom, and previous gen FE cards were limited to 120%.


----------



## Xzibit (Sep 22, 2018)

Diverge said:


> Anyone know the max power limit in afterburner/precision on the FE cards?
> 
> Evga released an updated bios for their cards, that takes them to 130% max power limit... so curious what the review sample FE cards have. If it is less, do you think nvidia will release a bios update? (do they ever release bios updates?)
> 
> ...



One of the youtubers forgot which one had one with 130%. Didnt do much different. As in peaked but then settled down. 204x-208x. Runs nice and cool.

*It was jayztwocents @ 9:00*

Nvidia already said they were voltage locked so going to need some hard workarounds


----------



## garrick (Sep 23, 2018)

its it worth 1,200.00 i have to think about this ok no!!!


----------



## StrayKAT (Sep 23, 2018)

garrick said:


> its it worth 1,200.00 i have to think about this ok no!!!



If I had $1200 I'd buy 128GB of RAM. If I'm going to be to wasteful on something I don't need, I may as well go that route.


----------



## tfdsaf (Sep 23, 2018)

xkm1948 said:


> You call a well established 10+ years reviewer and review site less reputable than some self claimed “youtuber” WTF are you smoking? Seriously?


Yea I do, sue me!


----------



## Xzibit (Sep 23, 2018)

xkm1948 said:


> You call a well established 10+ years reviewer and review site less reputable than some self claimed “youtuber” WTF are you smoking? Seriously?



Just FYI. You realize that youtuber reviews are part of TechSpot which is ranked higher then this site Globaly and in USA by a bit (thousands).


----------



## medi01 (Sep 23, 2018)

AdoredTV summarizes discrepances across istes, mentiones FE being overclocked out of the box, explains the differences:












terroralpha said:


> adored was not the only one who predicted this, everyone did. everyone was saying 2080 ti would be 35%-40% faster than 1080 Ti


Citation needed.



tfdsaf said:


> Seems like your numbers are quite ways off, for example hardware unboxed on the same games and resolutions found the RTX 2080 much closer to the GTX 1080ti and even losing in two games..



Could you be more specific?


----------



## Assimilator (Sep 23, 2018)

Xzibit said:


> Just FYI. You realize that youtuber reviews are part of TechSpot which is ranked higher then this site Globaly and in USA by a bit (thousands).



Views are quanitity, not quality.


----------



## medi01 (Sep 24, 2018)

Assimilator said:


> Views are quanitity, not quality.


Want to talk about "quality of the viewers" seriously, on this site, where "tech savvy" users claim 1060 is "much faster than 580"?


----------



## thebluebumblebee (Oct 5, 2018)

My apologies if this was asked somewhere else in the 9 pages of feedback, but I wasn't going to read them all...

@W1zzard , I know that what I'm about to ask would be a lot of work, but after watching a jayz2cents video, I'd like to see you test this, and maybe the 2080 on a Ryzen 2700X so that we would have "apples to apples" comparisons to use when people ask "Intel or AMD for a gaming build."


----------

