# ASUS GeForce RTX 2080 Ti STRIX OC 11 GB



## W1zzard (Sep 19, 2018)

The ASUS ROG Strix RTX 2080 Ti OC is the company's flagship graphics card, and designed to rival the quality of NVIDIA's own Founders Edition products. A high factory-overclock, bolstered by a strong VRM solution add to its premium credentials.

*Show full review*


----------



## kastriot (Sep 19, 2018)

Even best card can't beat 4K 60fps in Deus EX disappointing..


----------



## W1zzard (Sep 19, 2018)

Conclusion has been added


----------



## neatfeatguy (Sep 19, 2018)

W1zzard said:


> Conclusion has been added



Fourth paragraph, last sentence - it doesn't read very well and bugs me:

_"To me this looks like a more balanced approach, that ensures power draw and thus heat output don't that high, so the heatsink won't have to work as hard. "_

Not sure if the first comma is necessary, but also missing a word or two to make it read properly. Perhaps something like this:

"To me this looks like a more balanced approach that ensures a lower power draw, thus resulting in a lower heat output so the heatsink won't have to work as hard."

OR

"To me this looks like a more balanced approach. This ensures a lower power draw, thus a lower heat output so the heatsink won't have to work as hard."

Otherwise everything else is good. Awesome work on all these reviews. Too bad I don't think the price justifies the card. However, looking at things from my personal perspective on the fact that I have a 980Ti, the 2080/2080Ti alone can provide up to or more than twice the performance of my card. On that end of things, these cards look impressive if I were to upgrade. But, as I said, those prices are just not worth it in my opinion and I'll gladly stick with what I have since my 980Ti more than meets my current needs.

It's been a good time sink reading through all these reviews. Thanks for the hard work.


----------



## W1zzard (Sep 19, 2018)

neatfeatguy said:


> t doesn't read very well and bugs me:


fixed, thanks


----------



## Sasqui (Sep 19, 2018)

neatfeatguy said:


> Too bad I don't think the price justifies the card. However, looking at things from my personal perspective on the fact that I have a 980Ti, the 2080/2080Ti alone can provide up to or more than twice the performance of my card. On that end of things, these cards look impressive if I were to upgrade. But, as I said, those prices are just not worth it in my opinion and I'll gladly stick with what I have since my 980Ti more than meets my current needs.



$1299 for a graphics card, that's really hardcore.  I'll be playing with my Vega 64 for many years to come lol.


----------



## Fatalfury (Sep 20, 2018)

i like how the  gaming power consumption is still less than Vega 64...LOL


----------



## Solid State Soul ( SSS ) (Sep 20, 2018)

Hmm i dont like the fact that asus are using 5K capacitors on their super flagship product, those capacitors are commonly used on cheap budget motherboards.


----------



## JalleR (Sep 20, 2018)

Hehe it is fun to see that Intel really set its mark on "rebranding"    
Intel Core i7-8700K @ 4.8 GHz
(Coffee Lake, 8192 KB Cache) ??  

Great Review as always thx 
It is all the tests BTW. @W1zzard


----------



## W1zzard (Sep 20, 2018)

JalleR said:


> 8192 KB Cache


 big fail. Nobody ever noticed it. Not me, not the editorial team, not the proof reader, nor any of the readers.

fixing


----------



## newtekie1 (Sep 20, 2018)

Sasqui said:


> $1299 for a graphics card, that's really hardcore.  I'll be playing with my Vega 64 for many years to come lol.



Yeah, I remember back in the day when companies weren't that crazy and were only releasing graphics cards for $1,500...wait...no...people here praised those cards like they were the second coming.  I wonder what the difference was.


----------



## Sasqui (Sep 20, 2018)

newtekie1 said:


> Yeah, I remember back in the day when companies weren't that crazy and were only releasing graphics cards for $1,500...wait...no...people here praised those cards like they were the second coming.  I wonder what the difference was.



I suspect I would have said the same thing back then.  There are people out there willing fork out that much cash, such as 1080 or 1080 TI in SLI or Vega 64 in CF.


----------



## Shatun_Bear (Sep 20, 2018)

newtekie1 said:


> Yeah, I remember back in the day when companies weren't that crazy and were only releasing graphics cards for $1,500...wait...no...people here praised those cards like they were the second coming.  I wonder what the difference was.



The price is absolutely ludicrous here and now. Prices have never been so expensive.


----------



## newtekie1 (Sep 21, 2018)

Shatun_Bear said:


> The price is absolutely ludicrous here and now. Prices have never been so expensive.


They've been higher, go learn some history.


----------



## DarthJedi (Sep 21, 2018)

newtekie1 said:


> They've been higher, go learn some history.



Hmm, when?


----------



## Prima.Vera (Sep 21, 2018)

Yeah, 1300$ WITHOUT taxes...
In Europe will probably be ~1500EU from the retail shop. 
Good luck...


----------



## Stevostin (Sep 21, 2018)

Does it have fanconnect II? On the gallery you show pictures of the 2080 and we can see the fanconnect but i can't find any mention of fanconnect on the 2080 ti. Can you confirm its presence or not?


----------



## newtekie1 (Sep 21, 2018)

naxeem said:


> Hmm, when?



April 2014 and April 2016.


----------



## mahanddeem (Sep 21, 2018)

Long story short, if you have 1080 or 1080ti and on 1440p or less and don't want to burn money unnecessarily, skip the new gen.


----------



## Stevostin (Sep 22, 2018)

As I feared I got my answer, not sure if anyone else is interested but they did remove fanconnect on the 2080 ti. This doesn't make any sense since the 2080 has this feature. For such a price they could have put the fanconnect which was quite useful for me...


----------



## newtekie1 (Sep 22, 2018)

Stevostin said:


> As I feared I got my answer, not sure if anyone else is interested but they did remove fanconnect on the 2080 ti. This doesn't make any sense since the 2080 has this feature. For such a price they could have put the fanconnect which was quite usefuwl for me...
> 
> View attachment 107331



That is odd, one picture in W1z's review shows the fan headers there, but the PCB shots in W1z's review shows them missing. There are spaces for them on the PCB, but the actual headers are soldered on.

Makes me wonder if ASUS planned to have them, but had to cut them for some reason. Or maybe they just aren't there on the sample boards?

ASUS claims on their site that both the 2080 and 2080Ti have two fan connect headers. 

https://rog.asus.com/articles/gamin...nd-rtx-2080-graphics-cards-from-rog-and-asus/


----------



## W1zzard (Sep 22, 2018)

Stevostin said:


> not sure if anyone else is interested but they did remove fanconnect on the 2080 ti


My sample is an early one, so I assume they just forgot to put it in, since the more recently produced 2080 has it. Let me double check


----------



## DarthJedi (Sep 22, 2018)

newtekie1 said:


> April 2014 and April 2016.



But I don't recall that. It has never been like this, this is a whole new level and then some.
Can you be more specific? When were those cards priced this much? What cards? How much?


----------



## newtekie1 (Sep 22, 2018)

Sasqui said:


> I suspect I would have said the same thing back then.  There are people out there willing fork out that much cash, such as 1080 or 1080 TI in SLI or Vega 64 in CF.



Actually, your exact words were "what a monster", you seemed to have no issue with the $1,500 price back then.  In fact, you can go through that entire thread, there is no outrage about the price, despite it being even higher than the 2080Ti over 4 years ago.

Don't get me wrong, $1,200 is still a crazy price to pay for a graphics card, and way beyond what I'd be willing to spend.  However, everyone is freaking out about the price, when in reality it is pretty well in line with the historical pricing of a completely unrivaled king of the hill graphics card.  If anything, it's priced a little lower than they have been in the past...



naxeem said:


> When were those cards priced this much?



I already answered that.  The dates I mentioned were the released dates of the cards, and they were priced that much at release.



naxeem said:


> What cards?



AMD R9 295X2 and Radeon Pro Duo.  The Radeon Pro Duo you can maybe give them a pass on the high price, because they tried to market it as a "Prosumer" product like the Titans, which usually have a higher price tag, but the R9 295X2 was a straight up desktop card priced at $1500(AFAIK still the highest priced desktop card to date).



naxeem said:


> How much?



Both were price at $1,500 at release.


----------



## Fluffmeister (Sep 23, 2018)

+3

Yeah, the lack of outrage at the price of that gas guzzler is striking.

I'd hate to think there are double standards going on here.

There are.


----------



## DarthJedi (Sep 23, 2018)

newtekie1 said:


> Actually, your exact words were "what a monster", you seemed to have no issue with the $1,500 price back then.  In fact, you can go through that entire thread, there is no outrage about the price, despite it being even higher than the 2080Ti over 4 years ago.
> 
> Don't get me wrong, $1,200 is still a crazy price to pay for a graphics card, and way beyond what I'd be willing to spend.  However, everyone is freaking out about the price, when in reality it is pretty well in line with the historical pricing of a completely unrivaled king of the hill graphics card.  If anything, it's priced a little lower than they have been in the past...
> 
> ...



No, it doesn't qualify. 
Neither of those cards were actually single products from the manufacturing perspective; these were a special 1-slot dual GPU cards that were 2 cards in every way but a slot usage. There was no difference from a usual dual card SLI/CrossFire, except for the package (single PCB).


----------



## Stevostin (Sep 23, 2018)

W1zzard said:


> My sample is an early one, so I assume they just forgot to put it in, since the more recently produced 2080 has it. Let me double check



What's really telling is that you can find the mention of fanconnect on the product page of the 2080 but not on the 2080 ti product page.
Something's fishy here...


----------



## newtekie1 (Sep 23, 2018)

naxeem said:


> No, it doesn't qualify.
> Neither of those cards were actually single products from the manufacturing perspective; these were a special 1-slot dual GPU cards that were 2 cards in every way but a slot usage. There was no difference from a usual dual card SLI/CrossFire, except for the package (single PCB).



What? Just because they were dual-gpu cards doesn't mean they weren't a single video card when they reached the consumer. They could put 4 or 200 GPUs on them to get the performance they need, or even multiple PCBs like nVidia did. If they sold the consumer a single product that's primary purpose was to render graphics for video games, and charged $1,500 for it, it counts.


----------



## DarthJedi (Sep 23, 2018)

newtekie1 said:


> What? Just because they were dual-gpu cards doesn't mean they weren't a single video card when they reached the consumer. They could put 4 or 200 GPUs on them to get the performance they need, or even multiple PCBs like nVidia did. If they sold the consumer a single product that's primary purpose was to render graphics for video games, and charged $1,500 for it, it counts.



Not really. Multi-GPU cards are multi-GPU cards. They cost like two GPUs, they perform like two GPUs and they have limits of two GPUs. They could never put more than 2 GPUs or 4 in total.
It simply doesn't count, because otherwise, you'd count SLI systems or 2 card bundles. 
It doesn't work that way. nVidia dual-GPU cards are actually SLI, like AMD Radeons are actually CrossFire. You could never get more than two and you'd have SLI.

The 2000 Series from nVidia is a real price jump since it's a jump for a single GPU. Not a single card. They could put two Turings on a card and that would not dethrone a single one as the most expensive product.


----------



## newtekie1 (Sep 23, 2018)

naxeem said:


> Not really. Multi-GPU cards are multi-GPU cards. They cost like two GPUs, they perform like two GPUs and they have limits of two GPUs. They could never put more than 2 GPUs or 4 in total.
> It simply doesn't count, because otherwise, you'd count SLI systems or 2 card bundles.
> It doesn't work that way. nVidia dual-GPU cards are actually SLI, like AMD Radeons are actually CrossFire. You could never get more than two and you'd have SLI.
> 
> The 2000 Series from nVidia is a real price jump since it's a jump for a single GPU. Not a single card. They could put two Turings on a card and that would not dethrone a single one as the most expensive product.



No, it counts.  I don't care if it is mutliple GPU, mutliple PCB, hell it could be two cards bundled in one package.  If AMD or nVidia puts a single unique model number on it, and advertises it as a single graphics card, then it counts, period.


----------



## DarthJedi (Sep 23, 2018)

newtekie1 said:


> No, it counts.  I don't care if it is mutliple GPU, mutliple PCB, hell it could be two cards bundled in one package.  If AMD or nVidia puts a single unique model number on it, and advertises it as a single graphics card, then it counts, period.



Well, you're alone then. Not even people from the industry itself view it that way. Dual-GPU products were always special product hacks and were never addressed as "cards" outside of marketing materials hoping someone won't notice the trick and buy through. Engineers and manufacturers refer to chips and GPUs. 
Only if we move to chiplets with gfx hardware, we'll be able to talk about single GPUs.

Anyway, we can only take Titans as real highly priced GPUs.


----------



## newtekie1 (Sep 23, 2018)

naxeem said:


> Well, you're alone then. Not even people from the industry itself view it that way. Dual-GPU products were always special product hacks and were never addressed as "cards" outside of marketing materials hoping someone won't notice the trick and buy through. Engineers and manufacturers refer to chips and GPUs.
> Only if we move to chiplets with gfx hardware, we'll be able to talk about single GPUs.
> 
> Anyway, we can only take Titans as real highly priced GPUs.



No, all the reviews treat them as one card.  They test them as one card, and they rate them as one card and they directly compared them to all the other single GPU graphics cards.  They are presented to the consumer as one card, a single product with a single model number designating that product, and they are treated as such by the industry.

The industry does not treat them any differently than cards with single GPUs in them. Yes, we know the engineering behind them is more complex, but the end result to the consumer is the same.  They were using dual-GPUs to hit performance targets, today they are making bigger GPUs, but when that wasn't possible they just added more GPUs to the card.  In fact, I dare you to find me a review of the R9 295X2 that doesn't refer to it as a card or graphics card.  Heck, you can go back to W1z's review of nVidia's GTX295 that uses two PCBs, and even that is referred to as a "graphics card", not cards, not hybrid abomination thing, nothing like that, just "graphics card".


----------



## DarthJedi (Sep 24, 2018)

newtekie1 said:


> No, all the reviews treat them as one card.  They test them as one card, and they rate them as one card and they directly compared them to all the other single GPU graphics cards.  They are presented to the consumer as one card, a single product with a single model number designating that product, and they are treated as such by the industry.
> 
> The industry does not treat them any differently than cards with single GPUs in them. Yes, we know the engineering behind them is more complex, but the end result to the consumer is the same.  They were using dual-GPUs to hit performance targets, today they are making bigger GPUs, but when that wasn't possible they just added more GPUs to the card.  In fact, I dare you to find me a review of the R9 295X2 that doesn't refer to it as a card or graphics card.  Heck, you can go back to W1z's review of nVidia's GTX295 that uses two PCBs, and even that is referred to as a "graphics card", not cards, not hybrid abomination thing, nothing like that, just "graphics card".



- no, these are NEVER tested as one card because these were *ALWAYS SLI or CrossFire*, that's how these work. They have SLI profiles, SLI issues, SLI scaling ...
- engineering is not "more complex" - it's simply: SLI/CF on a single board. You integrate the bridge to the PCB.
- the end result for the consumer is NOT the same as one card: it's VASTLY different - it's SLI/CF, not single GPU performance behavior.


----------



## newtekie1 (Sep 24, 2018)

naxeem said:


> - no, these are NEVER tested as one card because these were *ALWAYS SLI or CrossFire*, that's how these work. They have SLI profiles, SLI issues, SLI scaling ...



Look at this review and then tell me it wasn't treated like a single card in the review.  Again, I dare you to find me any reviews that didn't refer to them as a graphics card and treated as such.



naxeem said:


> - engineering is not "more complex" - it's simply: SLI/CF on a single board. You integrate the bridge to the PCB.



It actually is pretty complicated to get all that into a single unit.



naxeem said:


> - the end result for the consumer is NOT the same as one card: it's VASTLY different - it's SLI/CF, not single GPU performance behavior.



Yes it is the same.  It doesn't matter what components they use or how they get to the end result.  The card is released with its own model number and the consumer sees a single graphics card, because it is, and it cost $1,500.

I'm done with this discussion.  You haven't made any convincing argument.  We in the industry know how they work, and they are still regarded as a single graphics card by us, even the reviewers treat them as a single graphics card.


----------



## Splinterdog (Sep 24, 2018)

I wouldn't think twice about buying this card - if I had money to burn.


----------



## Sasqui (Sep 24, 2018)

newtekie1 said:


> Actually, your exact words were "what a monster", you seemed to have no issue with the $1,500 price back then. In fact, you can go through that entire thread, there is no outrage about the price, despite it being even higher than the 2080Ti over 4 years ago.



How dare you quote something from 2014, lol.  The difference there was, in fact, that the card was a dual GPU solution.  And the heart of the comment was how did it compare to two single GPU's in CF with the same die(s) for comparison.  

There are obviously those willing to pay that much for a consumer graphics card, and personally know one of those people.  So be it!


----------



## W1zzard (Sep 26, 2018)

Stevostin said:


> Does it have fanconnect II? On the gallery you show pictures of the 2080 and we can see the fanconnect but i can't find any mention of fanconnect on the 2080 ti. Can you confirm its presence or not?





Stevostin said:


> As I feared I got my answer, not sure if anyone else is interested but they did remove fanconnect on the 2080 ti. This doesn't make any sense since the 2080 has this feature. For such a price they could have put the fanconnect which was quite useful for me...



Just finished discussing this with ASUS. Our reviewed sample is lacking fan connect due to some technical obstacle in the first production run, but ASUS has overcome it and all cards from the second batch on will have this feature


----------



## Stevostin (Sep 26, 2018)

W1zzard said:


> Just finished discussing this with ASUS. Our reviewed sample is lacking fan connect due to some technical obstacle in the first production run, but ASUS has overcome it and all cards from the second batch on will have this feature


Thanks for looking into that, guess Asus won't lose a customer over such a tiny feature now ^^


----------



## DarthJedi (Sep 26, 2018)

newtekie1 said:


> Look at this review and then tell me it wasn't treated like a single card in the review.  Again, I dare you to find me any reviews that didn't refer to them as a graphics card and treated as such.
> 
> Yes it is the same.  It doesn't matter what components they use or how they get to the end result.  The card is released with its own model number and the consumer sees a single graphics card, because it is, and it cost $1,500.
> 
> I'm done with this discussion.  You haven't made any convincing argument.  We in the industry know how they work, and they are still regarded as a single graphics card by us, even the reviewers treat them as a single graphics card.



I'm sure @W1zzard would agree that's simply not true and wrong. You can't treat them as a single GPU as these were NEVER single GPU and never worked that way. These are SLI or CF dual GPUs. One card for 1 GPU or 4 of them, doesn't matter for anything but how it fits into the computer case. By the virtue of distance, these could have 0-1% better latencies, but that wouldn't mean much. There were multi-GPU with all the scaling, driver and support issues any such combo has.

"You in the industry"? I'm keen to learn what makes you part of the industry? As an engineer, I am pretty sure we don't mean the same industry if we say the word. Unless you're a salesaman, which would explain why you can't distinguish between GPU and a card.

FWIW, telling customers anything but a warning that these are dual-GPU and that they'll have all the SLI/CF characteristics, including limit to only 2 cards instead of 4 (because the limit doesn't go away if you put them in a box), would've been cheating. And, thankfully, they never did that. Both, nVidia and AMD were always straightforward about that.


----------



## Deleted member 171912 (Sep 26, 2018)

Good review. Thanks.

Nice new technologies and performance improvement but it is still not good enough for switching to 4K gaming and upgrade my rig (CPU, graphic card and monitor).

Why? In short FPS, 1st gen ray tracing poor perf and nonexistence of content, overpriced graphic cards (2080 Ti) and good gaming monitors (4K IPS).

OK, I will wait 2 more years for next gen NVIDIA and ASUS GeForce RTX (2180?) Ti card and 4K gaming monitor. No problem. My rig and Ultra on 1440p @144 Hz is still good enough to enjoy all my games.


----------

