# NVIDIA SLI GeForce RTX 2080 Ti and RTX 2080 with NVLink



## W1zzard (Sep 26, 2018)

In our Turing NVLink review, we test RTX 2080 Ti SLI against RTX 2080 SLI in 23 games and also include Pascal's GTX 1080 Ti SLI numbers. Our results confirm that NVLink does give higher performance than previous generations' SLI, even though game support could be better.

*Show full review*


----------



## Renald (Sep 26, 2018)

I'm already laughing at the guy, which I forgot the name, with trump icon who said "I already ordered two of them. You are just too poor to admit that this generation rocks".

Sure dude, most of console games are worse with a SLI than with the card alone. And some times worse than a 1080 Ti alone.
These cards are a waste. The 2080 Ti  is only there for 4k but that's all.


----------



## Vya Domus (Sep 26, 2018)

Atrocious scaling as expected. SLI is an ancient technology which should have kicked the bucket long ago and make way for better alternatives.


----------



## Assimilator (Sep 26, 2018)

Vya Domus said:


> ... better alternatives.



Like what? The much-vaunted DX12 multi-GPU, which never materialised as the review notes?


----------



## Vya Domus (Sep 26, 2018)

Assimilator said:


> Like what? The much-vaunted DX12 multi-GPU, which never materialised as the review notes?



Of course it never will, we have SLI. Now with NVLink, so it sounds fancier when you drop 2500$.


----------



## Upgrayedd (Sep 26, 2018)

Vya Domus said:


> Of course it never will, we have SLI. Now with NVLink, so it sounds fancier when you drop 2500$.


Witcher 3 seems scale wonderfully.. I blame the developers cause certain games scale fine.


----------



## Ferrum Master (Sep 26, 2018)

Can we get SLI perf. graph depending on CPU frequency? Isn't it already starved?


----------



## jabbadap (Sep 26, 2018)

Don't know why but that review avatar made me think that this was a sewing machine review(Oh new Pfaff)... But yeah had to chuckle a bit 1080p with rtx cards in sli. You could just leave that off altogether(Lot of your good work gone in vain), no one in their right mind should buy one not to mention two of these and run it on that kind of puny resolution. Maybe 8k or some sort of higher than 4k surround would be better show case, but 4k and 1440p would have been enough.

Edit: for SLI, frametime analysis or at least min. fps 1%/0.1% percentiles would have been good to see. And of course RX Vega⁶⁴ CF numbers.


----------



## mouacyk (Sep 26, 2018)

Blame the death of AFR SLI on shading methods that rely on data between frames.  I think a common culprit is temporal anti-aliasing, which does wonders for removing pixel crawl in motion and is becoming more popular but relies on the previous frame to be effective.  Throws a mega wrench in the gears of AFR SLI.


----------



## sweet (Sep 26, 2018)

Where are Vega CF's numbers? They would make NVlink look less "astounding", wouldn't they?


----------



## nemesis.ie (Sep 26, 2018)

@sweet I was about to write something similar.  

RE: SLI on the Ti, it looks like "The more you buy, the lower your frames".


----------



## Lightofhonor (Sep 26, 2018)

sweet said:


> Where are Vega CF's numbers? They would make NVlink look less "astounding", wouldn't they?


I was thinking the same. These numbers by themselves don't really prove anything.


----------



## GhostRyder (Sep 26, 2018)

Hmm, interesting read up as I have been reading a lot about this new NVlink on the pro cards and RTX cards and wanted to see some solid numbers.  I did not know however (Guess I just missed it previously) that the RTX 2070 lacks support...  So at this point only the RTX 2080 and ti variant support it which is really a let down as I was thinking the 2070 seemed to be a logical area for people still wanting to have budget high performance versus those on the high end wanting just the best since I figured 2 2070's would eclipse the 2080 and get close to the 2080ti for a little less money (Or for those in the future wanting to get more performance without replacing their current card).

Either way, I may still get a 2080ti but will not be investing in SLI/NVlink (Meaning more than one card) anymore unless something really demands it that I play constantly.


----------



## 0x4452 (Sep 27, 2018)

W1zzard, could you include 5k numbers too? (perhaps instead of 1080p)

Spending $2.5k on GPUs deserves a nice monitor too, and I suspect scaling might be good there.


----------



## hat (Sep 27, 2018)

The number of results where 2080 SLI performs worse than a single 2080, or even 1080Ti in some cases, is staggering... absolutely worthless unless you only play the handful of games that actually support it properly.


----------



## dalekdukesboy (Sep 27, 2018)

Renald said:


> I'm already laughing at the guy, which I forgot the name, with trump icon who said "I already ordered two of them. You are just too poor to admit that this generation rocks".
> 
> Sure dude, most of console games are worse with a SLI than with the card alone. And some times worse than a 1080 Ti alone.
> These cards are a waste. The 2080 Ti  is only there for 4k but that's all.



Agreed, I saw his post and didn't notice Trump icon or forgot if it was positive or negative for Trump? Anyway I didn't remember it because me liking his icon or not has nothing to do with me totally disagreeing with him I believe in same thread I mostly crapped on the cards and one guy (maybe him or another?) said he was excited to get a couple so I was nice and replied I'm sure he'll like them and congrats but that I had doubts and was waiting on reviews....my inner being is very happy I got a cheap 1080ti a couple months back. I got to use it couple months longer than 2080 was out, and I spent less than half of what the 2080ti is going for.


----------



## Tsukiyomi91 (Sep 27, 2018)

dual 2080Ti in NVLINK SLI is only worthy at 4K or higher & on certain games. Newer games from Fall 2018 to next year MAY get better scaling than current or older games coz devs are either having a time crunch or just don't bother coding their games to support multi GPU scaling regardless of how potent the game engine is. Still, spending $2500 ish for 2 of those cards + NVLINK bridge just to get the "best gaming experience" is a little overkill IMO. I would just settle with a single 2080Ti non FE, a decent 4K monitor & tweak the settings to get that sweet, buttery smooth 60fps on all games. $1200 for an FE variant doesn't really make a difference unless one has a really, really deep pocket & got nothing to lose...


----------



## londiste (Sep 27, 2018)

Assimilator said:


> Like what? The much-vaunted DX12 multi-GPU, which never materialised as the review notes?


Wasn't Deus Ex Mankind Divided support to have received DX12 mGPU support with a patch?
Was DX:MD tested at DX11 or DX12?


----------



## Bytales (Sep 27, 2018)

The new NVLink bridge is an assault on your wallet and an affront to decency.

Tipicall Nvidia !
Thats i why i swore never to buy Nvidia graphics Card ever again. Damn, i even have second doubts getting a Nintendo Switch, since it has nvidia chipset, but the fact that diablo 3 will lunch on Switch has convinced me in the end.

However, this is going to be the only exception i am ever going to do.
Nvidia Needs to Die allready. It has become way to greedy.


----------



## W1zzard (Sep 27, 2018)

londiste said:


> Wasn't Deus Ex Mankind Divided support to have received DX12 mGPU support with a patch?
> Was DX:MD tested at DX11 or DX12?


DX12 of course, and we used the latest patch


----------



## iO (Sep 27, 2018)

Nice review but what about the frame times? Does Nvlink improve the frame pacing or is mGPU still the same stuttery mess as before?


----------



## slsmaster (Sep 27, 2018)

I would like to see titles like crysis , crysis 2 and crysis 3 and witcher 3 and gta 5 in 8k with 2x2080 ti . I am considering to get 2 ti's and an 8k monitor from dell, so please I need help to decide , PLEASE make some tests.


----------



## TheinsanegamerN (Sep 27, 2018)

I'm disappointed. I hoped NVlink would be more driver/engine independent then SLI was, but it appears to just be SLI on sterioids. 

I'll just stick to a single vega this generation. Maybe in 2020 they'll figure out multi GPU again.


----------



## Breit (Sep 27, 2018)

Did you actually check if both cards are being used (like in Win10 Taskmanager or GPUz) when SLI is enabled?
I noticed some irregularities on my dual 980TIs in SLI on some games when using DX12 and the latest Nvidia driver, which resulted in only one card being used for rendering (with lower clocks due to SLI). With DX11 on the same game, both cards are being used for rendering. Maybe this explains the drop in performance on some games when SLI is used.


----------



## John Naylor (Sep 27, 2018)

It's not that SLI technology has slipped, it's that GFX card performance grows by leaps and bounds every generation.  The problem is everything else has been stagnant.  With CPUs, generation to generation increases are in single digits while GPU generations with 30 or 50% increases, no on blinks.  we are not seeing the scaling we used to see on SLI because a)  performance is bottlenecked by other componentry and b) with no competion from AMD, nVidia is competing with itself, sacriicing profits when 2 lesser cards are used in SLI instead of the pricey flagship models.


----------



## Razrback16 (Sep 27, 2018)

You know, I ran (1) gpu for a really long time, and when I finally got out of college and started making some money, I was able to move, to Crossfire with AMD cards and that was a great experience, and then I was able to start moving into the high end first with 780 Ti SLI, to Titan X Maxwell SLI, and now 1080 Ti SLI. I personally LOVE the technology and have had excellent experiences with each setup. 

I've been gaming in 4k either via a true 4k monitor (currently using a Phillips 40'' 4K monitor that I've had for a couple years) or a 1080p monitor using DSR to upscale for about 3-4 years. It was this move to the UHD resolutions that really pushed me into the SLI arena with the high end cards as it was the only way to get a smooth 60fps in most games at that resolution with the high IQ settings. For the most part, the games I play support SLI, but I'm noticing more and more, that developers just simply don't want to support it because of the smaller percentage of folks that use it and the fault is two-fold - one, on developers like Ubisoft for example who don't want to implement SLI because of the development time, but yet spend tons of money on multiple forms of DRM for their games (and the games still get cracked, lol), and also on NVidia for the reason that was noted in the review - back in the 900 series days even the 960 had SLI. NVidia keeps removing SLI support from the mid range cards and making it a luxury item so less and less users have it, making the developers less and less willing to implement it - NVidia needs to fix their side and the developers need to make more effort on behalf of their customers as well. By using NVlink now where the VRAM will stack, that would help push SLI adoption for mid-range users bigtime if NVidia would have only planned for it. 

It's really a great technology from my many years of experience using it - I would love to see it come back on mid-range cards so more people could make use of it at affordable pricing. After all, I started XFire on a pair of 4870 cards back when I didn't have a lot of $ and had a fantastic experience, and that's what got me into using it more. I'd love to see other people be able to enjoy it as well.


----------



## phill (Sep 28, 2018)

I do wish SLI was a load better than it is but if the developers don't put the support in the games for it, then we have no hope.  I do ask myself sometimes why do I have an expensive PC if I only use one GPU and not go with a console (I know there's a big performance gap but) when we have all these fancy liquid cooled systems with 3 or 4 cards in on one side and then another 2 or 3 cards on the other, for the cost of the build, I mean is it really worth it??  

I used to use a lot of SLI and Crossfire, had no real bad experiences with either, I liked having them both, meant faster frame rates and meant for me more tweaking which I always like doing..  

I do agree with @Razrback16 that if Nvidia take it away from the lower priced cards that that will limit who has it even more.  We all know that if a game doesn't support it then it's only as fast as your slowest card but when it works on Nvidia or AMD cards, it's an utter joy.


----------



## Razrback16 (Sep 28, 2018)

Excellent points phill - 

NVidia, especially with their pricing on Turing, is making it really hard on PC Gamers. It's like they're trying to chase us away or something. Make the latest line so unaffordable that sales are undoubtedly going to drop, we just don't know to what degree, but we already know their stock has dropped like 2% since the reviews game out for Turing, then they further limit SLI to only big dawg models of Turing, then just to put the cherry on top, for SLI, they make the new bridge like $80 for a device that probably cost $5 to make like Wizzard said in the review. 

I hope AMD is ready to take advantage of this because NVidia is really kinda shooting themselves in the foot bigtime with this release. IMO anyway.


----------



## phill (Sep 29, 2018)

We are in a crazy world at the moment and all this Nvidia monopoly isn't good for anyone..  AMD we need you back to show Nvidia how to make cheap gaming cards and how to you do this next series of cards will make or break you I think...  I hope your seeing what Nvidia are doing and do something completely opposite...  

Make us proud AMD again...  Please don't be stupid....


----------



## Vayra86 (Oct 1, 2018)

Assimilator said:


> Like what? The much-vaunted DX12 multi-GPU, which never materialised as the review notes?





Vya Domus said:


> Of course it never will, we have SLI. Now with NVLink, so it sounds fancier when you drop 2500$.



Don't you realize it doesn't matter what name you give it, devs have to do most of the work regardless, helped by bags of money + Nvidia engineers if need be. Its always a collaborative effort and that alone makes it destined to fail. SLI is a niche and the moment they killed SLI in the midrange was the moment it literally just ended.

Nobody optimizes for niche, and this review shows what that looks like.


----------



## HTC (Oct 2, 2018)

I'm really intrigued here: how can SLIed cards perform better with all tested games VS with *only games that scale*?

Somethings off, unless i'm seriously miss reading the performance summary.


----------



## W1zzard (Oct 3, 2018)

HTC said:


> I'm really intrigued here: how can SLIed cards perform better with all tested games VS with *only games that scale*?
> 
> Somethings off, unless i'm seriously miss reading the performance summary.


Not sure how you are misreading them. Higher percentage value = higher performance

Put down your thoughts, I'd be happy to elaborate


----------



## HTC (Oct 3, 2018)

W1zzard said:


> Not sure how you are misreading them. Higher percentage value = higher performance
> 
> Put down your thoughts, I'd be happy to elaborate



It seems i miss interpreted the chart: my bad.

A question if i may: does that 215% in the 4K results for the SLIed 2080 Ti mean it actually scales past 100%?


----------



## W1zzard (Oct 3, 2018)

HTC said:


> It seems i miss interpreted the chart: my bad.
> 
> A question if i may: does that 215% in the 4K results for the SLIed 2080 Ti mean it actually scales past 100%?


215% relative to a single GTX 1080 Ti (which is the 100% score)


----------



## HTC (Oct 3, 2018)

W1zzard said:


> 215% relative to a single GTX 1080 Ti (which is the 100% score)



I thought the base line was the *2*080 Ti: my bad, again.


----------



## phill (Oct 3, 2018)

If only SLI or Crossfire/X scaled well or the developers would make the most of why people have PC's then we'd have such a different setup for GPUs..  That said with Nvidia only putting SLI on their top end cards and charging a small fortune for those, AMD have the ability and chance to really make a difference for the rest of people that don't wish to spend $1600 to $2400 or so (£1500 +) on GPUs...  As said before, please AMD make us proud.....


----------



## W1zzard (Oct 3, 2018)

phill said:


> If only SLI or Crossfire/X scaled well or the developers would make the most of why people have PC's then we'd have such a different setup for GPUs..  That said with Nvidia only putting SLI on their top end cards and charging a small fortune for those, AMD have the ability and chance to really make a difference for the rest of people


At Computex I sat in a small meeting with 4 other journalists and AMD head honchos. David Wang made it clear that he does not believe that multi-chip modules (similar to Ryzen) are viable for GPUs. Specifically because of the difficulties of getting multi-GPU to work.


----------



## phill (Oct 3, 2018)

It's a real shame considering how it used to be something that was worked on and in some cases worked so very well with the scaling.  You'd have thought that because of the cost of GPUs now, that this might have been considered more so than not..  Otherwise, why don't we just grab a console and have done with it?


----------



## HTC (Oct 3, 2018)

W1zzard said:


> At Computex I sat in a small meeting with 4 other journalists and AMD head honchos. David Wang made it clear that *he does not believe that multi-chip modules (similar to Ryzen) are viable for GPUs.* *Specifically because of the difficulties of getting multi-GPU to work.*



IMO, the only way they'll manage to make it work is if, *and that's a big if*, however many small chips there are, are seen as one big chip by the OSes / drivers: only then they'll manage to pull it off.

If they do manage, somehow, nVidia is screwed because their whole design revolves around a very big monolithic chip (i'm talking high end stuff here: not mid range and lower). Essencialy: nVidia would be in the position Intel is now, minus the 10nm woes (again referring to the higher end stuff only).


----------



## Aquinus (Oct 3, 2018)

> This explains why you don't see flexible NVLink bridges printed on polyethylene substrates as those would need to be single-layer, running the entire width of the NVLink connector and resembling an ugly ribbon cable, such as IDE, hampering airflow for the Founders Edition axial-flow cooler.


Ummm, what? I seriously doubt it would obstruct airflow and at least a ribbon is flexible and it's not like your placement of the cards needs to revolve around the inflexible NVLink adapter. I think the reality is that the adapter would be wider if it were a ribbon because of the size of the connector. I honestly think there is absolutely no good reason to make it solid other than to have a [bad] reason charge 80 dollars for it. Also, it's certainly not worth it considering SLI scaling seems to still be garbage.

Side note, I remember Crossfire scaling better than this when I used CFX with two 6870s. Maybe this should be contrasted with how AMD's cards are scaling or did they ditch CFX support?


----------



## John Naylor (Oct 3, 2018)

phill said:


> I do wish SLI was a load better than it is but if the developers don't put the support in the games for it, then we have no hope.  I do ask myself sometimes why do I have an expensive PC if I only use one GPU and not go with a console (I know there's a big performance gap but) when we have all these fancy liquid cooled systems with 3 or 4 cards in on one side and then another 2 or 3 cards on the other, for the cost of the build, I mean is it really worth it??



I think there's more going on here than we realize. 

-When I got my 780s it was just after the Ti surfaced and proces went in the toilet.  In addition to the price drop, I got $200 worth of game coupons.  So i bought one and my son bought me one (Xmas gift that I paid for) ... and when all was said and done, it was cheaper and a lot faster than a single 780 Ti.  
-Later when my son did his "I just graduated kolludge and got a job" build, we did the reverse.  He bought a 970 and I bought one along with a  crapload of game coupons.   On average, using TPIs test suite the SLI option was 40% faster, some games didn't support it or sacled poorly which lotat critics like to banter about, but neither of us cared as we were getting well above 60 fps in those games.  What we cared about was 42 fps on the 980 was over 60 fps on the twin 970s.    That made the game playable ,,, going from having to play at 78 fps on an unspoorted game versus 87 on a 980 was a "who cares" .

In essence, with no competion from AMD, nVidia is basically competing with itself... and it's far more profitable to sell 1 x80 than 2 x60s or x70s.  I also remember looking and saying 'wait a minute ?   Why does nVidia have a thermal limit on the X70 that is 5C lower than the X8 ?  Should they vary, shouldn't it be the other way around ?  Then again, why is it that average scaling at 1080p is a measly at 18% or so... 1440p its up to 34% and at 2160p it's up over 50%. (some games did 95-100%).    By making scaling reasonable at 4k, and x80 performance at 4k under 60 fps, they had good reason to make 4K SLI a viable option.  And finally, you can no longer why twin x70s for the price of an X80.

Twin 560 Tis were cheaper and faster than the 580
Twin 650 Tis were much cheaper than the 680 and performed about the same
Twin 780s and 970s we already covered.

It seemed to me that Vidia was intentionally nerfing the x70s performance to create a wider gap between the 70 and 80 models so as to make purchasing an 80 more attractive.  This option only became possible due to tthe lack of competition from AMD in this price / performance niche. They had little fear of losing sales to AMD ... so why compete with themselves when they could just make the x70 or x70 SLI less attractive ?


----------



## lilkwarrior (Oct 8, 2018)

Vya Domus said:


> Atrocious scaling as expected. SLI is an ancient technology which should have kicked the bucket long ago and make way for better alternatives.





TheinsanegamerN said:


> I'm disappointed. I hoped NVlink would be more driver/engine independent then SLI was, but it appears to just be SLI on sterioids.
> 
> I'll just stick to a single vega this generation. Maybe in 2020 they'll figure out multi GPU again.


NVLINK maximizes communication between the two cards while DX12 or Vulkan handles the multi-GPU part as far as gaming. It's why they didn't transfer Quadro's memory pooling with NVLINK to the Geforce cards; DX12 & Vulkan is supposed to be doing that. 

Now that DX12 & Vulkan have ray-tracing (their first killer feature beside's Vulkan being cross-platform ready) & mainstream end-of-life has been reached for Windows 8,  Nvidia & everyone involved is hoping modern PC gamers moving forward will more maximize DX12 or Vulkan—including their multi-GPU capabilities.



Aquinus said:


> Ummm, what? I seriously doubt it would obstruct airflow and at least a ribbon is flexible and it's not like your placement of the cards needs to revolve around the inflexible NVLink adapter. I think the reality is that the adapter would be wider if it were a ribbon because of the size of the connector. I honestly think there is absolutely no good reason to make it solid other than to have a [bad] reason charge 80 dollars for it. Also, it's certainly not worth it considering SLI scaling seems to still be garbage.
> 
> Side note, I remember Crossfire scaling better than this when I used CFX with two 6870s. Maybe this should be contrasted with how AMD's cards are scaling or did they ditch CFX support?


NVLINK, which has been available far more optimally in Quadro+ cards for years, has never had a ribbon form & it would not make sense given what the technology is. There would not be sufficient placement of the hw needed for the far more efficient intercommunication between GPUs it provides.



Vayra86 said:


> Don't you realize it doesn't matter what name you give it, devs have to do most of the work regardless, helped by bags of money + Nvidia engineers if need be. Its always a collaborative effort and that alone makes it destined to fail. SLI is a niche and the moment they killed SLI in the midrange was the moment it literally just ended.
> 
> Nobody optimizes for niche, and this review shows what that looks like.


Nvidia & AMD expect more use of DX12's & Vulkan's explicit multi-GPU modes to be leveraged by developers, not their obsolete,  proprietary methods.

Now that Nvidia has done their part (providing NVLINK, consumer GPUs w/ dedicated deep learning for anti-aliasing RTX cores, Influenced standardization of Ray tracing for Vulkan & DX12) & Microsoft (WindowsML, DXR, and end-of-life support for Windows 8 reached), Devs now should do their part as all major engines have versions that'll out-of-the-box support Ray-tracing & explicit multi-GPU mode.


----------



## R-T-B (Oct 12, 2018)

lilkwarrior said:


> Devs now should do their part as all major engines have versions that'll out-of-the-box support Ray-tracing & explicit multi-GPU mode.



Forgive my skepticism...  but as a part-time developer myself, good luck there...


----------



## Spyle (Oct 22, 2018)

You acknowledge that "2080 Ti is able to saturate PCI-Express gen 3.0 x8 ", then proceed to test SLI in an x8/x8 configuration?


----------



## Aquinus (Oct 23, 2018)

Spyle said:


> You acknowledge that "2080 Ti is able to saturate PCI-Express gen 3.0 x8 ", then proceed to test SLI in an x8/x8 configuration?


The 8600k doesn't have enough PCIe lanes for two cards running at 16x as the CPU only has 16 lane available so, two x8 slots is literally all you can use with an Intel MSDT platform. The reality is that most people have a setup like this. Even a guy I know with a 2080 Ti has this kind of setup. I suspect that W1zz isn't going to change his entire testing setup to be something unrealistic for most consumers as most consumers (even those buying a 2080 Ti,) aren't likely going to have a HEDT build.

So, while you're right that a 2080 Ti could use 16 lanes each, the reality is that most builds don't have that many PCIe lanes and that reviews really should be capturing realistic real word usage and realistic doesn't mean a $5,000 USD build, even more so when these CPUs don't really do these GPUs justice with their lower clocks and high core counts.

With that said, PCIe bandwidth doesn't even begin to describe why performance with multiple GPUs is so terrible with the 2080 Ti.


----------



## Spyle (Oct 23, 2018)

Aquinus said:


> With that said, PCIe bandwidth doesn't even begin to describe why performance with multiple GPUs is so terrible with the 2080 Ti.



But it may not be that terrible, it could be the PCIe bandwidth holding it back. I recently went from a x99 platform to z390, while my new CPU is far better and I gained significant performance in BF1, other games that were GPU bottlenecked took around a 10% hit because I was now running x8/x8. And this is on Titan X pascal, so I imagine with faster cards the lack of PCIe lanes becomes even more apparent.


----------



## claydough (Dec 24, 2018)

the article acknowledges use cases like 3d vision in multi-monitor surround.
( wider is better wsgf )

As a fan of both I look to these SLI benchmarks to incorporate as much into their findings ( ya know for us users who might benefit via stereo renders across 3 QHD monitors! )
Where NVlink's bandwidth sounds potentially revolutionary.

In which case...
Why do these reviews then,
Never acknowledge that there are actually TWO FLAVORS OF SLI?
3d "or"
Accelerated Surround!

For which case...
I can't think of a single title I play that doesn't benefit!
Where the scaling bad or not often makes the difference of making 3d Vision in surround "possible".


Where scaling isn't nearly as important as "finally playable" as a 3D Vision in NV Surround Fan.
For which there are die hard communities relying on homebrewed solutions to support games where devs leave off.
Like Hayden at WSGF. or Helix for 3D vision support!

We ain't going anywhere!

SO?
How bout a lil representation (love) in your reviews?
And include Surround SLI in these benchmark reviews where SLI is covered?
Then u could even tout a comprehensive review!


----------



## cars10 (Jan 12, 2019)

An SLI benchmark bottlenecked in 8x/8x. You guys should be ashamed.



Aquinus said:


> as most consumers (even those buying a 2080 Ti,) aren't likely going to have a HEDT build.



PRECISELY because of BOGUS reviews like this one from Techpower up. If people knew more about the 8x/8x PCI-E lane bottleneck they would likely use a HEDT platform, and people who have money for SLI and the 2080ti are already swimming cash, obviously. It is only out of ignorance and the knowledge of Clockspeed > Core count that they go fore the mainstream platform

the LEAST they could do is write a big fat red amendment to the review "NOTE: This setup is bottlenecked!". Probably best change the charts to so it says ("only games that scale but are bottlenecked")

This sheds a bad light on SLI and makes me absolutely livid. If the review doessn't get ammended I am sharing this to every tech related subreddit I can think of


----------



## Tatty_One (Jan 12, 2019)

cars10 said:


> An SLI benchmark bottlenecked in 8x/8x. You guys should be ashamed.


That's the NVlink protocol, nothing to do with the PCI-E bus.


----------



## cars10 (Jan 12, 2019)

Tatty_One said:


> That's the NVlink protocol, nothing to do with the PCI-E bus.



What are you talking about ? Scaling has everything to do with the PCI-E Bus. The big question was if NVLink would alleviate it when in 8x/8x, which there is evidence to confirm that it does not. (Example: check out the GamersNexus reviews)


----------



## Tatty_One (Jan 12, 2019)

cars10 said:


> What are you talking about ? Scaling has everything to do with the PCI-E Bus. The big question was if NVLink would alleviate it when in 8x/8x, which there is evidence to confirm that it does not. (Example: check out the GamersNexus reviews)


Yes it has everything to do with the PCI-E bus but of course it will be limited by the NVlink protocol, I agree scaling is poor which is clear in the review but the link has improved things, just not enough.


----------



## cars10 (Jan 12, 2019)

Tatty_One said:


> but the link has improved things



Hard to say, because we can't test it on Pascal cards nor HB-SLI on Turing cards. It may just be future proofing or allowing for people to use the NVLink advantages in compute applications like sharing RAM.

Either way, for gaming, this review is severely skewered and readers should be advised of such, especially since it was already known that 2080Ti saturates an 3.0 8x link by itself.


----------



## Tatty_One (Jan 12, 2019)

Well to be fair, unless the enthusiast has money to burn I doubt there will be many even considering spending all that cash for the return they will get on that 2nd card.


----------



## cars10 (Jan 12, 2019)

Tatty_One said:


> Well to be fair, unless the enthusiast has money to burn I doubt there will be many even considering spending all that cash for the return they will get on that 2nd card.



I don't think that is fair. That is the whole point of review sites, to show you the full picture so you can decide what to spend on.
By that logic, a TITAN RTX or TITAN RTX SLI review would be pointless, because the enthusiast doesn't have that kind of money to spend.
And to add more fallacy to it: in this review they are already spending THOUSANDS just for the two 2080 TI's! Are you kidding me? As if the additional cost to the HEDT platform would then tip the iceberg.

Your excuse is poor at best. Instead of debating with me, you should be hitting up the editors informing them of the misinformation they are spreading.
Else I must take it upon myself to inform the public and discredit your otherwise great site and reputation.


----------



## Vayra86 (Jan 12, 2019)

cars10 said:


> I don't think that is fair. That is the whole point of review sites, to show you the full picture so you can decide what to spend on.
> By your logic, a TITAN RTX or TITAN RTX SLI review would be pointless, because the enthusiast doesn't have that kind of money to spend.
> And to add more fallacy to that logic: they are already spending THOUSANDS just for the two 2080 TI's! Are you kidding me? As if the additional cost to the HEDT platform would then tip the iceberg.
> 
> ...



Cant say I agree. SLI on this card is a monumental waste of money and more importantly, it is a niche within a niche and youre adding HEDT on top of that. Where does the amount of configs to test end? If the market share for that use case is more than 0.05% I would be surprised. Totally not interesting to spend time on.

As for PCIe bus scaling, TPU does revisit that every so often. Perhaps that is the place for such a test. But in regular reviews? Meh


----------



## cars10 (Jan 12, 2019)

Vayra86 said:


> Cant say I agree. SLI on this card is a monumental waste of money and more importantly, it is a niche within a niche. If the market share for that use case is more than 0.05% I would be surprised. Totally not interesting to spend time on.
> 
> As for PCIe bus scaling, TPU does revisit that every so often. Perhaps that is the place for such a test. But in regular reviews? Meh



That is besides the point. It doesn't matter what you agree with, because the fact of the matter is this IS a review ABOUT the "niche within a niche".
Market share or whatever, they already SPENT the time on it. It is DONE. And it is _inaccurate._
Debating that is pointless.

As for it being a waste of money, that is entirely subjective and up to the user. Some say gaming by itself is a waste of money!
If you want to max out the best monitors out there, like the ASUS PG27UQ that does 4k @ 144Hz or the new ones from CES 2019 (>4k 175Hz), you absolutely *NEED *2080 TI SLI, as you won't get anywhere near that with just one (in games that support SLI, ofc).

If my argumentation is too vicious for you and your members, Tatty_one, I would appreciate it if you could pass along the info up the chain so I can rest my campaign


----------



## Tatty_One (Jan 12, 2019)

cars10 said:


> That is besides the point. It doesn't matter what you agree with, because the fact of the matter is this IS a review ABOUT the "niche within a niche".
> Market share or whatever, they already SPENT the time on it. It is DONE. And it is _inaccurate._
> Debating that is pointless.
> 
> ...



This is kindergarten fluffy bunny stuff compared with some here so don't worry, as for "passing up the chain", it's a 4 month old review so your "campaign" is clearly on catchup.


----------



## cars10 (Jan 13, 2019)

Tatty_One said:


> This is kindergarten fluffy bunny stuff compared with some here so don't worry, as for "passing up the chain", it's a 4 month old review so your "campaign" is clearly on catchup.


Haha, ok. I thought I was shadowbanned since nothing was posting anymore.

The 4 month part is my fault. I didn't bother to double check the test setup and assumed everything was in order.


----------



## Saint-Gamer (Jan 30, 2019)

Could the combination of 2-Way NVLink RTX 2080 with I7 6900k on Rampage v edition 10 Mobo produce any sort of bottleneck while gaming or rendering? Those will be paired with 16GB 3000MHz Ram.


----------



## cars10 (Jan 30, 2019)

Saint-Gamer said:


> Could the combination of 2-Way NVLink RTX 2080 with I7 6900k on Rampage v edition 10 Mobo produce any sort of bottleneck while gaming or rendering? Those will be paired with 16GB 3000MHz Ram.



of course. for gaming clockspeed is still top. ideally you would want the 9900k, but with 32 pcie lanes. since you can't get both (which is completely retarded), you'd have to overclock the i7 9800x for example


----------



## Saint-Gamer (Jan 30, 2019)

cars10 said:


> of course. for gaming clockspeed is still top. ideally you would want the 9900k, but with 32 pcie lanes. since you can't get both (which is completely retarded), you'd have to overclock the i7 9800x for example


Well... I have overclocked my I7 6900K to 4GHz it's should work fine with 2-Way RTX 2080? , however. someone told me you could experience only bottleneck on 1080p resolution, but in 2k or 4k. I will never experience anything like bottleneck.


----------



## stuartiannaylor (Jan 31, 2019)

Its sort of strange after a review on nvlink and what it is so many have commented on pci-e limitations.

I am more optimistic about future nvlink compatibility and scaling than currently the RTX bit of technology in these cards.

If you have a game that is comapatible with Nvlink then scaling is good, if not then DX12 will flood pci-e 3.0.
People need to wise up about forced bottlenecks with multi-gpu that would allow a scaling upgrade equivalent to a 2 gen upgrade.
We should be asking why is this not better and why are developers not implementing and not this BS that they don't have the time.

Bad scaling is enforced so that you have to spend $ on a new GPU.


----------

