# NVIDIA GeForce RTX 2080 Ti PCI-Express Scaling



## W1zzard (Sep 24, 2018)

It takes a lot of bandwidth to support the fastest graphics card, especially one that can play anything at 4K 60 Hz, with an eye on 120 Hz. The GeForce RTX 2080 Ti could be the most bandwidth-heavy non-storage PCIe device ever built. PCI-Express gen 3.0 is facing its design limits.

*Show full review*


----------



## TheOne (Sep 24, 2018)

With only the 2080 and 2080 Ti supporting NV-Link/SLI it seems NVIDIA has officially killed SLI for consumers and are focusing on prosumers and professionals.


----------



## Joss (Sep 24, 2018)

Thanks for the test(s), very clarifying.


----------



## Deleted member 157035 (Sep 24, 2018)

It'd be great if you could provide 0.1% and 1% lows as well. Average frame-rate doesn't tell the whole story.


----------



## Prima.Vera (Sep 25, 2018)

Any chance for a SLI/NV-Link review anytime soon for the new 2080 x cards?
Thanks


----------



## king of swag187 (Sep 25, 2018)

Interesting, but I seriously doubt someone would use a platform as old as 32NM Sandy Bridge or heaven forbid, AMD FX
*with RTX gpus*


----------



## Darktalon (Sep 25, 2018)

Did you test with Gsync enabled? We have seen results in the past that show Gsync takes more bandwidth and can cause the gap between x8 and x16 to be even greater.


----------



## notb (Sep 25, 2018)

TheOne said:


> With only the 2080 and 2080 Ti supporting NV-Link/SLI it seems NVIDIA has officially killed SLI for consumers and are focusing on prosumers and professionals.


Why wouldn't they?
For the price of 2 cheaper cards in SLI you can get a more expensive model from the lineup that will be just as fast if not better.
Including SLI makes card designing harder and more expensive.
Hence, it only makes sense in the top of the range models - when a 2080 or 2080Ti is still not enough for client's needs. They're adding 2 performance "steps" on the top of this generation.


----------



## king of swag187 (Sep 25, 2018)

SLI, when it first came out was a wondrous thing for a consumer. You could buy a low end or midrange GPU (Lets say GTX 460 for instance) and when the prices fell enough that you could afford, you snag another, highly discounted GPU for a killer price and get near flag-ship performance
Unfortunately, since this doesn't make Nvidia money, they killed it (as well did developers who dropped it completely)


----------



## notb (Sep 25, 2018)

king of swag187 said:


> SLI, when it first came out was a wondrous thing for a consumer. You could buy a low end or midrange GPU (Lets say GTX 460 for instance) and when the prices fell enough that you could afford, you snag another, highly discounted GPU for a killer price and get near flag-ship performance
> Unfortunately, since this doesn't make Nvidia money, they killed it (as well did developers who dropped it completely)


When SLI reached peak popularity, most people (and almost 100% of gamers) had large ATX, custom-built boxes. Hence, there was a large potential of selling them another card later on.

It's very different today.
First of all: many people game on laptops. I don't know how many... half? Maybe more.
We generally moved to smaller desktops as well - usually imposing a 1 GPU limit.

Nvidia is simply going with times. And clearly this strategy has worked for them.


----------



## TheOne (Sep 25, 2018)

notb said:


> Why wouldn't they?
> For the price of 2 cheaper cards in SLI you can get a more expensive model from the lineup that will be just as fast if not better.
> Including SLI makes card designing harder and more expensive.
> Hence, it only makes sense in the top of the range models - when a 2080 or 2080Ti is still not enough for client's needs. They're adding 2 performance "steps" on the top of this generation.



I was just pointing out that SLI isn't really targeted at gamers anymore, but with Pascal prices fluctuating I could buy another 1070 right now to boost my performance rather than spend nearly double on the upcoming 2070, I'm not, but it is as swag187 says.


----------



## notb (Sep 25, 2018)

TheOne said:


> I was just pointing out that SLI isn't really targeted at gamers anymore, but with Pascal prices fluctuating I could buy another 1070 right now to boost my performance rather than spend nearly double on the upcoming 2070, I'm not, but it is as swag187 says.


Does this strategy make sense to you in long term? With at least 3 cycles?

Also, the "I could" here is rather weak. Do you find your 1070 really lacking? Because 1070 SLI is like what... ~1080? Not a big jump, IMO.


----------



## jigar2speed (Sep 25, 2018)

Excellent review, now only CPU scaling left so that people can know which core number can be the least effective solution for RTX series.


----------



## TheOne (Sep 25, 2018)

notb said:


> Does this strategy make sense to you in long term? With at least 3 cycles?
> 
> Also, the "I could" here is rather weak. Do you find your 1070 really lacking? Because 1070 SLI is like what... ~1080? Not a big jump, IMO.



Based on the reviews for the 2080 and 2080 Ti, it looks like the 2070 will be on par with a 1080, so if a 1070 SLI is on par with a 1080, then it could be $200-300 cheaper to use SLI and get similar performance with the only incentive for a 2070 maybe being DLSS in a few titles.

Also I was just giving you an example of how it could still be relevant to gamers, I'm not endorsing SLI.


----------



## SetsunaFZero (Sep 25, 2018)

TheOne said:


> With only the 2080 and 2080 Ti supporting NV-Link/SLI it seems NVIDIA has officially killed SLI for consumers and are focusing on prosumers and professionals.


SLI is dead since Maxwell. NV stopped optimizing drivers and profiles since Maxwell


----------



## Tomorrow (Sep 25, 2018)

king of swag187 said:


> Interesting, but I seriously doubt someone would use a platform as old as 32NM Sandy Bridge or heaven forbid, AMD FX


You mean at all or for RTX specifically? 

I'm sure there are plenty of people still running 32nm Intel parts. If not fot lack of money then for soldered heatspreader or small IPC improvements in newer parts alone. Thankfully a viable upgrade is coming (9xxx series K chips with soldered heatspreaders).

I agree that very few people would shell out 800+ money to get Turing yet are running 2011 era platforms. Tho there are some people running Pascal on those. I myself run GTX 1080 on a P67 platform with 2500K @ 4,7Ghz. I did managed to pop in a i7-3770K once and there was performance improvement. Tho hard to say exactly how much my PCI-E 2.0 x16 is holding back GTX 1080. Problably very little. Mostly it's the CPU itself that bottlenecking.

FYI yes i am upgrading next year if not earlier. Tho i will skip Turing and wait for RTX 3080Ti or whatever it ends up being called on 7nm. That would be a worthy upgrade considering it would offer approx 100% perf increase over GTX 1080. I paid 700€ for 1080 when it came out over 2 years ago so assuming the next flagship ships with the same price (and Nvidia does not increase prices...again) then 1400€ would be justified considering the resulting performnce increase.

As it's stands Turing is for developers and early adopters mostly.


----------



## notb (Sep 25, 2018)

TheOne said:


> Based on the reviews for the 2080 and 2080 Ti, it looks like the 2070 will be on par with a 1080, so if a 1070 SLI is on par with a 1080, then it could be $200-300 cheaper to use SLI and get similar performance with the only incentive for a 2070 maybe being DLSS in a few titles.


How come?
An 1070 costs around $400 now.
2070 is expected to cost $500-$600 depending on variant (just like 1080 at the moment).

Sure, you can spend less money this time getting another 1070 instead of 1080/2070. But you've already spent ~$400 on your first 1070, right?
So you're spending more on your PC in the end.
In fact 1080 MSRP was $500-600 as well. You could have just borrow the missing cash. Even with interest you would be well bellow $800.

That's exactly why I asked about a long-term cost. I just don't see how SLI could be cheaper than just buying the more expensive cards.


----------



## TheOne (Sep 25, 2018)

notb said:


> How come?
> An 1070 costs around $400 now.
> 2070 is expected to cost $500-$600 depending on variant (just like 1080 at the moment).
> 
> ...



Newegg has a 1070 for $360 (after MIR), I'm pretty sure I saw one a couple of weeks ago for less than that, the 2070 will probably cost $550<.


----------



## laszlo (Sep 25, 2018)

is not clear for me if the bandwidth is used at 100 % 

i see only fps which don't know how to translate to bandwidth usage...


----------



## trog100 (Sep 25, 2018)

running the basic firestrike my pair of palit 1070 cards (both set up at 80 max power for mining) still show "better than 99% of all cards tested" ..

1070 sli isnt like "1080" it can be just 1070 or it can be better than 1080TI it all depends on what you run and how sli scales.. 

i would like to see more of how games scale in sli in gaming reviews.. in fact it should be part of the review.. 

trog


----------



## Ferrum Master (Sep 25, 2018)

king of swag187 said:


> Interesting, but I seriously doubt someone would use a platform as old as 32NM Sandy Bridge or heaven forbid, AMD FX



We have plenty of Nehalem users here...

Also many boards on SB-E can enable PCIE3 via tool, registry.


----------



## rtwjunkie (Sep 25, 2018)

king of swag187 said:


> Interesting, but I seriously doubt someone would use a platform as old as 32NM Sandy Bridge or heaven forbid, AMD FX


Why not? There are still a lot of SB-E owners.  Nvidia even has a small patch to enable PCIe3 on many of those X79 systems.  There are even people with mainstream SB too.


----------



## Enterprise24 (Sep 25, 2018)

Thanks for review* W1zzard.*

If someone interest this is a good 28 min video about PCIE lanes with various graphics cards in SLI.


----------



## Gasaraki (Sep 25, 2018)

king of swag187 said:


> Interesting, but I seriously doubt someone would use a platform as old as 32NM Sandy Bridge or heaven forbid, AMD FX



Huh? I'm running on a 6-Core 12 thread first gen i7.


----------



## TheinsanegamerN (Sep 25, 2018)

count me in as eagerly awaiting NVlink benches. I'm curious if it will work better/more seamlessly then SLI.

SLIs biggest problem was having to be supported individually on every program. Something like NVlink, if t did not require per program support, could see far wider spread adoption.


notb said:


> When SLI reached peak popularity, most people (and almost 100% of gamers) had large ATX, custom-built boxes. Hence, there was a large potential of selling them another card later on.
> 
> It's very different today.
> First of all: many people game on laptops. I don't know how many... half? Maybe more.
> ...


ATX motherboards still sell incredibly well. While SFX builds and laptops are more popular now, ATX desktops still dominate the gaming world, and SLI adoption was never very high to begin with, even back in the late 2000s.

The size of the case didnt kill SLI, the lack of user interest did. As GPUs became powerful enough to push 1080p ultra, dualGPU rigs became incredibly rarer then they already were, and with such high support costs both in $$$ and man hours, its no surprise the idea eventually fizzled out.



king of swag187 said:


> Interesting, but I seriously doubt someone would use a platform as old as 32NM Sandy Bridge or heaven forbid, AMD FX


There are still a LOT of sandy bridge and FX machines out there. While the high end gaming machines may have left the platform years ago for PCIE3 and usb 3, there are plenty of mid range machines still on older arches. Hell, I just went from ivy bridge to ryzen, not for CPU performance, bu because my motherboard was on the fritz. I still know people on core 2 quads. 

CPU performance inst a driving upgrade factor anymore, and a ton of these older platforms still exist. While they will not be buying 2080s, they WILL probably be interested in 3060s next gen, and those may also saturate 2.0 x 16.


----------



## diatribe (Sep 25, 2018)

SetsunaFZero said:


> SLI is dead since Maxwell. NV stopped optimizing drivers and profiles since Maxwell



Not true at all.

As of nVidia's latest driver, 411.63, there are still SLI optimizations being released.  The following games are specified in the release notes:

HOB
Lake Ridden
NieR:Automata
Northgard
Pure Farming 2018
Raid: World War II
Star Wars: Battlefront II (2017)
TT Isle of Man


----------



## stance_changer (Sep 25, 2018)

Can someone explain how physically blocking pci e connectors works? How can the system know that it is blocked. I assume that all the connectors are identical and each pin has the same bandwidth as the rest, but other than that, this is a mystery to me.


----------



## Famorak (Sep 25, 2018)

So let me get this straight, if you have a new turing card and a M2 drive you take a performance hit?  so better to wait for PCIe 4.0?


----------



## king of swag187 (Sep 25, 2018)

notb said:


> When SLI reached peak popularity, most people (and almost 100% of gamers) had large ATX, custom-built boxes. Hence, there was a large potential of selling them another card later on.
> 
> It's very different today.
> First of all: many people game on laptops. I don't know how many... half? Maybe more.
> ...


I game on a laptop with dual 1080's and a 8700K lol, its smaller than my desktop by a alot

As to all the people that quoted my 32NM post, I meant with RTX series cards, not any other cards. they're still great for gaming lol


----------



## ERazer (Sep 25, 2018)

Famorak said:


> So let me get this straight, if you have a new turing card and a M2 drive you take a performance hit?  so better to wait for PCIe 4.0?


Depends on mobo how the pcie lanes its setup  and how many pcie lane your CPU has.


----------



## Jism (Sep 25, 2018)

king of swag187 said:


> Interesting, but I seriously doubt someone would use a platform as old as 32NM Sandy Bridge or heaven forbid, AMD FX



Still an FX, Bulldozer or Vishera in that matter, running here. It's doing great actually.


----------



## king of swag187 (Sep 25, 2018)

Jism said:


> Still an FX, Bulldozer or Vishera in that matter, running here. It's doing great actually.


No doubt about it, but I meant to say "with an RTX card"


----------



## Jism (Sep 25, 2018)

That card is way overpriced to be honest. The 2080 does'nt make sense either for being 100 ~ 200 $ more expensive and not doing that much better then a 1080ti.

It's just that there is no competition, and Nvidia is the new apple on the market with their exclusive iphone RTX editions.


----------



## king of swag187 (Sep 25, 2018)

I picked up a Windforce OC 1080 ti for $475 shipped, and a 7700K system w/16GB RAM for $450, so $925 total. So far its been good


----------



## nemesis.ie (Sep 26, 2018)

Famorak said:


> So let me get this straight, if you have a new turing card and a M2 drive you take a performance hit?  so better to wait for PCIe 4.0?



Threadripper would be fine in this case - with 3 x M.2.


----------



## Salty_sandwich (Oct 7, 2018)

I cant afford to keep up with the pricing of new high end or even mid range graphics cards these days + the cost of building a whole new computer to run new graphics card, it got to a point where I just gave up with PC gaming and sold all my computer gear, ended up buying an Xbox One X and game quite happy on that, although after about a year I decided to build a Low end PC out of the cheapest second hand parts I could find locally.

I just cant justify the cost just to play games on a PC anymore, probs an age related as well as wallet related issue :/ …


----------



## John Naylor (Jan 23, 2019)

The main thing I think that killed SLI was it had nVidia competing with itself.   The 780 overclocked was faster than the 290x overclocked so the Ti version was going against twi 770s and twin 780s... it was an easy choice.  Then came 9xx and nVidia took another tier and two 970s was 40% faster than a single 980 for the same cost.   AMD had nothing to throw against it, so nVidia was losing top tier card sales to twin mid level cards.   Yea a few games didn't benefit but the ones you wanted to play did.  And when it didn't scale, you were maybe 10% slower than they 980 but with 2 you were as much as 70% faster.    From a competition standpoint, improved scaling in SLI only hurts nVidia.... still find it curious why few questioned why 1080p scaled on average 18%, 1440p 33% and 2160p 56% .... the lower the resolution the more the scale was tipped against SLI.... but foir thoise who wanted 60+ performance at 2160p, they gave those customers what they wanted... they were still happy to sell ya two 1080Tis.... jut not two 1070s, so 1080 / 1440p scaling was nerfed.


----------



## jaggerwild (Jan 23, 2019)

king of swag187 said:


> I picked up a Windforce OC 1080 ti for $475 shipped, and a 7700K system w/16GB RAM for $450, so $925 total. So far its been good


Now we can LOLZ at you windforce haa!!!


----------



## king of swag187 (Jan 23, 2019)

jaggerwild said:


> Now we can LOLZ at you windforce haa!!!


Can't believe that was months ago, that system is *longggggggggg *gone by now, I have no idea how, but I found a free 8700K at my job and so am currently rocking that and a Z390 Gaming X, and my old Gal, the Fury X. All these years she's still going strong.


----------



## Hardcore Games (Mar 15, 2019)

My R5 2400G provides 8 lanes at PCIe 3.0 for a discrete card. I use a GTX 1060 and I have not had performance issues with it.

The next gen processors may be faster but it's overkill for my present video card


----------



## tpapas (Dec 28, 2019)

great review!

still rockin a 3770k 32gb ddr3 with 2080 rtx on 3x1080 monitors for ultra wide.
still no reason to go towards ryzen.SLI ing 2 x2080s makes still more sense than upgrading to a new mobo ddr4 cpu, even with a performance hit. of course this on 2021, still not worth it currently

the optane 900p + 2080 config lowers the optane to 1.6gb/sec instead of the max 3.5gb/sec and the 2080 loses 3-5 fps. so what...


----------



## 529th (Dec 30, 2019)

Great review.  I was limited not too long ago with my X58 rig and my GTX 1070


----------



## AteXIleR (Mar 17, 2020)

*It is(was) not facing its design limits yet.*

After the coming 3000s series still will be room for the 4th 1000s to show themselves with full power even on the PCI-Express 3.0 x16 interface.
Perhaps, at the 5th generation we may see real disadvantages, but two more gens working properly in the following years is highly likely.


----------



## Mykel67 (Sep 9, 2020)

Still rocking a P67 - i7 2700 with PCIe 2.0 x16 and a GTX 1070. On a 1440p setup. 

Monitoring the system while gaming show the GPU bottlenecking (100% use) on titles such a Witcher 3 or lately New World (alpha).

I was thinking of upgrading later this year, but I still can’t figure out if I’ll get enough fps for the price of a whole new rig (MB, CPU, RAM)...

Looking at this test I think it could work. What do you guys think ?


----------



## Tomorrow (Sep 9, 2020)

Mykel67 said:


> Still rocking a P67 - i7 2700 with PCIe 2.0 x16 and a GTX 1070. On a 1440p setup.
> 
> Monitoring the system while gaming show the GPU bottlenecking (100% use) on titles such a Witcher 3 or lately New World (alpha).
> 
> ...


I went from P67/2500K to X570/3800X. But i did not change my GPU - GTX 1080. Also on 1440p (165Hz).
I can tell you that games, especially multiplayer games became much smoother. Minimum fps went up by a lot. Even without changing the GPU.

For roughly $400 you should get a nice upgrade when buying new (B450, Ryzen 5 3600 and 2x8GB DDR4 @ 3200 CL16).


----------



## Mykel67 (Sep 9, 2020)

Tomorrow said:


> I went from P67/2500K to X570/3800X. But i did not change my GPU - GTX 1080. Also on 1440p (165Hz).
> I can tell you that games, especially multiplayer games became much smoother. Minimum fps went up by a lot. Even without changing the GPU.
> 
> For roughly $400 you should get a nice upgrade when buying new (B450, Ryzen 5 3600 and 2x8GB DDR4 @ 3200 CL16).



Was the 2500K on bottleneck compared to the 1080 ? Currently the i7 is around 50-60% usage when the 1070 is at 100%...

Did you mesure the difference ? 10 / 20% ?

Thanks for your feedback.


----------



## Tomorrow (Sep 9, 2020)

Mykel67 said:


> Was the 2500K on bottleneck compared to the 1080 ? Currently the i7 is around 50-60% usage when the 1070 is at 100%...
> 
> Did you mesure the difference ? 10 / 20% ?
> 
> Thanks for your feedback.


Yes based on my experience 2500K was a massive bottleneck. The last few years on it were pretty laggy.
Tho this is more to do with the 4c/4t nature of the CPU. Your 4c/8t i7 should hold up better.

The problem is that you cant upgrade to a faster GPU without bottlenecking horribly. 1070 is pretty well balanced with 2nd gen i7. I went a bit too far pairing 1080 with i5 tho.

Hardware Unboxed did a GPU scaling benchmark only weeks ago. What they found was that with RTX 2060 or faster you will notice CPU bottlenecking with older CPU's. This is 1080p tho:


----------



## Mykel67 (Sep 10, 2020)

Tomorrow said:


> Yes based on my experience 2500K was a massive bottleneck. The last few years on it were pretty laggy.
> Tho this is more to do with the 4c/4t nature of the CPU. Your 4c/8t i7 should hold up better.
> 
> The problem is that you cant upgrade to a faster GPU without bottlenecking horribly. 1070 is pretty well balanced with 2nd gen i7. I went a bit too far pairing 1080 with i5 tho.
> ...



Very interesting video ! Thanks for sharing. Sad that they didn't try it with 1440p systems as it will use more GPU than CPU.

My config isn't far from the i7-2600K - GTX 1650S + 1440p instead of 1080p so it is clear that I have currently no bottleneck.
(Did you have a PCIe x8 maybe ?)

What will be interesting to see also, is the NVidia IO that will use less CPU for doing some realtime rendering...

but as you said, it's pretty sure that the 3070 will bring Sandy Bridge to and end, but I have some hope haha

This test also shows the 2080Ti being bottlenecked by the CPU : 





						RTX 2080 Ti with i7-2600 1080p, 1440p, Ultrawide, 4K benchmarks at Ultra Quality - GPUCheck United States / USA
					

We benchmark RTX 2080 Ti and i7-2600 at Ultra Quality settings in 48 games and in 1080p, 1440p, and 4K. With a review of specifications, price, power, temperature, and CPU bottlenecks.



					www.gpucheck.com
				



.


----------



## Tomorrow (Sep 10, 2020)

Nope i had PCIe x16 but my P67 only supported 2.0 speeds. I don't remember if it was Z68 or Z77 that brought 3.0
Doubt that 2.0 vs 3.0 affected my 1080 too much tho. It was mostly CPU limited due to four cores and no HT.

I had it overclocked to 4.7Ghz and that helped alot. So it could have been worse. Still have the board too: ASUS P67 Sabertooth. Back when TUF brand meant 5 years warranty and quality components. Tho over time the board has developed issue with the BIOS chip unable to hold any changes (id did swap out the battery). Thankfully it's socketed so at some point i plan to order a new chip to test since i keep the board, CPU and RAM around for tinckering.


----------

