• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA GeForce RTX 2080 Ti PCI-Express Scaling

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
28,955 (3.75/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
It takes a lot of bandwidth to support the fastest graphics card, especially one that can play anything at 4K 60 Hz, with an eye on 120 Hz. The GeForce RTX 2080 Ti could be the most bandwidth-heavy non-storage PCIe device ever built. PCI-Express gen 3.0 is facing its design limits.

Show full review
 
Last edited:
With only the 2080 and 2080 Ti supporting NV-Link/SLI it seems NVIDIA has officially killed SLI for consumers and are focusing on prosumers and professionals.
 
Any chance for a SLI/NV-Link review anytime soon for the new 2080 x cards?
Thanks
 
Interesting, but I seriously doubt someone would use a platform as old as 32NM Sandy Bridge or heaven forbid, AMD FX
with RTX gpus
 
Last edited:
Did you test with Gsync enabled? We have seen results in the past that show Gsync takes more bandwidth and can cause the gap between x8 and x16 to be even greater.
 
With only the 2080 and 2080 Ti supporting NV-Link/SLI it seems NVIDIA has officially killed SLI for consumers and are focusing on prosumers and professionals.
Why wouldn't they?
For the price of 2 cheaper cards in SLI you can get a more expensive model from the lineup that will be just as fast if not better.
Including SLI makes card designing harder and more expensive.
Hence, it only makes sense in the top of the range models - when a 2080 or 2080Ti is still not enough for client's needs. They're adding 2 performance "steps" on the top of this generation.
 
SLI, when it first came out was a wondrous thing for a consumer. You could buy a low end or midrange GPU (Lets say GTX 460 for instance) and when the prices fell enough that you could afford, you snag another, highly discounted GPU for a killer price and get near flag-ship performance
Unfortunately, since this doesn't make Nvidia money, they killed it (as well did developers who dropped it completely)
 
SLI, when it first came out was a wondrous thing for a consumer. You could buy a low end or midrange GPU (Lets say GTX 460 for instance) and when the prices fell enough that you could afford, you snag another, highly discounted GPU for a killer price and get near flag-ship performance
Unfortunately, since this doesn't make Nvidia money, they killed it (as well did developers who dropped it completely)
When SLI reached peak popularity, most people (and almost 100% of gamers) had large ATX, custom-built boxes. Hence, there was a large potential of selling them another card later on.

It's very different today.
First of all: many people game on laptops. I don't know how many... half? Maybe more.
We generally moved to smaller desktops as well - usually imposing a 1 GPU limit.

Nvidia is simply going with times. And clearly this strategy has worked for them. :-)
 
Why wouldn't they?
For the price of 2 cheaper cards in SLI you can get a more expensive model from the lineup that will be just as fast if not better.
Including SLI makes card designing harder and more expensive.
Hence, it only makes sense in the top of the range models - when a 2080 or 2080Ti is still not enough for client's needs. They're adding 2 performance "steps" on the top of this generation.

I was just pointing out that SLI isn't really targeted at gamers anymore, but with Pascal prices fluctuating I could buy another 1070 right now to boost my performance rather than spend nearly double on the upcoming 2070, I'm not, but it is as swag187 says.
 
I was just pointing out that SLI isn't really targeted at gamers anymore, but with Pascal prices fluctuating I could buy another 1070 right now to boost my performance rather than spend nearly double on the upcoming 2070, I'm not, but it is as swag187 says.
Does this strategy make sense to you in long term? With at least 3 cycles?

Also, the "I could" here is rather weak. Do you find your 1070 really lacking? Because 1070 SLI is like what... ~1080? Not a big jump, IMO.
 
Excellent review, now only CPU scaling left so that people can know which core number can be the least effective solution for RTX series.
 
Does this strategy make sense to you in long term? With at least 3 cycles?

Also, the "I could" here is rather weak. Do you find your 1070 really lacking? Because 1070 SLI is like what... ~1080? Not a big jump, IMO.

Based on the reviews for the 2080 and 2080 Ti, it looks like the 2070 will be on par with a 1080, so if a 1070 SLI is on par with a 1080, then it could be $200-300 cheaper to use SLI and get similar performance with the only incentive for a 2070 maybe being DLSS in a few titles.

Also I was just giving you an example of how it could still be relevant to gamers, I'm not endorsing SLI.
 
Interesting, but I seriously doubt someone would use a platform as old as 32NM Sandy Bridge or heaven forbid, AMD FX
You mean at all or for RTX specifically?

I'm sure there are plenty of people still running 32nm Intel parts. If not fot lack of money then for soldered heatspreader or small IPC improvements in newer parts alone. Thankfully a viable upgrade is coming (9xxx series K chips with soldered heatspreaders).

I agree that very few people would shell out 800+ money to get Turing yet are running 2011 era platforms. Tho there are some people running Pascal on those. I myself run GTX 1080 on a P67 platform with 2500K @ 4,7Ghz. I did managed to pop in a i7-3770K once and there was performance improvement. Tho hard to say exactly how much my PCI-E 2.0 x16 is holding back GTX 1080. Problably very little. Mostly it's the CPU itself that bottlenecking.

FYI yes i am upgrading next year if not earlier. Tho i will skip Turing and wait for RTX 3080Ti or whatever it ends up being called on 7nm. That would be a worthy upgrade considering it would offer approx 100% perf increase over GTX 1080. I paid 700€ for 1080 when it came out over 2 years ago so assuming the next flagship ships with the same price (and Nvidia does not increase prices...again) then 1400€ would be justified considering the resulting performnce increase.

As it's stands Turing is for developers and early adopters mostly.
 
Based on the reviews for the 2080 and 2080 Ti, it looks like the 2070 will be on par with a 1080, so if a 1070 SLI is on par with a 1080, then it could be $200-300 cheaper to use SLI and get similar performance with the only incentive for a 2070 maybe being DLSS in a few titles.
How come?
An 1070 costs around $400 now.
2070 is expected to cost $500-$600 depending on variant (just like 1080 at the moment).

Sure, you can spend less money this time getting another 1070 instead of 1080/2070. But you've already spent ~$400 on your first 1070, right?
So you're spending more on your PC in the end.
In fact 1080 MSRP was $500-600 as well. You could have just borrow the missing cash. Even with interest you would be well bellow $800.

That's exactly why I asked about a long-term cost. I just don't see how SLI could be cheaper than just buying the more expensive cards.
 
How come?
An 1070 costs around $400 now.
2070 is expected to cost $500-$600 depending on variant (just like 1080 at the moment).

Sure, you can spend less money this time getting another 1070 instead of 1080/2070. But you've already spent ~$400 on your first 1070, right?
So you're spending more on your PC in the end.
In fact 1080 MSRP was $500-600 as well. You could have just borrow the missing cash. Even with interest you would be well bellow $800.

That's exactly why I asked about a long-term cost. I just don't see how SLI could be cheaper than just buying the more expensive cards.

Newegg has a 1070 for $360 (after MIR), I'm pretty sure I saw one a couple of weeks ago for less than that, the 2070 will probably cost $550<.
 
is not clear for me if the bandwidth is used at 100 %

i see only fps which don't know how to translate to bandwidth usage...
 
running the basic firestrike my pair of palit 1070 cards (both set up at 80 max power for mining) still show "better than 99% of all cards tested" ..

1070 sli isnt like "1080" it can be just 1070 or it can be better than 1080TI it all depends on what you run and how sli scales..

i would like to see more of how games scale in sli in gaming reviews.. in fact it should be part of the review..

trog
 
Interesting, but I seriously doubt someone would use a platform as old as 32NM Sandy Bridge or heaven forbid, AMD FX

We have plenty of Nehalem users here...

Also many boards on SB-E can enable PCIE3 via tool, registry.
 
Interesting, but I seriously doubt someone would use a platform as old as 32NM Sandy Bridge or heaven forbid, AMD FX
Why not? There are still a lot of SB-E owners. Nvidia even has a small patch to enable PCIe3 on many of those X79 systems. There are even people with mainstream SB too.
 
Thanks for review W1zzard.

If someone interest this is a good 28 min video about PCIE lanes with various graphics cards in SLI.

 
Interesting, but I seriously doubt someone would use a platform as old as 32NM Sandy Bridge or heaven forbid, AMD FX

Huh? I'm running on a 6-Core 12 thread first gen i7.
 
count me in as eagerly awaiting NVlink benches. I'm curious if it will work better/more seamlessly then SLI.

SLIs biggest problem was having to be supported individually on every program. Something like NVlink, if t did not require per program support, could see far wider spread adoption.
When SLI reached peak popularity, most people (and almost 100% of gamers) had large ATX, custom-built boxes. Hence, there was a large potential of selling them another card later on.

It's very different today.
First of all: many people game on laptops. I don't know how many... half? Maybe more.
We generally moved to smaller desktops as well - usually imposing a 1 GPU limit.

Nvidia is simply going with times. And clearly this strategy has worked for them. :)
ATX motherboards still sell incredibly well. While SFX builds and laptops are more popular now, ATX desktops still dominate the gaming world, and SLI adoption was never very high to begin with, even back in the late 2000s.

The size of the case didnt kill SLI, the lack of user interest did. As GPUs became powerful enough to push 1080p ultra, dualGPU rigs became incredibly rarer then they already were, and with such high support costs both in $$$ and man hours, its no surprise the idea eventually fizzled out.

Interesting, but I seriously doubt someone would use a platform as old as 32NM Sandy Bridge or heaven forbid, AMD FX
There are still a LOT of sandy bridge and FX machines out there. While the high end gaming machines may have left the platform years ago for PCIE3 and usb 3, there are plenty of mid range machines still on older arches. Hell, I just went from ivy bridge to ryzen, not for CPU performance, bu because my motherboard was on the fritz. I still know people on core 2 quads.

CPU performance inst a driving upgrade factor anymore, and a ton of these older platforms still exist. While they will not be buying 2080s, they WILL probably be interested in 3060s next gen, and those may also saturate 2.0 x 16.
 
Back
Top