• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

An Anthem for SLI: Bioware's New Universe in 60 FPS 4K Run on Two NVIDIA GeForce GTX 1080Ti GPUs

Joined
Dec 30, 2010
Messages
2,164 (0.43/day)
I have some bad news for you guys. Multi gpu is useless because you cant get consistent fps rate even from a single gpu. you either get inconsistent frame rate from both cards which leads to stutttering or a socalled solution; cap frame rate to the lower frame which changes constantly and in its way lead to lag. In both senarios gameplay loses the fluidity and smoothness of a single gpu. So good look in fooling yourself forever.

https://www.anandtech.com/show/6857/amd-stuttering-issues-driver-roadmap-fraps/3

Here's a good explenation on why micro stuttering occurs. It's not really a hardware bug, but merely a software one. And in order to get the best out of 2 cards or more, the system needs to be adapted seriously in order to get the best out of it. I always wondered what the effect on stuttering would be when you overclocked the PCI-E bus. I know this sounds stupid, but a videocard is just not always about pure bandwidth. I think the speed from these lanes is a very important factor as well. Nobody ever moved passed the 100Mhz barrier while we see noticable improvement when going from 100 to 112Mhz for example.
 

cadaveca

My name is Dave
Joined
Apr 10, 2006
Messages
17,233 (2.57/day)
Pubg did excellent with 2 RX580's. Noticable difference when going from 1 to 2 cards.
SLI doesn't work with this app. Not sure why Crossfire is relevant... SLI profile turn SLI OFF. It's funny, because I have dual 1080's and one is simply collecting dust. I re-install it every once in a while to see if it works with the games I play, and it does with a couple, but not the ones where I really want it to. I mean, I am gaming @ 2560x1600 at 60 Hz, which is still too much res for many games and a single 1080. That's what makes SLI so frustrating for me... I have the second card, but it doesn't do much of anything when I need it to.

pubg is one of the few exceptions were I actually think nothing bad can happen with multigpu cause the game in itself has microstutters and what not.
I've used sli and CF and the microstutter, ohh my god it's terrible.

Last sli was with maxwell and last CF was 6970's.

yay I have 200fps, still feels worse than 80fps...

neither vendor seems to be able to do anything about it.

Microstutter these days is not as bad as it used to be, and isn't so jarring, but is definitely still a problem at times. I will never forget having dual 2900XT, having one die, and when I RMA'd the card, I found one GPU just FELT better than two. I started looking into it and found what is now called microstutter to be a problem, long before anyone else was complaining about it... seems I'm very sensitive to it. That said, since I've been aware of it since then, I've paid close attention to how this all works out and there definitely has been progress but no, it is not gone completely.
 
Joined
Dec 30, 2010
Messages
2,164 (0.43/day)
AMD has a few profiles which you can set. And some options on the way the CF behaves. There's various ways or approached to render the frames split by 2 gpu's. Some have advantages, some less. But for the majority of games there are plenty of profiles to select from. For just 60Hz at 2560x1080 even a single RX580 will do the job. But when going 120 ~ 144Hz that SLI really comes into play. The 2 cards thing is when you want the best and maximum hardware can do.

If i'm not mistaken, where'nt the AMD crossfire chipsets back in the days better compared to the counterparts such as Nvidia ?
 
Joined
Jun 28, 2014
Messages
2,388 (0.65/day)
Location
Shenandoah Valley, Virginia USA
System Name Home Brewed
Processor i9-7900X and i7-8700K
Motherboard ASUS ROG Rampage VI Extreme & ASUS Prime Z-370 A
Cooling Corsair 280mm AIO & Thermaltake Water 3.0
Memory 64GB DDR4-3000 GSKill RipJaws-V & 32GB DDR4-3466 GEIL Potenza
Video Card(s) 2X-GTX-1080 SLI & 2 GTX-1070Ti 8GB G1 Gaming in SLI
Storage Both have 2TB HDDs for storage, 480GB SSDs for OS, and 240GB SSDs for Steam Games
Display(s) ACER 28" B286HK 4K & Samsung 32" 1080P
Case NZXT Source 540 & Rosewill Rise Chassis
Audio Device(s) onboard
Power Supply Corsair RM1000 & Corsair RM850
Mouse Generic
Keyboard Razer Blackwidow Tournament & Corsair K90
Software Win-10 Professional
Benchmark Scores yes
D

Deleted member 177333

Guest
Good to see - I've been running multi GPU solutions for a number of years. I ran Crossfire cards for a while, then when I switched over to NVidia, I ran 780 Ti SLI, Titan X Maxwell SLI, and now 1080 Ti SLI and have overall had a great experience. I simply can't game at the image quality & framerates I expect to game at without multiple video cards, GPUs simply aren't powerful enough with just one. It's good to see companies making an effort to help us gamers out with SLI support.

After Arkham Knight's disaster, I stopped buying graphically demanding games if they don't support SLI. Gotta vote with your wallet. :)

Thanks for the article.
 
Joined
Jun 28, 2014
Messages
2,388 (0.65/day)
Location
Shenandoah Valley, Virginia USA
System Name Home Brewed
Processor i9-7900X and i7-8700K
Motherboard ASUS ROG Rampage VI Extreme & ASUS Prime Z-370 A
Cooling Corsair 280mm AIO & Thermaltake Water 3.0
Memory 64GB DDR4-3000 GSKill RipJaws-V & 32GB DDR4-3466 GEIL Potenza
Video Card(s) 2X-GTX-1080 SLI & 2 GTX-1070Ti 8GB G1 Gaming in SLI
Storage Both have 2TB HDDs for storage, 480GB SSDs for OS, and 240GB SSDs for Steam Games
Display(s) ACER 28" B286HK 4K & Samsung 32" 1080P
Case NZXT Source 540 & Rosewill Rise Chassis
Audio Device(s) onboard
Power Supply Corsair RM1000 & Corsair RM850
Mouse Generic
Keyboard Razer Blackwidow Tournament & Corsair K90
Software Win-10 Professional
Benchmark Scores yes
I stopped buying graphically demanding games if they don't support SLI. Gotta vote with your wallet. :)

Me too. Most of mine do use SLI. (although my shooters are not that demanding)
 
Joined
Mar 14, 2014
Messages
1,342 (0.35/day)
Processor i7-4790K 4.6GHz @1.29v
Motherboard ASUS Maximus Hero VII Z97
Cooling Noctua NH-U14S
Memory G. Skill Trident X 2x8GB 2133MHz
Video Card(s) Asus Tuf RTX 3060 V1 FHR (Newegg Shuffle)
Storage OS 120GB Kingston V300, Samsung 850 Pro 512GB , 3TB Hitachi HDD, 2x5TB Toshiba X300, 500GB M.2 @ x2
Display(s) Lenovo y27g 1080p 144Hz
Case Fractal Design Define R4
Audio Device(s) AKG Q701's w/ O2+ODAC (Sounds a little bright)
Power Supply EVGA Supernova G2 850w
Mouse Glorious Model D
Keyboard Rosewill Full Size. Red Switches. Blue Leds. RK-9100xBRE - Hate this. way to big
Software Win10
Benchmark Scores 3DMark FireStrike Score : needs updating
I have some bad news for you guys. Multi gpu is useless because you cant get consistent fps rate even from a single gpu. you either get inconsistent frame rate from both cards which leads to stutttering or a socalled solution; cap frame rate to the lower frame which changes constantly and in its way lead to lag. In both senarios gameplay loses the fluidity and smoothness of a single gpu. So good look in fooling yourself forever.
Do you have SLI? Cause there's some people here saying its fine, over multiple generations.
Seems to be a lot of inexperienced, word of mouth, SLI haters, but I'm an idiot for wanting to try

Or maybe low profile cards expose the lie of multi-gpu and bridge more clearly because they are more prone to fps drops that becomes much more worse when using multi_GPU solution which *supposedly* its main purpose is increasing fps therefor eincreasing gaming fluidity.
Mid range sli was eating into Nvidia's high end sales so they stopped that. It really is that simple.
 
D

Deleted member 177333

Guest
Mid range sli was eating into Nvidia's high end sales so they stopped that. It really is that simple.

Hah, ya, when I did Crossfire many years back, that's what I did - I didn't make a lot of money back then so I had to go budget as much as possible so I'd usually run a couple $150-200 cards together and net some really good performance with them. By the time I switched to NVidia, they were starting to remove SLI on the middle-tier cards.
 
Joined
Mar 14, 2014
Messages
1,342 (0.35/day)
Processor i7-4790K 4.6GHz @1.29v
Motherboard ASUS Maximus Hero VII Z97
Cooling Noctua NH-U14S
Memory G. Skill Trident X 2x8GB 2133MHz
Video Card(s) Asus Tuf RTX 3060 V1 FHR (Newegg Shuffle)
Storage OS 120GB Kingston V300, Samsung 850 Pro 512GB , 3TB Hitachi HDD, 2x5TB Toshiba X300, 500GB M.2 @ x2
Display(s) Lenovo y27g 1080p 144Hz
Case Fractal Design Define R4
Audio Device(s) AKG Q701's w/ O2+ODAC (Sounds a little bright)
Power Supply EVGA Supernova G2 850w
Mouse Glorious Model D
Keyboard Rosewill Full Size. Red Switches. Blue Leds. RK-9100xBRE - Hate this. way to big
Software Win10
Benchmark Scores 3DMark FireStrike Score : needs updating
4k 45ish average on far cry 5 benchmark with a 1080 ti ftw3 and 6600k 4.5 ultra - perfectly playable and sli isn't worth it.

https://babeltechreviews.com/gtx-1080-ti-sli-performance-25-games/3/

44min
51avg
60max
V sync enabled.
1080 ti sli is useless until 4k144hz drops - it won't drive it though.
They have a newer SLI review with 1070Ti. They say the HB Bridge improves stutters.
I have Gsync and I absolutely hate going below 50fps still. There's no tearing but its not near as smooth as 80+fps.
 
Joined
Apr 26, 2008
Messages
1,131 (0.19/day)
Location
london
System Name Staggered
Processor Intel i5 6600k (XSPC Rasa)
Motherboard Gigabyte Z170 Gaming K3
Cooling RX360 (3*Scythe GT1850) + RX240 (2*Scythe GT1850) + Laing D5 Vario (with EK X-Top V2)
Memory 2*8gb Team Group Dark @3000Mhz 16-16-16-36 1.25v
Video Card(s) Inno3D GTX 1070 HerculeZ
Storage 256gb Samsung 830 + 2*1tB Samsung F3 + 2*2tB Samsung F4EG
Display(s) Flatron W3000H 2560*1600
Case Cooler Master ATCS 840 + 1*120 GT1850 (exhaust) + 1*230 Spectre Pro + Lamptron FC2 (fan controller)
Power Supply Enermax Revolution 85+ 1250W
Software Windows 10 Pro 64bit
thought with dx12, it would put any gpu together as a pool and the memory as well. something along the lines of any gpu work together to render games. no need for sli or crossfire.. was i wrong? can't remember
How Vulkan and DX12 work in general:
  • Start the program and grab a list of all available physical devices (IGP, discrete GPU etc.)
  • Query the physical devices to find one that supports the features you need, then create a logical device out of it.
  • Attach a command pool (the thing you've heard of in articles that makes DX12 multithreaded) to the logical device.
  • Throw work into a command queue then throw that command queue at the command pool to get the physical device to do work. Rinse and repeat.
The way multi-GPU works is you just create more than one logical device, each with it's own command pool. Then you can just spread the work between the different command pools.

The reason why devs don't really use it is because this is all new and they're still trying to get to grips with it.
Different GPU's have different features available and perform differently, so spreading work evenly between them usually ends up slower. So you have to load balance the work you send to the GPU which takes a bunch of testing to get just right.
Then there's the problem of cross-GPU communication. Sometimes work depends on the results of previous work (e.g. postprocessing) which could be on a different GPU, requiring communication between GPU's which add latency. That latency could end up making things slower than doing everything on a single GPU because GPUs are extremely fast and any time spent outside them (e.g. copying 3d model data from ram to vram) is wasted time. Ideally the program would require zero cross-GPU communication but that would take dev resources to redesign how they go about their renderer code.

Over time as devs learn more and tools get better, DX12/Vulkan (which are very similar) will end up replacing DX11/OpenGL* and we'll live in a magical world of cross vendor multi-GPU.
Of course the closet Mormons say that will never happen but they said the same about DX9, DX10, DX11, dualcores, quadcores and now 6/ cores. So they can safely be ignored.

*Khronos said Vulkan isn't supposed to replace Vulkan but that's what will happen. Graphics programming is commonly referred to as black magic or a dark art by other programmers because, while it may be easy to do a basic "hello world", it's extremely time consuming getting good at it. The reason for that is that pre-DX12, the API was a black box (a bloated finite state machine that does mysterious things) which was interpreted by the driver, another black box (also bloated because it's interpreting a bloated thing), which in turn relays commands to the GPU, another (you guessed it) black box.
The way you learn a black box is by throw it a bunch of inputs then seeing what it ouputs. This is time consuming enough for a single black box but when you have 3 layers of blackboxes, each interchangeable, you can imagine how this becomes a dark art.
With DX12/Vulkan, instead of a bloated FSM they give you a model of a GPU (which accurately maps to how modern GPUs work) and give you functions to directly interact with different parts of that model, removing its blackbox nature.
Since the model maps almost one-to-one with the physical GPU drivers can be a much leaner (which is why AMD is so good now, they aren't hampered by their bad drivers anymore). The driver ends up being just a glorified conversion layer which you can ignore because it's not doing anything special, removing it's blackbox nature.
And finally, since the model maps so closely to the physical GPU and the driver is ignorable the GPU also stops being a black box.
Writing a basic "hello world" is harder in DX12/Vulkan because you can essentially interact with any part of the GPU in any order. However, once you've written that "hello world" program you will have learnt most of the API. I've messed around with Vulkan and I'd compare it to C. It doesn't hold your hand but it does make eveything very transparent, so you can easily tell what your code is doing at the hardware level. It even feels the same with all the memory management.
 
Joined
Dec 28, 2012
Messages
3,651 (0.86/day)
System Name Skunkworks
Processor 5800x3d
Motherboard x570 unify
Cooling Noctua NH-U12A
Memory 32GB 3600 mhz
Video Card(s) asrock 6800xt challenger D
Storage Sabarent rocket 4.0 2TB, MX 500 2TB
Display(s) Asus 1440p144 27"
Case Old arse cooler master 932
Power Supply Corsair 1200w platinum
Mouse *squeak*
Keyboard Some old office thing
Software openSUSE tumbleweed/Mint 21.2
Plz elaborate. Games are are still releasing with SLI support. coming out with SLI support a year later

FTFY. SLI support at launch is praticaly non existant, more then 50% scaling within the first year is rare, a lot of games simply never get SLI support, and SLI is still completely MIA on DX12 games.

SLI is a far cry from the golden days of 2010, when every game was at least somewhat supported, and proper SLI support was the norm by the 3 month mark.
 
Joined
Mar 14, 2014
Messages
1,342 (0.35/day)
Processor i7-4790K 4.6GHz @1.29v
Motherboard ASUS Maximus Hero VII Z97
Cooling Noctua NH-U14S
Memory G. Skill Trident X 2x8GB 2133MHz
Video Card(s) Asus Tuf RTX 3060 V1 FHR (Newegg Shuffle)
Storage OS 120GB Kingston V300, Samsung 850 Pro 512GB , 3TB Hitachi HDD, 2x5TB Toshiba X300, 500GB M.2 @ x2
Display(s) Lenovo y27g 1080p 144Hz
Case Fractal Design Define R4
Audio Device(s) AKG Q701's w/ O2+ODAC (Sounds a little bright)
Power Supply EVGA Supernova G2 850w
Mouse Glorious Model D
Keyboard Rosewill Full Size. Red Switches. Blue Leds. RK-9100xBRE - Hate this. way to big
Software Win10
Benchmark Scores 3DMark FireStrike Score : needs updating
FTFY. SLI support at launch is praticaly non existant, more then 50% scaling within the first year is rare, a lot of games simply never get SLI support, and SLI is still completely MIA on DX12 games.

SLI is a far cry from the golden days of 2010, when every game was at least somewhat supported, and proper SLI support was the norm by the 3 month mark.
Lol "FTFY" ... did you even look at the 3 games I mentioned?
 
Joined
Jun 18, 2015
Messages
575 (0.17/day)
Do you have SLI? Cause there's some people here saying its fine, over multiple generations.
Seems to be a lot of inexperienced, word of mouth, SLI haters, but I'm an idiot for wanting to try

yes i tried both. and tried to use it with 120hz monitors but the result was in best cases easily noticeable smoothlessness when compared to single gpu. i used to play many games at constant 125fps and 250fps (old games at low settings) at 120hz monitors and got used to that smoothness and can notice and get bothered by anything less smoother which was really obvious in gtx 680 SLI. And guys here who claim their SLI setups are working. it looks working because theey are using high end cards and stuttering and lag issues are less noticable because they are getting higher framerates BUT it is still there. there are either stuttering (when getting low framerats, or with low-mid range cards ) or lag (with high framerates; when using highend cards). The lag is noticable and obvious especially when playing games online. so maybe they arent noticing it because they are casual gamers and not used to play at high framerates.
 
Joined
Feb 18, 2017
Messages
688 (0.25/day)
I've been running SLI setups since the GTX 280 days. I've tried dual, triple, quad SLI. Dual SLI seems to be the sweet spot. With dual SLI you'll see around 60-80% performance boost as opposed to just using 1 card in many games. With triple SLI, the 3rd card usually only gives an extra 15-20% performance boost on top of dual SLI. Then a 4th card usually only gives you an extra 4% performance on top of triple SLI. So 3-way and 4-way SLI are absolutely not worth it but 2-way SLI gives you quite a huge boost in FPS.
https://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_1080_SLI/20.html

70% when only normal scaling games are included. This means you lose 30% of the money you invested in a second card. And if you take all games into consideration (it's not certain that your beloved upcoming game will get a "better" 2 card support), it drops nearly to 50%, meaning HALF of your money is throw into dustbin when buying a second card.

I've been running SLI setups since the GTX 280 days. I've tried dual, triple, quad SLI. Dual SLI seems to be the sweet spot. With dual SLI you'll see around 60-80% performance boost as opposed to just using 1 card in many games. With triple SLI, the 3rd card usually only gives an extra 15-20% performance boost on top of dual SLI. Then a 4th card usually only gives you an extra 4% performance on top of triple SLI. So 3-way and 4-way SLI are absolutely not worth it but 2-way SLI gives you quite a huge boost in FPS.
https://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_1080_SLI/20.html

70% when only normal scaling games are included. This means you lose 30% of the money you invested in a second card. And if you take all games into consideration (it's not certain that your beloved upcoming game will get a "better" 2 card support), it drops nearly to 50%, meaning HALF of your money is throw into dustbin when buying a second card. So the one who says it's dead: it's really dead. Awful scaling, more heat, more noise. The only acceptable scaling would be 90% for scaling games, 70-75% including the non (or less)-scaling games.

What games do you? What's your monitor setup look like?


Then wait for a better GPU. SLI and CF are both awful.
 
Joined
Dec 18, 2005
Messages
8,253 (1.21/day)
System Name money pit..
Processor Intel 9900K 4.8 at 1.152 core voltage minus 0.120 offset
Motherboard Asus rog Strix Z370-F Gaming
Cooling Dark Rock TF air cooler.. Stock vga air coolers with case side fans to help cooling..
Memory 32 gb corsair vengeance 3200
Video Card(s) Palit Gaming Pro OC 2080TI
Storage 150 nvme boot drive partition.. 1T Sandisk sata.. 1T Transend sata.. 1T 970 evo nvme m 2..
Display(s) 27" Asus PG279Q ROG Swift 165Hrz Nvidia G-Sync, IPS.. 2560x1440..
Case Gigabyte mid-tower.. cheap and nothing special..
Audio Device(s) onboard sounds with stereo amp..
Power Supply EVGA 850 watt..
Mouse Logitech G700s
Keyboard Logitech K270
Software Win 10 pro..
Benchmark Scores Firestike 29500.. timepsy 14000..
Good to see - I've been running multi GPU solutions for a number of years. I ran Crossfire cards for a while, then when I switched over to NVidia, I ran 780 Ti SLI, Titan X Maxwell SLI, and now 1080 Ti SLI and have overall had a great experience. I simply can't game at the image quality & framerates I expect to game at without multiple video cards, GPUs simply aren't powerful enough with just one. It's good to see companies making an effort to help us gamers out with SLI support.

After Arkham Knight's disaster, I stopped buying graphically demanding games if they don't support SLI. Gotta vote with your wallet. :)

Thanks for the article.


that about sums it up.. vote with your wallet.. if the game dosnt suport sli dont buy it.. i am sure the sli support situation would improve dramatically if more folks did this.. :)

as for it not being cost effective.. since when has top end performance ever been cost effective.. if you want the best performance available it costs..

trog
 

newtekie1

Semi-Retired Folder
Joined
Nov 22, 2005
Messages
28,472 (4.16/day)
Location
Indiana, USA
Processor Intel Core i7 10850K@5.2GHz
Motherboard AsRock Z470 Taichi
Cooling Corsair H115i Pro w/ Noctua NF-A14 Fans
Memory 32GB DDR4-3600
Video Card(s) RTX 2070 Super
Storage 500GB SX8200 Pro + 8TB with 1TB SSD Cache
Display(s) Acer Nitro VG280K 4K 28"
Case Fractal Design Define S
Audio Device(s) Onboard is good enough for me
Power Supply eVGA SuperNOVA 1000w G3
Software Windows 10 Pro x64
Vote with your wallet only works if a significant number of the people buying the product stop, but the fact is SLI, and even Crossfire, users make up a very small percentage of the people buying these games. So the game companies don't even notice the lost sales.

This is the first generation that I have not run and SLI setup since I first started running SLI with 7800GTs over a decade ago, and I've run SLI with every GPU generation since then. But the support for it in games, and a large part of this is due to nVidia dropping SLI support on lower end cards, just isn't there anymore. So I opted to wait until I could afford the one big GPU and buy it instead of buying a lower teir GPU and then buying a second a few months later.
 
Joined
Jan 6, 2014
Messages
192 (0.05/day)
System Name #projectEVO v4
Processor AMD R9 3950X
Motherboard ASUS X570 Crosshair Impact
Cooling EK Supremacy EVO CPU, Alphacool Monsta 240, Alphacool UT60 360, Aquacomputer Aqauero 6X
Memory GSkills Trident Z Neo 32GB DDR4 CL16 @3600MHz
Video Card(s) EVGA RTX 3090 FTW3 Ultra
Storage Samsung 970 500GB NVME M.2, Seagate 4TB 7200rpm HDD (x2)
Display(s) Samsung G9 Odyssey 49" Ultrawide @240hz | ASUS MG279Q 1440P @144hz | HTC VIVE
Case CaseLabs Mercury S5
Audio Device(s) Marantz PM7000N | Schitt Stack UBER | Beyerdynamic DT1990 | Sennheiser HD6XX | HiFi Man HE4XX
Power Supply EVGA 1600 Watt T2
Mouse HyperX Haste, Logi GroX Superlight, Logi GPro Wireless, Razer Viper Ultimate
Keyboard Drop ALT , HHKB Pro 2, RealForce Topre 104UB, Topre Realforce RGB
Software Windows 10 PRO
yes i tried both. and tried to use it with 120hz monitors but the result was in best cases easily noticeable smoothlessness when compared to single gpu. i used to play many games at constant 125fps and 250fps (old games at low settings) at 120hz monitors and got used to that smoothness and can notice and get bothered by anything less smoother which was really obvious in gtx 680 SLI. And guys here who claim their SLI setups are working. it looks working because theey are using high end cards and stuttering and lag issues are less noticable because they are getting higher framerates BUT it is still there. there are either stuttering (when getting low framerats, or with low-mid range cards ) or lag (with high framerates; when using highend cards). The lag is noticable and obvious especially when playing games online. so maybe they arent noticing it because they are casual gamers and not used to play at high framerates.

Curious what resolution and monitors you're using at 120hz. When was the last time you used an SLI Setup? (GTX 680?) We've had major advances in minimizing latency and stuttering since then.

SLI and Crossfire User here with Triple 1440P @144hz (7680x1440P @144hz) monitors and 4K @60hz monitors. Although SLI and Crossfire were more prevalent generations ago especially with low and mid-tier GPU's, a setup like this is really only recommended for users at the bleeding edge pushing hardware and graphics to the limit today. SLI, Crossfire, Multi-GPU isn't for everyone. If anything major diminishing returns for users playing below 1440P @144hz with current titles, you just won't maximize the GPU's enough to warrant that second video card. Hence the complaints of people with negative multi-GPU experiences. (Game selection is another)

I personally recommend going SLI, Crossfire, Multi-GPU route only if you're playing at 4K+ Resolutions, High-Refresh, and Multi-Monitor setups with two of the highest tier GPU's. This guarantees you the best performance with all the games in the market regardless of support. I would never recommend users to go with mid or low tier SLI, Crossfire, Multi-GPU setups since you should be purchasing the single best GPU you can afford if you're playing below 1440P @144hz (including 1080P @240hz).

My point is to focus on building a balanced system matching your video card to the monitor setup for the best results. Does it hurt to pick up a GTX 1080Ti for a 1080P monitor? Not at all, but don't expect increased performance in that resolution and refresh adding another one.

Then wait for a better GPU. SLI and CF are both awful.

Even if you have the highest tier GPU available? Yea ok... Try running a single gpu to run a setup with triple 1440P @ 144hz (7680x1440P @144hz)... (That's a long wait) Really curious what monitor, resolution, refresh rate, and game selection you're playing at to say it's awful... Would love to know your current experience. i'll wait...
 
Last edited:
Joined
Mar 14, 2014
Messages
1,342 (0.35/day)
Processor i7-4790K 4.6GHz @1.29v
Motherboard ASUS Maximus Hero VII Z97
Cooling Noctua NH-U14S
Memory G. Skill Trident X 2x8GB 2133MHz
Video Card(s) Asus Tuf RTX 3060 V1 FHR (Newegg Shuffle)
Storage OS 120GB Kingston V300, Samsung 850 Pro 512GB , 3TB Hitachi HDD, 2x5TB Toshiba X300, 500GB M.2 @ x2
Display(s) Lenovo y27g 1080p 144Hz
Case Fractal Design Define R4
Audio Device(s) AKG Q701's w/ O2+ODAC (Sounds a little bright)
Power Supply EVGA Supernova G2 850w
Mouse Glorious Model D
Keyboard Rosewill Full Size. Red Switches. Blue Leds. RK-9100xBRE - Hate this. way to big
Software Win10
Benchmark Scores 3DMark FireStrike Score : needs updating
yes i tried both. and tried to use it with 120hz monitors but the result was in best cases easily noticeable smoothlessness when compared to single gpu. i used to play many games at constant 125fps and 250fps (old games at low settings) at 120hz monitors and got used to that smoothness and can notice and get bothered by anything less smoother which was really obvious in gtx 680 SLI. And guys here who claim their SLI setups are working. it looks working because theey are using high end cards and stuttering and lag issues are less noticable because they are getting higher framerates BUT it is still there. there are either stuttering (when getting low framerats, or with low-mid range cards ) or lag (with high framerates; when using highend cards). The lag is noticable and obvious especially when playing games online. so maybe they arent noticing it because they are casual gamers and not used to play at high framerates.
Since your SLI days they have released two new SLI bridges to combat the issues...
Most people talking smack on SLI here are talking about the old SLI hardware, haven't seen anyone talking bad about their HB SLI bridge setup, just the stuff of yesteryear.

@B-Real wow those 1080 reviews are more than 2 years old lmao we need new cards so bad.

So what are we all going to do when they keep releasing SLI games in 6 months? Keep claiming its dead? Keep asking for people to prove to them that SLI games are still coming out because they don't have the patience to look it up themselves? Looking at you @cadaveca...
 

cadaveca

My name is Dave
Joined
Apr 10, 2006
Messages
17,233 (2.57/day)
Since your SLI days they have released two new SLI bridges to combat the issues...
Most people talking smack on SLI here are talking about the old SLI hardware, haven't seen anyone talking bad about their HB SLI bridge setup, just the stuff of yesteryear.

@B-Real wow those 1080 reviews are more than 2 years old lmao we need new cards so bad.

So what are we all going to do when they keep releasing SLI games in 6 months? Keep claiming its dead? Keep asking for people to prove to them that SLI games are still coming out because they don't have the patience to look it up themselves? Looking at you @cadaveca...
I have both the SLI+ and and HB-SLI bridge, as well as a lit HB SLI bridge, all MSI, to go with my MSI cards on what was supposed to be an MSI motherboard. The perfect example I have of SLI NOT working is PUBG... when crossfire works fine. That's crap support. That's why I have the complaints I do… because I've been using both SLI and Crossfire since you could (A8N SLI board, IIRC), and today support is far worse than it used to be, even with titles that apparently have support. What's more concerning is that SLI works fine for me in benchmarks... just not games. I'd like support in Destiny 2 as well... but again, it doesn't work (just tried again today, I get graphical anomolies).

So I'm left with a question, maybe my bridge is bad? OK, try new one. Nope, still doesn't work. try new board.. nope. New platform (I have them all), nope. Maybe it's a bad card? nope.. cards work fine on their own.

So, why do I have problems with getting SLI to work? I'm the guy with many machines and a store's-worth of hardware on my shelf, and nothing gets things working right. I use two different configs for monitors... 3x 1200p and a single 2650x1600...I got into multi GPU because 2560x1600 required it. Then I went with multi-monitor because of Eyefinity… that was crap too... swapped to SLI and things were better... and then things went downhill from there.

So great, it works for you, but it doesn't woprk for me. How is that possible? We use our PCs differently. To me, for SLI to be a success (or Crossfire, for that matter), we should all have the same experience, but we don't. It's too bad.
 
Last edited:
Joined
Jan 6, 2014
Messages
192 (0.05/day)
System Name #projectEVO v4
Processor AMD R9 3950X
Motherboard ASUS X570 Crosshair Impact
Cooling EK Supremacy EVO CPU, Alphacool Monsta 240, Alphacool UT60 360, Aquacomputer Aqauero 6X
Memory GSkills Trident Z Neo 32GB DDR4 CL16 @3600MHz
Video Card(s) EVGA RTX 3090 FTW3 Ultra
Storage Samsung 970 500GB NVME M.2, Seagate 4TB 7200rpm HDD (x2)
Display(s) Samsung G9 Odyssey 49" Ultrawide @240hz | ASUS MG279Q 1440P @144hz | HTC VIVE
Case CaseLabs Mercury S5
Audio Device(s) Marantz PM7000N | Schitt Stack UBER | Beyerdynamic DT1990 | Sennheiser HD6XX | HiFi Man HE4XX
Power Supply EVGA 1600 Watt T2
Mouse HyperX Haste, Logi GroX Superlight, Logi GPro Wireless, Razer Viper Ultimate
Keyboard Drop ALT , HHKB Pro 2, RealForce Topre 104UB, Topre Realforce RGB
Software Windows 10 PRO
I have both the SLI+ and and HB-SLI bridge, as well as a lit HB SLI bridge, all MSI, to go with my MSI cards on what was supposed to be an MSI motherboard. The perfect example I have of SLI NOT working is PUBG... when crossfire works fine. That's crap support. That's why I have the complaints I do… because I've been using both SLI and Crossfire since you could (A8N SLI board, IIRC), and today support is far worse than it used to be, even with titles that apparently have support. What's more concerning is that SLI works fine for me in benchmarks... just not games. I'd like support in Destiny 2 as well... but again, it doesn't work (just tried again today, I get graphical anomolies).

So I'm left with a question, maybe my bridge is bad? OK, try new one. Nope, still doesn't work. try new board.. nope. New platform (I have them all), nope. Maybe it's a bad card? nope.. cards work fine on their own.

So, why do I have problems with getting SLI to work? I'm the guy with many machines and a store's-worth of hardware on my shelf, and nothing gets things working right. I use two different configs for monitors... 3x 1200p and a single 2650x1600...I got into multi GPU because 2560x1600 required it. Then I went with multi-monitor because of Eyefinity… that was crap too... swapped to SLI and things were better... and then things went downhill from there.

So great, it works for you, but it doesn't woprk for me. How is that possible? We use our PCs differently. To me, for SLI to be a success (or Crossfire, for that matter), we should all have the same experience, but we don't. It's too bad.

PUBG is a terrible example for SLI/Crossfire support. I agree it was working when V1.0 dropped for both Crossfire and SLI, but the update after that killed any existence of both for me. I have SLI GTX 1080Ti FTW3's and Crossfire VEGA 64's. Had to run that game in 4K and/or 2560x1440P since that game also has terrible support for Triple monitors. With that said a single high-tier GPU was good enough.

What issues are you experiencing with Destiny 2? Seems to work for my setup with both VEGA 64's and GTX 1080Ti's.

Are you playing at 60hz for your triples and 1600P? I'd argue your setup is probably still good for a single high tier gpu. I don't imagine you would be stressing a single GTX 1080(Ti) and VEGA 64 at that resolution and refresh of 60hz. What other games are you trying to run with multi-gpu?

For reference I also own a GTX 1070 FTW, GTX 970 FTW, VEGA 56 NANO, Crossfire FURY X's, Crossfire VAPOR-X 290X's 8GB, Crossfire HD7970's that i've tested with my setups.
 
Last edited:

cadaveca

My name is Dave
Joined
Apr 10, 2006
Messages
17,233 (2.57/day)
PUBG is a terrible example for SLI/Crossfire support. I agree it was working when V1.0 dropped for both Crossfire and SLI, but the update after that killed any existence of both for me. I have SLI GTX 1080Ti FTW3's and Crossfire VEGA 64's. Had to run that game in 4K and/or 2560x1440P since that game also has terrible support for Triple monitors. With that said a single high-tier GPU was good enough.

What issues are you experiencing with Destiny 2? Seems to work for my setup with both VEGA 64's and GTX 1080Ti's.

Are you playing at 60hz for your triples and 1600P? I'd argue your setup is probably still good for a single high tier gpu. I don't imagine you would be stressing a single GTX 1080(Ti) and VEGA 64 at that resolution and refresh of 60hz. What other games are you trying to run with multi-gpu?

For reference I also own a GTX 1070 FTW, GTX 970 FTW, VEGA 56 NANO, Crossfire FURY X's, Crossfire VAPOR-X 290X's 8GB, Crossfire HD7970's that i've tested with my setups.
The real problem with SLI, as I came to find out (and is present in my cards), is that if your cards reach differing clocks due to Boost, it throws frame sync off in a big way, and can lead to post-processing graphical glitches, such as lighting and fog issues (half the frame is out of sync). In the past, NVidia restricted cards to the same clocks, but they do not any more, and this causes problems. I found I could mitigate some of these problems by having the slower card as primary, but if your secondary card runs slower than the primary by more than about 39 MHz, you start getting bad glitches.

This is such a basic issue that has come back that NVidia dealt with before, but stopped because benchmarks look better when your cards run at the maximum clocks possible. (IMHO).
 
Joined
Jan 6, 2014
Messages
192 (0.05/day)
System Name #projectEVO v4
Processor AMD R9 3950X
Motherboard ASUS X570 Crosshair Impact
Cooling EK Supremacy EVO CPU, Alphacool Monsta 240, Alphacool UT60 360, Aquacomputer Aqauero 6X
Memory GSkills Trident Z Neo 32GB DDR4 CL16 @3600MHz
Video Card(s) EVGA RTX 3090 FTW3 Ultra
Storage Samsung 970 500GB NVME M.2, Seagate 4TB 7200rpm HDD (x2)
Display(s) Samsung G9 Odyssey 49" Ultrawide @240hz | ASUS MG279Q 1440P @144hz | HTC VIVE
Case CaseLabs Mercury S5
Audio Device(s) Marantz PM7000N | Schitt Stack UBER | Beyerdynamic DT1990 | Sennheiser HD6XX | HiFi Man HE4XX
Power Supply EVGA 1600 Watt T2
Mouse HyperX Haste, Logi GroX Superlight, Logi GPro Wireless, Razer Viper Ultimate
Keyboard Drop ALT , HHKB Pro 2, RealForce Topre 104UB, Topre Realforce RGB
Software Windows 10 PRO
The real problem with SLI, as I came to find out (and is present in my cards), is that if your cards reach differing clocks due to Boost, it throws frame sync off in a big way, and can lead to post-processing graphical glitches, such as lighting and fog issues (half the frame is out of sync). In the past, NVidia restricted cards to the same clocks, but they do not any more, and this causes problems. I found I could mitigate some of these problems by having the slower card as primary, but if your secondary card runs slower than the primary by more than about 39 MHz, you start getting bad glitches.

This is such a basic issue that has come back that NVidia dealt with before, but stopped because benchmarks look better when your cards run at the maximum clocks possible. (IMHO).

I agree this really is a problem, not sure if Nivida designed their drivers to simply use the lowest spec gpu as the base when using SLI. Generally you want to utilize two identical cards for the best performance. You really should have considered that. Unlike AMD's crossfire which does exactly that since you're allowed to Crossfire with a lower tier from the same family. Example: Fury+FURYX, VEGA 56+VEGA 64.

EDIT: To be clear yours cards should be matching including the Core and memory clocks. Same exact card and SKU

Another thing to think about is SLI / Crossfire currently have limitations with post processing such as Temporal AA that has the possibility of crippling performance. Some games have that as default without any means to disable, such as Gears of War 4. Disabling certain post processing should increase your performance.

It's little details like this is the reason SLI, Crossfire, and Multi-GPU is more of an advanced user feature that most people are not ready or familiar with.
 
Last edited:
Joined
Jun 28, 2014
Messages
2,388 (0.65/day)
Location
Shenandoah Valley, Virginia USA
System Name Home Brewed
Processor i9-7900X and i7-8700K
Motherboard ASUS ROG Rampage VI Extreme & ASUS Prime Z-370 A
Cooling Corsair 280mm AIO & Thermaltake Water 3.0
Memory 64GB DDR4-3000 GSKill RipJaws-V & 32GB DDR4-3466 GEIL Potenza
Video Card(s) 2X-GTX-1080 SLI & 2 GTX-1070Ti 8GB G1 Gaming in SLI
Storage Both have 2TB HDDs for storage, 480GB SSDs for OS, and 240GB SSDs for Steam Games
Display(s) ACER 28" B286HK 4K & Samsung 32" 1080P
Case NZXT Source 540 & Rosewill Rise Chassis
Audio Device(s) onboard
Power Supply Corsair RM1000 & Corsair RM850
Mouse Generic
Keyboard Razer Blackwidow Tournament & Corsair K90
Software Win-10 Professional
Benchmark Scores yes
The real problem with SLI, as I came to find out (and is present in my cards), is that if your cards reach differing clocks due to Boost, it throws frame sync off in a big way, and can lead to post-processing graphical glitches, such as lighting and fog issues (half the frame is out of sync).

Would this be so with matching GPUs?
 

cadaveca

My name is Dave
Joined
Apr 10, 2006
Messages
17,233 (2.57/day)
Would this be so with matching GPUs?
Yeah I guess its possible. I have one plain gaming 1080 and one gamingX 1080, and one card runs 1836, the other 1949. it took me a long time to figure out it was the clock difference that was causing a fair amount of the issues I have. These are the same card, really, same PCB etc... just different clocks.
 
Top