• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

SLI is really not Dead if you Have a SLI rig

Being Close to 550 Watts isn't that bad when you consider that the RTX 4090 is node 4x smaller while still needing up to around 450 Watts max, & over clock models push it right back up to 550-600watts.
Also Like said before the smaller single bus is going to help. It's one 384 bit bus vs two 384 bit buses totaling 768 bit, it's a lot for a cpu to handle.

No it is bad, when it takes two cards to be half the fps with the same power draw.
In what world is that not bad.
That is 450w max, in the Techpowerup review the 4090 FE used 346w in the gaming list.
The video I watched the guy running the test said the two 2080ti used 550w together.
If he was been truthful the settings were the same with each card.

The buses don't work like that, each card will have the exact same bus to move the data through.
Just the processing power is doubled most of the time. The bus is for the memory on the card not the CPU

(Memory bus can be thought of literally as lanes of traffic. More lanes dedicated for traffic, the greater the flow.
The graphics processor is connected to the RAM on the card via a memory bus(384bit). ) The CPU doesn't touch the memory bus.

mlee49 posted this 10 years ago on here.​

 
Last edited:
"Support" is a pretty broad term in this case.
you want the list of which cards support it?

RX 5600XT
RX 5700
RX 5700x

RX 6400 XT
RX6500 XT
RX 6600
RX 6650 XT
RX 6700
RX 6700 XT
RX 6750 XT
RX 6800
RX 6800 XT
RX 6900 XT
RX6950 XT

RX 7600
RX7600 XT
RX 7700 XT
RX 7800 XT
RX 7900 XT
RX 7900 XTX

Or
do want the game list of which games support mGPU?
I only a few games that I know support it.

1. Ashes of the singularity
2. Chasm the rift Vulkan
3. Deus EX mankind Divided
4. Rise of the tomb raider
5. Shadow of the Tomb raider
6. Quake 2 RTX

No it is bad, when it takes two cards to be half the fps with the same power draw.
In what world is that not bad.

The buses don't work like that, each card will have the exact same bus to move the data through.
Just the processing power is doubled most of the time. The bus is for the memory on the card not the CPU

(Memory bus can be thought of literally as lanes of traffic. More lanes dedicated for traffic, the greater the flow.
The graphics processor is connected to the RAM on the card via a memory bus(384bit). ) The CPU dosn't touch the memory bus.

mlee49 posted this 10 years ago on here.


That is not true for today's cards, with Rebar. It's also not true when the Vram buffer is too small. The Gpu will still have to talk to the Cpu's memory controller to use ram. The cpu will touch the memory bus when it calls for RAM when VRam is out. it will still have to do swap out.
 
Last edited:
What happened to DX12 vendor agnostic mGPU?
 
What happened to DX12 vendor agnostic mGPU?
Simple answer D.L.S.S happened & everyone thinks in free performance, when it's not.
Using A.I cost money, power & time.
 
It's wishful thinking I'm afraid, at the point we're at, 4090 is already 2x faster than the 2080 Ti, add inefficiencies in the mix, it's just not gonna work. Since Nvidia no longer releases the compatibility bits, it's truly become a Hail Mary situation. The newest card that can do SLI is the 3090/3090 Ti I believe, but by then it had all but been abandoned.

If you want to experience what SLI was at its prime, buy a couple of old Kepler cards, maybe a GTX 690 or even a Titan Z if you find one for an acceptable price, and run games from the 2010-2015 timeframe at most. Newer the game, worse it'll be, promise.
4090 is faster if you use DLSS without it 60 FPS is the normal so in reality the actual card sucks.

SLI is dead because it was costly to code into games, and single cards were doing just fine. Also why does anyone want to run two 250-400w cards that'll add stuttering, latency, issues?
Wow SLI removes stuttering and latency issues. I rememver Crysis 3 running on single card and I got tearing and pause latency issues and stuttering in SLI smooth as glass.

Run any game with stupid enough settings and your GPU will choke no matter how beefy it is or how many of them you have. Also the super finicky driver support needed for a game to work well with SLI would make it an unattractive option given how spoiled we are with things like VRR. I would rather have a buttery smooth, framerate capped 120FPS with freesync/gsync over a "max throughput" 240FPS that's shot to hell with tearing and microstutter any day.

Also calling the 4090 a "POS card" is a dead giveaway that you haven't actually experienced one before. Those things are godly. A 4090 would roll those SLI 2080Tis up in a joint and smoke them.
This is 4k on 4090 running cyberpunk with no DLSS 30 fps thats what you call Godly. WOW that what I call POS. I'm to say this but WTH?
 
What happened to DX12 vendor agnostic mGPU?
Ashes of the Singularity is one of the first Dx12 Games and does support that.
 
4090 is faster if you use DLSS without it 60 FPS is the normal so in reality the actual card Bucks.
Based on that logic, every GPU sucks.
I don't use a stupid feature that make the picture worse.

I would like to know where you see 60fps as the norm for a 4090.
I know for me it is, because I use V-sync and I have my display set at 4k60.
If I run unlocked and at 4k120, I am constantly getting over 60fps in games and no, I don't mean I'm getting 61fps then.

I like how you pick a video where an over rated feature is used that would hurt even a precious 3090ti sli set up.
Go look at a Techpowerup review of the 4090, if you don't waste your time with RT, the 4090 is the only card that gets over 60fps in Cyberpunk.
 
Last edited:
4090 is faster if you use DLSS without it 60 FPS is the normal so in reality the actual card sucks.


Wow SLI removes stuttering and latency issues. I rememver Crysis 3 running on single card and I got tearing and pause latency issues and stuttering in SLI smooth as glass.


This is 4k on 4090 running cyberpunk with no DLSS 30 fps thats what you call Godly. WOW that what I call POS. I'm to say this but WTH?
Only things that suck with a 4090 is power draw and price.

SLI adds latency and stuttering. I've literally ran an SLI setup until a month or so ago on a different machine. I had SLI in my main rig years ago and games were running better on a single card.

2077 is a horrible benchmark, however max settings include pathtracing which will eat ANY card on the market. Run normal RTX and I'm sure the 4090 is fine.

This thread is a PEBCAK error with you.
 
2077 is a horrible benchmark, however max settings include pathtracing which will eat ANY card on the market. Run normal RTX and I'm sure the 4090 is fine.

The Techpowerup review gets 41fps with rtx on and I just ran a test and I got about 47fps with rtx on medium and other settings a mix of high and ultra at 4k.
So if you are set on having RTX on in Cyberpunk you either have to kiss 60fps gaming or lower settings to low\medium or use DLSS, even with a 4090.

Now Lycanwolfen can say, see how the 4090 is a POS card since it can't do 60 fps in one game with extreme settings.
 
I had a Ryzen 5 2600 back then :rockout:

Yeah, that would have been pretty limiting. Not enough pcie lanes, and too slow IPC.

Only things that suck with a 4090 is power draw and price.

SLI adds latency and stuttering. I've literally ran an SLI setup until a month or so ago on a different machine. I had SLI in my main rig years ago and games were running better on a single card.

2077 is a horrible benchmark, however max settings include pathtracing which will eat ANY card on the market. Run normal RTX and I'm sure the 4090 is fine.

This thread is a PEBCAK error with you.

People aaaaalways use that argument with 4090 - "but powahhh draw!!!"... but thing is, essentially anyone who has any sorts of smarts runs them undervolted...

I run mine at 2700 mhz @ 925 mv... meaning powerdraw in games is reduced by roughly 150 watts.
 
People aaaaalways use that argument with 4090 - "but powahhh draw!!!"... but thing is, essentially anyone who has any sorts of smarts runs them undervolted...

I run mine at 2700 mhz @ 925 mv... meaning powerdraw in games is reduced by roughly 150 watts.
That 2000+ EUR price tag is the reason why I literally hate the whole card. Felt difficult to pay even that 480EUR what I paid for my 6700 XT..
 
This thread is comedy. Complaining about a 4090 in CP 2077 with RT on when the 2080tis barely can RT and RT is not a need but a want and an expensive want at that. Trollaololo... facepalm.
 
4090 is faster if you use DLSS without it 60 FPS is the normal so in reality the actual card sucks.


Wow SLI removes stuttering and latency issues. I rememver Crysis 3 running on single card and I got tearing and pause latency issues and stuttering in SLI smooth as glass.


This is 4k on 4090 running cyberpunk with no DLSS 30 fps thats what you call Godly. WOW that what I call POS. I'm to say this but WTH?

I'm afraid the 4090 is invariably faster, and its render latency is much lower with far better utilization of the hardware. I'm willing to bet my RTX 4080 is too in most if not all scenarios.
 
SLI is as dead to developers as this thread is to moderators wasting their efforts with it rather than locking it.
 
I had better luck with Crossfire than SLI, but it always seemed like I was CPU bound when I ran such setups so it wasn't very helpful except for benchmarks. The games that it did seem to work right I felt like I was either bandwidth bound or hitting VRAM limit anyway.

But I do believe in multi-GPU, and it could make a comeback in some form with multiple GPU core chiplet dies. Maybe HBM or 3D cache technologies that we have now could solve some of the latency issues.
 
What happened to DX12 vendor agnostic mGPU?

2 things happened.

1. With dx12 mgpu it was now on game devs rather than gpu vendors on supporting multi-gpu in games... and anyone with half a brain knew exactly what was gonna happen as soon as it got announced... exactly what did end up happening, multi gpu effectively died as soon as games started using dx12.

2. Temporal effects became the big thing (TAA etc), meaning you either had to disable those (which looks... very bad... in games made to use it), or performance TANKED when one gpu had to fetch previous frame data on the other gpu, leading to the bandwidth between the gpus to become completely saturated.

Last game to run good and look good with sli was battlefield 5 with a bit of sli hacks in dx11 mode (cause official sli support had stopped at that point).

I had better luck with Crossfire than SLI, but it always seemed like I was CPU bound when I ran such setups so it wasn't very helpful except for benchmarks. The games that it did seem to work right I felt like I was either bandwidth bound or hitting VRAM limit anyway.

But I do believe in multi-GPU, and it could make a comeback in some form with multiple GPU core chiplet dies. Maybe HBM or 3D cache technologies that we have now could solve some of the latency issues.

It depended alot on your setup. You wanted a cpu with at least 40 pcie lanes, and you wanted gpus with a good amount of vram - preferably double vram versions.

I remember back in the day in far cry 2, i was running quad sli - 1 gpu got me 40 fps, all 4 gpus got me 110 fps... was very impressed (and surprised) by the scaling.

But cryengine and frostbite engine always loved sli.
 
Hi,
Did i miss the op's sli game play proving his claims ?
 
I still buy boards that can do it, because old cards can be fun sometimes. I have not attempted to run SLI since 2016 or thereabouts.. so it isn't a real priority for me, but I still think about it :D
 
I still buy boards that can do it, because old cards can be fun sometimes. I have not attempted to run SLI since 2016 or thereabouts.. so it isn't a real priority for me, but I still think about it :D
Yeah, as I mean, implementing it to motherboards doesn't cost much.
 
I still buy boards that can do it, because old cards can be fun sometimes. I have not attempted to run SLI since 2016 or thereabouts.. so it isn't a real priority for me, but I still think about it :D

Using quad sli hack for 1080 ti could net you some pretty insane performance with pre 2018 games at 8k, if you ever decide to mess with it :p

Yeah, as I mean, implementing it to motherboards doesn't cost much.

Honestly though, it is defacto dead at this point - all the new tech being used in games today relies on previous frame info - something you can't do with sli without a massive performance penalty.

They may aswell save the money.
 
NVIDIA SLI GeForce RTX 2080 Ti and RTX 2080 with NVLink Review
the games that support SLI still do but there is a very strange trend 2080 Ti is now even more expensive than what I paid for mine 6 months ago. apparently people think 2080 Ti is worth it for some reason compared to a 4060 Ti 16GB. RTX 5070 will probably do in 200 W what the hypothetical SLI can do in 400 with power limit set to 200W per card 600 W without.
 
you want the list of which cards support it?

RX 5600XT
RX 5700
RX 5700x

RX 6400 XT
RX6500 XT
RX 6600
RX 6650 XT
RX 6700
RX 6700 XT
RX 6750 XT
RX 6800
RX 6800 XT
RX 6900 XT
RX6950 XT

RX 7600
RX7600 XT
RX 7700 XT
RX 7800 XT
RX 7900 XT
RX 7900 XTX

Or
do want the game list of which games support mGPU?
I only a few games that I know support it.

1. Ashes of the singularity
2. Chasm the rift Vulkan
3. Deus EX mankind Divided
4. Rise of the tomb raider
5. Shadow of the Tomb raider
6. Quake 2 RTX



That is not true for today's cards, with Rebar. It's also not true when the Vram buffer is too small. The Gpu will still have to talk to the Cpu's memory controller to use ram. The cpu will touch the memory bus when it calls for RAM when VRam is out. it will still have to do swap out.

Shadow of the tomb raider runs horribly though, unless you only use fxaa, which looks absolutely TERRIBLE in motion, as the game was very clearly designed for TAA.

I will gladly share screenshots highlighting the issues tomorrow.
 
you want the list of which cards support it?

RX 5600XT
RX 5700
RX 5700x

RX 6400 XT
RX6500 XT
RX 6600
RX 6650 XT
RX 6700
RX 6700 XT
RX 6750 XT
RX 6800
RX 6800 XT
RX 6900 XT
RX6950 XT

RX 7600
RX7600 XT
RX 7700 XT
RX 7800 XT
RX 7900 XT
RX 7900 XTX

Or
do want the game list of which games support mGPU?
I only a few games that I know support it.

1. Ashes of the singularity
2. Chasm the rift Vulkan
3. Deus EX mankind Divided
4. Rise of the tomb raider
5. Shadow of the Tomb raider
6. Quake 2 RTX



That is not true for today's cards, with Rebar. It's also not true when the Vram buffer is too small. The Gpu will still have to talk to the Cpu's memory controller to use ram. The cpu will touch the memory bus when it calls for RAM when VRam is out. it will still have to do swap out.
Does 3DMark support it? I saw results with RX 6700 + 6700 XT
 
Back
Top