• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Could Intel create an equivalent tech to SLI/Crossfire?

Joined
Jun 2, 2017
Messages
8,158 (3.19/day)
System Name Best AMD Computer
Processor AMD 7900X3D
Motherboard Asus X670E E Strix
Cooling In Win SR36
Memory GSKILL DDR5 32GB 5200 30
Video Card(s) Sapphire Pulse 7900XT (Watercooled)
Storage Corsair MP 700, Seagate 530 2Tb, Adata SX8200 2TBx2, Kingston 2 TBx2, Micron 8 TB, WD AN 1500
Display(s) GIGABYTE FV43U
Case Corsair 7000D Airflow
Audio Device(s) Corsair Void Pro, Logitch Z523 5.1
Power Supply Deepcool 1000M
Mouse Logitech g7 gaming mouse
Keyboard Logitech G510
Software Windows 11 Pro 64 Steam. GOG, Uplay, Origin
Benchmark Scores Firestrike: 46183 Time Spy: 25121
They could, but would they pay for it. Let's remember this was already done by AMD on AM3. I forget the exact chipset but you could use a R7 250 to get more frames. At the root though you would need developers to write that support into Games but GPUs are weak and APUs that can Game already eat 8 PCie lanes so.
 
Joined
Apr 18, 2019
Messages
2,155 (1.15/day)
Location
Olympia, WA
System Name Sleepy Painter
Processor AMD Ryzen 5 3600
Motherboard Asus TuF Gaming X570-PLUS/WIFI
Cooling FSP Windale 6 - Passive
Memory 2x16GB F4-3600C16-16GVKC @ 16-19-21-36-58-1T
Video Card(s) MSI RX580 8GB
Storage 2x Samsung PM963 960GB nVME RAID0, Crucial BX500 1TB SATA, WD Blue 3D 2TB SATA
Display(s) Microboard 32" Curved 1080P 144hz VA w/ Freesync
Case NZXT Gamma Classic Black
Audio Device(s) Asus Xonar D1
Power Supply Rosewill 1KW on 240V@60hz
Mouse Logitech MX518 Legend
Keyboard Red Dragon K552
Software Windows 10 Enterprise 2019 LTSC 1809 17763.1757
I'd say it's plausible, *if* each cards' resources could be virtually pooled together as one MCM-over-PCIe GPU. I may be incorrect but, I believe Arc's Xe HPG lineage might allow for such a thing.
After all, Gen4x16 is considerably higher bandwidth than Gen3x16.

The issue that comes up is less the hardware, and more software.
Even if Intel created something similar for GPGPU workloads, Intel is notorious for keeping tech in-industry, or charging for 'features'. (which, they routinely get away with in-industry, but less-so in the consumer market).
Not to mention, it's yet another thing for their Arc driver devs to have to deal with; and they're doing their best already.
 
Joined
Aug 14, 2013
Messages
2,373 (0.60/day)
System Name boomer--->zoomer not your typical millenial build
Processor i5-760 @ 3.8ghz + turbo ~goes wayyyyyyyyy fast cuz turboooooz~
Motherboard P55-GD80 ~best motherboard ever designed~
Cooling NH-D15 ~double stack thot twerk all day~
Memory 16GB Crucial Ballistix LP ~memory gone AWOL~
Video Card(s) MSI GTX 970 ~*~GOLDEN EDITION~*~ RAWRRRRRR
Storage 500GB Samsung 850 Evo (OS X, *nix), 128GB Samsung 840 Pro (W10 Pro), 1TB SpinPoint F3 ~best in class
Display(s) ASUS VW246H ~best 24" you've seen *FULL HD* *1O80PP* *SLAPS*~
Case FT02-W ~the W stands for white but it's brushed aluminum except for the disgusting ODD bays; *cries*
Audio Device(s) A LOT
Power Supply 850W EVGA SuperNova G2 ~hot fire like champagne~
Mouse CM Spawn ~cmcz R c00l seth mcfarlane darawss~
Keyboard CM QF Rapid - Browns ~fastrrr kees for fstr teens~
Software integrated into the chassis
Benchmark Scores 9999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999
Iirc the only real issue was the lack of developers/studios actually supporting it. Sometimes it scaled really well, but most of the time it was an afterthought with minor performance improvements or just flat out unsupported.

I mean, of course it’s possible — it was standard on almost all GPUs for a decade plus. But, having used it myself three or four times, it was just too poorly supported to bother with (and yes microstutter and heat).
 
Joined
Feb 24, 2023
Messages
2,306 (4.95/day)
Location
Russian Wild West
System Name DLSS / YOLO-PC
Processor i5-12400F / 10600KF
Motherboard Gigabyte B760M DS3H / Z490 Vision D
Cooling Laminar RM1 / Gammaxx 400
Memory 32 GB DDR4-3200 / 16 GB DDR4-3333
Video Card(s) RX 6700 XT / RX 480 8 GB
Storage A couple SSDs, m.2 NVMe included / 240 GB CX1 + 1 TB WD HDD
Display(s) Compit HA2704 / Viewsonic VX3276-MHD-2
Case Matrexx 55 / Junkyard special
Audio Device(s) Want loud, use headphones. Want quiet, use satellites.
Power Supply Thermaltake 1000 W / FSP Epsilon 700 W / Corsair CX650M [backup]
Mouse Don't disturb, cheese eating in progress...
Keyboard Makes some noise. Probably onto something.
VR HMD I live in real reality and don't need a virtual one.
Software Windows 10 and 11
This will all of the sudden make RTX 4060 Ti 16 GB make some sense. Please don't unembarass this GPU. I object!
 
Joined
Mar 21, 2016
Messages
2,217 (0.74/day)
Couldn't Intel do something a bit like Lucid Hydra with direct storage to shorten the round trip time with compression/decompression. I would think between direct storage, PCIE 4.0/5.0, and DDR5 not to mention CPU's and GPU's with larger cache SLI/CF style mGPU would easily be better today.

Beyond all of that they could utilize something like CFR checkerboard frame rendering where each GPU uses it's own system resources actually that was suppose to be one of the major mGPU things with DX12 that Microsoft had touted at the time being able to leverage VRAM allocation on each GPU instead just one GPU and mirror copying it between them each.

What could done differently today with multiple GPU rendering is develop a seamless ways to leverage checkerboard frame render combined with variable rate shading techniques.

Now think about that a moment with post process and with AA and/or AF for example. Take a 3x3 pixel tile and you've got a anti-aliasing tile block multiply it and you've got several. Want a long column or row link them together want a bigger square join them together. Want it to do gradient effects no Fing problem 100%/90%/80%/F You%/60%/50% sorry 70% F you...don't draw the short straw next time no hard feelings. Either way you get the idea tile based variable rate shading seems like a obvious place to leverage multiple GPU's with checkerboard frame rendering. Want to include or exclude a tile no problem F you tile 4 row/column #'s F you both! That's the magic AI inference algorithm F you both I don't need you this time check back later when I want you to do some stuff again.
 
Joined
Oct 18, 2013
Messages
5,617 (1.45/day)
Location
Everywhere all the time all at once
System Name The Little One
Processor i5-11320H @4.4GHZ
Motherboard AZW SEI
Cooling Fan w/heat pipes + side & rear vents
Memory 64GB Crucial DDR4-3200 (2x 32GB)
Video Card(s) Iris XE
Storage WD Black SN850X 4TB m.2, Seagate 2TB SSD + SN850 4TB x2 in an external enclosure
Display(s) 2x Samsung 43" & 2x 32"
Case Practically identical to a mac mini, just purrtier in slate blue, & with 3x usb ports on the front !
Audio Device(s) Yamaha ATS-1060 Bluetooth Soundbar & Subwoofer
Power Supply 65w brick
Mouse Logitech MX Master 2
Keyboard Logitech G613 mechanical wireless
Software Windows 10 pro 64 bit, with all the unnecessary background shitzu turned OFF !
Benchmark Scores PDQ
I feel like Intel needs to work on a lot of other things before this should even make it's way onto the list of things their GPU division needs to do.
^^THIS^^

IF
, & when, they can ever produce a credible card, with high-quality & stable drivers, to meet or beat the other 2, then perhaps they could look into the mGPU thingy, but given their other issues right now, that would be a massive waste of time & money....even IF they could convince the gamz dev's to get support it, which would probably be at least as difficult & expensive, if not more so, than developing the card(s) to start with...
 
Joined
Dec 12, 2020
Messages
1,755 (1.38/day)
Since Intel is very much the underdog in the gaming GPU market now and doesn't seem to have anything competitive w/the duopoly at anything except low tier, they're going to have to do something to generate interest in their GPU products.
 
Joined
Jan 10, 2011
Messages
1,346 (0.28/day)
Location
[Formerly] Khartoum, Sudan.
System Name 192.168.1.1~192.168.1.100
Processor AMD Ryzen5 5600G.
Motherboard Gigabyte B550m DS3H.
Cooling AMD Wraith Stealth.
Memory 16GB Crucial DDR4.
Video Card(s) Gigabyte GTX 1080 OC (Underclocked, underpowered).
Storage Samsung 980 NVME 500GB && Assortment of SSDs.
Display(s) LG 24MK430 primary && Samsung S24D590 secondary
Case Corsair Graphite 780T.
Audio Device(s) On-Board.
Power Supply SeaSonic CORE GM-650.
Mouse Coolermaster MM530.
Keyboard Kingston HyperX Alloy FPS.
VR HMD A pair of OP spectacles.
Software Ubuntu 22.04 LTS.
Benchmark Scores Me no know English. What bench mean? Bench like one sit on?
Why can't mGPU be implemented in drivers transparently (i.e. to games developers)?

Because the implicit (xfire/SLI) approach adds more load to driver development process to produce compatibility profiles and, in some (many?) cases, still needs gamedevs to modify their games to achieve any meaningful gains.

Forcing the gamedev to handle multi-GPU workload distribution greatly simplifies the process, makes driver development easier, and opens up more possibilities than with the rigid, implicit approach.

On topic: KISS.
SLI/Xfire were a complex thing that required much work for gains easily beaten by a generational upgrade or even jumping tiers in the same gen.
Focusing on improving single GPU performance benefits everyone. Wasting resources on implicit mGPU benefits a few (and harms everyone else).
 
Joined
Nov 26, 2021
Messages
1,372 (1.49/day)
Location
Mississauga, Canada
Processor Ryzen 7 5700X
Motherboard ASUS TUF Gaming X570-PRO (WiFi 6)
Cooling Noctua NH-C14S (two fans)
Memory 2x16GB DDR4 3200
Video Card(s) Reference Vega 64
Storage Intel 665p 1TB, WD Black SN850X 2TB, Crucial MX300 1TB SATA, Samsung 830 256 GB SATA
Display(s) Nixeus NX-EDG27, and Samsung S23A700
Case Fractal Design R5
Power Supply Seasonic PRIME TITANIUM 850W
Mouse Logitech
VR HMD Oculus Rift
Software Windows 11 Pro, and Ubuntu 20.04
Could? Yes. Would they want to? Or anyone for that matter? No. Multiple GPU rendering is dead, and honestly it was more of bandaid fix for anemic performance back then. It's all about finesse now. Things like shader execution reordering - that's the path forward.
Shader execution reordering is a very cool feature, but to be fair, Intel beat Nvidia in bringing this feature to the market.
 
Joined
Mar 14, 2014
Messages
1,297 (0.35/day)
Processor i7-4790K 4.6GHz @1.29v
Motherboard ASUS Maximus Hero VII Z97
Cooling Noctua NH-U14S
Memory G. Skill Trident X 2x8GB 2133MHz
Video Card(s) Asus Tuf RTX 3060 V1 FHR (Newegg Shuffle)
Storage OS 120GB Kingston V300, Samsung 850 Pro 512GB , 3TB Hitachi HDD, 2x5TB Toshiba X300, 500GB M.2 @ x2
Display(s) Lenovo y27g 1080p 144Hz
Case Fractal Design Define R4
Audio Device(s) AKG Q701's w/ O2+ODAC (Sounds a little bright)
Power Supply EVGA Supernova G2 850w
Mouse Glorious Model D
Keyboard Rosewill Full Size. Red Switches. Blue Leds. RK-9100xBRE - Hate this. way to big
Software Win10
Benchmark Scores 3DMark FireStrike Score : needs updating
I loved SLI when it worked.

What I'd rather see today is a dedicated RT card just like we used to with PhysX.
 
Joined
Mar 21, 2016
Messages
2,217 (0.74/day)
I would hope Intel could create something that isn't equivalent since SLI/CF were a puke red mess of abandoned rubbish. That's just my hot take though since they were full of problems and just when you thought hey maybe they'll iron them out with some new innovation to some of the obvious bottleneck choke points they abandoned them nearly outright both. I don't think they they want to sell multiple GPU options today more than it can't be done. It's simply they've come to the conclusion they make more money by not offering it and selling more generational halo tier options and repeating the process over again while giving incremental generational uplift. People seem willing to pay it though out of desperation for more performance.

There is the developer angle as well, but instead of SLI/CF their now supporting 3 different god damn upscale techniques in place of it in some cases so really what did they gain in the end!? If they were too lazy to do SLI/CF well I can tell you right now they won't be any less lazy with upscale or RTRT effects or pretty much anything else that includes involved effort levels rather than quick production of half baked early access micro transaction riddled dumpster fire games. Just the same games are difficult to make well as a individual or small group of individuals. There is a lot that goes into making a good proper game and with expectations society has for their money spent on them.

I don't know if Intel will try take a serious swing at multiple GPU rendering or not, but I can't imagine they can do any worse at it. It would be hard to do worse given all the different improvements in area's like direct storage, PCIE, system memory, cpu cores and cache plus stuff like variable rate shading and post processing which could be offloaded to another GPU or alternate between them for techniques perhaps. They almost have to try to do worse at it today in order to do so at it today because there is no reason it should be better today if it were available with modern approaches. It would be rather impressively broken if worse. Also generational GPU progress has slowed down a lot so chances are the uplift would seem a whole lot more reasonable now even w/o perfect scaling, but it was pretty good anyway and biggest fault was micro stutter which was latency and bandwidth related and that's improved within systems a modest amount.

Notice that NVIDIA killed off SLI more at that low end and mid range earlier that tells you all you need to know about their agenda. They would much sooner push single cards at more cost instead and AMD doesn't mind either nor do the lazy developers not having to code for SLI/CF especially as their being pushed to adopt coding in fakescale technology to make your render at near non inferior render like quality relative to native yes native like as in unprocessed this how it renders we haven't tried to pitch a orange as a apple to you we rendered a apple not a orange meanwhile orange you glad I made that remark probably not if you're a fan deep fakes like watered down skim milk. I promise it's not near milk quality it's percentile milk quality don't be duped by the low grade milk with 120% water added. That's no longer milk it's closer to water. Don't have a cow man, but maybe enjoy real 100% god damn good high quality milk. Also no matter much you like almonds it's still not milk you don't get almonds out of a cow unless you feed them a lot of almonds. I'm pretty sure that's not how it's made either. Almond milk now processed with 100% cow technology the way it's meant to be made.
 

Mussels

Freshwater Moderator
Staff member
Joined
Oct 6, 2004
Messages
58,413 (8.13/day)
Location
Oystralia
System Name Rainbow Sparkles (Power efficient, <350W gaming load)
Processor Ryzen R7 5800x3D (Undervolted, 4.45GHz all core)
Motherboard Asus x570-F (BIOS Modded)
Cooling Alphacool Apex UV - Alphacool Eisblock XPX Aurora + EK Quantum ARGB 3090 w/ active backplate
Memory 2x32GB DDR4 3600 Corsair Vengeance RGB @3866 C18-22-22-22-42 TRFC704 (1.4V Hynix MJR - SoC 1.15V)
Video Card(s) Galax RTX 3090 SG 24GB: Underclocked to 1700Mhz 0.750v (375W down to 250W))
Storage 2TB WD SN850 NVME + 1TB Sasmsung 970 Pro NVME + 1TB Intel 6000P NVME USB 3.2
Display(s) Phillips 32 32M1N5800A (4k144), LG 32" (4K60) | Gigabyte G32QC (2k165) | Phillips 328m6fjrmb (2K144)
Case Fractal Design R6
Audio Device(s) Logitech G560 | Corsair Void pro RGB |Blue Yeti mic
Power Supply Fractal Ion+ 2 860W (Platinum) (This thing is God-tier. Silent and TINY)
Mouse Logitech G Pro wireless + Steelseries Prisma XL
Keyboard Razer Huntsman TE ( Sexy white keycaps)
VR HMD Oculus Rift S + Quest 2
Software Windows 11 pro x64 (Yes, it's genuinely a good OS) OpenRGB - ditch the branded bloatware!
Benchmark Scores Nyooom.
We have this, it's called "DX12 Explicit multi GPU" and it's upto game devs to implement

Which they do not.


SLI and crossfire died because past a certain point it became a serious issue to get it working in more and more complex game engines, like how DLSS etc all need to be coded in by the game devs - the market wasnt there, so they never did.

No one wants a game that needs two GPUs to run, so the higher ups see no point to marketing for it, or spending money towards it - only Nvidia and AMD ever profited, and they couldnt get games working right.

Starcraft II (DX9) had an SLI profile, by the time it's final expansion dropped that profile was "disable SLI"
Killing floor 2 had an SLI profile, which could be summed up as "flickering mess"


None of this was ever fixed, they reached a point where they couldnt do it without the help from the devs and the devs gave no shits
 
Joined
Mar 14, 2014
Messages
1,297 (0.35/day)
Processor i7-4790K 4.6GHz @1.29v
Motherboard ASUS Maximus Hero VII Z97
Cooling Noctua NH-U14S
Memory G. Skill Trident X 2x8GB 2133MHz
Video Card(s) Asus Tuf RTX 3060 V1 FHR (Newegg Shuffle)
Storage OS 120GB Kingston V300, Samsung 850 Pro 512GB , 3TB Hitachi HDD, 2x5TB Toshiba X300, 500GB M.2 @ x2
Display(s) Lenovo y27g 1080p 144Hz
Case Fractal Design Define R4
Audio Device(s) AKG Q701's w/ O2+ODAC (Sounds a little bright)
Power Supply EVGA Supernova G2 850w
Mouse Glorious Model D
Keyboard Rosewill Full Size. Red Switches. Blue Leds. RK-9100xBRE - Hate this. way to big
Software Win10
Benchmark Scores 3DMark FireStrike Score : needs updating
We have this, it's called "DX12 Explicit multi GPU" and it's upto game devs to implement

Which they do not.
The last one I can think of off the top of my head is RDR2 using Vulkan. Which is still a little difficult to find a good mGPU comparison of.
 
Joined
Apr 18, 2019
Messages
2,155 (1.15/day)
Location
Olympia, WA
System Name Sleepy Painter
Processor AMD Ryzen 5 3600
Motherboard Asus TuF Gaming X570-PLUS/WIFI
Cooling FSP Windale 6 - Passive
Memory 2x16GB F4-3600C16-16GVKC @ 16-19-21-36-58-1T
Video Card(s) MSI RX580 8GB
Storage 2x Samsung PM963 960GB nVME RAID0, Crucial BX500 1TB SATA, WD Blue 3D 2TB SATA
Display(s) Microboard 32" Curved 1080P 144hz VA w/ Freesync
Case NZXT Gamma Classic Black
Audio Device(s) Asus Xonar D1
Power Supply Rosewill 1KW on 240V@60hz
Mouse Logitech MX518 Legend
Keyboard Red Dragon K552
Software Windows 10 Enterprise 2019 LTSC 1809 17763.1757
AMD Presentation on Explicit mGPU:

nVidia 1st party deep-dive on DX12 Explicit mGPU implementations:

Dunno if anyone else has noticed, but (at least) Crossfire(X) has been entirely supplanted by DX12/VK mGPU. (nVidia appears to still use SLI terminology, intermittently)
AMD very directly implies AMD MGPU works generically on pre-DX12/VK-mGPU-supporting titles. (supplanting CrossfireX)
Multi-GPU support and performance varies by applications and graphics APIs. For example, games/applications using DirectX® 9, 10, 11 and OpenGL must run in exclusive full-screen mode to take advantage of AMD MGPU. For DirectX® 12 and Vulkan® titles, multi-GPU operation is exclusively handled by the application and configured from the in-app/game menu for graphics/video.
See also: https://community.amd.com/t5/graphics/about-mgpu-technology/m-p/575585

-Even, in 3rd Party marketing. (example: My 6500 XT was advertised as CrossfireX Ready, and others still are.)
lolCrossfireX.png


As mentioned, support is abysmal
As of 4 years ago, there were only a handful of titles:
  1. Ashes Of The Singularity(http://www.kitguru.net/gaming/matth...and-amd-team-up-for-total-war-warhammer-dx12/)
  2. Rise Of The Tomb Raider(http://wccftech.com/rise-of-the-tomb-raider-pc-patch-dx12-multi-gpu-async-compute-vsync/)
  3. Deus Ex: Mankind Divided(https://gaming.radeon.com/en/deus-ex-directx-12-mgpu/)
  4. Total War: Warhammer(Now here in Beta form https://www.reddit.com/r/totalwar/comments/4qr92s/_/d4vc4oc )
  5. Civilization VI(http://www.pcworld.com/article/3143...directx-12-new-multiplayer-mode-and-maps.html)
  6. Hitman(
    )In action.
  7. Gears of War 4(
    )In action.
  8. Sniper Elite 4(
    )Performance is superior to DX11 Crossfire mode.
  9. Halo Wars 2(https://www.tweaktown.com/news/5821...-update-includes-multi-gpu-support/index.html)This is a DX12 (Feature Level 12) only game, so although the artlicle refers to SLI/Crossfire it is infact DX12 m-GPU that has been patched in.
 
Last edited:
Joined
Jul 15, 2019
Messages
500 (0.28/day)
Location
Hungary
System Name Detox sleeper
Processor Intel i9-7980XE@4,5Ghz
Motherboard Asrock x299 Taichi XE (custom bios with ecc reg support, old microcode)
Cooling Custom water: Alphacool XT45 1080 + 9xArctic P12, EK-D5 pump combo, EK Velocity D-RGB block
Memory 8x16Gb Hynix DJR ECC REG 3200@4000
Video Card(s) Intel Arc A770 LE 16Gb + RTX 3080 FE 10Gb
Storage Samsung PM9A1 1Tb + PM981 512Gb + Kingston HyperX 480Gb + Samsung Evo 860 500Gb
Display(s) HP ZR30W (30" 2560x1600)
Case Chieftec 1E0-500A-CT04 + AMD Sempron sticker
Audio Device(s) Genius Cavimanus
Power Supply Super Flower Leadex 750w Platinum
Mouse Logitech G400
Keyboard Dell Oem + Focus Fk2000 plus
Software Windows 11 Pro x64
I have an A770 and an A380, I will try them when I have time.
I think dx12 mgpu games will work fine because these not need any driver support just dx12 support.
Very sad that only a few games support it.
20230803_073151.jpg
 
Last edited:
Joined
Apr 30, 2020
Messages
870 (0.58/day)
System Name S.L.I + RTX research rig
Processor Ryzen 7 5800X 3D.
Motherboard MSI MEG ACE X570
Cooling Corsair H150i Cappellx
Memory Corsair Vengeance pro RGB 3200mhz 16Gbs
Video Card(s) 2x Dell RTX 2080 Ti in S.L.I
Storage Western digital Sata 6.0 SDD 500gb + fanxiang S660 4TB PCIe 4.0 NVMe M.2
Display(s) HP X24i
Case Corsair 7000D Airflow
Power Supply EVGA G+1600watts
Mouse Corsair Scimitar
Keyboard Cosair K55 Pro RGB
Because the implicit (xfire/SLI) approach adds more load to driver development process to produce compatibility profiles and, in some (many?) cases, still needs gamedevs to modify their games to achieve any meaningful gains.

Forcing the gamedev to handle multi-GPU workload distribution greatly simplifies the process, makes driver development easier, and opens up more possibilities than with the rigid, implicit approach.

On topic: KISS.
SLI/Xfire were a complex thing that required much work for gains easily beaten by a generational upgrade or even jumping tiers in the same gen.
Focusing on improving single GPU performance benefits everyone. Wasting resources on implicit mGPU benefits a few (and harms everyone else).
That whole last state is false.
The options of more choices should matter to everyone especially since it's a feature of DX12 itself. Also because of the obscene prices of newer generation GPU's.
 
Joined
Jan 14, 2019
Messages
10,162 (5.16/day)
Location
Midlands, UK
System Name Holiday Season Budget Computer (HSBC)
Processor AMD Ryzen 7 7700X
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 16 GB Corsair Vengeance EXPO DDR5-6000
Video Card(s) Sapphire Pulse Radeon RX 6500 XT 4 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2, 4 + 8 TB Seagate Barracuda 3.5"
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Windows 10 Pro
Multi-GPU is dead with DirectX 12 and modern GPU architectures, so I highly doubt it. Not to mention, they have to get their single-GPU performance and power consumption in order first.
 
Joined
Mar 14, 2014
Messages
1,297 (0.35/day)
Processor i7-4790K 4.6GHz @1.29v
Motherboard ASUS Maximus Hero VII Z97
Cooling Noctua NH-U14S
Memory G. Skill Trident X 2x8GB 2133MHz
Video Card(s) Asus Tuf RTX 3060 V1 FHR (Newegg Shuffle)
Storage OS 120GB Kingston V300, Samsung 850 Pro 512GB , 3TB Hitachi HDD, 2x5TB Toshiba X300, 500GB M.2 @ x2
Display(s) Lenovo y27g 1080p 144Hz
Case Fractal Design Define R4
Audio Device(s) AKG Q701's w/ O2+ODAC (Sounds a little bright)
Power Supply EVGA Supernova G2 850w
Mouse Glorious Model D
Keyboard Rosewill Full Size. Red Switches. Blue Leds. RK-9100xBRE - Hate this. way to big
Software Win10
Benchmark Scores 3DMark FireStrike Score : needs updating
Multi-GPU is dead with DirectX 12 and modern GPU architectures, so I highly doubt it. Not to mention, they have to get their single-GPU performance and power consumption in order first.
Gtx 480 was like a 250w card. Sure the current upper tier prob pulls too much power to have SLI throughout the stack but the smaller cards are prime for it. Two 4080s under 300w each would be acceptable. But 30-50w less from the upper stack would be much much better.
 
Joined
Jan 14, 2019
Messages
10,162 (5.16/day)
Location
Midlands, UK
System Name Holiday Season Budget Computer (HSBC)
Processor AMD Ryzen 7 7700X
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 16 GB Corsair Vengeance EXPO DDR5-6000
Video Card(s) Sapphire Pulse Radeon RX 6500 XT 4 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2, 4 + 8 TB Seagate Barracuda 3.5"
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Windows 10 Pro
Gtx 480 was like a 250w card. Sure the current upper tier prob pulls too much power to have SLI throughout the stack but the smaller cards are prime for it. Two 4080s under 300w each would be acceptable. But 30-50w less from the upper stack would be much much better.
You'd be limited by VRAM and PCI-e bandwidth, just like you were back in the days. That's why SLi and CF were never that popular, I guess. Not to mention Nvidia killed SLi in the lower, then mid-range, and only later in the high-end.
 
Joined
Mar 14, 2014
Messages
1,297 (0.35/day)
Processor i7-4790K 4.6GHz @1.29v
Motherboard ASUS Maximus Hero VII Z97
Cooling Noctua NH-U14S
Memory G. Skill Trident X 2x8GB 2133MHz
Video Card(s) Asus Tuf RTX 3060 V1 FHR (Newegg Shuffle)
Storage OS 120GB Kingston V300, Samsung 850 Pro 512GB , 3TB Hitachi HDD, 2x5TB Toshiba X300, 500GB M.2 @ x2
Display(s) Lenovo y27g 1080p 144Hz
Case Fractal Design Define R4
Audio Device(s) AKG Q701's w/ O2+ODAC (Sounds a little bright)
Power Supply EVGA Supernova G2 850w
Mouse Glorious Model D
Keyboard Rosewill Full Size. Red Switches. Blue Leds. RK-9100xBRE - Hate this. way to big
Software Win10
Benchmark Scores 3DMark FireStrike Score : needs updating
You'd be limited by VRAM and PCI-e bandwidth, just like you were back in the days. That's why SLi and CF were never that popular, I guess.
Ima need a reminder cause my old 980Tis in SLI never seemed hindered by either of those.
Doesn't NVLink pool the memory anyways?
 
Joined
Mar 21, 2016
Messages
2,217 (0.74/day)
Killing off support in lower end of product range was it's death nail in the coffin. It really didn't help that the lower end and mid range GPU's were more anemic on VRAM that Nvidia would put on GPU's. Meanwhile they need put a lot more VRAM on certain cards that can barely utilize and not enough of it on cards that could better utilize it. They want to force people into sooner upgrades or buying the next halo tier higher.

Ima need a reminder cause my old 980Tis in SLI never seemed hindered by either of those.
Doesn't NVLink pool the memory anyways?

That was a halo tier card though outside of Titan and outside of Telsa and Workstation GPU's that are honestly best suited for SLI. The GTX960 4GB scaled pretty well, but the VRAM was still rather limited and had it been more like 6GB would've been more ideal and memory bus was not exciting plus memory didn't scale great given the limited memory bus. It still did pretty good on scaling relative to a GTX980 though.

Now if you took a similar GPU performance Tesla card with 8GB VRAM and put those in SLI I'd expect it to easily outperform GTX960's 4GB in SLI especially at higher resolutions and might even at times give a GTX 980Ti a run for it's money.

Design limitations hamper expectations depending on usage and expectations around them. A lot of GPU's become VRAM limited too quickly or resourced limited to push the tier model above it or were just poorly balanced design. It's tricky to design something ideal for both single card usage or multiple GPU usage though from a product stack standpoint if you don't keep enough bottleneck limitations in mind so some combinations end enough a lot better balanced for either scenario or really double down on poor design limitations that aren't ideal to begin with.
 
Last edited:
Joined
Jan 14, 2019
Messages
10,162 (5.16/day)
Location
Midlands, UK
System Name Holiday Season Budget Computer (HSBC)
Processor AMD Ryzen 7 7700X
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 16 GB Corsair Vengeance EXPO DDR5-6000
Video Card(s) Sapphire Pulse Radeon RX 6500 XT 4 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2, 4 + 8 TB Seagate Barracuda 3.5"
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Windows 10 Pro
Ima need a reminder cause my old 980Tis in SLI never seemed hindered by either of those.
Doesn't NVLink pool the memory anyways?
SLi stored a mirror image in your VRAM, so with two 4 GB cards, you doubled your theoretical performance (it was closer to 1.5x in practice), but you still only had 4 GB VRAM.
 
Joined
Mar 21, 2016
Messages
2,217 (0.74/day)
SLi stored a mirror image in your VRAM, so with two 4 GB cards, you doubled your theoretical performance (it was closer to 1.5x in practice), but you still only had 4 GB VRAM.

Direct X 12 though was suppose to allow mGPU's to each utilize their full memory allocation.
 
Joined
Nov 26, 2021
Messages
1,372 (1.49/day)
Location
Mississauga, Canada
Processor Ryzen 7 5700X
Motherboard ASUS TUF Gaming X570-PRO (WiFi 6)
Cooling Noctua NH-C14S (two fans)
Memory 2x16GB DDR4 3200
Video Card(s) Reference Vega 64
Storage Intel 665p 1TB, WD Black SN850X 2TB, Crucial MX300 1TB SATA, Samsung 830 256 GB SATA
Display(s) Nixeus NX-EDG27, and Samsung S23A700
Case Fractal Design R5
Power Supply Seasonic PRIME TITANIUM 850W
Mouse Logitech
VR HMD Oculus Rift
Software Windows 11 Pro, and Ubuntu 20.04
Direct X 12 though was suppose to allow mGPU's to each utilize their full memory allocation.
The developers still have to do the heavy lifting; no wonder SLI and Crossfire are dead.
 
Top