Can't say I will complain about SLI and Crossfire going out. Its a waste of time, because really, what we would do with all that dev time is get maybe a 10-15% discount on a certain performance level through the second GPU (better perf/dollar lower down the stack) in exchange for endless fiddling and tiny nuisances like microstutter, heat, reduced OCs, it just being broken, etc.
I always kinda wondered if multi GPU didn't just exist for the epeen value, and then 'oh yeah it also adds some performance'. It does feel good having a rig full of hardware, never mind the practicalities.
Dual GPU has been dead since 290x. Devs don't care. They can't even release a game without a day one 15GB patch. It's all a joke just like gaming in general, today. AAA titles are crap, microtransactions (along with pay2win), sold on lies, etc.
Remember when gaming was good? Pepperidge Farm remembers.
You must be fun at lan parties
Gaming's in a pretty good place if you care to look beyond the frontpage news, really. Maybe its just burned out in general for you?
I miss dual GPU. In this era of slowing tech advancements, multi-GPU is the easiest way to get more power, given GPU generations are taking longer and making smaller steps every time. "crossfire" in the old sense may be dead, but Crossfire, as in some multi GPU solution, isnt going to stay dormant forever. MCM GPU configurations will be the future of high rez high hz gaming setups once we cant shrink nodes anymore.
You're actually very right about that. But we also place higher demands on our games now, the dependancy on VRAM I think was a catalyst for SLI's demise. Nvidia started killing it right at the same time AMD started looking at HBM; and Nvidia had to move to delta compression, and VRAM capacities doubled overnight.
Now look at today; high end GPU between Maxwell and Pascal gained another 4GB (970 > 1070) and the high end even goes to eleven
This makes it even harder to sell 'wasted' hardware resources like doubled VRAM. And I reckon its also harder to push all that across the bridge/bus. Nvidia had to scale those up already.
I think inevitably you will meet the same problems with SLI/Crossfire as you do with MCM solutions, but with MCM you can solve it all within a single chip/design/board, and with much shorter paths.
The subject has come up in this thread that multiple GPUs in Crossfire or SLI are necessary for 4K which isn't correct . I just checked the review of the 2080 Ti FE here and of the 23 games benched at 4K only 3 fell below 60 FPS average.
Deus Ex: Mankind Divided 51 FPS
Ghost Recon Wildlands 48 FPS
Monster Hunter World 43 FPS
Most of the rest were way, way over 60 FPS average.
Bear in mind also that these games were benched at their highest quality settings. For the very small handful of games that don't perform to your expectations then you could turn down the settings a bit. I'm not defending the price of the 2080 Ti but you could go that route with 4K. It is possible.
I expect the 3080 Ti will probably be quite a bit faster than the 2080 Ti.
But 4K gaming is still a niche market even after all of these years. It's for people that are willing to pay for it.
Teehee. I remember the 1080ti reviews and I saw those exact same framerates across the testing at 4K. 60 FPS 4K (as in minimums) is fár away from us. And yet, price is soaring for marginal performance bumps. Look how long it took for us to say we can finally 'kill 1080p' with a specific GPU. And even then some new games cripple even the higher end models at that res.
Games evolve. A resolution bump is simply a major bump in your requirements for smooth gameplay, and it won't
ever be fixed by new releases if you keep playing new games. Well, it will, but you can safely look at periods of a decade for that to materialize. I think many people can now say they made a major mistake buying into 4K gaming (monitor, mostly) as GPU performance increases slow down.