• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD CEO Lisa Su: "CrossFire Isn't a Significant Focus"

Multiadapter was supposed to be able to make it easy for developers to utilize multiple gpu's any gpu even integrated all at the same time. apparently MS didn't make it easy enough though because no one used it.

It could be easy as heck but you seem to be missing the point that developers have absolutely no reason to spend a dime on it. They are already complaining about money and crunch.

if you were in their shoes dealing with budget and crunch woes, would you create more work for yourself for the whole two people in the world that use crossfire? That will give you absolutely zero returns?
 
Both mGPU techs have gone the way of the dodo. Good riddance. :)

When it works scaling was always meh with maybe 50-70% average. Few better, more worse or none at all. Only if it is a NEED, say 4K with mid-range tier cards, can I see it being worthwhile.

In the words of Elsa....LET IT GO... LET IT GO! CANT HOLD IT BACK ANY MORE!!!!! :p

But but b...I can buy two cheap gpus to make a midrange card.... stooooooooopit..lol
 
Back when the GTX 280 was king I played with TRI-SLI for a while and it was fun to play around with, but the buzz almost came from putting one GTX 280 in... then another... THEN ANOTHER.
 
It reminds me .. single core CPUs, then 2 cores.. now 32.. OK I know it is not the same with GPUs, but it all comes down to the right technology implementation, to start the dual GPU core magic!!!
;)
 
You might be right as you no longer need a bridge connection to enable crossfire on AM4 or X399 boards. And just about all of them have crossfire support written on the box.

You no longer need a bridge connection because AMD gaming GPUs use a XDMA engine in-die to communicate over PCIe. Each GPU can access the other's memory directly. It's been this way since Hawaii (R9 290X) after AMD's frametimes were shown to be abysmal in previous Crossfire. Once this is removed in hardware, Crossfire support can no longer be enabled (Radeon VII and 5700/XT).

Radeon Instinct MI50 and MI60 need a bridge for Infinity Fabric Link to support high-bandwidth transfers (200GB/s+).

I run 2 Vega64s in CF and it's mostly fine. Newer driver features like Enhanced Sync or even Freesync cause stuttering though, so as long as you know about the limitations, it's okay. Using ReLive during Crossfire gameplay also reduces performance as it triggers adaptive GPU clocking, which is normally disabled in Crossfire to maximize XDMA performance.

Drivers after 19.7.1 have Crossfire profiles missing too.
 
It could be easy as heck but you seem to be missing the point that developers have absolutely no reason to spend a dime on it. They are already complaining about money and crunch.

if you were in their shoes dealing with budget and crunch woes, would you create more work for yourself for the whole two people in the world that use crossfire? That will give you absolutely zero returns?

If it was easy as heck then the free markets would supply an employee that can do it rather cheaply or subsidized by M$. Similar to how Nvidia sends some of its engineers to game studios to help them out from time to time. So it must not be all that important to M$ besides a pretty headline 4 years ago. /shrug

Any other thoughts captain?
 
I couldn't care less about CF or SLI.
 
R9 290 CF user here. Not my first multi-GPU setup, but when it works, second card gives a nice boost. I'd say that the thermals are the worst problem (only one card is watercooled).
 
Dual GPU has been dead since 290x. Devs don't care. They can't even release a game without a day one 15GB patch. It's all a joke just like gaming in general, today. AAA titles are crap, microtransactions (along with pay2win), sold on lies, etc.

Remember when gaming was good? Pepperidge Farm remembers.
 
Yeah, sadly the modern consoles are weak. Hence the whole industry is being held back.
 
I got into Crossfire with my old HD 4850 setup. It was fun to play with but honestly I spent more doing that then just buying a high end card. I actually had a 4850x2 plus one more 4850 in Crossfire and man that was a heater. I haven't really done Crossfire since. I've found that buying a sub $500 GPU tends to be a more efficient and cost effective way to manage my setup.
 
Way forward is the way of their cpu's chiplets. Once they do that on gpu's it will be a big step forward.
 
If it was easy as heck then the free markets would supply an employee that can do it rather cheaply or subsidized by M$. Similar to how Nvidia sends some of its engineers to game studios to help them out from time to time. So it must not be all that important to M$ besides a pretty headline 4 years ago. /shrug

Any other thoughts captain?

First, what does a free market have to do with anything? Microsoft doesn't care. They make an operating system, a graphics api to go with it, and publish a few games. Again, Microsoft really doesn't stand much to gain either. People already have Windows whether mGPU works or not. They aren't going to make anymore money.

If anyone stands to gain from mGPU it AMD. If mGPU worked well AMD could sell two 580s at a pop so someone could get 2080 performance or two 5700s to get a 2080ti. nVidia probably doesn't care too much about it because they make way more money on 2080+ gpus then they would on the lower cards. I'd wager they make more money on a single 2080S than they do on two 2060S purchases. Probably why they have been slowly raising the bar for the gpus that can do sli - 960, 1070, 2080.

With all that, both of the GPU makers say it isn't worth spending resources developing drivers and helping studios implement mGPU. Why? Because almost no one uses it.
 

Multiadapter was supposed to be able to make it easy for developers to utilize multiple gpu's any gpu even integrated all at the same time. apparently MS didn't make it easy enough though because no one used it.

From the very article you linked:

"It may be free performance for gamers, but that doesn't mean it's free for developers to implement. As PCPer points out, "Unlinked Explicit Multiadapter is also the bottom of three-tiers of developer hand-holding. You will not see any benefits at all, unless the game developer puts a lot of care in creating a load-balancing algorithm, and even more care in their QA department to make sure it works efficiently across arbitrary configurations."

And:

"Likewise, DirectX12 making it possible for Nvidia and AMD graphics cards to work together doesn't guarantee either company will happily support that functionality. "


If it was easy as heck then the free markets would supply an employee that can do it rather cheaply or subsidized by M$. Similar to how Nvidia sends some of its engineers to game studios to help them out from time to time. So it must not be all that important to M$ besides a pretty headline 4 years ago. /shrug

Any other thoughts captain?

You seem to be the only one saying it was easy. The very article pointed out that it wouldn't be easy and that the game developer and hardware driver teams would still have to dedicated resources to it. Just because you decided something should be easy, doesn't mean anyone lied to you, especially when they said that it wouldn't be from the beginning.
 
Last edited:
From the very article you linked:

"It may be free performance for gamers, but that doesn't mean it's free for developers to implement. As PCPer points out, "Unlinked Explicit Multiadapter is also the bottom of three-tiers of developer hand-holding. You will not see any benefits at all, unless the game developer puts a lot of care in creating a load-balancing algorithm, and even more care in their QA department to make sure it works efficiently across arbitrary configurations."

And:

"Likewise, DirectX12 making it possible for Nvidia and AMD graphics cards to work together doesn't guarantee either company will happily support that functionality. "




You seem to be the only one saying it was easy. The very article pointed out that it wouldn't be easy and that the game developer and hardware driver teams would still have to dedicated resources to it. Just because you decided something should be easy, doesn't mean anyone lied to you, especially when they said that it wouldn't be from the beginning.

AMD/Nvidia often send their own engineers to game studios to push branded specific features, M$ could have done this as well, they just prefer hoarding their money and resources instead, and talking a lot of crap.
 
AMD/Nvidia often send their own engineers to game studios to push branded specific features, M$ could have done this as well, they just prefer hoarding their money and resources instead, and talking a lot of crap.

Ok, obviously you have created a narrative for yourself and no amount of facts and proof is going to change your mind.
 
I would have really liked a world in which hybrid crossfire worked well. Have an APU, add a GPU and that performance is just added on top of the APU performance.
 
"To be honest, the software is going faster than the hardware, I would say that CrossFire isn't a significant focus"

Isn't that the perfect situation to make multiple GPUs work? I mean if they can't keep up with software's hardware demand with a single card, then "just throw in more".
Since AMD doesn't have a real high-end card it would seem like a good idea to focus on CrossFire, but I guess it's not only their fault. Game developers would need a bit of extra work to make scaling efficient and that's not worth it for them, or either their partner GPU vendor (Which is usually Nvidia with their ****Works)

A dual 5700 XT (or many cheap cards) setup would beat a 2080 TI for much less money in ideal situations and that would make the most expensive card irrelevant.

it has nothing to do with nvidia really. both SLI and CF using same tech called AFR. modern game engine simply did not like how AFR work.


Multiadapter was supposed to be able to make it easy for developers to utilize multiple gpu's any gpu even integrated all at the same time. apparently MS didn't make it easy enough though because no one used it.
it was supposed to make it more easy but it doesn't mean game developer will suddenly embrace it because it is much easier to do. regardless of being easy or not it still regarded as more extra work. and not many people with such setup. game developer most often already very busy to fix their game post launch. adding multi GPU will add more problems.

It reminds me .. single core CPUs, then 2 cores.. now 32.. OK I know it is not the same with GPUs, but it all comes down to the right technology implementation, to start the dual GPU core magic!!!
;)
lol our GPU already have thousands of cores inside them called shaders.
 
Really ? A single 1080Ti was already achieving 4K/60fps on most games back in 2017 and 2080Ti is definitely achieving well above 60fps in pretty much all games ( unless it's some poorly optimised junk ) !

Maybe you are not up to date with GPU tech ......

You clearly haven't done your research, here is a chart with 2080 ti @4k ultra settings, tell me if those frames are all playable. good luck with that
08232019-100415.jpg



this is the article it is taken from
 
People here railing on MS about DX12 - you really think NVidia or AMD really wants to spend resources to make use of the feature to marry together opposing GPUs? Or game developers?
Be realistic, please.

You clearly haven't done your research, here is a chart with 2080 ti @4k ultra settings, tell me if those frames are all playable. good luck with that

Every single one of those games are playable at those framerates. Rephrase the question to "Is this 4K60 gaming?" and then the answer indeed is no.
 
i bought a while ago another hd6950 for CF and was the worst decision ever; i had higher fps with one card than with two in CF..., in newer games which didn't supported it however...

in my opinion CF & SLI is a waste of money especially due higher power consumption, if games don't support these features
 
no, 60hz, and 40 fps is not considered good, when SLI gives you double that or at the very least 50% more.
 
To be honest, I always thought that there's a conflit between pushing graphics further, and playing at higher resolution. We haven't even reached the point were 4k gaming is really mainstream, and we are already hearing stuff about 8k gaming...I'm starting to doubt that silicon will ever make 4k60 a reality on mainstream segment, unless we stop pushing better graphics for a while.
 
no, 60hz, and 40 fps is not considered good, when SLI gives you double that or at the very least 50% more.

That isn't what you asked. You asked if they were all "playable". And yes, every game in that chart was playable at those frame rates. Are they perfect or ideal? No, so turn down the quality settings.
 
Back
Top