Monday, September 25th 2017

AMD Phasing Out CrossFire Brand With DX 12 Adoption, Favors mGPU

An AMD representative recently answered PC World's query regarding the absence of "CrossFire" branding on their latest Radeon Software release, which introduced multi-GPU support for AMD's Vega line of graphics cards. According to the AMD representative, it goes down to a technicality, in that "CrossFire isn't mentioned because it technically refers to DX11 applications. In DirectX 12, we reference multi-GPU as applications must support mGPU, whereas AMD has to create the profiles for DX11. We've accordingly moved away from using the CrossFire tag for multi-GPU gaming."
The CrossFire branding has been an AMD staple for years now, even before it was even AMD - it was introduced to the market by ATI on 2005, as a way to market multiple Radeon GPUs being used in tandem. For years, this was seen as a semi-viable way for users to space out their investment in graphics card technology by putting in lesser amounts of money at a time - you'd buy a mid-range GPU now, then pair it with another one later to either update your performance capabilities to the latest games, or achieve the same performance levels as a more expensive, single-GPU solution. This has always been a little hit or miss with both vendors, due to a number of factors.

But now, the power to implement CrossFire or SLI isn't solely on the GPU vendor's (AMD and NVIDIA) hands. With the advent of DX 12 and explicit multi-adapter, it's now up to the game developers to explicitly support mGPU technologies, which could even allow for different graphics cards from different manufacturers to work in tandem. History has proven this to be more of a pipe-dream than anything, however. AMD phasing out the CrossFire branding is a result of the times, particular times nowadays where the full responsibility of making sure multi-GPU solutions work shouldn't be placed at AMD or NVIDIA's feet - at least on DX 12 titles.
Source: PCWorld
Add your own comment

55 Comments on AMD Phasing Out CrossFire Brand With DX 12 Adoption, Favors mGPU

#26
londiste
RejZoRNot quite. For Crossfire, it was up to AMD to create functional profiles for a game. With mGPU, it's entirely up to game developers to add support for multi-GPU. Which also means the engine has to run DirectX 12, because DX11 doesn't even support such a thing. Not sure how it's with Vulkan in this regard...
well, not exactly.
while crossfire has become mgpu as amd's marketing term for the thing, nothing really changed in technology itself.
story with dx11 and opengl stays the same. vulkan is a bit up in the air.
for dx12 it matters if we are talking about implicit or explicit multiadapter.
implicit is pretty much the same as it is with dx11.
dx12 and explicit is what you are talking about. also, that comes in two flavours - linked (two gpu-s, logically similar to sli/xf) and unlinked (appear as one adapter).
Posted on Reply
#27
GhostRyder
Hence why I don't do multi-GPU anymore. It normally works well enough in the games I play that I was happy but its becoming a problem for AMD and Nvidia to keep up with. I am probably not going to build multi-GPU systems anymore for myself personally so this will probably not matter to me.
Posted on Reply
#28
Steevo
OctopussI thought the whole industry was done with that colossal failure multi GPU setups were? And now I see AMD simply calls it a different name. WTF?
Its DX12 which is a Microsoft product that is doing away with Xfire and SLI being controlled in the driver domain, and instead moving it to hard coding in the application. This has little to do with AMD.
Posted on Reply
#30
evernessince
ZoneDymoI like progress, away with crossfire and sli and welcome mGPU.
I doubt many game devs are going to take the time to implement this. It makes zero sense for them financially for 0.0001% of their player base.
Posted on Reply
#31
thesmokingman
evernessinceI doubt many game devs are going to take the time to implement this. It makes zero sense for them financially for 0.0001% of their player base.
Nail, meet head. There's no incentive for them. The only group to stand to gain from mgpu are AMD and Nvidia, yet they do very little subsidizing in regards to game developers. When we think about it is quite ironic.
Posted on Reply
#32
evernessince
thesmokingmanNail, meet head. There's no incentive for them. The only group to stand to gain from mgpu are AMD and Nvidia, yet they do very little subsidizing in regards to game developers. When we think about it is quite ironic.
Nice extrapolation on my thoughts, exactly what I was thinking. As you stated it only made sense for Nvidia or AMD to do it because they can make cash off mgpu sales. Game Devs do not and unfortunately with AMD dropping support it's going the way of the dodo. I don't think it will be a huge deal though because AMD are planning to release their scalable Navi architecture which should allow them to make a GPU of any size they need. I'm sure Nvidia already has one in development as well, they have even said that MCM (multi-chip module) is the future.

GPUs stand to benefit the most from MCM because of their massive die size. As your increase die size, cost goes up exponentially. MCM technology in GPUs would reduce the cost to produce high end GPUs by a large amount and it would allow them to scale the product up to almost any size. I say almost because there is a limit on the number of dies that can put put on a single chip with current tech.
Posted on Reply
#33
FordGT90Concept
"I go fast!1!11!1!"
Thing is, developers generally aren't going to support mGPU unless they're paid to do it by AMD or NVIDIA. It's yet another thing to debug that doesn't really improve their game in a way that matters.
Posted on Reply
#34
bug
thesmokingmanNail, meet head. There's no incentive for them. The only group to stand to gain from mgpu are AMD and Nvidia, yet they do very little subsidizing in regards to game developers. When we think about it is quite ironic.
Tbh, this could make a difference for professional software. There's always someone out there that needs to squeeze just a bit more juice out of any system. But it's really a solution in search of a problem for home users.
Posted on Reply
#35
Vya Domus
FordGT90ConceptThing is, developers generally aren't going to support mGPU unless they're paid to do it by AMD or NVIDIA.
If Nvidia can do it with GameWorks and Physx , they can do it with this too.
Posted on Reply
#36
FordGT90Concept
"I go fast!1!11!1!"
They can't because of the closer-to-the-metal nature of D3D12 and Vulkan. Drivers no longer have the data they need to split frame rendering to multiple cards.
Posted on Reply
#37
thesmokingman
Vya DomusIf Nvidia can do it with GameWorks and Physx , they can do it with this too.
FordGT90ConceptThey can't because of the closer-to-the-metal nature of D3D12 and Vulkan. Drivers no longer have the data they need to split frame rendering to multiple cards.
I think you missed Vya's point which if I have it right means that since Nvidia was successful subsidizing something as divisive and pervasive as Gameworks, they could do the same for mgpu. This is more about the business of it and less about the technical merits.
Posted on Reply
#38
FordGT90Concept
"I go fast!1!11!1!"
Gameworks isn't as successful as you think. Case in point: if the game isn't TWIMTBP logo'd, GameWorks is rarely supported. Again, we're only going to see mGPU in games that AMD/NVIDIA pay for it to be included.
Posted on Reply
#39
thesmokingman
FordGT90ConceptGameworks isn't as successful as you think. Case in point: if the game isn't TWIMTBP logo'd, GameWorks is rarely supported. Again, we're only going to see mGPU in games that AMD/NVIDIA pay for it to be included.
lol everyone's a contrarian on the internets.
Posted on Reply
#40
FordGT90Concept
"I go fast!1!11!1!"
How many new releases actually get 90%+ framerate improvement running Crossfire/SLI? 1%? Even that? AMD and NVIDIA both ended 3- and 4- way support. They have also both strongly hinted that days are running out for 2- way. MGPU, going forward, is in the developers hands, not NVIDIA and not AMD.

I can only name two games that use GameWorks off the top of my head: Arkham series and Witcher 3. PhsyX was open sourced not too long ago because outside of UE4 and a few other games, it doesn't get used.
Posted on Reply
#41
evernessince
FordGT90ConceptGameworks isn't as successful as you think. Case in point: if the game isn't TWIMTBP logo'd, GameWorks is rarely supported. Again, we're only going to see mGPU in games that AMD/NVIDIA pay for it to be included.
That's likely because GameWorks doesn't have a good track record. I can't think of one time GameWorks made me go "wow" in a positive way. I might go "wow that really tanks FPS" though. Not only does GameWorks bork AMD cards and prevent AMD from optimizing for the game but it screws over previous gen Nvidia cards as well. GTX 900 series comes out, Nvidia way over tessellates grass in Crysis 2, what do you know the 700 series takes a huge performance hit with tessellation enabled. The 900 series can handle a massive amount of tessellation with little issue, Nvidia exploited it.

GameWorks is more a "We pay the devs so they can put in features that heavily favor our new cards to push sales".
Posted on Reply
#42
FordGT90Concept
"I go fast!1!11!1!"
GameWorks only exists to convince developers to get into the NVIDIA ecosystem not unlike what Microsoft did with DirectX. There's open source (some by AMD) or engine tools (by like-minded developers) available to do everything GameWorks does without being trapped in an ecosystem.
Posted on Reply
#43
londiste
FordGT90ConceptGameWorks only exists to convince developers to get into the NVIDIA ecosystem not unlike what Microsoft did with DirectX. There's open source (some by AMD) or engine tools (by like-minded developers) available to do everything GameWorks does without being trapped in an ecosystem.
developer.nvidia.com/what-is-gameworks
Posted on Reply
#44
bug
evernessinceThat's likely because GameWorks doesn't have a good track record. I can't think of one time GameWorks made me go "wow" in a positive way. I might go "wow that really tanks FPS" though. Not only does GameWorks bork AMD cards and prevent AMD from optimizing for the game but it screws over previous gen Nvidia cards as well. GTX 900 series comes out, Nvidia way over tessellates grass in Crysis 2, what do you know the 700 series takes a huge performance hit with tessellation enabled. The 900 series can handle a massive amount of tessellation with little issue, Nvidia exploited it.

GameWorks is more a "We pay the devs so they can put in features that heavily favor our new cards to push sales".
Apart from a few titles, maxing out whatever GameWorks brings to the table, also brings the current high-end cards to their knees. And guess what, that's no different from maxing out setting in games without GameWorks.
Sure, GameWorks put additional stress on the GPU (additional features do that), but inferring GameWorks was developed solely as a means to push newer cards is a little out there, imho. Especially since the competition at the high end is pretty much MIA for a few years now.

PS Witcher 3 is a GameWorks title that wows. But then again, Withcher 3 wasn't GameWorks, it still wowed in its time (many couldn't believe it was "only" a DX9 title).
Posted on Reply
#45
Frick
Fishfaced Nincompoop
What about Hybrid Crossfire? APU+GPU nevee really worked well, but I like the idea, especially if the APUs become more powerful.
Posted on Reply
#46
bug
FrickWhat about Hybrid Crossfire? APU+GPU nevee really worked well, but I like the idea, especially if the APUs become more powerful.
Think about it for a bit. When you upgrade your video card from one generation to the next and get 30-50% more HP, best case scenario, a game that ran at 20 fps will run at 30 fps, 40 becomes 60, 60 becomes 90 and 100 becomes 150. Going from 20 to 30 makes an unplayable title possibly playable (if it's a TBS or smth), 40 to 60 is much better (though an avg of 60 means it will still be below that ~50% of the time) and in the other cases the games remain playable, but you can now up the settings.
This is an acceptable gain (though in practice I find myself often skipping a generation). But an IGP will not add as much HP to your rig, thus you'll gain a lot less. Is it worth the hassle?
Posted on Reply
#47
Prima.Vera
If now we have to wait for devs to develop multi GPU engines....then we have to wait. Probably for a looooooooooooooooooooooooooooooooooooooooong time.
Posted on Reply
#48
Frick
Fishfaced Nincompoop
bugThink about it for a bit. When you upgrade your video card from one generation to the next and get 30-50% more HP, best case scenario, a game that ran at 20 fps will run at 30 fps, 40 becomes 60, 60 becomes 90 and 100 becomes 150. Going from 20 to 30 makes an unplayable title possibly playable (if it's a TBS or smth), 40 to 60 is much better (though an avg of 60 means it will still be below that ~50% of the time) and in the other cases the games remain playable, but you can now up the settings.
This is an acceptable gain (though in practice I find myself often skipping a generation). But an IGP will not add as much HP to your rig, thus you'll gain a lot less. Is it worth the hassle?
Yeah, if it'd work well. And it would be nice to get the same boost just adding a GPU that cost a lot less than you'd have to spend to otherwise get the same performance.
Posted on Reply
#49
bug
FrickYeah, if it'd work well. And it would be nice to get the same boost just adding a GPU that cost a lot less than you'd have to spend to otherwise get the same performance.
Well, if you upgrade, you get to sell your old video card, so upgrading is already quite cheap.
Posted on Reply
#50
GhostRyder
FordGT90ConceptGameWorks only exists to convince developers to get into the NVIDIA ecosystem not unlike what Microsoft did with DirectX. There's open source (some by AMD) or engine tools (by like-minded developers) available to do everything GameWorks does without being trapped in an ecosystem.
But I like upgrading my graphics card just so I can run the main characters hair on Ultra settings :P
evernessinceThat's likely because GameWorks doesn't have a good track record. I can't think of one time GameWorks made me go "wow" in a positive way. I might go "wow that really tanks FPS" though. Not only does GameWorks bork AMD cards and prevent AMD from optimizing for the game but it screws over previous gen Nvidia cards as well. GTX 900 series comes out, Nvidia way over tessellates grass in Crysis 2, what do you know the 700 series takes a huge performance hit with tessellation enabled. The 900 series can handle a massive amount of tessellation with little issue, Nvidia exploited it.

GameWorks is more a "We pay the devs so they can put in features that heavily favor our new cards to push sales".
Yea I can only think of maybe one Gameworks game that wasn't just feeling like a detriment to run on any system for graphics that (While they looked great) were not that special. PhysX was a good idea in the beginning in my opinion only because I liked the idea of having it be done by a separate piece of hardware to reduce stress on the main GPU but even that was nothing amazing.

I guess the question is whats next. If Nvidia and AMD are not going to be making profiles anymore (At least less than before) are they going to focus on new ecosystems for developers or something else?
Posted on Reply
Add your own comment
Jan 20th, 2025 23:12 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts