Sunday, July 7th 2019

AMD Radeon RX 5700 "Navi" Graphics Cards Lack CrossFire Support

In a comical turn of events, while NVIDIA restored NVLink SLI support for its RTX 2070 Super graphics card on virtue of being based on the "TU104" silicon, AMD did the opposite with "Navi." The Radeon RX 5700 XT and RX 5700 lack support for AMD CrossFire. If you put two of these cards on, say, a swanky X570 motherboard that splits PCIe gen 4.0 to two x8 slots with bandwidths comparable to PCIe gen 3.0 x16; you won't see an option to enable CrossFire. AMD, responding to our question on CrossFire compatibility clarified that AMD dropped CrossFire support for "Navi" in favor of DirectX 12 and Vulkan "explicit" multi-GPU mode. The older "implicit" multi-GPU mode - CrossFire - used by DirectX 11, DirectX 9, and OpenGL games is not supported. The AMD statement follows.
Radeon RX 5700 Series GPU's support CrossFire in 'Explicit' multi-GPU mode when running a DX12 or Vulkan game that supports multiple GPU's. The older 'implicit' mode used by legacy DX9/11/OpenGL titles is not supported.
Add your own comment

27 Comments on AMD Radeon RX 5700 "Navi" Graphics Cards Lack CrossFire Support

#1
RH92
Anti-consumer right ? Oh wait im silly it's not Nvidia it can't be or can it .......
Posted on Reply
#2
Zubasa
RH92Anti-consumer right ? Oh wait im silly it's not Nvidia it can't be or can it .......
To be fair, who runs SLI on 2060 and 2070 anyway?
I doubt anyone with the 2070 Super (that actually supports SLI) runs SLI.
With the lack of support of multi-gpu in games, might as well stop wasting time on it.

Also the performance / dollar sucks on this generation of GPUs already, who would want to pay double and might or might not get a performance increase?
Posted on Reply
#3
W1zzard
As mentioned in my reviews, I think this is a good thing, because it frees AMD from wasting time with a feature that nobody really uses, so they can assign the resources to more useful tasks
Posted on Reply
#4
RH92
W1zzardAs mentioned in my reviews, I think this is a good thing, because it frees AMD from wasting time with a feature that nobody really uses, so they can assign the resources to more useful tasks
ZubasaTo be fair, who runs SLI on 2060 and 2070 anyway?
I know i know my comment was just a shout-out to all those giving flak to Nvidia for not supporting SLI with 2060/2070 wich ironically now they do with 2070 Super .

Personally i've never been a big fan of multi-gpu configurations , i would rather buy a single more powerfull gpu .
Posted on Reply
#5
the54thvoid
Super Intoxicated Moderator
RH92Personally i've never been a big fan of multi-gpu configurations , i would rather buy a single more powerfull gpu .
It's why I bought my current card. And with this showing from AMD, I'm not sure Nvidia needs to worry. Which is unfortunate for all of us. It consumes more power than the 12nm Nvidia equivalent, with more heat and noise.
Posted on Reply
#6
Darmok N Jalad
I know it's probably a long shot for gaming cards, but I am curious to see how the Vega X2 performs in the new Mac Pro in terms of GPU to GPU communication. It's a different take on multi-GPU, as it uses Infinity Fabric to connect the 2 GPUs instead of the classic bridge chip concept. It certainly suggests that IF AMD wanted to support a multi-GPU configuration, it would skip SLI and just launch a card with 2 GPUs on it.
Posted on Reply
#7
z1n0x
I personaly don't care about multi-gpu gaming, and from what i read it's nothing but trouble no matter the GPU brand.
Also, according to AMD under 1% use multi-gpus for gaming.
Posted on Reply
#8
Unregistered
My thing with AMD and multiple GPU is that clearly they cannot compete with NVidia on the high end, so they have to find another way to do it. Multiple middle-tier GPUs working together seems a possible route. I hope they find a way.
#9
Aerpoweron
Razrback16My thing with AMD and multiple GPU is that clearly they cannot compete with NVidia on the high end, so they have to find another way to do it. Multiple middle-tier GPUs working together seems a possible route. I hope they find a way.
There is a way. The developers can use the DX12 or Vulkan Multi GPU route. As stated in the artictle above. Hitman and Hitman2 already offer that option.
But it is on the game developers now, not the GPU companies.

From my experience Crossfire / SLI lately causes more trouble in games than it is worth. Now it is up to the 3d mark programmers to give us a DX12 and Vulcan demo which allows multi gpu support.
Posted on Reply
#10
64K
CrossFire and SLI are irrelevant anymore. The last time I checked the Steam Hardware Survey a year ago the people using more than one GPU was less than 2% and it declining.
Posted on Reply
#11
yotano211
When I had sli on a laptop some years ago, 970m sli, it sucked. There was only many issues with drivers.
Posted on Reply
#12
dinmaster
One of the things I like about amd is the approach they take to things and really dragging the industry forward. I was a bleeding edge multi monitor guy before eyefinity came out and was super thankful it did come out. I also ran crossfire/sli in a few systems and it's more trouble then it's worth. If adoption of dx12 and volcan can spread more, this will become a non issue. Their cpu's are good too, why have a gpu on a cpu when your using a video card... doesn't make sense to me. Dont butter up something and make it bloatware. Keep it basic because in the end it's a gpu and that's all we want it to be.
Posted on Reply
#13
Grog6
I last ran crossfire with 2 7970's, and that was a room heater. :)

I game at 1080, so a rx480 is doing fine, and won't even run the fans until I load a game.

I may add another rx480 from ebay at some point, but I don't see any lag, so I probably don't need it at this resolution.

I played Q2 at 30fps back in the day on a 600x800 monitor; it's been a long time since I saw that much lag, lol.
Posted on Reply
#14
Vayra86
W1zzardAs mentioned in my reviews, I think this is a good thing, because it frees AMD from wasting time with a feature that nobody really uses, so they can assign the resources to more useful tasks
It is a good thing - for now - but at the same time it also pushes people more readily into very expensive higher end cards. Bit of a double edged blade in a way.
Posted on Reply
#15
medi01
the54thvoidIt consumes more power than the 12nm Nvidia equivalent, with more heat and noise.
What consumes more than 12nm Nvidia equivalent, are you from planet Earth?

RH92Anti-consumer right ? Oh wait im silly it's not Nvidia it can't be or can it .......
Dual card usage has dropped sharply in the recent years, not least because of NV's move.
Given how small AMD is, spending resources on handful % (less than 1?) of the market would not be wise.
Posted on Reply
#16
Imsochobo
RH92Anti-consumer right ? Oh wait im silly it's not Nvidia it can't be or can it .......
AMD fanboys will complain regardless of what nvidia and amd does.
If RX5700 was 100 usd it'd still be too expensive according to them.
Also, if crossfire was supported they'd complain about bad support and pointless, might as well remove it.
If removed they complain it's such an important feature but less than 0.1 % actually use it.
Posted on Reply
#17
Tapion
medi01What consumes more than 12nm Nvidia equivalent, are you from planet Earth?

Performance per watt doesn't show actual power consumption.

Here is the relevant chart.

Posted on Reply
#18
Darmok N Jalad
So how does the 5700XT compare to VII in terms of performance? From a quick review of the results I’ve seen, the 5700XT isn’t too far off from the VII. It certainly has a much better performance per watt, suggesting that it’s not all 7nm magic in the 5700XT. I guess HBM and the interposer might contribute to the higher power demand?
Posted on Reply
#19
RH92
Darmok N JaladI guess HBM and the interposer might contribute to the higher power demand?
Actually is the opposite if didn't had HBM it would had comsumed even more !
Posted on Reply
#20
moproblems99
Darmok N JaladSo how does the 5700XT compare to VII in terms of performance? From a quick review of the results I’ve seen, the 5700XT isn’t too far off from the VII. It certainly has a much better performance per watt, suggesting that it’s not all 7nm magic in the 5700XT. I guess HBM and the interposer might contribute to the higher power demand?
If you game in 1080. For those of us with higher resolutions, the VII is still the way to go.
medi01What consumes more than 12nm Nvidia equivalent, are you from planet Earth?
Well, if you look at your chart, the 5700XT is at 86% while the 2070 is the 95%. That tells me the 5700XT uses more power than the relative performance compared to 2070.
TapionPerformance per watt doesn't show actual power consumption.

Here is the relevant chart.
That would be relevant if they have identical performance. But since they don't, performance per watt is the relevant chart.
Posted on Reply
#21
Fluffmeister
Nice improvements from AMD, and good to finally see the back of the awful Polaris and Vega cards.
Posted on Reply
#22
medi01
TapionPerformance per watt doesn't show actual power consumption.
Actual power consumption doesn't show where that power goes to, stranger.
Performance per watt shows you the most relevant power metric imaginable in this context.
Posted on Reply
#23
CSJr
There are a lot of detractors for Crossfire. "Nobody uses it", "cards are sufficient for X resolution"... Well some people still use it and I am due for a GPU upgrades soon and have always utilized more than 1 AMD GPU for a while now. It has kept my system relevant from 1080p gaming up to 4K gaming just by adding 'old' cards, despite scaling. But my current cards will not handle the next big thing, ray tracing that will be prevalent next year.

Those that say it is of no use are really playing into AMD's hand as they want you to purchase the newest and greatest to attempt 4k and even 8k gaming in years coming. Why not just add an old card for cheap for better resolution support? Why, because AMD doesn't get your money if you do that.

Unless all future games support DX12 multi GPU and, less likely, all previous games patch in DX12 multi GPU, then your extra PCEI slots in your motherboard are going to waste as you cannot use multiple 5700XT for any of those titles. In this case motherboard manufactures should just stop making motherboards with multiple PCEIx16 slots right?

I am all for customization and freedom to upgrade and I wish people would not lay down and get run over as they submit to less flexibility. Needless to say, AMD's offerings have reluctantly dropped off my radar. However, a couple of 2070 Supers would look more enticing, if they supported PCEI 4.0, in the next 6 months.
Posted on Reply
#24
Unregistered
CSJrThere are a lot of detractors for Crossfire. "Nobody uses it", "cards are sufficient for X resolution"... Well some people still use it and I am due for a GPU upgrades soon and have always utilized more than 1 AMD GPU for a while now. It has kept my system relevant from 1080p gaming up to 4K gaming just by adding 'old' cards, despite scaling. But my current cards will not handle the next big thing, ray tracing that will be prevalent next year.

Those that say it is of no use are really playing into AMD's hand as they want you to purchase the newest and greatest to attempt 4k and even 8k gaming in years coming. Why not just add an old card for cheap for better resolution support? Why, because AMD doesn't get your money if you do that.

Unless all future games support DX12 multi GPU and, less likely, all previous games patch in DX12 multi GPU, then your extra PCEI slots in your motherboard are going to waste as you cannot use multiple 5700XT for any of those titles. In this case motherboard manufactures should just stop making motherboards with multiple PCEIx16 slots right?

I am all for customization and freedom to upgrade and I wish people would not lay down and get run over as they submit to less flexibility. Needless to say, AMD's offerings have reluctantly dropped off my radar. However, a couple of 2070 Supers would look more enticing, if they supported PCEI 4.0, in the next 6 months.
I agree. If I was in a situation where a pair of 5700 XTs were looking like a potential upgrade for me, the fact that they nixed XFire support would send me to team green or to wait for Intel. While I haven't been very pleased with NVidia for a good bit, now, I will at least give them credit that they have maintained functional SLI support - it still runs absolutely fantastic in all of the demanding AAA games I play that support it, which is about 98% of the ones I own.
Posted on Edit | Reply
#25
Cryio
This more or less confirms that for the moment AMD will work with the few devs implementing DX12/Vulkan in their AAA games. For smaller devs, that launch DX9/10/11/OpenGL games, crossfire will work only on previous RDNA uArch GPUs as long as there is developer interest.

So for the blockbuster titles, nothing really changed going forward (Ubisoft should really start implementing DX12/Vulkan in all their incredibly CPU bound games though). The percentage of people using multi GPUs setup has always been incredible small.
Posted on Reply
Add your own comment
Nov 23rd, 2024 04:52 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts