• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

AMD CrossFireX Scaling is Actually Pretty Terrible with Mixed RX Vega

If CrossFireX was working correctly there would not be graphics corruption, and I'm not sure it's a driver problem.
 
The thing is I simply cannot get Crossfire to work under DX12 for my 290X and 390.
In DX11 the cards works pretty well, but in most DX12 games the 390 pretty much sits idle.
In fact not even Time Spy works for my 290X + 390 set up.
Same is true with Sniper Elite 4, crossfire works in DX11, but doesn't work in DX12, which is like the dumbest thing ever since I either get DX12 with Async, or DX11 with crossfire.

DX12 its up to the ISV to implement any mGPU support

I think Microsoft, Nvidia and AMD have repeated it like 1 million times and people still don't get it.
 
Last edited:
Clearly a driver issue guys. As always for AMD they are late in ironing out quirks. Vega is very different from previous archs in the way boost is apllied. CF between twins of 56 or 64 will show that clearly.
 
2015 right after FuryX launch RTG is spun off AMD, then lead by Raja. Now 2017 with Raja giving two consecutive flops Polaris and Vega he is shown the door. I hope Lisa Su can correct the path of RTG before it is too late.

I'd argue Polaris was not a flop. The 4GB RX480 hade a nice bang/buck ratio and was my go to recommendation for anything not GTX 1070/80, right up until the mining craze.

The perplexing part is why they use Vega instead of Polaris in Raven Ridge. I'd be interested in seeing a budget Vega card with GDDR5.
 
don't bet all your eggs for a GDDR5 based Vega as AMD's RTG will might not or never will make one just to compete with NVIDIA's sub-$300 range market. Sad to see them being like this when all we want is a good competition in the GPU market... It was lucky for AMD when they showed Raja the door before it gets worst.
 
Yeah well they are on par anyway when 56 is flashed with the Vega 64 bios. It's really close anyway.

The big culprit of vega is that it's original design was aimed at compute and not gaming. The 56 still makes a good competitor against the 1070.

You still don't get it. It's not about how close it performs, it's about what the GPU is. Different amounts of shaders means there will be syncing differences and that certainly contributes to crap performance. It has always been ideal for both cards to be of identical model. It can be factory overclocked and from another vendor, it'll just adjust the clocks to the slower one. You can't adjust shaders.
 
ATI died in 2006. AMD killed it. And nearly themselves in the process.

ATI didn't die, AMD just bought them out. Then they died, like you say. Such a shame hey.

I used to love my ATI cards damn it.
 
They have a Long way to go with their Drivers until they make dual vega Setup working in the slightest. They should start working, if they want to make a dual VEGA Card at some Point, which im sure they will, cause they got nothing better or in the same league as Titan X:
 
You still don't get it. It's not about how close it performs, it's about what the GPU is. Different amounts of shaders means there will be syncing differences and that certainly contributes to crap performance. It has always been ideal for both cards to be of identical model. It can be factory overclocked and from another vendor, it'll just adjust the clocks to the slower one. You can't adjust shaders.

You're thinking of Nvidia cards in SLI, AMD/ATi cards have always been fine with combining different submodels of cards (I.E 7970+7950, 290X+290, etc), this issue is simply down to bugs in a beta driver.
 
Why on Earth would you mix Vega 64 and 56 ? Just because it works, it doesn't mean it'll work well. Dual card setups NEVER worked well with two different cards. Why is this a shock to people like 20 years after dual card setups existed?
I'd not sign that. I know people who used setups like HD5970 + 5870 and HD6990 + 6970 for example. Hell, I remember that with a little tweaking, even HD2900XT + HD3870 worked just fine.

DX12 its up to the ISV to implement any mGPU support

I think Microsoft, Nvidia and AMD have repeated it like 1 million times and people still don't get it.
Yep. First I was like wtf when my 970SLI setup just used one card in BF1, then when I switched to DX11 the #2 card woke up.
 
Multi GPU tests without frame latency data are useless.

Witcher 3 probably never got real crossfire support.
I did waste a lot of time trying to make it work flawlessly, and found no solution for blinking XP meter, for example.
 
While the article is informative and good and bravo for doing it and everything, I do not understand why the author keeps judging a FIRST BETA driver like it is the FINAL driver. The way this article is written, it's like "This is it folks. This is the best it can get. Ever. And it is disappointing".
 
Good thing they are showing Raja the door. Everyday the Vega is becoming uglier and uglier. Polish a turd 1000 times and it will still remain a turd.

For people saying hybrid Crossfire isn't a thing you are dead wrong. People even crossfired FuryX and Fury which are both different in clock speed as well as shader units and texture units.


You want proof? Here you go. By AMD RTG employee Matt himself from AMD's own community forum

https://community.amd.com/thread/186648


More proof:
http://www.overclock.net/t/1611844/r9-fury-x-r9-fury-crossfire/0_100


Something is seriously wrong with Vega's design. I am sure at the point it is no use to patch a sinking ship.


2015 right after FuryX launch RTG is spun off AMD, then lead by Raja. Now 2017 with Raja giving two consecutive flops Polaris and Vega he is shown the door. I hope Lisa Su can correct the path of RTG before it is too late.

You are giving Raja way to much credit for all he did at RTG in these last few years, may I remember you for the 100th time that both Polaris and Vega were already in the pipeline for development before he got back to AMD. And he simply has a managerial role at the end of the day , he may have some implication in designing the GPUs but he is not a pivotal force in that regard.
 
ATI died in 2006. AMD killed it. And nearly themselves in the process.
Also it is what gives them a unique advantage against Nvidia or Intel. Ati alone, would have been killed by Nvidia. I don;t think ATI would have been as competitive as it was 12 years ago. Jen-Hsun Huang is a very innovative person, finding new markets for his GPUs. I doubt ATI would have been doing something more than AMD, following Nvidia. AMD alone on the other hand, would have been killed by Intel. No consoles to save it. No APUs to differenciate it from Intel, to give it any advantage over Intel.
AMD and ATI together, well, they have an advantage as long as x86 is the highest performing architecture.
 
You are giving Raja way to much credit for all he did at RTG in these last few years, may I remember you for the 100th time that both Polaris and Vega were already in the pipeline for development before he got back to AMD. And he simply has a managerial role at the end of the day , he may have some implication in designing the GPUs but he is not a pivotal force in that regard.
He was there to validate the Polaris design and end up with a product that was described as a danger for the motherboard's PCIe slot and he was there to approve that "poor Volta" campaign. While a GPU person, he failed to predict the negative publicity of the PCIe issue with Polaris and failed to estimate Vega's performance in games, or agreed to push with a misleading campaign. On the other hand, in GPUs, in those rare cases she was involved, Lisa hasn't done everything perfect either. She was the one calling Fury an overclocker's dream and see was the one showing a crossfire demo of Vegas in presentations before Vega's official announcements. See probably followed marketing department suggestions there, but still, if it was Raja in her place, probably people wouldn't have given him any excuse.
 
Whoever considered multi GPU / mixed GPU a great idea when we enter a transitioning from one API to another, as well as extra diversity with Vulkan, hasn't been paying attention to business or game development.

DX12 is first and foremost a way to cut costs through homogenization with consoles AND enabling higher CPU performance for lower segment CPUs as core counts go up but clock speeds stay low. The rest? That's just some added bonus, that we may or may not get.
 
The thing is I simply cannot get Crossfire to work under DX12 for my 290X and 390.
In DX11 the cards works pretty well, but in most DX12 games the 390 pretty much sits idle.
In fact not even Time Spy works for my 290X + 390 set up.
Same is true with Sniper Elite 4, crossfire works in DX11, but doesn't work in DX12, which is like the dumbest thing ever since I either get DX12 with Async, or DX11 with crossfire.
You can try flashing 390x bios into the 290x. Anyway I'm successfully running an asymmetric crossfire with a 7950 flashed to 280 and a 280x toxic. I made a bios with modified frequencies to get the exact same output from the two.
 
Proof that SLI and Crossfire are a dieing breed.

Why people still bother these days is beyond me.
 
He was there to validate the Polaris design and end up with a product that was described as a danger for the motherboard's PCIe slot and he was there to approve that "poor Volta" campaign. While a GPU person, he failed to predict the negative publicity of the PCIe issue with Polaris and failed to estimate Vega's performance in games, or agreed to push with a misleading campaign. On the other hand, in GPUs, in those rare cases she was involved, Lisa hasn't done everything perfect either. She was the one calling Fury an overclocker's dream and see was the one showing a crossfire demo of Vegas in presentations before Vega's official announcements. See probably followed marketing department suggestions there, but still, if it was Raja in her place, probably people wouldn't have given him any excuse.

Polaris was described as a danger for motherboards by the press , which historically always gave them crap for everything no matter how ridiculous it was. Many cards exceed the PCIe spec and pretty much all of them do that once overclocked yet damaged PCIe slots were and still are a very rare thing that often it comes down to defective hardware. There was just one report from one guy and the press gobbled it up like crazy. Call me a conspiracy theorist and an AMD fanboy but that whole charade served no purpose other than to diminish the value of a very compelling product because it sure as hell didn't save millions from burning their motherboard , which is how this whole thing was supposed to go down according to everyone.

At the end of the day there was nothing to predict because it was a known thing. A disproportionate response to that was unpredictable.

Don't know what misleading campaigns you have seen about Vega and gaming performance , if anything the amount of information they released was minimal at best. I don't remember any false statements they had on that matter , the crossfire demo was done without any performance metrics or other claims as far as I remember.
 
Last edited:
Polaris was described as a danger for motherboards by the press , which historically always gave them crap for everything no matter how ridiculous it was. Many cards exceed the PCIe spec and pretty much all of them do that once overclocked yet damaged PCIe slots were and still are a very rare thing that often comes down to defective hardware. There was just one report from one guy and the press gobbled it up like crazy. Call me a conspiracy theorist and an AMD fanboy but that whole charade served no purpose other than to diminish the value of a very compelling product because it sure as hell didn't save millions from burning their motherboard , which is how this whole thing was supposed to go down according to everyone.

At the end of the day there was nothing to predict because it was a known thing. A disproportionate response to that was unpredictable.

Don't know what misleading campaigns you have seen about Vega and gaming performance , if anything the amount of information they released was minimal at best. I don't remember any false statements they had on that matter , the crossfire demo was done without any performance metrics or other claims as far as I remember.
Yes I know. I was almost screaming at sites (TPU and PCPer) to test overclocked cards or specifically a GTX 950, if I remember correctly, that had no external power connector and probably going much higher than 75W from the PCIe bus. Almost all sites where testing with a GTX 960 that HAD an extra PCie connector, conveniently finding nothing wrong with that card.

But on the other hand, it's different to have a card stressing the bus after overclocking and have a card stressing the bus at it's defaults. Polaris should have come out with an 8 pin or two 6 pin connectors. Or at lower frequencies. Tech press' response, about anything not perfect from AMD, it was already known, it wasn't something to predict, it was already known how the press will react and they should had avoided it.

Vega's campaign was also a mistake. AMD is NOT a football team, AMD shouldn't be running a marketing campaign about poor Voltas like it is addressing the typical, no brain, football team fanatic. Crossfire in presentations was also misleading and was making people questioning Vega's performance way before it's official announcement. Throwing Poor Voltas and Crossfires at the same time, was sending the opposite signals to people waiting for Vega and in the end backfired. I am considered an AMD fan and I was getting attacked by another member in a Greek forum for not believing that Vega will come close to 1080 Ti. Today forum members like that guy, AMD fans like that guy, had lost every credibility in the eyes of other members, because they believed that Poor Volta campaign.
 
Yes I know. I was almost screaming at sites (TPU and PCPer) to test overclocked cards or specifically a GTX 950, if I remember correctly, that had no external power connector and probably going much higher than 75W from the PCIe bus. Almost all sites where testing with a GTX 960 that HAD an extra PCie connector, conveniently finding nothing wrong with that card.

But on the other hand, it's different to have a card stressing the bus after overclocking and have a card stressing the bus at it's defaults. Polaris should have come out with an 8 pin or two 6 pin connectors. Or at lower frequencies. Tech press' response, about anything not perfect from AMD, it was already known, it wasn't something to predict, it was already known how the press will react and they should had avoided it.

But just as you said with the 950 this was a known thing but no one paid attention to it previously, so how the hell was AMD supposed to have known that this time people will lose their shit ? I just find this tremendously ridiculous , AMD is not only expected to deliver performance with no issues but to also to be a psychic.

I do agree on one thing though , they do know how the press reacts to everything they put out , but they never took that into account. Unfortunately when that will happen , they will just end up mirroring what Nvidia does and I'm not so sure the consumer will benefit from that.
 
But just as you said with the 950 this was a known thing but no one paid attention to it previously, so how the hell was AMD supposed to have known that this time people will lose their shit ? I just find this tremendously ridiculous , AMD is not only expected to deliver performance with no issues but to also to be a psychic.

I do agree on one thing though , they do know how the press reacts to everything they put out , but they never took that into account. Unfortunately when that will happen , they will just end up mirroring what Nvidia does and I'm not so sure the consumer will benefit from that.
As I said, it is another thing to have an overclocked(by the user) card going over the 75W limit and a different thing having a card from the factory going over the 75W limit. That's where the press find the excuse to attack AMD(again). The 950 was a 75W card, it was at it's limit, but if I remember correctly, in TPU's review they managed to overclock the card and get 20% extra performance. Well, you can't have more performance without higher power consumption. Can you? From the replies I got in here, you can!!! Because it is an Nvidia of course. Anyway, the thing here is that the card was going over the PCIe limits at it's defaults. There is no excuse for that, neither you have to be a psychic to understand that this could end in bad publicity. AMD wanted to offer a card with only a single PCIe 6pin connector to make it look as power efficient as Nvidia cards in the eyes of the consumer. They took that risk, press found a nice subject to write about.

The press' behavior has changed after Polaris. It's less negative towards AMD. There are some exceptions, like the way this article is written, but exceptions, not the rule like it was before Polaris.
 
Because it's an advertised feature, maybe?

Just because it's advertised as "functional", that doesn't mean it'll perform good. It's not like multi GPU was released just last year. It has been around for decades. And we all know it only really works well when same cards are paired.
 
Just because it's advertised as "functional", that doesn't mean it'll perform good. It's not like multi GPU was released just last year. It has been around for decades. And we all know it only really works well when same cards are paired.
It works best that way, sure. However it typically never scaled negatively. This IS worse, than previous mixed scaling than Fury and FuryX for example. Look up some testing. ;)
 
Back
Top