# AMD Radeon R9 290X CrossFire



## W1zzard (Oct 24, 2013)

In this review, we put two AMD Radeon R9 290X cards into a CrossFire configuration, delivering massively improved framerates that will easily handle the latest and greatest titles. With the pair of cards retailing at just $100 more than a single GTX Titan, this is also quite an affordable combination.

*Show full review*


----------



## Slomo4shO (Oct 24, 2013)

I was hoping for 5760x1080P performance benchmarks. Thanks for the writeup.


----------



## The Von Matrices (Oct 24, 2013)

I think you need a new thumbnail now.  This just doesn't seem appropriate anymore since the cards no longer need crossfire bridges:


----------



## Supercrit (Oct 24, 2013)

70% for Titan and 100% for 290x crossfire, that's not a mere 30% increase of performance. From Titan's viewpoint, that's 100%/70% which gives *42.8%* increase, it's 30% decrease only from 290x CF viewpoint.


----------



## The Von Matrices (Oct 24, 2013)

I guess you don't have the setup to test, but I'm curious if frame pacing was fixed with this card.


----------



## 15th Warlock (Oct 24, 2013)

As soon as I read the first paragraph my heart was filled with joy: $549! 

Thanks for ending this madness AMD!!!


----------



## Kissamies (Oct 24, 2013)

Awesome performance, and it sure gets better after few driver updates


----------



## ChaoticG8R (Oct 24, 2013)

I feel like the bang for buck factor significantly goes down using 2 in CrossFire, the performance gains are not very linear and seem to peak after about a 30-70% (depending on title) increase.


----------



## mrwizard200 (Oct 24, 2013)

Blah.

I have this in my cart, but I dont know if my 500W PSU will hold for long ://


----------



## hardcore_gamer (Oct 24, 2013)

Wizz,

Please test it at 4K.


----------



## Jetster (Oct 24, 2013)

Power consumption? Alot?


----------



## D1RTYD1Z619 (Oct 24, 2013)

Ah I do miss my x1900xts.


----------



## fullinfusion (Oct 24, 2013)

Supercrit said:


> 70% for Titan and 100% for 290x crossfire, that's not a mere 30% increase of performance. From Titan's viewpoint, that's 100%/70% which gives *42.8%* increase, it's 30% decrease only from 290x CF viewpoint.


I agree but only a great AMD driver can fix the issue.. Still Im glad to keep what I got.. Nice try AMD but not from this guy.


----------



## The Von Matrices (Oct 24, 2013)

On other review sites it seems that the 290X fixed the crossfire eyefinity/4K split screen frame pacing issues.  The 280X and lower still has to deal with it though until a proper driver update is released.


----------



## W1zzard (Oct 24, 2013)

hardcore_gamer said:


> Please test it at 4K.



Please send me a monitor 

Right now I think it's stupid to buy a 4K monitor. New models will come out with HDMI 2.0, prices will go down a lot, and NVIDIA G-Sync could make it into new models.


----------



## fullinfusion (Oct 24, 2013)

W1zzard said:


> Please send me a monitor
> 
> Right now I think it's stupid to buy a 4K monitor. New models will come out with HDMI 2.0, prices will go down a lot, and NVIDIA G-Sync could make it into new models.


I bought a LG 55" for 1200 bucks and it's 4k

two credit cards thick and lol it's amazing... roll it up baby!!!


----------



## buggalugs (Oct 24, 2013)

Why write off the card based on the reference cooler, they have always been crap.

 Non-reference 290X will change the game and make the temp/noise issues irrelevant. Or the card is cheap enough to make buying an aftermarket cooler worthwhile.

 Not only that, with a decent cooler, Im sure this card will overclock something awesome and really pull ahead of titan.


----------



## HTC (Oct 24, 2013)

buggalugs said:


> Why write off the card based on the reference cooler, they have always been crap.
> 
> Non-reference 290X will change the game and make the temp/noise issues irrelevant. Or the card is cheap enough to make buying an aftermarket cooler worthwhile.
> 
> *Not only that, with a decent cooler, Im sure this card will overclock something awesome and really pull ahead of titan.*



Why doesn't the card come with a better cooler out of the box?

The fact that it doesn't is what creates these temp/noise issues to begin with.


Dunno about you but if i were to purchase this sort of card (high end card), *which i'm not*, i'd gladly pay more (within reason) if it came with proper cooling out of the box, but that's me ...

EDIT

How much more does a proper cooler cost anyways? $10? $15? $20? If that brought both noise and temps to a more acceptable level, wouldn't you pay the extra money?


----------



## repman244 (Oct 24, 2013)

HTC said:


> Why doesn't the card come with a better cooler out of the box?



So it costs less monies and you can decide yourself which one to use.
Unless they put the void warranty sticker somewhere...


----------



## Jetster (Oct 24, 2013)

There also not going to wait for aftermarket coolers to release the card


----------



## The Von Matrices (Oct 24, 2013)

HTC said:


> How much more does a proper cooler cost anyways? $10? $15? $20? If that brought both noise and temps to a more acceptable level, wouldn't you pay the extra money?



To dissipate 300W you need a pretty massive cooler.  They're usually around $75 retail like this.

But open air coolers are cheaper to produce than blower coolers, and since the resellers wouldn't be paying money for the first heatsink, I wouldn't expect them to cost any more than the blower.  Manufacturers typically prefer blowers for reference cards because their cooling performance is less affected by case airflow, especially in multi-card scenarios.  I think AMD didn't like the outcome of the HD 7990, which suffered in two card scenarios because of its open air cooling.

The delay in custom cooling the R9 290X probably just comes down to AMD's core design.  AMD for some reason likes to recess their cores below the shims.  Each model has a different core and shim size, and because of this each AMD card needs a specifically designed cooler with a special base made for it rather than a universal cooler that can be used for all cards.


----------



## HTC (Oct 24, 2013)

The Von Matrices said:


> *To dissipate 300W you need a pretty massive cooler.  They're usually around $75 retail* like this.
> 
> But open air coolers are cheaper to produce than blower coolers, and since the resellers wouldn't be paying money for the first heatsink, I wouldn't expect them to cost any more than the blower.  Manufacturers typically prefer blowers for reference cards because their cooling performance is less affected by case airflow, especially in multi-card scenarios.  I think AMD didn't like the outcome of the HD 7990, which suffered in two card scenarios because of its open air cooling.
> 
> The delay in custom cooling the R9 290X probably just comes down to AMD's core design.  AMD for some reason likes to recess their cores below the shims.  Each model has a different core and shim size, and because of this each AMD card needs a specifically designed cooler with a special base made for it rather than a universal cooler that can be used for all cards.



The question was meant as how much more *compared* to the already in use cooler.

Also, it's completely different if the cooler manufacturer produces something that *may end up being used in some cards* as opposed to something that *will be used on all reference cards*: volume drives prices down.


Let's say, for the sake of argument, a cooler which provided both acceptable noise and temps for the card's normal mode (disregarding the uber mode here) costs $30 more then the one included and that drives the price to $579 for the reference model.

Now, do the math for the graphs with everything price related for $579 instead of the $549, keeping in mind the noise and temps will be better as well as there will be less drawbacks in the conclusion (from noise and temps) and answer me this: is the extra $30 worth it?


----------



## W1zzard (Oct 24, 2013)

buggalugs said:


> Non-reference 290X will change the game and make the temp/noise issues irrelevant



so far, everybody who I talked to, said that AMD doesn't allowed custom designs for 290X. This will probably change soon, maybe AMD just wants to sell more of their ref boards asap


----------



## Buff Hamster (Oct 24, 2013)

I think I'll keep with my 7990 @ 1440p


----------



## tt_martin (Oct 24, 2013)

Spoiler



At 2560x1600 and in its "Uber" mode, the R9 290X CrossFire offers 30% higher performance than the GTX TITAN and 26% higher performance than the GTX 690





> At 2560x1600 and in its "Uber" mode, the R9 290X CrossFire offers *59%* higher performance than the GTX TITAN and *32%* higher performance than the GTX 690



Fixed that for You. Take some math classes.


----------



## AlderaaN (Oct 24, 2013)

Thank you W1zzard, for this review (and the non-CF one).

It baffles me why AMD puts its cooling solution's noise output at the 'Absolutely Not A Priority' category yet again.

Same case was with its CrossFired HD 7990 card, even though its cooling solution features a triple fan configuration, resembling that of several AIBs.


Regards,


----------



## 1d10t (Oct 24, 2013)

W1zz there's a couple thing a wanna ask you:


This cards show 315W maximum on other thread.It's fair assumption that two of these could take 630W?
Are these cards require Display Port connected to 3rd monitor to run Eyefinity properly like previous generations?
About Crossfire Profiles,did AMD support PLP (Portrait Landscape Portrait) Eyefinity mode which lack in the previous generation?

Many thanks in advance


----------



## Supercrit (Oct 24, 2013)

fullinfusion said:


> I agree but only a great AMD driver can fix the issue.. Still Im glad to keep what I got.. Nice try AMD but not from this guy.



I don't know if you understand what I meant.

Titan is 30% slower than dual 290x.
Dual 290x is 42.8% faster than Titan.

70 x 1.428 =100 
100-(100 x 30%) = 70


----------



## draecko (Oct 24, 2013)

Where is the Titan SLI in this review? 780 SLI? Hello?


----------



## haswrong (Oct 24, 2013)

The Von Matrices said:


> I think you need a new thumbnail now.  This just doesn't seem appropriate anymore since the cards no longer need crossfire bridges:
> 
> http://tpucdn.com/reviews/AMD/R9_290X_CrossFire/images/small.gif



cross-water bridge 




draecko said:


> Where is the Titan SLI in this review? 780 SLI? Hello?


hmm, and wheres the evga 780 classy review?


----------



## Elmo (Oct 24, 2013)

alot of people want 4k benches but alot of people dont have 4k


----------



## HD64G (Oct 24, 2013)

Fix it Wiz. 42.8% better than Titan is much better than 30% that you wrote. It absolutely changes the truth about the performance margin from any other GPU. And why don't you make the chart for Eyefinity? You have the numbers.


----------



## NeoXF (Oct 24, 2013)

Okay 2 things that are deeply deeply wrong with this review, okay 3.

1. Are you seriously benching 1600x900 for this setup? Even 1920x1080 is a joke, hitting CPU bottlenecks and seeing most of the cards do ~200fps begs one to scratch their heads... You could've at least excluded the tests where it's clearly CPU-bound.

2. Z87 w/o a PLX chip? Really? I hope you realize how much 8x/8x can limit multi-GPU setups right? Especially now that Crossfire needs it most with the removal of the bridge and changing the multi-GPU game up.

Here's an pertinent example (there's more...):






3. No frame-time measurements and such. Really TPU... get with the times...


----------



## W1zzard (Oct 24, 2013)

1d10t said:


> W1zz there's a couple thing a wanna ask you:
> 
> 
> This cards show 315W maximum on other thread.It's fair assumption that two of these could take 630W?
> ...



1) yes, that might be possible. more realistic is probably slightly above 600 W because CF doesn't scale 100%
2) well yes, because there is no 3rd place to put it. but you don't need an active adapter anymore, the cheapest passive one will do (they now added a 3rd DVI/HDMI clock generator)
3) did not test


----------



## AlderaaN (Oct 24, 2013)

W1zzard said:


> 2) well yes, because there is no 3rd place to put it. but you don't need an active adapter anymore, the cheapest passive one will do (they now added a 3rd DVI/HDMI clock generator)


That's actually sweet.


----------



## Intel God (Oct 24, 2013)

Wizzard when does the TRI crossfire review come out?


----------



## W1zzard (Oct 24, 2013)

Intel God said:


> Wizzard when does the TRI crossfire review come out?



no plans, neither have 3 cards, nor a platform that supports them.


----------



## TheoneandonlyMrK (Oct 24, 2013)

alderaan said:


> thank you w1zzard, for this review (and the non-cf one).
> 
> It baffles me why amd puts its cooling solution's noise output at the 'absolutely not a priority' category yet again.
> 
> ...



+1


----------



## refillable (Oct 25, 2013)

I remember they promised 'almost 100%' increase, we didn't see it now... We saw about 50% on uber.

Just thinked to myself, this review has many major flaws. You didn't include Titan and 780 SLI, which will be helpful when you see them dropping in terms of pricing. Second, two of these is completely a total beast (by today standards), let alone 3 or 4. 1920x1200 will definetely not cut it, let alone 1680x1050. Bottlenecks will definetely occur. So before testing eyefinity 5760x1080 and perhaps QuadFullHD 3840x2160 you are not getting the potential of the cards. In conclusion, the increase shown on this review could be somewhat misleading.

Oh yes, next time, would you also test frame time variance for us? This is critical on crossfire or SLI because it is the only way you can tell if a card 'micro-stutters' or not.

Just some constructive critics guys, ya need to work harder on the next review (Perhaps 780 Ti, a dumb card, or is it?)!


----------



## 1d10t (Oct 25, 2013)

W1zzard said:


> 1) yes, that might be possible. more realistic is probably slightly above 600 W because CF doesn't scale 100%
> 2) well yes, because there is no 3rd place to put it. but you don't need an active adapter anymore, the cheapest passive one will do (they now added a 3rd DVI/HDMI clock generator)
> 3) did not test



1.Crap...this card obviously limited by my old P760 
2.HDMI in the middle?Curious how AMD now handling eyefinity with HDMI enabled.But if they required passive one...that a good news 
3.So did AMD,they still not respond my last emaill about that 

But thanks anyway 



refillable said:


> I remember they promised 'almost 100%' increase, we didn't see it now... We saw about 50% on uber.



Yeah definityly not all of it...

Battlefield 3 
94.6 fps - 47 fps = 48.6 fps or 101% more





Call of Juarez
174.7fps - 127.3 fps = 47.4fps or 37% more





Crysis 3
34.5fps - 18.2fps = 16.3 fps or 89% more





FarCry3
41.5 fps - 19.5fps = 22 fps or 112% more





please do some research


----------



## Raghar (Oct 25, 2013)

Is heatsink and water plumbing included? They were lucky with release date, winter begins on Northen hemisphere. They can try crossfire, then sell one card before summer appears.


----------



## Blín D'ñero (Oct 25, 2013)

Picture not showing up on Bioshock page. The HTML shows up in stead, you can see the problem is a missing bracket at the end.


>


----------



## Blín D'ñero (Oct 25, 2013)

W1zzard, thank you for this:  





> DirectX 11.2 is not supported on Windows 7 and requires Windows 8.1, yet Windows 8 and its successor, Windows 8.1, are incredibly unpopular with PC gamers (at least our forum members) due to the idiotic user-interface that insults our intelligence.


  Well-said.

And a question: what is your Crysis 3 testing method? I can't find it in the review. I get the same or slightly higher performance ingame with my HD7970 CF @ 2560x1600, DX11 Very High 2x AA -- though it fluctuates throughout the game (between 40 ~ 80). 
I suppose you used some benchmark?

I didn't expect one R9 290X to match/beat HD7970 CF in Crysis 3, but it seems in crossfire they perform equally (in Crysis 3 @ 2560x1600).


----------



## Solaris17 (Oct 26, 2013)

missing an ending bracket on this page 4th picture


http://www.techpowerup.com/reviews/AMD/R9_290X_CrossFire/5.html


----------



## Blín D'ñero (Oct 26, 2013)

Hello!


----------



## buggalugs (Oct 27, 2013)

W1zzard said:


> so far, everybody who I talked to, said that AMD doesn't allowed custom designs for 290X. This will probably change soon, maybe AMD just wants to sell more of their ref boards asap



 I've heard that too, at least for the initial release. I don't think it will be too long before we see them.


----------



## Aerpoweron (Oct 31, 2013)

First hi @ all, nice forum here as well as very good reviews 

I still like the reference cooler design, since it does not just circle around the hot air. In the recent years there are no aftermarket coolers which do that. It is a great advantage in Crossfire/SLI setup, because the sandwiched cooler does not overheat the card. But still a shame that the cooler is so noisy.

The card seems nice so far, the only problem is that the Double Precision power is cut down. I do distributed computing on a double precision project, and i run near the Furmark power draw because of that.
I had to send in my Dual Fan cards twice, because the fans bearing just gave up. (twin frozer II design) in SLI.

4k tests would have been nice, but given the prices for these Displays (Not Televisions) is it just not practical yet. I really love to see a Displyport only Monitor. HDMI is just crappy for a Computer Display, since it often does some scaling up or down.

Again, great review Wizzard.

Edit:

It is about time, that VGA is phasing out. I can remember Intel wanted to do that in 2013, but they pushed it further into the future.


----------



## MxPhenom 216 (Oct 31, 2013)

Aerpoweron said:


> First hi @ all, nice forum here as well as very good reviews
> 
> I still like the reference cooler design, since it does not just circle around the hot air. In the recent years there are no aftermarket coolers which do that. It is a great advantage in Crossfire/SLI setup, because the sandwiched cooler does not overheat the card. But still a shame that the cooler is so noisy.
> 
> ...



its the standard blower design that has been done on reference cards from both amd and nvidia for years. There is also a vendor for nvidia thst does after market blower designs. Evga on their classified cards.

Most non reference cards dont have the blower design be ause it doesnt work as well for cooling and its typically a lot louder. If you have a case with good airflow the heat thats released into the case from such designs is not much of a problem.


----------



## Aerpoweron (Oct 31, 2013)

> Most non reference cards dont have the blower design be ause it doesnt work as well for cooling and its typically a lot louder. If you have a case with good airflow the heat thats released into the case from such designs is not much of a problem.



I have to agree, but only for a none Crossfire or SLI design. The card with the fan or fans beween the cards always gets hotter. What is also an issue for me, the none blower cards exhaust heat goes upwards inside the case and heats up the cpu-cooler as well, before the heat gets blown out. Not a problem for gaming, but if you run GPU and CPU at full speed, you can hear the difference 

By the way, did the question ever got answered, if you need 1 or 2 crossfire bridges to connect 2 cards? I always used one without problems. But now we don't need them any more


----------



## MxPhenom 216 (Oct 31, 2013)

Aerpoweron said:


> I have to agree, but only for a none Crossfire or SLI design. The card with the fan or fans beween the cards always gets hotter. What is also an issue for me, the none blower cards exhaust heat goes upwards inside the case and heats up the cpu-cooler as well, before the heat gets blown out. Not a problem for gaming, but if you run GPU and CPU at full speed, you can hear the difference
> 
> By the way, did the question ever got answered, if you need 1 or 2 crossfire bridges to connect 2 cards? I always used one without problems. But now we don't need them any more



The top card will be hotter than the bottom regardless when it comes to air cooling. Also CPUs run so cool these,days that heat from a gpu if it makes its way through the cpu heatsink will not increase temps all that much. Key is case airflow.

Not to mention the 290x reference cooler is junk. Card shouldnt be hitting 95c  load or make 50 db of noise at the same time. 

For 2 cards 2 bridges does not make a difference compared to 1.


----------



## Aerpoweron (Oct 31, 2013)

> For 2 cards 2 bridges does not make a difference compared to 1



Yes, but i've seen treads pages long, and differnt statements from MSI and Sapphire.

With the power draw of GPUs rising, i wonder what cooling solutions will be used by the GPU manufacturers in the future.

My brother an I play in a average room on midrange computers. But it is enough to heat the room.

When i look at your specs, your pc also has a noticeable heat output under load.
Oh, where can i enter my PC specs, didn't find it in my profile.


----------



## MxPhenom 216 (Oct 31, 2013)

Aerpoweron said:


> Yes, but i've seen treads pages long, and differnt statements from MSI and Sapphire.
> 
> With the power draw of GPUs rising, i wonder what cooling solutions will be used by the GPU manufacturers in the future.
> 
> ...



The power draw isnt rising(well in some cases). Its actually going down. 20nm GPUs should have considerable lower tdp and power draw.


----------



## The Von Matrices (Oct 31, 2013)

MxPhenom 216 said:


> The power draw isnt rising(well in some cases). Its actually going down. 20nm GPUs should have considerable lower tdp are power draw.



A more detailed explanation is that the cards' power draw are already at their limit, and unless there is a monumental advance in air cooling we won't see cards go over 300W.

20nm should theoretically bring lower power draw, but not for the reason you think.  In the past a lower transistor size meant less leakage (wasted power) per transistor, which resulted in less power draw.  However, the most modern silicon manufacturing processes are having trouble controlling leakage as transistor size decreases, so today leakage is more or less held constant as transistors shrink.  The problem with smaller and smaller transistors but constant leakage per transistor is that heat becomes more concentrated.  You actually get higher temperatures for the same power draw.  In order to keep the chip from overheating, you have to ratchet down the power draw.

It's actually going to become harder and harder to make a power guzzling card in the future because of this; the chip will overheat before it is able to draw that much power.


----------



## Steevo (Oct 31, 2013)

The Von Matrices said:


> A more detailed explanation is that the cards' power draw are already at their limit, and unless there is a monumental advance in air cooling we won't see cards go over 300W.
> 
> 20nm should theoretically bring lower power draw, but not for the reason you think.  In the past a lower transistor size meant less leakage (wasted power) per transistor, which resulted in less power draw.  However, the most modern silicon manufacturing processes are having trouble controlling leakage as transistor size decreases, so today leakage is more or less held constant as transistors shrink.  The problem with smaller and smaller transistors but constant leakage per transistor is that heat becomes more concentrated.  You actually get higher temperatures for the same power draw.  In order to keep the chip from overheating, you have to ratchet down the power draw.
> 
> It's actually going to become harder and harder to make a power guzzling card in the future because of this; the chip will overheat before it is able to draw that much power.



Thermal density, same issues Nvidia faced with the 4xx series, the real issue is the advanced SOI process is too expensive to use for a GPU. When we make a GPU core efficient enough and the die size small enough power consumption will drop dramatically.

Imagine either camps core running at 4Ghz instead of 1 with the same thermal load?


----------



## Aerpoweron (Nov 3, 2013)

*Double precision*

Thanks for the infos Steevo and The Von Matrices.

But i have another question. I have done some searching and i can't find any sources for the double precision calculation power of the 290x. Some sites say 1/4 of single precision and others say 1/8th.

If someone is running Milkyway@home, it would be nice to see some times from the Work Units. If 1/8 then the times are comparable to the 6970s. If 1/4 then it should be about 30 to 40% faster than on the 7970.

Since i don't have the money at the moment i can't test it myself.


----------



## Steevo (Nov 4, 2013)

I have seen numbers from F@H and it seems about 20% faster than a 7970 with decent temps, I am not sure if F@H uses DP or SP however.


If it was cut to 1/8 it was due to temps and power draw to keep it from burning up. This GPU is extremely efficient in 2mm for performance, I hope they find some power savings in a future spin and push it further.


----------



## reedy777 (Nov 12, 2013)

Supercrit said:


> 70% for Titan and 100% for 290x crossfire, that's not a mere 30% increase of performance. From Titan's viewpoint, that's 100%/70% which gives *42.8%* increase, it's 30% decrease only from 290x CF viewpoint.



true that. also at the moment these cards are limited in xfire @ 67% due to heat according to other reviews it world be nice to some water cooled benchies from what I've seen the 290 pars the 780 ti max oc for max oc


----------



## reedy777 (Nov 12, 2013)

reedy777 said:


> true that. also at the moment these cards are limited in xfire @ 67% due to heat according to other reviews it world be nice to some water cooled benchies from what I've seen the 290 pars the 780 ti max oc for max oc



24/7 oc that is and I mention it because I recently upgraded to a 2560 x 1440 and I specifically want to now if you can run crisis 3 at a smooth 60fps it took me 3 680's @ 1150 to do it on 1080p but I think the 2 gb ram might be holding me back


----------



## Blín D'ñero (Nov 28, 2013)

"2 GB RAM" you have??? Much too little, and the shortage problem gets exponentially bigger whether it's XP x86 or Windows 7 x64.

I have an average of 50 ~ 60 fps in Crysis 3 Very High / 4 x AF @ 2560 x 1600 / 60 on crossfire 7970 and more than enough RAM with 16 GB.


----------



## BIGARC (Jan 25, 2014)

Blín D'ñero said:


> "2 GB RAM" you have??? Much too little, and the shortage problem gets exponentially bigger whether it's XP x86 or Windows 7 x64.
> 
> I have an average of 50 ~ 60 fps in Crysis 3 Very High / 4 x AF @ 2560 x 1600 / 60 on crossfire 7970 and more than enough RAM with 16 GB.


 

Im pretty sure he ment 2 gb vram mate lol as the 680 come with 2 gb vram .. and your 16 gb system ram wont make a difference , and your using 4 x af not 16 x and what about msaa x 8 ? im not having a go im just saying . I have a xfx r9 290 x with a ek waterblock clocked at 1130 core / 1380  mem it gets very very close to 60 frps average @ 1080 p crisis 3 maxxed with 8 x msaa etc


----------

