# AMD Radeon R9 295X2 8 GB



## W1zzard (Apr 2, 2014)

Today, AMD is launching their Radeon R9 295X2, a dual-GPU card based on two fully unlocked, fully clocked Hawaii graphics processors. As a result, the card delivers impressive numbers in 4K and EyeFinity. But with a price of $1500, it is certainly not cheap, no matter how you look at it.

*Show full review*


----------



## Fluffmeister (Apr 8, 2014)

Impressive performance, but I just couldn't entertain that sort of power consumption.


----------



## Xzibit (Apr 8, 2014)

*The AMD Radeon™ R9 295X2 graphics card. It's too fast.*


----------



## W1zzard (Apr 8, 2014)

the real video you should watch instead of amd propaganda


----------



## Ed_1 (Apr 8, 2014)

what was used for load in the vid, the core clocks and voltage are all over the place , like 920 to 1000 .
Man those VRM are getting hot an I agree should of been 3x 8pin's .


----------



## sweet (Apr 8, 2014)

Fluffmeister said:


> Impressive performance, but I just couldn't entertain that sort of power consumption.


That power consumption is needed to power a pair of HawaiiXT at 1018 MHz, and it worth. This card beats SLI 780Ti, which are on the newest "wonder driver", that means it will destroy TitanZ to be the fastest single card.


----------



## Sasqui (Apr 8, 2014)

What a monster!  W1zzard, when you have the time (lol), it would be more than interesting to compare 290x in CF with newer drivers against this (and at higher resolutions) than the original review:  http://www.techpowerup.com/reviews/AMD/R9_290X_CrossFire/


----------



## msamelis (Apr 8, 2014)

^ lol at that video Xzibit, I haven't seen something so cheesy for a while.

Considering the card, I find it curious and interesting on one had on how the stock cooler is implemented but at the same, I find it a little ridiculous to have to go such lengths to keep the temperatures down. Let's not even talk about the price, I don't see the point at all. Then again, it has always been like this for over-the-top cards.

Edit: Thanks for the second video W1zzard, so the card/s doesn't run as hot but the VRMs are glowing hot while the whole thing is very loud.. That's a little disappointing.


----------



## W1zzard (Apr 8, 2014)

Ed_1 said:


> what was used for load in the vid, the core clocks and voltage are all over the place , like 920 to 1000 .



not furmark, but something that mimics the peak load of the most demanding games


----------



## sweet (Apr 8, 2014)

Sasqui said:


> What a monster!  W1zzard, when you have the time (lol), it would be more than interesting to compare 290x in CF with newer drivers against this (and at higher resolutions) than the original review:  http://www.techpowerup.com/reviews/AMD/R9_290X_CrossFire/


You can find it here. 295x2 is faster than CF 290x 
http://www.tomshardware.com/reviews/radeon-r9-295x2-review-benchmark-performance,3799-17.html


----------



## darkangel0504 (Apr 8, 2014)

a monster card


----------



## Fluffmeister (Apr 8, 2014)

sweet said:


> That power consumption is needed to power a pair of HawaiiXT at 1018 MHz, and it worth. This card beats SLI 780Ti, which are on the newest "wonder driver", that means it will destroy TitanZ to be the fastest single card.



I don't disagree, but that doesn't change the fact it's far too juicy for me.

As for the 337.50 driver update:

http://www.guru3d.com/articles_pages/amd_radeon_r9_295x2_review,29.html


----------



## Overclocker_2001 (Apr 8, 2014)

vrm and pci-e plug will die in less than 3 years
sure who will buy this card have no money problem to rebuy another powerfull card next year.. but.. if i get 25°C room temperature (35°C inside case) that vrm will throttle for sure.. or die in too little time
remind me x1950xtx and 8800ultra vrm skyhigh temperature (with tons of rma)


----------



## Xzibit (Apr 8, 2014)

Overclocker_2001 said:


> vrm and pci-e plug will die in less than 3 years
> sure who will buy this card have no money problem to rebuy another powerfull card next year.. but.. if i get 25°C room temperature (35°C inside case) that vrm will throttle for sure.. or die in too little time
> remind me x1950xtx and 8800ultra vrm skyhigh temperature (with tons of rma)



Who cares about the card.

I want that Haswell processor he was using that never hit 20C to even register in the FLIR


----------



## W1zzard (Apr 8, 2014)

Xzibit said:


> I want that Haswell processor he was using that never hit 20C to even register in the FLIR



and the fan was on the CPU heatsink was disconnected, too.


----------



## LeonVolcove (Apr 8, 2014)

I dont know what else to say UNLESS that price seems too high


----------



## Sasqui (Apr 8, 2014)

sweet said:


> You can find it here. 295x2 is faster than CF 290x
> http://www.tomshardware.com/reviews/radeon-r9-295x2-review-benchmark-performance,3799-17.html




Eww, I feel violated!  I just clicked on a TH link.  Ick.

Seriously though, the few gaming benchmarks I see show them about neck and neck.


----------



## THE_EGG (Apr 8, 2014)

omnomnom. This looks like a tasty video card. Too bad it is WAY out of budget for me haha. Nice thermal imaging part of the review too.

Although I will say that the lack of fan control is a little lame.


----------



## Spaceman Spiff (Apr 8, 2014)

sweet said:


> That power consumption is needed to power a pair of HawaiiXT at 1018 MHz, and it worth. This card beats SLI 780Ti, which are on the newest "wonder driver", that means it will destroy TitanZ to be the fastest single card.



Do you have a source for this? "This card beats SLI 780Ti" 
TPU does not have a review of Ti's in SLI and looking at anandtech's review, I see the 780Ti SLI and 295x2 trading places everywhere, game and resolution dependent.


----------



## btarunr (Apr 8, 2014)

Spaceman Spiff said:


> Do you have a source for this? "This card beats SLI 780Ti"
> TPU does not have a review of Ti's in SLI and looking at anandtech's review, I see the 780Ti SLI and 295x2 trading places everywhere, game and resolution dependent.



http://www.anandtech.com/show/7930/the-amd-radeon-r9-295x2-review

They also used 337.50 for NVIDIA. So even with the best driver, 780Ti SLI is slower than 295X2. Which means TITAN-Z will suck, especially at its $3000 price.


----------



## ShurikN (Apr 8, 2014)

Spaceman Spiff said:


> Do you have a source for this? "This card beats SLI 780Ti"
> TPU does not have a review of Ti's in SLI and looking at anandtech's review, I see the 780Ti SLI and 295x2 trading places everywhere, game and resolution dependent.


If you bothered to open the guru3d link you would see that the 295 is faster than 780ti sli (with new driver) in 6 out of 8 games. Taking 4k res into consideration ofc.
Tom's showed it  as well.


----------



## Durvelle27 (Apr 8, 2014)

THE_EGG said:


> omnomnom. This looks like a tasty video card. Too bad it is WAY out of budget for me haha. Nice thermal imaging part of the review too.
> 
> Although I will say that the lack of fan control is a little lame.


There will be soon as it's a driver limitation


----------



## birdie (Apr 8, 2014)

An excellent review, thanks!

Finally with 4k.

I'm kinda confused that 780(ti) SLI and 290(x) crossfire results are missing in the charts. Hopefully you'll include them when reviewing titan z.


----------



## FreedomEclipse (Apr 8, 2014)

> Uncompromised Philosophy
> Uncompromised Power
> Uncompromised Performance



Compromised Crossfire Performance - for lack of a better permanent fix (and better driver dev team)

---

They keep promising permanent fixes but the current one is still temporary, It still works though but its still like using a band aid in the place of stitches. you cant seal the wound without the proper tools/materials


----------



## Xzibit (Apr 8, 2014)

btarunr said:


> http://www.anandtech.com/show/7930/the-amd-radeon-r9-295x2-review
> 
> They also used 337.50 for NVIDIA. So even with the best driver, 780Ti SLI is slower than 295X2. Which means TITAN-Z will suck, especially at its $3000 price.



Over at *[H]ardocp - AMD Radeon R9 295X2 Video Card Review* they also tested with 337 and noticed this.



			
				 Hardocp said:
			
		

> With the GeForce GTX 780 Ti we found the peak consistent clock speed on both GPUs went up to 1019MHz while gaming. This is higher than the boost clock on a GTX 780 Ti which is 928MHz. As we posted on the previous page, this seems slightly higher than we've tested in the past. Normally we've seen the GPU hit 1006MHz while gaming, but now it is at 1019MHz with this newest driver. We also noticed the temperature of the GPU was higher, at 87c, versus 84c on previous drivers. This higher temperature threshold has allowed the frequency to go higher, hence the 1019MHz. In any case, this means the GTX 780 Ti SLI configuration was providing us higher performance for this round of testing, so it definitely got to give us its best shot at stock performance without overclocking.



Seams the new 337 drivers might be uping the thermal threshold of Nvidia cards to run faster.


----------



## birdie (Apr 8, 2014)

W1zzard said:


> the real video you should watch instead of amd propaganda



Hey, you should have spoken a few words during filming this cause it's hard to compare to anything since there are no other audio sources we can relate to.


----------



## darkangel0504 (Apr 8, 2014)




----------



## btarunr (Apr 8, 2014)

darkangel0504 said:


>



The background score in that video sounds like something out of Brazzers.


----------



## W1zzard (Apr 8, 2014)

birdie said:


> Hey, you should have spoken a few words during filming this cause it's hard to compare to anything since there are no other audio sources we can relate to.


I can't speak out of the video card (to have the proper noise levels you could calibrate on)


----------



## TheBrainyOne (Apr 8, 2014)

With single GPU cards reaching 300W TDPs, I guess it's not unexpected that dual GPU cards are reaching 500W levels. Still, 2 pre-overclocked GTX 780 Ti cards would be a *much* better buy.



> The card requires two 8-pin PCI-Express power connectors. Normally this configuration would be good for up to 375 W of power draw. AMD however has chosen to exceed the specifications, citing a minimum of 28A (=336 W) for each connector, which brings the total to around 750 W.



Is AMD crazy or what?! Was it that hard to put another 8 pin connector on there?!'


----------



## Xzibit (Apr 8, 2014)

TheBrainyOne said:


> With single GPU cards reaching 300W TDPs, I guess it's not unexpected that dual GPU cards are reaching 500W levels. Still, 2 pre-overclocked GTX 780 Ti cards would be a *much* better buy.
> 
> 
> 
> Is AMD crazy or what?! Was it that hard to put another 8 pin connector on there?!'



They ran out of room






I believe its the same length of the 7990.  They would have had to make it longer if they did


----------



## TheBrainyOne (Apr 8, 2014)

Xzibit said:


> I believe its the same length of the 7990. They would have had to make it longer if they did



Yeah. Because a guy purchasing a 500W 1500$ piece of hardware won't have a big enough case. Stupid AMD.


----------



## Delta6326 (Apr 8, 2014)

Great card! Sadly the whole card is ruined by coil wine!


----------



## newtekie1 (Apr 8, 2014)

W1zzard said:


> the real video you should watch instead of amd propaganda



Look at how hot those 8-Pin connectors get, and I think we can even see the 24-pin in there and it is getting super hot too.  I have a feeling this card is going to melt connectors.


----------



## Mistral (Apr 8, 2014)

That thing is a beastly monster, and it serves to underline one thing: *we needed freaking 20nm months ago*...


----------



## The Von Matrices (Apr 8, 2014)

You have to give AMD's engineers credit for this one.  They crammed a 1024 bit memory bus and enough circuitry to provide 500W of power all on a normal height reasonably long PCB.  One other site said the PCB is 14 layers, which is quite an achievement.


----------



## 15th Warlock (Apr 8, 2014)

I tip my hat to both AMD's and Asetek's engineering teams, they have accomplished what IMHO was impossible, taming two fully featured Hawaii XT cores using a single 120mms radiator, and designing a single board with 1024 lanes of DDR5 memory, they clearly had a no compromise design in mind, and they achieved that goal obviously.

Amazing performance, low noise levels and temperatures, what a big difference investing in a premium cooling solution has done for the beastly 290X core, still, IMHO, if you have the room for two cards it's a better deal than going for a dual GPU single card, but boy, what a fantastic piece of engineering this card is!

The writing is on the wall, this card is probably going to be as fast if not faster than Titan-Z, and at half the price again, see boys and girls? this is the reason why we need a healthy AMD to bring the heat to Nvidia and create a competitive environment, at $1500 I would personally not buy this card, but you can bet the bean counters at the green team are at full alert mode trying to figure how to compete with this card at this price point, 780X2? Who knows.



The Von Matrices said:


> You have to give AMD's engineers credit for this one.  They crammed a 1024 bit memory bus, and enough circuitry to provide 500W of power all on a normal height reasonably long PCB.  One other site said the PCB is 14 layers, which is quite an achievement.



You posted this while I was writing my post, it's eerie how similar our conclusions on this card are, you know what they say about great minds....


----------



## Xzibit (Apr 8, 2014)

btarunr said:


> The background score in that video sounds like something out of *Brazzers*.



TMI

Now we know where to find you when the news is delayed.


----------



## the54thvoid (Apr 8, 2014)

In all honesty I can't be impressed.  But that is down to personal preferences.  If I wanted to go with a dual GPU solution I'd buy two standard 290x's and put blocks on them.  I've not yet checked out the 290x crossfire comparisons but without water will these not be down clocking, making an apples to apples comparison difficult?

This is simply what the 6990 and 7990 was before, except this time they've had to slap a hybrid water solution on it.  For this price a more genuine approach would be an EKWB dual full cover block with an integrated pump/rad combo.  Given the current pricing on a single 290x, I don't think the price is 'unrealistic' (you want that, look to Titan Z) but I think the hybrid solution is a cheap one and the card is pretty ugly (not that it matters).

The power consumption is absolutely irrelevant in comparison as a 780ti sli solution is the same (accord to Guru3D).  Performance wise, looks like AMD have clinched the single gpu crown back given that Titan Z results are yet to come out but looks likely NV will lose (on 780ti comparison).

If NV released a price comparable card (GTX 790, reduced compute, 6GB combined memory) then it would be a very good fight but as it is, in the dual GPU battle, $3000 Titan Z isn't even a competitor against a card half that price and better performing (arguably).

They're still both crap compared to a sensible single gpu/screen combo choice (he says stroking his Classified 780ti ).

And FTR the Guru3D benches at 4K (on the new 337 driver page) show the 4GB versus 3GB arguments look very tired.


----------



## Casecutter (Apr 8, 2014)

W1zzard said:


> not furmark, but something that mimics the peak load of the most demanding games



Yea, that's what I wanted to know nice to see not as a full-on stress test. Although, perhaps just as a "one-time" baseline it might be nice to see what it looks like in real "worst case" actually gaming in that same amount of time.  Once we see that I/we can say "al-righty-then" that simulation appears to mimic what real world gaming is stressing. 

And while such results are "glowing" being these are the first it's hard to say they have any true implication, we'll need to balance that against more cards/results.

As to this card... like all dual chip solutions they aren't my cup of tea... But for $1500 just knocking-off more or less an ARES II, it's not as not refined, evolutionary, or truly/adequately engineered.  It almost feels like marketing drove this, and raised what we got… which always bad.  Almost like someone in engineering said the PCB and layout was done and hanging around, but cooling needs "sophistication", which meant time and money.  Marketing and bean counters said just get it checked off the list, quick and no more costs.  Sadly I see it all too often… and it actually works against them.


----------



## W1zzard (Apr 8, 2014)

Casecutter said:


> Yea, that's what I wanted to know nice to see not as a full-on stress test. Although, perhaps just as a "one-time" baseline it might be nice to see what it looks like in real "worst case" actually gaming in that same amount of time



Drivers can detect Furmark and throttle the card, and people would cry "unrealistic" either way. I think the test I have is quite good and kinda represents worst case in realistic usage. Furmark would be just to show everything "omgz hot", which is not the point of this test. It would also affect the noise recordings, which are there to provide additional insight, because dBA numbers are not so easy to grasp


----------



## ISI300 (Apr 8, 2014)

Such a great review. Thanks, Wizz. Also thank god those cables didn't catch fire!


----------



## mr2009 (Apr 8, 2014)

why not run the middle tube water through the copper channel and back into the 2nd water pump?


----------



## ISI300 (Apr 8, 2014)

Good point, well made.


----------



## TheBrainyOne (Apr 8, 2014)

mr2009 said:


> why not run the middle tube water through the copper channel and back into the 2nd water pump?



Single 120*120*50 mm rad. There is only so much a single rad can do.


----------



## W1zzard (Apr 8, 2014)

mr2009 said:


> why not run the middle tube water through the copper channel and back into the 2nd water pump?


because they'd actually have to engineer something that you can't just buy from asetek. and I agree, that would be a much better solution


----------



## Rahmat Sofyan (Apr 8, 2014)

Xzibit said:


> Over at *[H]ardocp - AMD Radeon R9 295X2 Video Card Review* they also tested with 337 and noticed this.
> 
> 
> 
> Seams the new 337 drivers might be uping the thermal threshold of Nvidia cards to run faster.



so actually, with 337 drivers nvidia just overclocked their card silently?bs about the dx11 optimization?


----------



## JBVertexx (Apr 8, 2014)

Do you know if there was a difference in temperatures between the two GPUs?  On the thermal image, one looks hotter than the other.  Given they are connected to the CLC loop in series, you would expect that.  I'm just wondering which GPU the card temperature reading is measuring.


----------



## W1zzard (Apr 8, 2014)

JBVertexx said:


> Do you know if there was a difference in temperatures between the two GPUs?  On the thermal image, one looks hotter than the other.  Given they are connected to the CLC loop in series, you would expect that.  I'm just wondering which GPU the card temperature reading is measuring.


The temperatures are slightly different indeed, refer to page 28. The sensor readings in the video are from the primary (hotter) GPU


----------



## Hilux SSRG (Apr 8, 2014)

Great review W1zzard.  Overall I think Amd did a good job for an x2 card.  Shame about the coil noise and lack of 3rd 8xpin.  Amd botched it there.


----------



## btarunr (Apr 8, 2014)

Xzibit said:


> TMI
> 
> Now we know where to find you when the news is delayed.



Rrrr...research.


----------



## JBVertexx (Apr 8, 2014)

W1zzard said:


> The temperatures are slightly different indeed, refer to page 28. The sensor readings in the video are from the primary (hotter) GPU


Well it certainly helps to read the chart, eh?  I was too busy trying to discern the difference in temps in the video - ha!

Only a 2 degree difference is surprising.  I was skeptical that a 120mm radiator could handle the cooling of these 2 GPUs, but it certainly does it well.  

BTW, you're the only site that goes into this level of detail, so Thanks!


----------



## hanzawhtet7 (Apr 8, 2014)

This is amazing! I might get two of these super GPUs.


----------



## Rahmat Sofyan (Apr 8, 2014)

Where is the Titan Black BTW?


----------



## Fluffmeister (Apr 8, 2014)

Rahmat Sofyan said:


> Where is the Titan Black BTW?



It's not been reviewed here so as such isn't listed.

No need really, just look at the 780 Ti, and we already know custom 780 Ti's are faster than the one used in the results.


----------



## Assimilator (Apr 8, 2014)

From an engineering viewpoint this card is an impressive feat. Two full-fat Hawaii GPUs on a single card? THIS IS SPARTA!

However as usual, AMD botched the implementation. The idle fan noise is poor, and the VRM heat is worrying, but it's the coil whine that kills the card for me. If I'm gonna fork out fifteen hundred smackers for a graphics card, it better be fast AND quiet.

Overloading the PEG power connectors (by more than 100%!!!) is asking for trouble, you just know someone is gonna buy one of these cards and hook it up to a cheapy PSU and then blame AMD when the whole shebang catches fire. However I'm not sure what else AMD could've done, short of putting quad 8-pin connectors on, and thus making the card that much longer.

The increasing number of cards requiring more than 300 watts PEG power suggests that a serious amendment is required to the ATX and/or PCIe specs. At the very least I think we need to see max PCIe slot power draw (from the motherboard) doubled to 150W and a new PEG power connector (10-pin?) that can deliver 250W... hence a 2x 10-pin card would be able to draw 150 + 250 + 250 = 650W. Considering that 8+8 pin PEG power (375W total draw) is only slated for ratification in the PCIe 4.0 spec, which is still at least a year from being official, I'm thinking that AMD's next dual-GPU card might need to include its own PSU!


----------



## erocker (Apr 8, 2014)

I'd like to buy one without the cooler on it. Those VRM's need some water treatment too.


----------



## the54thvoid (Apr 8, 2014)

erocker said:


> I'd like to buy one without the cooler on it. Those VRM's need some water treatment too.



You know EKWB will make a block for it.  It is inevitable.


----------



## erocker (Apr 8, 2014)

I know it, I just would like to save a few hundred bucks. I'm not in my "well off" financial situation from blowing my money on expensive computer hardware that will be outdated in a year.


----------



## Kaotik (Apr 8, 2014)

W1zzard said:


> not furmark, but something that mimics the peak load of the most demanding games


Do you have any more details on this? I mean, it's quite a bit different from your official "GPU temperature" numbers after all and awfully close to Furmark numbers you reported?
http://www.techpowerup.com/reviews/AMD/R9_295_X2/28.html Are the temps reported here using Metro Last Light like power consumption tests use, or Battlefield 3 like OC tests use?


----------



## v12dock (Apr 8, 2014)

Incredible card! Well done AMD


----------



## Casecutter (Apr 8, 2014)

W1zzard said:


> Drivers can detect Furmark and throttle the card, and people would cry "unrealistic" either way. I think the test I have is quite good and kinda represents worst case in realistic usage. Furmark would be just to show everything "omgz hot", which is not the point of this test. It would also affect the noise recordings, which are there to provide additional insight, because dBA numbers are not so easy to grasp


 
Oh no, you're correct and nice it isn't furmark, but what I was saying is run the same X minutes of a run of Crysis3 just to see if that indeed the VRM's or whatever heating the same in such a time frame.  Just as a way to know that such simulated run through, is indicative of actual hard gaming.


----------



## Xzibit (Apr 8, 2014)

*Guru3D - AMD Radeon R9-295x2 Review - Graphics Card Thermal Imaging Temperature Measurements*

His stress test FLIR test hit 70.8 C on VRM Area during his testing.

Not sure on the difference in equipment but you can tell his is picking up the MB heat as well so it might be more sensitive.


----------



## Fluffmeister (Apr 8, 2014)

So money on the first person here to own one of these bad boys?

I'm gonna go for Xzibit, that guy has been drooling over it for months.


----------



## HumanSmoke (Apr 8, 2014)

sweet said:


> You can find it here. 295x2 is faster than CF 290x


Faster than two 290X Lightnings that are $100 less ? The base clock on the MSI card isn't far off the OC of the 295X2


Overclocker_2001 said:


> vrm and pci-e plug will die in less than 3 years


Interesting to note that ComputerBase measured current draw for the cables. No surprise that that it breached the PCI-SIG (meh), but it also exceeded the AWG18 electrical specification. Maybe Asetek can design a water jacket for PSU cables!







btarunr said:


> They also used 337.50 for NVIDIA. So even with the best driver, 780Ti SLI is slower than 295X2. Which means TITAN-Z will suck, especially at its $3000 price.


Seems to vary by game title - optimization, Mantle enabled/not supported, Crossfire/SLI profiles, game studio ties, game settings etc. For the most part it seems that the 295X2 is king of the hill at the expense of power draw (not a new concept), but I wouldn't consider it a slam dunk. Tech Report, ComputerBase, and PC Per had a lot of variation in benchmarks. ComputerBase had the 295X2 besting the 780 Ti SLI by 6% (3840x2160 at playable settings), but losing by 3% to the same setup when overclocking entered the equation for both- the situation moved more to the 780Ti's favour at 2560x1600.

From a marketing viewpoint it's job done 10/10.
I still wouldn't buy one over two vendor designed OC'ed cards, or even reference cards and waterblocks. For me, the card seems more an extension of the Asus ROG boards than a reference SKU (with a price to match).


----------



## Xzibit (Apr 8, 2014)

Fluffmeister said:


> So money on the first person here to own one of these bad boys?
> 
> I'm gonna go for Xzibit, that guy has been drooling over it for months.



Have I ?

I don't have the need or urge to upgrade.

My money would be on someone that harbors a deep seeded love hate relationship with them and expresses it at any opportunity.


----------



## Fluffmeister (Apr 8, 2014)

Xzibit said:


> Have I ?
> 
> I don't have the need or urge to upgrade.
> 
> *My money would be on someone that harbors a deep seeded love hate relationship with them and expresses it at any opportunity.*



So again that begs the question, why not?


----------



## sweet (Apr 9, 2014)

W1zzard said:


> the real video you should watch instead of amd propaganda



W1zzard, there is a serious wrong thing with your setup. Why did you put the radiator THAT CLOSE to the VRM cooling???

Those guys on Guru3d just put the radiator far away from the card, and VRM never hit 80 C.


----------



## LAN_deRf_HA (Apr 9, 2014)

So is this the most inefficient card ever released? That power page is nuts. On another note, why is it so impossible to get rid of coil whine? Motherboard makers did shortly after core 2 duo.


----------



## sweet (Apr 9, 2014)

LAN_deRf_HA said:


> So is this the most inefficient card ever released? That power page is nuts. On another note, why is it so impossible to get rid of coil whine? Motherboard makers did shortly after core 2 duo.



TPU said it has coil whine doesn't mean your card will definitely get coil whine. My 7990 is dead silent for example.


----------



## badtaylorx (Apr 9, 2014)

erocker said:


> I know it, I just would like to save a few hundred bucks. I'm not in my "well off" financial situation from blowing my money on expensive computer hardware that will be outdated in a year.


yeah!!!  PLEEEEEEAAAAAASE  AMD,  give us a $1350 "Naked Edition" 

I cant wait to see somebody overclock this once a custom water block gets released



sweet said:


> TPU said it has coil whine doesn't mean your card will definitely get coil whine. My 7990 is dead silent for example.



Powercolor glues their chokes down to cut down on coil whine.  I wonder what kind of (non-conductive) glue they use???


----------



## W1zzard (Apr 9, 2014)

sweet said:


> W1zzard, there is a serious wrong thing with your setup. Why did you put the radiator THAT CLOSE to the VRM cooling???



Huh? the radiator has nothing to do with it, it is placed above the card, and just 60°C. If you assume there is a thermal connection between the card and the radiator, which there is not, the radiator would actually help with cooling.

The radiator is where it is, so that you can see and hear it in the video.

I can't tell you why it's only 70°C in the Guru3D picture, maybe they used some light test? or loaded only one GPU? Note how their 2nd GPU is much cooler than the 1st one. 48° radiator temperature suggests the same, it will get much warmer when properly loaded.



Xzibit said:


> Not sure on the difference in equipment but you can tell his is picking up the MB heat as well so it might be more sensitive.


Mine is picking up the motherboard too, otherwise it would be completely black, the scale I've set also matters and the ASUS TUF Armor, so that mobo heat doesn't distract the viewer (as designed).


----------



## HumanSmoke (Apr 9, 2014)

W1zzard said:


> I can't tell you why it's only 70°C in the Guru3D picture, maybe they used some light test? or loaded only one GPU? Note how their 2nd GPU is much cooler than the 1st one.


At least one other site reported temps of 80-90°C which pretty much splits the difference between Hilbert and yourself. Possible variance in heatsink fitting/securing?


----------



## W1zzard (Apr 9, 2014)

HumanSmoke said:


> At least one other site reported temps of 80-90°C which pretty much splits the difference between Hilbert and yourself. Possible variance in heatsink fitting/securing?


Could be, the thermal tape looks rather thick, and the VRMs output quite some heat.


----------



## Steevo (Apr 9, 2014)

I like it. I think the score is a little to low, perhaps a 9.2 or so I think would be fair, the coil noise could be simply a unglued press release thing, knowing you guys would take the cards apart. The scaling is good, the performance good, and the power draw expected. 

Good job on the video setup, looking forward to seeing it in more new reviews.


----------



## W1zzard (Apr 9, 2014)

Steevo said:


> I think the score is a little to low, perhaps a 9.2 or so I think would be fair



I had something above 9 at first, but then thought about the price, the coil noise, the form factor inconvenience (when looking from a beginner's perspective).


----------



## Steevo (Apr 9, 2014)

I agree with that, but still believe any user capable or with enough understanding to purchase this should be able to use it properly, much like buying a fast car, and yet idiots buy them too I guess.


----------



## Recus (Apr 9, 2014)

Rahmat Sofyan said:


> so actually, with 337 drivers nvidia just overclocked their card silently?bs about the dx11 optimization?



No. Payed troll quoting pro-AMD site.



 



As for card, I think custom R9 290X CF would be better and cheaper but since AMD is not robbing their customers it ok.


----------



## W1zzard (Apr 9, 2014)

sweet said:


> W1zzard, there is a serious wrong thing with your setup. Why did you put the radiator THAT CLOSE to the VRM cooling??? Those guys on Guru3d just put the radiator far away from the card, and VRM never hit 80 C.



I redid the test without the radiator on top:







Same result


----------



## Xzibit (Apr 9, 2014)

W1zzard

Did AMD provide specs on the radiator fan ?  Looks like a generic one like the ones that comes in last gen Antec AIO CPU Coolers.


----------



## LAN_deRf_HA (Apr 9, 2014)

sweet said:


> TPU said it has coil whine doesn't mean your card will definitely get coil whine. My 7990 is dead silent for example.



That's not typically how it works. You probably just have one that's pitched too high for you to hear or you constantly have v-sync on. Look at threads and reviews where people RMA over coil whine where every new card they get still has it. Even Asus cards with their bullshit about eliminating coil whine still have it. Now maybe some are higher pitched and that cuts down on the % of people that hear it, but it's there. There are a few reviewers out there who manufactures listen too and I wish they'd start pushing this issue more vigorously because they seem to ignore consumer complaints and frankly it's absurd that this problem has gone on for so many years.


----------



## W1zzard (Apr 9, 2014)

Xzibit said:


> Did AMD provide specs on the radiator fan ? Looks like a generic one like the ones that comes in last gen Antec AIO CPU Coolers.


Haven't seen anything, maybe it's part of the bundle that Asetek provides to AMD


----------



## Ferrum Master (Apr 9, 2014)

W1zzard said:


> Haven't seen anything, maybe it's part of the bundle that Asetek provides to AMD



FIRST DEAD


----------



## Mathragh (Apr 9, 2014)

Its nice to see them fulfilling their promise when it comes to crossfire scaling and for a change also provide a truly adequate cooling solution on their stock cards.
I doubt many people a year ago would've believed you when you told them AMD now successfully made a dual-chip card with proper scaling, insane power use but still decent cooling/noise.


----------



## DRDNA (Apr 9, 2014)

I want two of these. I would also like to see a hi speed controllable fan mod for the VRMs, which should be pretty darn easy. Nice review W1zzard as usual!


----------



## yogurt_21 (Apr 9, 2014)

well it seems you've found the new "yeah but..." in the crysis 3 test. "Yeah,, but can it play crysis 3 at 4k?"


----------



## ndtoan (Apr 9, 2014)

W1zzard said:


> I redid the test without the radiator on top:
> 
> 
> 
> ...



why do you not lock temp at 95 degree?


----------



## the54thvoid (Apr 9, 2014)

badtaylorx said:


> yeah!!!  PLEEEEEEAAAAAASE  AMD,  give us a $1350 "Naked Edition"
> 
> I cant wait to see somebody overclock this once a custom water block gets released
> 
> ...



Not on all their cards, unless it's a new thing.  My LCS 7970 from Powercolor had pretty bad whine - I RMA'd my first because of it - 2nd had it but not quite as bad.  Cheap components and poor power circuitry can cause it.  Hell, my Titan had/has a smidge of it.


----------



## radrok (Apr 9, 2014)

I enjoyed the review, fap fap fap.

In all seriousness though, I would have loved to see a waterblocked version from AMD, the majority of people buying this will be using a custom waterloop.

54thvoid, my titans yell like schoolgirls under load, probably because I abused them with voltage


----------



## btarunr (Apr 10, 2014)

radrok said:


> In all seriousness though, I would have loved to see a waterblocked version from AMD, the majority of people buying this will be using a custom waterloop.



You may hear about a PowerColor LCS+ card soon.


----------



## radrok (Apr 10, 2014)

btarunr said:


> You may hear about a PowerColor LCS+ card soon.



Yeah I figured but usually PowerColor charges twice what a waterblock would cost you on top of the graphics card alone.


----------



## Tonduluboy (Apr 10, 2014)

W1zzard said:


> I had something above 9 at first, but then thought about the price, the coil noise, the form factor inconvenience (when looking from a beginner's perspective).



W1zzz, obviously this card not for a beginner... anybody buying this expensive card, will not care about how pricey or  how much their Electricity bill go up... they only care about how many FPS they can get out of this card. Like buying Ferrari, u cant complaint the car insurance fee or the gas usage...


----------



## btarunr (Apr 10, 2014)

Tonduluboy said:


> W1zzz, obviously this card not for a beginner...



Rich kids?


----------



## refillable (Apr 10, 2014)

Well, Niche cards like these comes with things that are expected. It's a power sucker, runs noisy and hot (although the hybrid cooling helps the temps to 70 C and noise now), bpazing fast and has tons on CF scaling issue. Yes it's faster than the pointless Titan-Z. Perhaps what's worrying for me is only the VRM temps, which affects the lifespan of course. But sure, only rich freaks buy these and they'd replace their cards before theirs gone off. We will stick with the 290 (horray for no price jumps in non USA country!).

And, anyone who's buying this for 1080p gaming is a Nobo.


----------



## cadaveca (Apr 10, 2014)

refillable said:


> Well, Niche cards like these comes with things that are expected. It's a power sucker, runs noisy and hot (although the hybrid cooling helps the temps to 70 C and noise now), bpazing fast and has tons on CF scaling issue. Yes it's faster than the pointless Titan-Z. Perhaps what's worrying for me is only the VRM temps, which affects the lifespan of course. But sure, only rich freaks buy these and they'd replace their cards before theirs gone off. We will stick with the 290 (horray for no price jumps in non USA country!).
> 
> And, anyone who's buying this for 1080p gaming is a Nobo.




I am neither rich, nor a noob. I will buy one, just to show it off and piss off people like you. I got dual 780 Ti to push a single panel @ 1200p already.  I use a 4960X, and spent $800 on 32 GB of custom-built Avexir ram. All I do is surf the internet..I don't even PLAY games anymore. So what?

Why anyone would seem to care...is beyond me. AMD and other tech companies make products like this for marketing, and because they WILL sell, just like the original Titan, and the forth-coming Titan-Z.


The cost of things doesn't matter unless money matters to you, and although I have very little money, and no job, I still get the things I want, no problem, and you won't hear me complaining about what someone decides something is worth, or calling people that buy expensive hardware noobs...

High-power computing today is excessive, and as such, it SHOULD be priced high, and stupid. No-one needs even a single 290, nevermind a dual GPU card... Not one person. Nobody needs to play videogames, watch TV, or troll the internet. Yet here we are...


----------



## Suka (Apr 10, 2014)

That is an awesome overclock particularly the memory, if only the vrms were cooler it would be perfect no?Or maybe if it was 1200$


----------



## sweet (Apr 11, 2014)

Suka said:


> That is an awesome overclock particularly the memory, if only the vrms were cooler it would be perfect no?Or maybe if it was 1200$


AMD's engineers need to feed their wife  

At least they didn't rob our kidneys with a 3000$ card


----------



## HumanSmoke (Apr 11, 2014)

sweet said:


> AMD's engineers need to feed their wife


AMD's engineers share a wife?
You'd think with all that extra spare time they could have come up with more than two GCN 2.0 based GPUs 


sweet said:


> At least they didn't rob our kidneys with a 3000$ card


Technically, it would be kidney sale. Theft implies the deceitful removal of property. As far as I'm aware, to obtain a $3000 card you'd have to willingly part with the cash (biologic equivalent?) rather than have body organs removed against your will. A bad deal? Yes. Theft? No.


----------



## valentyn0 (Apr 11, 2014)

Rahmat Sofyan said:


> so actually, with 337 drivers nvidia just overclocked their card silently?bs about the dx11 optimization?



that's BS on your part, i got 100% boost in some games, and i tested this on different cards, i monitored the max gpu freq with the last whql and the new beta, it's all the same, but different gains, BUD!


----------



## Vario (Apr 11, 2014)

This card with a full waterblock would be quite nice.  The dual gpu cards always have problems with heat.

They could have stacked a 3rd PCI-E power cable by doubling up the connectors like on my PNY 770.


----------



## LAN_deRf_HA (Apr 12, 2014)

cadaveca said:


> I will buy one



Just one?


----------



## cadaveca (Apr 13, 2014)

LAN_deRf_HA said:


> Just one?


Only one slot on mITX boards... and this is the only way to get Crossfire 290 in that form-factor. So yeah, just one. VGA will cost more than the rest of the rig combined.


----------



## Suka (Apr 16, 2014)

valentyn0 said:


> that's BS on your part, i got 100% boost in some games, and i tested this on different cards, i monitored the max gpu freq with the last whql and the new beta, it's all the same, but different gains, BUD!


100%?


----------



## radrok (Apr 16, 2014)

Suka said:


> 100%?



From 2 to 4 fps


----------



## Suka (Apr 16, 2014)

radrok said:


> From 2 to 4 fps


lolz


----------



## UaVaj (Apr 17, 2014)

was recently posted in the wrong subforum and someone got soft.  was told to repost here to get the answer.

this a 690 question regarding this review.  NOT a question about - 295x2 nor 290x.

Review - 04/08/14 - AMD Radeon R9 295x2 8gb
http://www.techpowerup.com/reviews/AMD/R9_295_X2/9.html

*** shows 690 with 335.23 with BF4 at 5760x1080 4xAA to be 36.3 fps


Review - 03/04/14 - PowerColor r9 290x PCS+ 4gb
http://www.techpowerup.com/reviews/Powercolor/R9_290X_PCS_Plus/9.html

*** shows 690 with 331.82 with BF4 at 5760x1080 4xAA to be 26.1 fps


what cause this 39% boost? typo error for 26.3fps?
what happen here? what is the correct value?


----------



## HTC (Apr 17, 2014)

UaVaj said:


> was recently posted in the wrong subforum and someone got soft.  was told to repost here to get the answer.
> 
> this a 690 question regarding this review.  NOT a question about - 295x2 nor 290x.
> 
> ...



http://www.techpowerup.com/reviews/Powercolor/R9_290X_PCS_Plus/5.html
http://www.techpowerup.com/reviews/AMD/R9_295_X2/5.html

Notice the differences in the test setup for both reviews: that's the reason.


----------



## 64K (Apr 17, 2014)

UaVaj said:


> was recently posted in the wrong subforum and someone got soft.  was told to repost here to get the answer.
> 
> this a 690 question regarding this review.  NOT a question about - 295x2 nor 290x.
> 
> ...



I had a couple of thoughts that I was going to pose to you yesterday.

What CPU are you using? The reason I ask is because BF4 is well known to be CPU intensive also. W1zzard used a i7 4770k @4.2 GHz. Perhaps in addition to the improved Nvidia drivers between tests there was also some further optimizations from Dice on the CPU side as well.

Another thing is, did you record your FPS in the exact same part of the game as W1zzard did in the 3/4 and 4/8 benchmarks? Dice could have done some tweaks in that area of the game that factored into these tests.

It's not possible to go back and rerun the benchmarks from 3/4 even with rolling back the drivers because we can't undo any optimizations that Dice has made since then and even if you could get W1zzard to rerun both benchmarks what difference would it make? I sense that you are frustrated with the performance of BF4 which is understandable because so many players are based on the posts all over the internet but getting W1zzard to say he made a typo, if he did, isn't going to alleviate your frustration. Right?


----------



## UaVaj (Apr 17, 2014)

I have already jump ship.  290x x3 (the lesser evil) is on route and one of the 680 x3 is already sold.  780ti x3 was first choice but ruled out cause it is a stutter fest in BF4.
cannot say the same for other BF4 players who is less fortunate without the financial backing to also jump ship.  do share their pain of not being able to enjoy BF4 like it was intended.

SLI is simply broken in BF4.   I clearly did not see that 39% boost nor any amount of boost in any part of BF4.
39% is not exactly a testing margin of error.  that is huge. 

hence here asking if W1zzard made a typo.


cpu is a i7-3770k oc to 4.7GHz


----------



## FX-GMC (Apr 17, 2014)

UaVaj said:


> was recently posted in the wrong subforum and someone got soft.  was told to repost here to get the answer.
> 
> this a 690 question regarding this review.  NOT a question about - 295x2 nor 290x.
> 
> ...



Many patches and many driver updates have happen*ed*.  They are *both* correct for the time in which they were published.


----------



## 64K (Apr 18, 2014)

Patches have been updat*ed*. No doubt that you have noticed the improvement *yourself* as *FX-GMC* has pointed out.


----------



## 64K (Apr 19, 2014)

UaVaj said:


> I have already jump ship.  290x x3 (the lesser evil) is on route and one of the 680 x3 is already sold.  780ti x3 was first choice but ruled out cause it is a stutter fest in BF4.
> cannot say the same for other BF4 players who is less fortunate without the financial backing to also jump ship.  do share their pain of not being able to enjoy BF4 like it was intended.
> 
> SLI is simply broken in BF4.   I clearly did not see that 39% boost nor any amount of boost in any part of BF4.
> ...



Well, post back here if you want to. There are a lot of BF4 players here that could possibly benefit from your experience. Maybe not 3X R9 290 crossfire. That's exotic...but interesting.


----------



## Bytales (Apr 19, 2014)

Why the shitty frames in Diablo 3 ?
I was looking to upgrading my Videocard, and i was waiting for the dual AMD card for what is like 2 years now, only to find it lacks major time in the game i play the most.
Crossfire 290x seems to scale ok. Is this suppose to be fixed, or what is really wrong ?

How is diablo 3 really tested, can anyone clarify ?


----------



## Bitech (Apr 19, 2014)

The reviewer kinda screwed up somewhere on measuring the power consumption of these cards. The lack of consistency of the games used in measuring performance and power consumption caused some of these cards' efficiency bars to go all over the place.

The patches might not be part of the problem; notice how to GTX 760, 780 and Titan greatly improved in efficiency, but the 770 and 780 Ti stayed the same.

And either *half *of the AMD cards have either improved slightly or severely dropped in efficiency.


----------



## HumanSmoke (Apr 19, 2014)

Bitech said:


> View attachment 56158
> The reviewer kinda screwed up somewhere on measuring the power consumption of these cards. The lack of consistency of the games used in measuring performance and power consumption caused some of these cards' efficiency bars to go all over the place.


What also could have changed the paradigm:
Thief was part of the benchmark suite in one graph, and wasn't used in the other
Diablo III:RoS was part of the benchmark suite in one graph, and wasn't used in the other
Call of Juarez: Gunslinger was part of the benchmark suite in one graph, and wasn't used in the other
16 games were benchmarked in one graph, 15 in the other.
Driver improvements - Case in point, BF4 from both of the reviews you're pinpointing. Not huge framerate increases for some cards (GTX 690's SLI config excepted), but in marked in percentage to those cards that don't benefit.


----------



## Chetkigaming (Apr 20, 2014)

Thanks for review  .The video from noise test is very cool!


----------



## nem (Apr 30, 2014)

hi bros i just have a question is possible make one tri crossfire of 295x , so i do ear some over the new standard CrossFireX XDMA was possible make a cross with more of 4 GPUS .. cheers i just have that doubt


----------



## radrok (May 1, 2014)

I wouldn't even imagine the load of issues you'd be running into.


----------



## W1zzard (May 2, 2014)

Bitech said:


> View attachment 56158
> The reviewer kinda screwed up somewhere on measuring the power consumption of these cards. The lack of consistency of the games used in measuring performance and power consumption caused some of these cards' efficiency bars to go all over the place.
> 
> The patches might not be part of the problem; notice how to GTX 760, 780 and Titan greatly improved in efficiency, but the 770 and 780 Ti stayed the same.
> ...


We switched from using Crysis 2 to Metro Last Light for typical gaming power consumption, also the list of games has changed.


----------



## Rejkt (Oct 7, 2014)

Hi, i recently made a rookie mistake, i bought this gpu, and this PSU http://www.overclockers.co.uk/showproduct.php?prodid=CA-025-SS, i had heard that seasonic were one of the best makes, and that it was a high voltage high cost psu, that it would defiantly be compatible with this GPU, i've just read online that the psu for this card needs to be able to handle 28amps in a single rail and was unsure if this psu was able to, can anyone ease my mind and tell me if this is the case or if i have purchased a psu that won't work like a moron, thankyou.


----------



## Steevo (Oct 7, 2014)

It will be fine.


----------



## P-40E (Jan 3, 2015)

If I was to go all out with money on a GPU this would be the card I would get! The Nvidia cards like the Titan for example is over $1000 and not even as good as a regular R9 290X. So the R9 295X2 for under $800 would be what I would get. For under $800 this GPU puts Nvidia to shame with their overpricing. Not to mention Nvidia has nothing yet to offer that outperforms the 295X2. So the R9 295X2 is currently the fastest GPU in the world! I also love the water cooling and I love how it runs well under 70c. That is impressive. Nvidia may have slightly better efficiency but AMD once again beats them in top performance. Do not get me wrong I have owned many Nvidia cards and I like Nvidia cards. But AMD has more logical prices. And I do not care if I get a card that runs a few degrees warmer or uses a few more watts. That is meaningless anyway! You are only going to save about a dollar a year in electricity so I do not understand why anyone would buy a GPU based on using less wattage. Besides you are supposed to have a good quality PSU anyway. Unless you are dumb enough to buy a Alienware PC,  But then they never have cards like this one, They are a ridiculous set up like a i7 4790K with a OEM GTX 645 LOL and a crap Hipro built PSU that will destroy the system not long after it messes up the bios causing it to have to be constantly reset.


----------

