• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

AMD Radeon R9 295X2 8 GB

The temperatures are slightly different indeed, refer to page 28. The sensor readings in the video are from the primary (hotter) GPU
Well it certainly helps to read the chart, eh? I was too busy trying to discern the difference in temps in the video - ha!

Only a 2 degree difference is surprising. I was skeptical that a 120mm radiator could handle the cooling of these 2 GPUs, but it certainly does it well.

BTW, you're the only site that goes into this level of detail, so Thanks!
 
Where is the Titan Black BTW?
 
Where is the Titan Black BTW?

It's not been reviewed here so as such isn't listed.

No need really, just look at the 780 Ti, and we already know custom 780 Ti's are faster than the one used in the results.
 
Last edited:
From an engineering viewpoint this card is an impressive feat. Two full-fat Hawaii GPUs on a single card? THIS IS SPARTA!

However as usual, AMD botched the implementation. The idle fan noise is poor, and the VRM heat is worrying, but it's the coil whine that kills the card for me. If I'm gonna fork out fifteen hundred smackers for a graphics card, it better be fast AND quiet.

Overloading the PEG power connectors (by more than 100%!!!) is asking for trouble, you just know someone is gonna buy one of these cards and hook it up to a cheapy PSU and then blame AMD when the whole shebang catches fire. However I'm not sure what else AMD could've done, short of putting quad 8-pin connectors on, and thus making the card that much longer.

The increasing number of cards requiring more than 300 watts PEG power suggests that a serious amendment is required to the ATX and/or PCIe specs. At the very least I think we need to see max PCIe slot power draw (from the motherboard) doubled to 150W and a new PEG power connector (10-pin?) that can deliver 250W... hence a 2x 10-pin card would be able to draw 150 + 250 + 250 = 650W. Considering that 8+8 pin PEG power (375W total draw) is only slated for ratification in the PCIe 4.0 spec, which is still at least a year from being official, I'm thinking that AMD's next dual-GPU card might need to include its own PSU!
 
I'd like to buy one without the cooler on it. Those VRM's need some water treatment too.
 
I'd like to buy one without the cooler on it. Those VRM's need some water treatment too.

You know EKWB will make a block for it. It is inevitable.
 
I know it, I just would like to save a few hundred bucks. I'm not in my "well off" financial situation from blowing my money on expensive computer hardware that will be outdated in a year. :)
 
not furmark, but something that mimics the peak load of the most demanding games
Do you have any more details on this? I mean, it's quite a bit different from your official "GPU temperature" numbers after all and awfully close to Furmark numbers you reported?
http://www.techpowerup.com/reviews/AMD/R9_295_X2/28.html Are the temps reported here using Metro Last Light like power consumption tests use, or Battlefield 3 like OC tests use?
 
Drivers can detect Furmark and throttle the card, and people would cry "unrealistic" either way. I think the test I have is quite good and kinda represents worst case in realistic usage. Furmark would be just to show everything "omgz hot", which is not the point of this test. It would also affect the noise recordings, which are there to provide additional insight, because dBA numbers are not so easy to grasp

Oh no, you're correct and nice it isn't furmark, but what I was saying is run the same X minutes of a run of Crysis3 just to see if that indeed the VRM's or whatever heating the same in such a time frame. Just as a way to know that such simulated run through, is indicative of actual hard gaming.
 
Last edited:
So money on the first person here to own one of these bad boys?

I'm gonna go for Xzibit, that guy has been drooling over it for months.
 
You can find it here. 295x2 is faster than CF 290x
Faster than two 290X Lightnings that are $100 less ? The base clock on the MSI card isn't far off the OC of the 295X2
vrm and pci-e plug will die in less than 3 years
Interesting to note that ComputerBase measured current draw for the cables. No surprise that that it breached the PCI-SIG (meh), but it also exceeded the AWG18 electrical specification. Maybe Asetek can design a water jacket for PSU cables!
ddXN2Ca.jpg

They also used 337.50 for NVIDIA. So even with the best driver, 780Ti SLI is slower than 295X2. Which means TITAN-Z will suck, especially at its $3000 price.
Seems to vary by game title - optimization, Mantle enabled/not supported, Crossfire/SLI profiles, game studio ties, game settings etc. For the most part it seems that the 295X2 is king of the hill at the expense of power draw (not a new concept), but I wouldn't consider it a slam dunk. Tech Report, ComputerBase, and PC Per had a lot of variation in benchmarks. ComputerBase had the 295X2 besting the 780 Ti SLI by 6% (3840x2160 at playable settings), but losing by 3% to the same setup when overclocking entered the equation for both- the situation moved more to the 780Ti's favour at 2560x1600.

From a marketing viewpoint it's job done 10/10.
I still wouldn't buy one over two vendor designed OC'ed cards, or even reference cards and waterblocks. For me, the card seems more an extension of the Asus ROG boards than a reference SKU (with a price to match).
 
So money on the first person here to own one of these bad boys?

I'm gonna go for Xzibit, that guy has been drooling over it for months.

Have I ?

I don't have the need or urge to upgrade.

My money would be on someone that harbors a deep seeded love hate relationship with them and expresses it at any opportunity. :D
 
Have I ?

I don't have the need or urge to upgrade.

My money would be on someone that harbors a deep seeded love hate relationship with them and expresses it at any opportunity. :D

So again that begs the question, why not? :D
 

the real video you should watch instead of amd propaganda

W1zzard, there is a serious wrong thing with your setup. Why did you put the radiator THAT CLOSE to the VRM cooling???

Those guys on Guru3d just put the radiator far away from the card, and VRM never hit 80 C.

index.php


index.php
 
Last edited:
So is this the most inefficient card ever released? That power page is nuts. On another note, why is it so impossible to get rid of coil whine? Motherboard makers did shortly after core 2 duo.
 
So is this the most inefficient card ever released? That power page is nuts. On another note, why is it so impossible to get rid of coil whine? Motherboard makers did shortly after core 2 duo.

TPU said it has coil whine doesn't mean your card will definitely get coil whine. My 7990 is dead silent for example.
 
I know it, I just would like to save a few hundred bucks. I'm not in my "well off" financial situation from blowing my money on expensive computer hardware that will be outdated in a year. :)
yeah!!! PLEEEEEEAAAAAASE AMD, give us a $1350 "Naked Edition"

I cant wait to see somebody overclock this once a custom water block gets released

TPU said it has coil whine doesn't mean your card will definitely get coil whine. My 7990 is dead silent for example.

Powercolor glues their chokes down to cut down on coil whine. I wonder what kind of (non-conductive) glue they use???
 
W1zzard, there is a serious wrong thing with your setup. Why did you put the radiator THAT CLOSE to the VRM cooling???

Huh? the radiator has nothing to do with it, it is placed above the card, and just 60°C. If you assume there is a thermal connection between the card and the radiator, which there is not, the radiator would actually help with cooling.

The radiator is where it is, so that you can see and hear it in the video.

I can't tell you why it's only 70°C in the Guru3D picture, maybe they used some light test? or loaded only one GPU? Note how their 2nd GPU is much cooler than the 1st one. 48° radiator temperature suggests the same, it will get much warmer when properly loaded.

Not sure on the difference in equipment but you can tell his is picking up the MB heat as well so it might be more sensitive.
Mine is picking up the motherboard too, otherwise it would be completely black, the scale I've set also matters and the ASUS TUF Armor, so that mobo heat doesn't distract the viewer (as designed).
 
Last edited:
I like it. I think the score is a little to low, perhaps a 9.2 or so I think would be fair, the coil noise could be simply a unglued press release thing, knowing you guys would take the cards apart. The scaling is good, the performance good, and the power draw expected.

Good job on the video setup, looking forward to seeing it in more new reviews.
 
Back
Top