# PowerColor R9 290X OC 4 GB



## W1zzard (Nov 4, 2013)

PowerColor's Radeon R9 290X is built on AMD's reference design.Using a 30 MHz overclock on the GPU and coming at reference design pricing, the decision whether this card is worth the money is really a no-brainer.

*Show full review*


----------



## W1zzard (Nov 4, 2013)

I uploaded both the original review sample's BIOS and the 2 retail BIOSes here: http://www.techpowerup.com/vgabios/...or&model=R9+290X&interface=&memType=&memSize=


----------



## mame (Nov 4, 2013)

Could you please check if the die is at the same height as the metal shim?
I would like to use my old cooler that has a flat surface.


----------



## Casecutter (Nov 4, 2013)

Interesting it appears the only difference is these use a the International Rectifier IR 3567B controller rather than the CHiL controllers. Although, with that we see what appears 8% less "peak" watts in gaming, while not any difference in dbA, all the while achieving a 9% bump in performance at 2560X for what I take is quiet mode? 

If we are talking quiet to quiet in the perf/watt graph on 2560x is showing a 14% improvement?

Wow


----------



## W1zzard (Nov 4, 2013)

Same IRF controller on the 290X. The BIOS is "uber" mode + 30 MHz


----------



## Casecutter (Nov 4, 2013)

W1zzard said:


> Same IRF controller on the 290X. The BIOS is "uber" mode + 30 MHz


Ok, was checking the orginal 290X review, see it now. 

So all the places you have it in graphs it's considered "uber"... Okay not that big of deal! 
Thanks


----------



## LTUGamer (Nov 4, 2013)

Quiet and effective custom cooling or gtfo -_-


----------



## W1zzard (Nov 4, 2013)

Casecutter said:


> Okay not that big of deal!



yeah it's a small difference, basically you are paying nothing for a guaranteed 30 MHz oc out of the box, with warranty.


----------



## Hades (Nov 4, 2013)

W1zzard said:


> yeah it's a small difference, basically you are paying nothing for a guaranteed 30 MHz oc out of the box, with warranty.



That is not bad at all, considering the heat and power consumption from such an aggressive card...


----------



## lZKoce (Nov 4, 2013)

Very sexy card in my opinion. However, I don't find the lack of analog VGA output as a cons. The standard is set to expire in 2015, we are late 2013, there is barely 2 years before it goes into history. I don't think it affected the score in any way, its just I personally don't find it a CON.


----------



## jihadjoe (Nov 4, 2013)

Has anyone tried a memory-only OC on a 290X to see if it makes any difference? I'm guessing the 512-bit bus is plenty wide enough and it would be limited almost exclusively by the GPU.


----------



## W1zzard (Nov 4, 2013)

lZKoce said:


> However, I don't find the lack of analog VGA output as a cons. The standard is set to expire in 2015, we are late 2013, there is barely 2 years before it goes into history. I don't think it affected the score in any way, its just I personally don't find it a CON.



I completely agree, but mentioning it for completeness, and there are many people who still use analog monitors. Those same people often don't read the whole review


----------



## claylomax (Nov 4, 2013)

jihadjoe said:


> Has anyone tried a memory-only OC on a 290X to see if it makes any difference? I'm guessing the 512-bit bus is plenty wide enough and it would be limited almost exclusively by the GPU.



Overclocking the memory will give at most 1 fps while increasing heat and power consumption; I never got extra performance raising memory clocks on video cards, maybe it's just me.


----------



## DeadSkull (Nov 4, 2013)

290x needs lightning / matrix pcb and cooling with no voltage or temperature restrictions via drivers.



claylomax said:


> Overclocking the memory will give at most 1 fps while increasing heat and power consumption; I never got extra performance raising memory clocks on video cards, maybe it's just me.



Memory OC helps in synthetic benchmarks but very rarely in gaming. Gaming is all about core clockspeeds.


----------



## Aquinus (Nov 4, 2013)

I would hardly call 





> No analog VGA outputs


 a con. If you're using one of these, it really begs the question; why the hell are you using VGA?


----------



## Hades (Nov 4, 2013)

Aquinus said:


> I would hardly call  a con. If you're using one of these, it really begs the question; why the hell are you using VGA?



Exact!


----------



## W1zzard (Nov 4, 2013)

Aquinus said:


> If you're using one of these, it really begs the question; why the hell are you using VGA?



- poor
- don't know difference between analog and digital
- that's the machine my boss gave me


----------



## Xzibit (Nov 4, 2013)

W1zzard said:


> - *poor*
> - don't know difference between analog and digital
> - that's the machine my boss gave me



If your poor why are you buying a $550 GPU 

If that's the machine your boss gave you wouldn't the $550 dollars serve best in upgrading something else ?

Its like your depicting the mentality of a console gamer rather then an PC enthusiast where only seeing pretty things on screen matter.

Just odd justifications.


----------



## W1zzard (Nov 4, 2013)

Xzibit said:


> If your poor why are you buying at $550 GPU



spent all my money on GPU, can't buy a monitor now

if all we need to debate in the review is the analog vga outputs, then my review must be awesome


----------



## Xzibit (Nov 4, 2013)

W1zzard said:


> spent all my money on GPU, can't buy a monitor now
> 
> *if all we need to debate in the review is the analog vga outputs, then my review must be awesome*



Might want to read through some of the post in your reviews.


----------



## lZKoce (Nov 4, 2013)

W1zzard said:


> if all we need to debate in the review is the analog vga outputs, then my review must be awesome



Aaaaah, some self-irony . I feel you man, but you should know better as a writer how the things go. I usually don't comment on high end cards, because it's not my teritory. I don't need one and I can't afford one definitely.


----------



## NeoXF (Nov 4, 2013)

So will you be bringing the R9 290 review shortly as well? If so, I hope you use the launch drivers, otherwise, you might as well have reviewed it a week ago.

Also, I suggest removing Skyrim (which seems broken in your test suite anyway...), and adding Battlefield 4 instead.

So at any moment we shall be hearing announcements about custom models, right? ASUS DCUII TOP being the first one I'm guessing. Also, maybe details/tests on whatever "Tahiti XTL" is...


----------



## LAN_deRf_HA (Nov 4, 2013)

I'm glad you pointed out the power efficiency not making sense. Even with silicon lottery it's weird to see it get better with an overclock. It's weird even on the first card that silent mode is less efficient than uber


----------



## Strider (Nov 4, 2013)

It's supposed to run at those temps. =]

If you don't like the noise, the temps, or the OC headroom, just wait for the non-reference designs. That's what most people are doing, including myself. 

Also, no analog output is a con? Really? Who the hell buys a top-tier card and still uses VGA? Yeah, that needs to not even be a consideration anymore. At all. 

lol


----------



## Jack1n (Nov 4, 2013)

If you realize that that the no VGA Con is not really a Con then it probably was not written for you,for example i always ignore load noise because i used headphones so even if its written in cons i just ignore it.


----------



## Solaris17 (Nov 5, 2013)

Strider said:


> It's supposed to run at those temps. =]
> 
> If you don't like the noise, the temps, or the OC headroom, just wait for the non-reference designs. That's what most people are doing, including myself.
> 
> ...




i was confused here too brb using my 15" 1280x1024 view sonic with my $500 video card #YOLO


----------



## Lionheart (Nov 5, 2013)

Goof review Wizz... That memory overclock is nice... Would like to see the overclocked results at a higher resolution just to see if that extra memory bandwidth helps at all...


----------



## TheDeeGee (Nov 5, 2013)

Really?

Let's OC 30 MHz and label it OC...


----------



## Aquinus (Nov 5, 2013)

W1zzard said:


> - poor



Then why do you have a 500 USD video card? Someone's inability to choose the proper parts for the budget that he or she has, shouldn't be reflected in the review of a video card that really should only be bought when people know what they're getting.



W1zzard said:


> - don't know difference between analog and digital


While I kind of understand this since my displays only came with VGA cables (stupid Dell,) but I would hardly call not having a VGA output a flaw for that reason. DVI to VGA adapters still exist and the real question is, do you want to waste panel space on a stupid VGA port, something that hardly anyone will ever use, when you can get something better that can still give you a VGA signal from an adapter. I think we can all agree that every port that it has is better than VGA and that they can all run with VGA adapter with the exception of HDMI.



W1zzard said:


> - that's the machine my boss gave me


Awfully nice gift from the boss, jeez. 
Either way, you already have the video card so what does it matter if it doesn't have VGA. Be grateful that you got it from your boss you bastard. 



Solaris17 said:


> i was confused here too brb using my 15" 1280x1024 view sonic with my $500 video card #YOLO



Clearly you need the extra power to drive that 85Hz tube.


----------



## xenocide (Nov 6, 2013)

Svarog said:


> Really?
> 
> Let's OC 30 MHz and label it OC...



They're technically not lying, it is overclocked.  It just gives them a reason to push it to 50mhz and label it superclocked, then 100mhz and label it maximum oc or something similarily stupid...


----------



## Wittermark (Nov 6, 2013)

even if they were to put a VGA port on it, where would they have the space for it tho? the output panel looks pretty crowded as is, and they got a half slot left for exhaust vents, which is already pretty small, putting a vga port on it would mean completely blocking the exhaust vents, meaning no openings to dump the heat on a already very hot card, it just doesn't make sense.


----------



## W1zzard (Nov 6, 2013)

that's what the 4 little analog VGA pins are used for in the DVI connector


----------



## Aquinus (Nov 6, 2013)

W1zzard said:


> http://img.techpowerup.org/131106/Capture2949.jpg
> 
> that's what the 4 little analog VGA pins are used for in the DVI connector



So you use lack of a VGA port as a con, why again?


----------



## mastershake575 (Nov 6, 2013)

I love how people are crying that he listed no vga output when in reality I doubt he even took points off because of it....... (thus what in the hell are your crying/whining about ?)


----------



## erocker (Nov 6, 2013)

If you don't use a VGA port it's not a con for you. So how does this matter?


----------



## Aquinus (Nov 6, 2013)

erocker said:


> If you don't use a VGA port it's not a con for you. So how does this matter?



It doesn't matter. It just adds to the cons list and makes it look like there is more wrong with this GPU when there isn't. Perceptually, if the list of things wrong with a gpu is closer in size to the list of things that are good about a card, it doesn't make it look as good as opposed to there being one less con listed. It's not a con for the majority of people looking at buying this card, so I don't see why needs to be there in the first place. I just find it strange that it's important enough to warrant being mentioning when in reality it doesn't deserve to be mentioned at all.

Complaining about lacking VGA on this card is like complaining about there not being an AGP variant... Whoop de doo.


----------



## eidairaman1 (Nov 7, 2013)

if i need VGA i just get a DVI to VGA adapter (Radeon 9700 Pro had 1 of them)


----------



## TheinsanegamerN (Nov 8, 2013)

eidairaman1 said:


> if i need VGA i just get a DVI to VGA adapter (Radeon 9700 Pro had 1 of them)



nope. wont work. the card has DVI-D ports. no vga adaption is possible, unless you go the display port-VGA route. 

Ignoring all of this VGA nonsense, I would love to see this gpu with the geforce 780ti's cooler on it. that would be amazing....


----------



## TheoneandonlyMrK (Nov 8, 2013)

TheinsanegamerN said:


> nope. wont work. the card has DVI-D ports. no vga adaption is possible, unless you go the display port-VGA route.
> 
> Ignoring all of this VGA nonsense, I would love to see this gpu with the geforce 780ti's cooler on it. that would be amazing....



This has been said a lot , do you realise both how wrong you are and how inept that even sounds, the R9 die is far denser then a Gk110 in its shader arrays , has more on die cache in almost every area and essentially more of everything in a smaller space creating more heat anyway and less area to remove it from, An nvidia blower is not that much better than an Amd one as the noise levels of a typical 780Ti clearly show at load.
Anyways move along to argument 2 ie the vga dongle thing because Aib custom cooled R9's are on the way now the 780Ti is out.


----------



## eidairaman1 (Nov 10, 2013)

So youre sayin those arent dvi-I ports?



TheinsanegamerN said:


> nope. wont work. the card has DVI-D ports. no vga adaption is possible, unless you go the display port-VGA route.
> 
> Ignoring all of this VGA nonsense, I would love to see this gpu with the geforce 780ti's cooler on it. that would be amazing....


----------



## TheinsanegamerN (Nov 12, 2013)

eidairaman1 said:


> So youre sayin those arent dvi-I ports?



correct. those are both full DVI-D ports. they dont have the 4 pins on the side needed for vga adapters.


----------

