• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

PowerColor R9 290X OC 4 GB

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
28,653 (3.74/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
PowerColor's Radeon R9 290X is built on AMD's reference design.Using a 30 MHz overclock on the GPU and coming at reference design pricing, the decision whether this card is worth the money is really a no-brainer.

Show full review
 
Last edited:
Could you please check if the die is at the same height as the metal shim?
I would like to use my old cooler that has a flat surface.
 
Interesting it appears the only difference is these use a the International Rectifier IR 3567B controller rather than the CHiL controllers. Although, with that we see what appears 8% less "peak" watts in gaming, while not any difference in dbA, all the while achieving a 9% bump in performance at 2560X for what I take is quiet mode?

If we are talking quiet to quiet in the perf/watt graph on 2560x is showing a 14% improvement?

Wow :toast:
 
Same IRF controller on the 290X. The BIOS is "uber" mode + 30 MHz
 
Same IRF controller on the 290X. The BIOS is "uber" mode + 30 MHz
Ok, was checking the orginal 290X review, see it now.

So all the places you have it in graphs it's considered "uber"... Okay not that big of deal!
Thanks
 
Quiet and effective custom cooling or gtfo -_-
 
Okay not that big of deal!

yeah it's a small difference, basically you are paying nothing for a guaranteed 30 MHz oc out of the box, with warranty.
 
yeah it's a small difference, basically you are paying nothing for a guaranteed 30 MHz oc out of the box, with warranty.

That is not bad at all, considering the heat and power consumption from such an aggressive card...
 
Very sexy card in my opinion. However, I don't find the lack of analog VGA output as a cons. The standard is set to expire in 2015, we are late 2013, there is barely 2 years before it goes into history. I don't think it affected the score in any way, its just I personally don't find it a CON.
 
Has anyone tried a memory-only OC on a 290X to see if it makes any difference? I'm guessing the 512-bit bus is plenty wide enough and it would be limited almost exclusively by the GPU.
 
However, I don't find the lack of analog VGA output as a cons. The standard is set to expire in 2015, we are late 2013, there is barely 2 years before it goes into history. I don't think it affected the score in any way, its just I personally don't find it a CON.

I completely agree, but mentioning it for completeness, and there are many people who still use analog monitors. Those same people often don't read the whole review :)
 
Has anyone tried a memory-only OC on a 290X to see if it makes any difference? I'm guessing the 512-bit bus is plenty wide enough and it would be limited almost exclusively by the GPU.

Overclocking the memory will give at most 1 fps while increasing heat and power consumption; I never got extra performance raising memory clocks on video cards, maybe it's just me. :confused:
 
290x needs lightning / matrix pcb and cooling with no voltage or temperature restrictions via drivers.

Overclocking the memory will give at most 1 fps while increasing heat and power consumption; I never got extra performance raising memory clocks on video cards, maybe it's just me. :confused:

Memory OC helps in synthetic benchmarks but very rarely in gaming. Gaming is all about core clockspeeds.
 
I would hardly call
No analog VGA outputs
a con. If you're using one of these, it really begs the question; why the hell are you using VGA? :p
 
I would hardly call a con. If you're using one of these, it really begs the question; why the hell are you using VGA? :p

Exact!
 
If you're using one of these, it really begs the question; why the hell are you using VGA?

- poor
- don't know difference between analog and digital
- that's the machine my boss gave me
 
- poor
- don't know difference between analog and digital
- that's the machine my boss gave me

If your poor why are you buying a $550 GPU :confused:

If that's the machine your boss gave you wouldn't the $550 dollars serve best in upgrading something else ?

Its like your depicting the mentality of a console gamer rather then an PC enthusiast where only seeing pretty things on screen matter.

Just odd justifications.
 
If your poor why are you buying at $550 GPU

spent all my money on GPU, can't buy a monitor now

if all we need to debate in the review is the analog vga outputs, then my review must be awesome
 
spent all my money on GPU, can't buy a monitor now

if all we need to debate in the review is the analog vga outputs, then my review must be awesome

Might want to read through some of the post in your reviews.
 
if all we need to debate in the review is the analog vga outputs, then my review must be awesome

Aaaaah, some self-irony :). I feel you man, but you should know better as a writer how the things go. I usually don't comment on high end cards, because it's not my teritory. I don't need one and I can't afford one definitely.
 
So will you be bringing the R9 290 review shortly as well? If so, I hope you use the launch drivers, otherwise, you might as well have reviewed it a week ago.

Also, I suggest removing Skyrim (which seems broken in your test suite anyway...), and adding Battlefield 4 instead.

So at any moment we shall be hearing announcements about custom models, right? ASUS DCUII TOP being the first one I'm guessing. Also, maybe details/tests on whatever "Tahiti XTL" is...
 
I'm glad you pointed out the power efficiency not making sense. Even with silicon lottery it's weird to see it get better with an overclock. It's weird even on the first card that silent mode is less efficient than uber
 
It's supposed to run at those temps. =]

If you don't like the noise, the temps, or the OC headroom, just wait for the non-reference designs. That's what most people are doing, including myself.

Also, no analog output is a con? Really? Who the hell buys a top-tier card and still uses VGA? Yeah, that needs to not even be a consideration anymore. At all.

lol
 
If you realize that that the no VGA Con is not really a Con then it probably was not written for you,for example i always ignore load noise because i used headphones so even if its written in cons i just ignore it.
 
Back
Top