• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Radeon R9 290X 4 GB

HTC

Joined
Apr 1, 2008
Messages
4,664 (0.77/day)
Location
Portugal
System Name HTC's System
Processor Ryzen 5 5800X3D
Motherboard Asrock Taichi X370
Cooling NH-C14, with the AM4 mounting kit
Memory G.Skill Kit 16GB DDR4 F4 - 3200 C16D - 16 GTZB
Video Card(s) Sapphire Pulse 6600 8 GB
Storage 1 Samsung NVMe 960 EVO 250 GB + 1 3.5" Seagate IronWolf Pro 6TB 7200RPM 256MB SATA III
Display(s) LG 27UD58
Case Fractal Design Define R6 USB-C
Audio Device(s) Onboard
Power Supply Corsair TX 850M 80+ Gold
Mouse Razer Deathadder Elite
Software Ubuntu 20.04.6 LTS

once again, you are using the word massacre to describe a few fps/~10% difference
and once again, you are selectively pulling benchmarks that AMD typically do well in(this time from toms but rejects ones from tweaktown) to favor your argument.

I see no massacre here, all im seeing is exactly what a slightly OC'd 780 or titan would do and would probably still run quieter and cooler than the 290x

If the price difference VS titan wasn't by almost a factor of 2, i would agree this was no massacre but achieving this performance @ a bit over half the price, i call that a massacre and i would call it a massacre if instead of ~10% difference was 0%.


Still, i think AMD shot themselves on the foot with a cannon ball here!

The way i see it, they are trying to sell more cards by selling the reference models now with the piece-of-garbage cooler that only those that will either water cool or don't care about noise (considering this, there won't be many, regarding the latter) and then allowing for non-reference cards with a better cooler which, by itself, should increase performance, as in, not increasing any other specs other then the cooler.

What i think they fail to realize is that, if they had a better cooler, and i'm talking about one that could make the card hover around the GHz (which is the supposed default speed) mark as opposed to drop into the 850 MHz range with even worse 650 MHz dips as evidenced by the graph below (by hover, i mean it could drop like 50-70 MHz but not 350 MHz), they would sell WAY more cards now (which would compensate, and then some, for the higher price of the cooler) and, when the non-reference cards hit the market, with even better coolers, sell a shitload more of them.

 
Last edited:
Joined
Feb 8, 2012
Messages
3,014 (0.64/day)
Location
Zagreb, Croatia
System Name Windows 10 64-bit Core i7 6700
Processor Intel Core i7 6700
Motherboard Asus Z170M-PLUS
Cooling Corsair AIO
Memory 2 x 8 GB Kingston DDR4 2666
Video Card(s) Gigabyte NVIDIA GeForce GTX 1060 6GB
Storage Western Digital Caviar Blue 1 TB, Seagate Baracuda 1 TB
Display(s) Dell P2414H
Case Corsair Carbide Air 540
Audio Device(s) Realtek HD Audio
Power Supply Corsair TX v2 650W
Mouse Steelseries Sensei
Keyboard CM Storm Quickfire Pro, Cherry MX Reds
Software MS Windows 10 Pro 64-bit
What i think they fail to realize is that, if they had a better cooler, and i'm talking about one that could make the card hover around the GHz (which is the supposed default speed) mark as opposed to drop into the 850 MHz range with even worse 650 MHz dips as evidenced by the graph below (by hover, i mean it could drop like 50-70 MHz but not 350 MHz)

Yeah, funny thing is, this card would perform much better in just 4 C cooler ambient. But then again it would heat up the room pretty fast and throttle down soon after.
Better aftermarket coolers that would keep it under 90 C mark are often non-blowout type and that would raise case temperature and speed up case fans.
It's hard to balance the noise/temps when you deal with hot hardware on air. You often end up using the headphones.
For this card on air, triple slot vapour chamber blowout style cooler is a way to go IMO.
 

HTC

Joined
Apr 1, 2008
Messages
4,664 (0.77/day)
Location
Portugal
System Name HTC's System
Processor Ryzen 5 5800X3D
Motherboard Asrock Taichi X370
Cooling NH-C14, with the AM4 mounting kit
Memory G.Skill Kit 16GB DDR4 F4 - 3200 C16D - 16 GTZB
Video Card(s) Sapphire Pulse 6600 8 GB
Storage 1 Samsung NVMe 960 EVO 250 GB + 1 3.5" Seagate IronWolf Pro 6TB 7200RPM 256MB SATA III
Display(s) LG 27UD58
Case Fractal Design Define R6 USB-C
Audio Device(s) Onboard
Power Supply Corsair TX 850M 80+ Gold
Mouse Razer Deathadder Elite
Software Ubuntu 20.04.6 LTS
Yeah, funny thing is, this card would perform much better in just 4 C cooler ambient. But then again it would heat up the room pretty fast and throttle down soon after.
Better aftermarket coolers that would keep it under 90 C mark are often non-blowout type and that would raise case temperature and speed up case fans.
It's hard to balance the noise/temps when you deal with hot hardware on air.
You often end up using the headphones.
For this card on air, triple slot vapour chamber blowout style cooler is a way to go IMO.

But the stock cooler wouldn't have to be top-of-the-line: just not bottom-of-the-line.

As i said, it could still dip below the advertised base speed like 50-70 MHz or so: dipping 350 MHz is well beyond over the top, IMO.

In alternative, they could have a base clock of 900 MHz coupled with a better cooler and i think it would still perform better in quiet mode, though not as good in uber mode but that could be fixed easily by having the quiet mode speed of 900 MHz and the uber mode speed of 1000 MHz.

As it stands, even in uber mode default fan speed the card throttles down because of high temps, as seen in the graph below, and it reaches 600 MHz and that's a throttle down of "just 40%" ...



It's a shame a time scale on W1zzard's graphs isn't included: i would like to know how long it took for the card to throttle down heavily in both modes (not the beginning of the throttling: the part where it starts being more severe).


A question @ W1zzard: is it possible to run some tests (don't need all: a few should suffice) the way you ran them but with the base speed on both modes @ ... say ... 900 MHz? I'm wondering if it performs better by not throttling constantly due to temps.
 
Last edited:
Joined
Apr 30, 2012
Messages
3,881 (0.84/day)
.
Gibbo - Overclockers UK

**UNDER-VOLTING**

Guys it seems the current drivers / BIOS have protection built in so the card won't allow you to under-volt it by any usable amount unfortunately.

Anything below 1225mv seems to trigger protection meaning it won't load to windows desktop until you do a full power off and on.

However going from 1250mv down to 1225mv has reduced temperature by 5c under-load with no ill-effect on stability at stock speeds with the card still set at +50 power with 1000Mhz core speed always.


If this protection gets removed by later drivers/BIOS the card in theory should be able to run fine around 1150-1200mv which should be good for a good 10c+ reduction in temps over stock. Seems you can get them to run pretty cool.

Infact its turning out to be quite a fun card for the tweaker/overclocker.
 
Joined
May 21, 2011
Messages
660 (0.13/day)
System Name Tiger1-Workstation
Processor Intel XEON E3-1275V2 / E3-1230V3
Motherboard ASUS SABERTOOTH Z77 / AsRock H87 Performance
Cooling Corsair H80i Watercooling
Memory 32GB Corsair Dominator Platinum 2400
Video Card(s) Inno3D GTX 780 Ti
Storage 2TB SSD(4X OCZ vertex 4 256GB LSI RAID0 + Crucial M550 1TB)
Display(s) 2x Dell U3011 30" IPS
Case Silverstone Raven 03
Audio Device(s) Xonar Essence STX--> Xonar Essence One --> SPL Auditor -->Hivi X6
Power Supply Corsair AX860i Platinum
Software Windows 8.1 Enterprise
If the price difference VS titan wasn't by almost a factor of 2, i would agree this was no massacre but achieving this performance @ a bit over half the price, i call that a massacre and i would call it a massacre if instead of ~10% difference was 0%.


Still, i think AMD shot themselves on the foot with a cannon ball here!

The way i see it, they are trying to sell more cards by selling the reference models now with the piece-of-garbage cooler that only those that will either water cool or don't care about noise (considering this, there won't be many, regarding the latter) and then allowing for non-reference cards with a better cooler which, by itself, should increase performance, as in, not increasing any other specs other then the cooler.

What i think they fail to realize is that, if they had a better cooler, and i'm talking about one that could make the card hover around the GHz (which is the supposed default speed) mark as opposed to drop into the 850 MHz range with even worse 650 MHz dips as evidenced by the graph below (by hover, i mean it could drop like 50-70 MHz but not 350 MHz), they would sell WAY more cards now (which would compensate, and then some, for the higher price of the cooler) and, when the non-reference cards hit the market, with even better coolers, sell a shitload more of them.

http://tpucdn.com/reviews/AMD/R9_290X/images/analysis_quiet.gif

why are you comparing this to titan? this is exactly the inconsistency/bias I talked that people use to further their argument. a better comparison would be against the 780gtx which should be coming down in price soon.
I'll repaste my previous post again in case you have reading comprehension issues:

First, they'll start off with the argument that the 290X is about ~$100 cheaper than the GTX780 (and that the price killed the 780)

and then they'll say "Oh who cares about temp, noise and power consumption", -but you know what? most people do! Then after that, they'll say "Just throw a waterblock on it".

ummm, if you truly didnt' care about noise and temp why are you throwing a waterblock on it, and how much does a waterblock cost? around $150, so basically, 290x + waterblock cost around $700, that is only if you already have a waterloop setup, if you don't, well you are likely to run into the $1000s+ to have everything setup just to run this card within acceptable parameters.

and don't give me this 'that everyone who buys high end already have a waterloop setup', that is just BS. most people don't run waterloops and those who do are in the small minorities, and the people who makes these claims don't even have waterloops themselves.

and when all arguments fail against them, they'll bring out the performance card, where they claim 290x "absolutely destroyed" titan, um why are you comparing to titan first of all(it LOST to titan in silent mode, and if u are comparing in uber mode, then don't give me this its only 2db louder, because its 12db louder in uber mode), it barely even "destroyed" 780, they are well within 1~5% of each other trading blows. that is not what "destroy" means, "destroy" means beating something by at least 15-20%, otherwise, a few fps difference is barely even noticeable to the human eye.

so to sum it up:
1) they claim its cheaper
2) they ignore noise, temp and power consumption
3) then they say throw a water block on it ignoring the price of a water-cool kit (+$150 for the block and +$400 for the whole setup)
4) selective/inconsistency in comparison of performance of 290x (in uber mode), but using silent mode to compare noise and temp.
5) overhype with words such as "destroy" "kill" "massacre" when benchmarks show they are fairly equal matched.
6) selectively/inconsistency in comparison of price to titan when argument favors them. but compares to 780 when situations fits them(performance).
 
Last edited:

HTC

Joined
Apr 1, 2008
Messages
4,664 (0.77/day)
Location
Portugal
System Name HTC's System
Processor Ryzen 5 5800X3D
Motherboard Asrock Taichi X370
Cooling NH-C14, with the AM4 mounting kit
Memory G.Skill Kit 16GB DDR4 F4 - 3200 C16D - 16 GTZB
Video Card(s) Sapphire Pulse 6600 8 GB
Storage 1 Samsung NVMe 960 EVO 250 GB + 1 3.5" Seagate IronWolf Pro 6TB 7200RPM 256MB SATA III
Display(s) LG 27UD58
Case Fractal Design Define R6 USB-C
Audio Device(s) Onboard
Power Supply Corsair TX 850M 80+ Gold
Mouse Razer Deathadder Elite
Software Ubuntu 20.04.6 LTS
why are you comparing this to titan? this is exactly the inconsistency/bias I talked that people use to further their argument. a better comparison would be against the 780gtx which should be coming down in price soon.
I'll repaste my previous post again in case you have reading comprehension issues:

Doesn't this card trade blows (mostly in uber mode) with the titan? Wins some and loses some. It barely beats the 780 in quiet mode, but it does beat it and that's with a piss poor cooler which, i'm sure you agree, restrains the performance of this card. And it DOES beat the titan when in uber mode, according to W1zzard's graph below.



Since you're not the only one who can re-paste your previous posts ...

If the price difference VS titan wasn't by almost a factor of 2, i would agree this was no massacre but achieving this performance @ a bit over half the price, i call that a massacre and i would call it a massacre if instead of ~10% difference was 0%.

Do you see my point?
 
Last edited:
Joined
May 21, 2011
Messages
660 (0.13/day)
System Name Tiger1-Workstation
Processor Intel XEON E3-1275V2 / E3-1230V3
Motherboard ASUS SABERTOOTH Z77 / AsRock H87 Performance
Cooling Corsair H80i Watercooling
Memory 32GB Corsair Dominator Platinum 2400
Video Card(s) Inno3D GTX 780 Ti
Storage 2TB SSD(4X OCZ vertex 4 256GB LSI RAID0 + Crucial M550 1TB)
Display(s) 2x Dell U3011 30" IPS
Case Silverstone Raven 03
Audio Device(s) Xonar Essence STX--> Xonar Essence One --> SPL Auditor -->Hivi X6
Power Supply Corsair AX860i Platinum
Software Windows 8.1 Enterprise
Doesn't this card trade blows (mostly in uber mode) with the titan? Wins some and loses some. It barely beats the 780 in quiet mode, but it does beat it and that's with a piss poor cooler which, i'm sure you agree, restrains the performance of this card. And it DOES beat the titan when in uber mode, according to W1zzard's graph below.

http://tpucdn.com/reviews/AMD/R9_290X/images/perfrel.gif

Since you're not the only one who can re-paste your previous posts ...



Do you see my point?

because the titan is not in direct competition with 290x, with a portion of the titan users buying this card for company/workstation use, the Titan can be seen as a card that's "best of both worlds" between a gaming card and a professional card(which cost significantly more than gaming cards).

another factor is that the titan and to that extend the gtx780 is much better in terms of noise and temperature and efficiency. from a technical point of view, it would be much more difficult to design a product that does well in all criteria than if a product were just focused in one. in this case, Nvidia had to design a graphics card that had noise, temperature, power consumption well under control while trying to maximize performance, do you understand this is a lot more difficult to achieve than just brute force performance? I would imagine the R&D cost would be much higher too which is reflected in the premium. To give an example of this would be that I'm an architectural designer, if I were asked to design a very cheap building or a very efficient building it would be relatively easy, but if I were asked to design a building that is both cheap, elegant, and energy efficient then I would probably charge a hell of a lot more to come up with the design, do you get my drift?

I also stated that the 290x has just release while the titan/780 has been out for half a year. and Nvidia will soon be adjusting their pricing very soon, its too early to be calling "massacres" and "obliterations" at this point.

what bugs me (or scares me) more than anything at this point is that with the release of 290x, many of the "Nvidia-naysayers" are suddenly out of the forest creating this product of Titan/780hybrid, where this imaginary card has the price tag of the Titan but the performance of the 780. and with that creation of another imaginary card with the performance of 290x uber mode and noise/temp of the silent mode, and is selectively pitting these two imaginary cards against each other to further their agenda. Don't get me wrong im not taking any sides here, but I would like to sort the facts straight and see some consistencies in their arguments.
 
Last edited:
Joined
Sep 28, 2012
Messages
980 (0.22/day)
System Name Poor Man's PC
Processor waiting for 9800X3D...
Motherboard MSI B650M Mortar WiFi
Cooling Thermalright Phantom Spirit 120 with Arctic P12 Max fan
Memory 32GB GSkill Flare X5 DDR5 6000Mhz
Video Card(s) XFX Merc 310 Radeon RX 7900 XT
Storage XPG Gammix S70 Blade 2TB + 8 TB WD Ultrastar DC HC320
Display(s) Xiaomi G Pro 27i MiniLED + AOC 22BH2M2
Case Asus A21 Case
Audio Device(s) MPow Air Wireless + Mi Soundbar
Power Supply Enermax Revolution DF 650W Gold
Mouse Logitech MX Anywhere 3
Keyboard Logitech Pro X + Kailh box heavy pale blue switch + Durock stabilizers
VR HMD Meta Quest 2
Benchmark Scores Who need bench when everything already fast?
http://i1241.photobucket.com/albums/gg515/Elang_Mahameru/furmark_6mins.png
even my GTX680 DC II T furmark'ed wouldn't touch 50*C
so, is it safe to say that this card wouldn't hit 65*C under water ?

Trust me mate...that's why they called me Elders or opa in kaskus :laugh:

Errrm... Thing is, I'm not asking about TDP (TDP != maximum power draw) and w1zzard didn't even say anything about TDP (he said power configuration), and what I'm asking is exactly that, power configuration.
Why with the same power configuration (one 6pin + 8pin or 2 x 6pin) AMD cards has higher theoretical maximum power draw than NVIDIA? Is the difference coming from them (special power setup/setting from AMD)? Or is it coming from PCI-E 3.0 standard?

TDP is straight value constraint with VRM design :)

Let me elaborate...




Titan and GTX 780 only had 6 choke Foxconn made with only single channel/tunneling driver+ mosfet ,albeit R22 rating 35A each at OC (operational condition) 90'C with tolerance 10-15%.This could translate they only deliver 200W,added some pci-e slot power 75W and then you have fairly 275W.

While R290X...


Had 6 chokes Coiltronics made with dual channel/tunneling driver+mosfet,R 15 rating 50A each at OC (operational condition) 110'C with tolerance 5-10%.This could translate they will deliver 300W,added some pci-e slot power 75W and then you have 375W :)

I totally understand, but the amount of ignorance here is killing me tho.
first they throw out the price argument by saying 290x is ALOT cheaper (which is not, money saved couldn't even pay for a waterblock)

It's just nature of internet :laugh:
You can always mind them though,basically they didn't have valid ground to make such a statement implying to spot which is which.Its hilarious to see anyone debating over enthusiast card while he himself never touched,try,test or even had one :laugh:
Just like debating Viper is ridiculously inefficient,Beemer had utterly crap materials and Mercedes big cc cant run faster than yo mama riding a wheel chair while he only had Prius :laugh:
 

wolf

Better Than Native
Joined
May 7, 2007
Messages
8,178 (1.27/day)
System Name MightyX
Processor Ryzen 9800X3D
Motherboard Gigabyte X650I AX
Cooling Scythe Fuma 2
Memory 32GB DDR5 6000 CL30
Video Card(s) Asus TUF RTX3080 Deshrouded
Storage WD Black SN850X 2TB
Display(s) LG 42C2 4K OLED
Case Coolermaster NR200P
Audio Device(s) LG SN5Y / Focal Clear
Power Supply Corsair SF750 Platinum
Mouse Corsair Dark Core RBG Pro SE
Keyboard Glorious GMMK Compact w/pudding
VR HMD Meta Quest 3
Software case populated with Artic P12's
Benchmark Scores 4k120 OLED Gsync bliss
This thread just gets better and better :toast:

All said and done, Nvidia clearly have the upper hand, they can now adjust their range of highly competitive cards ( and introduce SKU's to fit) accordingly.

Let's disregard the Titan completely for arguments sake, Nvidia did it because they could, like Intel their extreme chips, they can sell one for a grand because AMD can't, and AMD did similarly back in the day with the old FX's too, anyway;

Yes it is $100 more than a 290X, but not for long, as we already know, and it hasn't even been 48 hours yet.

And that 100$ more, bought you this level of performance up to 6+ months ago, remember that, AMD didn't have a card this fast then.

This card is good because it will alter how much performance you get for ~$500-$600, and that's about it. This level of performance was new 9 months ago, even longer if you look at a 690, which I'm sure people that have them are quite content with, again , yes it was $1000, but it was $1000 when it launched a year and a half ago.

Last but not least, all the talk about 4k, who on earth has a 4k monitor pray tell? that's what I thought. the vast majority of us game at 1080p-1440p-1600p, the cards are a hec of a lot closer there, and lets face it, even one 780 or 290x is overkill for 1080p alone.

Let's keep things in perspective.

todays 2 cents.
 
Last edited:
Joined
Oct 1, 2013
Messages
250 (0.06/day)
Trust me mate...that's why they called me Elders or opa in kaskus :laugh:



TDP is straight value constraint with VRM design :)

Let me elaborate...


Titan and GTX 780 only had 6 choke Foxconn made with only single channel/tunneling driver+ mosfet ,albeit R22 rating 35A each at OC (operational condition) 90'C with tolerance 10-15%.This could translate they only deliver 200W,added some pci-e slot power 75W and then you have fairly 275W.

While R290X...


Had 6 chokes Coiltronics made with dual channel/tunneling driver+mosfet,R 15 rating 50A each at OC (operational condition) 110'C with tolerance 5-10%.This could translate they will deliver 300W,added some pci-e slot power 75W and then you have 375W :)



It's just nature of internet :laugh:
You can always mind them though,basically they didn't have valid ground to make such a statement implying to spot which is which.Its hilarious to see anyone debating over enthusiast card while he himself never touched,try,test or even had one :laugh:
Just like debating Viper is ridiculously inefficient,Beemer had utterly crap materials and Mercedes big cc cant run faster than yo mama riding a wheel chair while he only had Prius :laugh:

Oh you showed him this picture. My plan was to humiliate the Titan/780 later on for their poor VRM design. In short, except the flashy cover, the board and the chip of Titan/780 are far more inferior than 290X's counterparts. No surprise that 290X has been broken records with LN2, it is truly the beast.
 
Joined
Feb 8, 2005
Messages
1,675 (0.23/day)
Location
Minneapolis, Mn
System Name Livingston
Processor i7-4960HQ
Motherboard macbook prp retina
Cooling Alphacool NexXxoS Monsta (240mm x 120mm x 80mm)
Memory 16Gb
Video Card(s) Zotac Arctic Storm Nvidia 980ti
Display(s) 1x Acer XB270HU, 1x Catleap, 1x Oculus
Benchmark Scores http://www.3dmark.com/fs/770087
all the talk about 4k, who on earth has a 4k monitor pray tell? that's what I thought. the vast majority of us game at 1080p-1440p-1600p, the cards are a hec of a lot closer there, and lets face it, even one 780 or 290x is overkill for 1080p alone.

Let's keep things in perspective.

todays 2 cents.

<-- There are lots of surround and 3d vision gamers. Speaking as one of em I don't think we could go wrong with a 690 or 2 290x at this point. :ohwell:
 

MxPhenom 216

ASIC Engineer
Joined
Aug 31, 2010
Messages
13,006 (2.50/day)
Location
Loveland, CO
System Name Ryzen Reflection
Processor AMD Ryzen 9 5900x
Motherboard Gigabyte X570S Aorus Master
Cooling 2x EK PE360 | TechN AM4 AMD Block Black | EK Quantum Vector Trinity GPU Nickel + Plexi
Memory Teamgroup T-Force Xtreem 2x16GB B-Die 3600 @ 14-14-14-28-42-288-2T 1.45v
Video Card(s) Zotac AMP HoloBlack RTX 3080Ti 12G | 950mV 1950Mhz
Storage WD SN850 500GB (OS) | Samsung 980 Pro 1TB (Games_1) | Samsung 970 Evo 1TB (Games_2)
Display(s) Asus XG27AQM 240Hz G-Sync Fast-IPS | Gigabyte M27Q-P 165Hz 1440P IPS | Asus 24" IPS (portrait mode)
Case Lian Li PC-011D XL | Custom cables by Cablemodz
Audio Device(s) FiiO K7 | Sennheiser HD650 + Beyerdynamic FOX Mic
Power Supply Seasonic Prime Ultra Platinum 850
Mouse Razer Viper v2 Pro
Keyboard Corsair K65 Plus 75% Wireless - USB Mode
Software Windows 11 Pro 64-Bit
Oh you showed him this picture. My plan was to humiliate the Titan/780 later on for their poor VRM design. In short, except the flashy cover, the board and the chip of Titan/780 are far more inferior than 290X's counterparts. No surprise that 290X has been broken records with LN2, it is truly the beast.

I don t think nvidia has ever had good reference designs. They just save that for their board partners, to do nice non reference designs.
 
Joined
Apr 19, 2011
Messages
2,198 (0.44/day)
Location
So. Cal.
If some AIB said they would offer a three slot version that was rear exhaust and quieter, while letting it run more often at 1000Mhz and $550 it might really peak my interest. Sure perhaps not a C-F card... but to not have all the heat floating inside the chassis would be nice.

While then duct it all outside with a vent/fan in summer, while inside for winter! Necessity is the mother of invention.

I still would like to see a through tear down of the 290X cooler/fan vs. what the stock Titan was built like. Would be interesting to see if you could mount a Titan on the 290X and see if was able to it cool better.
 
Joined
Mar 10, 2013
Messages
59 (0.01/day)
Processor i5 3570K @ 4.3 GHz | i7 10700k @ stock
Motherboard Gigabyte GA-Z77X-D3H | MSI Z490 Gaming Plus
Memory 18 GB DDR3 | 32 GB DDR4
Video Card(s) GTX 1070 | RTX 3070
Display(s) BenQ XL2411T @120HZ | ASUS VG258QR @165HZ
Power Supply XFX Pro Black Edition 750W | Seasonic Ultra 850W Gold
Software Win10 PRO | OL 8.9 and 9.x
I don t think nvidia has ever had good reference designs. They just save that for their board partners, to do nice non reference designs.

What? Reference cooler of titan and 780 is great. Its slick, silent and does the job pretty well.

If you change the fan speed accordingly you can even OC the reference model to go beyond the 902 Mhz "limit". I have experienced a 780 with custom fan speed to 67ºC with the boost to 993 Mhz, on a REFERENCE cooled 780. And the noise is practically the same. And after almost 3 hours temps are about 69ºC MAX, with constant 993 Mhz clock. It only throttles down when it reaches the 80ºC as you may be aware of course if you read 780 review, this I only saw with default fan settings where it would go down to the reference 863 Mhz but never go below that ofc.
 
Last edited:
Joined
Apr 17, 2008
Messages
3,935 (0.65/day)
Location
West Chester, OH
Relying on price a your sole saving grace isn't a good idea. That only works if nVidia can't lower prices. We already know both the Titan and 780 were very high margin parts, so nVidia's only answer has to be a price reduction. Yeah, it is great that there is finally competition to drive down the insane prices, but this isn't a ground breaking card any way you look at it.



No, they charge a premium for the performance. Titan and the 780 were expensive because there was nothing that could compete with them, now there there is we should see a price drop on the 780, as well as the 780Ti being priced pretty similar to the 290X.

So AMD will lower the price of the R290X, they'll play the price war just as much as NVidia MIGHT do to ensure sales. This isn't anything new. I think people are missing the point here... AMD just released a $549 MSRP card that competes with NVidia's $999 MSRP card. It's nearly half the price.

Lets make a scenario and think of things a little differently. What if AMD had released the R290X when NVidia released the Titan and it was $999. So, we're reversing the releases.

Would the $999 R290X sell as well as the Titan did... yes.
Why? Because it would have been the most powerful card available. Hardcore enthusiasts would be the only ones buying them because it cost a thousand flipping dollars. No one would be complaining about temps/noise/power consumption when they just spent $1k on a card. Why? Because its beats anything else available with ease and is clearly the king of the GPU crop. They would be praising its unparallelled performance and gawking at its frame rates and whatever else enthusiasts do.

Now, continuing our reversal scenario, NVidia would now release the Titan and it would cost $549. NVidia would now have the best bang for your buck along with the overall best card. Now AMD would look silly with their overly insanely priced R290X and NVidia would be the saving grace of the affordable top tier card market.

Moral of my scenario: Enthusiast with 1,000 dollars to spend don't care about cost. They want the best card money can buy. They don't care about blowing money, and both AMD and NVidia don't care about charging you that price. But here is what they DO care about... SALES and moving product.

If they make cards that are awesome that people aren't buying, then what is the point? Why bother making awesome cards if no one is going to buy them? Why would I waste my R&D on something that no one is going to buy? In other words, stop whining about the R290X when its clear as crystal that it IS the best bang for your buck ATM. How can people have anything to complain about when a $550 card is competing with a card that goes for 1,000 dollars?

So now people are saying "well, i could spend an extra $50-100 for a GTX 780 and it will blow away a R290X if I get one that's non reference" . Well, *LIGHTBULB*, what about non-reference R290X's? Like a lightning from MSI or something from Gigabyte or ASUS. C'mon... I've never seen so many NVidia-biased responses in a video card review in a long time... this is silly business.
 
Last edited:

MxPhenom 216

ASIC Engineer
Joined
Aug 31, 2010
Messages
13,006 (2.50/day)
Location
Loveland, CO
System Name Ryzen Reflection
Processor AMD Ryzen 9 5900x
Motherboard Gigabyte X570S Aorus Master
Cooling 2x EK PE360 | TechN AM4 AMD Block Black | EK Quantum Vector Trinity GPU Nickel + Plexi
Memory Teamgroup T-Force Xtreem 2x16GB B-Die 3600 @ 14-14-14-28-42-288-2T 1.45v
Video Card(s) Zotac AMP HoloBlack RTX 3080Ti 12G | 950mV 1950Mhz
Storage WD SN850 500GB (OS) | Samsung 980 Pro 1TB (Games_1) | Samsung 970 Evo 1TB (Games_2)
Display(s) Asus XG27AQM 240Hz G-Sync Fast-IPS | Gigabyte M27Q-P 165Hz 1440P IPS | Asus 24" IPS (portrait mode)
Case Lian Li PC-011D XL | Custom cables by Cablemodz
Audio Device(s) FiiO K7 | Sennheiser HD650 + Beyerdynamic FOX Mic
Power Supply Seasonic Prime Ultra Platinum 850
Mouse Razer Viper v2 Pro
Keyboard Corsair K65 Plus 75% Wireless - USB Mode
Software Windows 11 Pro 64-Bit
What? Reference cooler of titan and 780 is great. Its slick, silent and does the job pretty well.

If you change the fan speed accordingly you can even OC the reference model to go beyond the 902 Mhz "limit". I have experienced a 780 with custom fan speed to 67ºC with the boost to 993 Mhz, on a REFERENCE cooled 780. And the noise is practically the same. And after almost 3 hours temps are about 69ºC MAX, with constant 993 Mhz clock. It only throttles down when it reaches the 80ºC as you may be aware of course if you read 780 review, this I only saw with default fan settings where it would go down to the reference 863 Mhz but never go below that ofc.

Im talking board designs, not the cooler.
 
Joined
Feb 8, 2005
Messages
1,675 (0.23/day)
Location
Minneapolis, Mn
System Name Livingston
Processor i7-4960HQ
Motherboard macbook prp retina
Cooling Alphacool NexXxoS Monsta (240mm x 120mm x 80mm)
Memory 16Gb
Video Card(s) Zotac Arctic Storm Nvidia 980ti
Display(s) 1x Acer XB270HU, 1x Catleap, 1x Oculus
Benchmark Scores http://www.3dmark.com/fs/770087
Would the $999 R290X sell as well as the Titan did... yes.
Why? Because it would have been the most powerful card available. Hardcore enthusiasts would be the only ones buying them because it cost a thousand flipping dollars. No one would be complaining about temps/noise/power consumption when they just spent $1k on a card. Why? Because its beats anything else available with ease and is clearly the king of the GPU crop. They would be praising its unparallelled performance and gawking at its frame rates and whatever else enthusiasts do.

Now, continuing our reversal scenario, NVidia would now release the Titan and it would cost $549. NVidia would now have the best bank for your buck along with the overall the best card, and now AMD would look silly with their overly insanely priced R290X and NVidia would be the saving grace of the affordable top tier card market.

As others said in this thread (its a long thread at this point), I don't think the Titan would exist if this (290x) had been released then, there would have been a gaming card that did not have the compute performance of the titan, and maybe a more affordable version of the tesla that cost less for professional use.
 
Joined
Jan 2, 2012
Messages
1,079 (0.23/day)
Location
Indonesia
Processor AMD Ryzen 7 5700X
Motherboard ASUS STRIX X570-E
Cooling NOCTUA NH-U12A
Memory G.Skill FlareX 32 GB (4 x 8 GB) DDR4-3200
Video Card(s) ASUS RTX 4070 DUAL
Storage 1 TB WD Black SN850X | 2 TB WD Blue SN570 | 10 TB WD Purple Pro
Display(s) LG 32QP880N 32"
Case Fractal Design Define R5 Black
Power Supply Seasonic Focus Gold 750W
Mouse Pulsar X2
Keyboard KIRA EXS
TDP is straight value constraint with VRM design :)

Let me elaborate...


Titan and GTX 780 only had 6 choke Foxconn made with only single channel/tunneling driver+ mosfet ,albeit R22 rating 35A each at OC (operational condition) 90'C with tolerance 10-15%.This could translate they only deliver 200W,added some pci-e slot power 75W and then you have fairly 275W.

While R290X...


Had 6 chokes Coiltronics made with dual channel/tunneling driver+mosfet,R 15 rating 50A each at OC (operational condition) 110'C with tolerance 5-10%.This could translate they will deliver 300W,added some pci-e slot power 75W and then you have 375W :)

Make sense, I understand now, thank you.
But still curious why W1zzard didn't write about this difference in VRM design can cause difference maximum power draw despite both using exactly same power configuration.
When I first saw this I thought it was a typo, but then after reading all R9 reviews (280X,270X, and now 290X) I assume that there must be something going on.

From TPU picture I see that 290X is using R23 not R15 (R15 is 7970).
Is there any difference between R23 and R15?
All I can find is this datasheet : http://www.cooperindustries.com/con...-datasheets/Bus_Elx_DS_4341_FP1007_Series.pdf
 
Last edited:
Joined
Apr 17, 2008
Messages
3,935 (0.65/day)
Location
West Chester, OH
As others said in this thread (its a long thread at this point), I don't think the Titan would exist if this (290x) had been released then, there would have been a gaming card that did not have the compute performance of the titan, and maybe a more affordable version of the tesla that cost less for professional use.

Because they would magically change their entire R&D plans midway through making the Titan..... yeah ok. Well what about the GTX 4xx series, they should have done it then but they didn't. Moot.
 
Joined
Feb 8, 2005
Messages
1,675 (0.23/day)
Location
Minneapolis, Mn
System Name Livingston
Processor i7-4960HQ
Motherboard macbook prp retina
Cooling Alphacool NexXxoS Monsta (240mm x 120mm x 80mm)
Memory 16Gb
Video Card(s) Zotac Arctic Storm Nvidia 980ti
Display(s) 1x Acer XB270HU, 1x Catleap, 1x Oculus
Benchmark Scores http://www.3dmark.com/fs/770087
Because they would magically change their entire R&D plans midway through making the Titan..... yeah ok. Well what about the GTX 4xx series, they should have done it then but they didn't. Moot.

You were the one that postulated a before the titan was released situation. You can granularize your logic after the fact to suit your needs as much as you want. But it really just means no one will want to talk to you because you constantly rescope the debate. Enjoy talking to yourself.
 
Joined
May 8, 2013
Messages
84 (0.02/day)
Better aftermarket coolers that would keep it under 90 C mark are often non-blowout type and that would raise case temperature and speed up case fans.

But it's a lot easier to get good performance out of case fans while retaining low noise than it is with a small, cheaply-made blower-type fan. It's the same reason why closed-loop liquid coolers work well: you're not eliminating the need for ventilation, but moving it to a location where you can use larger (and multiple) fans which move more air at lower noise levels.
 

the54thvoid

Super Intoxicated Moderator
Staff member
Joined
Dec 14, 2009
Messages
13,058 (2.39/day)
Location
Glasgow - home of formal profanity
Processor Ryzen 7800X3D
Motherboard MSI MAG Mortar B650 (wifi)
Cooling be quiet! Dark Rock Pro 4
Memory 32GB Kingston Fury
Video Card(s) Gainward RTX4070ti
Storage Seagate FireCuda 530 M.2 1TB / Samsumg 960 Pro M.2 512Gb
Display(s) LG 32" 165Hz 1440p GSYNC
Case Asus Prime AP201
Audio Device(s) On Board
Power Supply be quiet! Pure POwer M12 850w Gold (ATX3.0)
Software W10
Trust me mate...that's why they called me Elders or opa in kaskus :laugh:



TDP is straight value constraint with VRM design :)

Let me elaborate...


Titan and GTX 780 only had 6 choke Foxconn made with only single channel/tunneling driver+ mosfet ,albeit R22 rating 35A each at OC (operational condition) 90'C with tolerance 10-15%.This could translate they only deliver 200W,added some pci-e slot power 75W and then you have fairly 275W.

While R290X...


Had 6 chokes Coiltronics made with dual channel/tunneling driver+mosfet,R 15 rating 50A each at OC (operational condition) 110'C with tolerance 5-10%.This could translate they will deliver 300W,added some pci-e slot power 75W and then you have 375W :)



It's just nature of internet :laugh:
You can always mind them though,basically they didn't have valid ground to make such a statement implying to spot which is which.Its hilarious to see anyone debating over enthusiast card while he himself never touched,try,test or even had one :laugh:
Just like debating Viper is ridiculously inefficient,Beemer had utterly crap materials and Mercedes big cc cant run faster than yo mama riding a wheel chair while he only had Prius :laugh:

Excellent info on the chokes, thanks. Makes me more interested in it now, knowing it has a solid voltage base.
 
Joined
Oct 26, 2011
Messages
3,145 (0.66/day)
Processor 8700k Intel
Motherboard z370 MSI Godlike Gaming
Cooling Triple Aquacomputer AMS Copper 840 with D5
Memory TridentZ RGB G.Skill C16 3600MHz
Video Card(s) GTX 1080 Ti
Storage Crucial MX SSDs
Display(s) Dell U3011 2560x1600 + Dell 2408WFP 1200x1920 (Portrait)
Case Core P5 Thermaltake
Audio Device(s) Essence STX
Power Supply AX 1500i
Mouse Logitech
Keyboard Corsair
Software Win10
Excellent info on the chokes, thanks. Makes me more interested in it now, knowing it has a solid voltage base.

AMD/ATI has always had the best reference designs.

I've been the most impressed by 6990s volterras PWM, they could resist more than 100C on stock cooling with 1.3v+

Do that to a Titan and you'll end up with a 1k $ paperweight.
 

Am*

Joined
Nov 1, 2011
Messages
332 (0.07/day)
System Name 3D Vision & Sound Blaster
Processor Intel Core i5 2500K @ 4.5GHz (stock voltage)
Motherboard Gigabyte P67A-D3-B3
Cooling Thermalright Silver Arrow SB-E Special Edition (with 3x 140mm Black Thermalright fans)
Memory Crucial Ballistix Tactical Tracer 16GB (2x8GB 1600MHz CL8)
Video Card(s) Nvidia GTX TITAN X 12288MB Maxwell @1350MHz
Storage 6TB of Samsung SSDs + 12TB of HDDs
Display(s) LG C1 48 + LG 38UC99 + Samsung S34E790C + BenQ XL2420T + PHILIPS 231C5TJKFU
Case Fractal Design Define R4 Windowed with 6x 140mm Corsair AFs
Audio Device(s) Creative SoundBlaster Z SE + Z906 5.1 speakers/DT 990 PRO
Power Supply Seasonic Focus PX 650W 80+ Platinum
Mouse Logitech G700s
Keyboard CHERRY MX-Board 1.0 Backlit Silent Red Keyboard
Software Windows 7 Pro (RIP) + Winbloat 10 Pro
Benchmark Scores 2fast4u,bro...
The GK104 has been a massive success for Nvidia, and as you said yourself it was purely focused on gaming and it has and still does it's job brilliantly. No one gives a toss about compute benchmarks they never run. Can't say I've missed out on anything running my little GTX 670, in fact it does me proud every day.

At least give them credit where it is due.

And yet GK110 powered cards like the K20 and K20X have been on the market for ages already, the chip that powers Titan is literally over a year old already.

At least give them credit where it is due.

Can't argue with that, but you make your bed and you lie in it.

Way to contradict yourself, bud. Attempts to argue that compute performance is worthless, proceeds to bring up compute cards as prime examples. Umm...kay.

I will give Nvidia credit where it's due: Nvidia have done well to milk their obscenely overpriced range of cards for this long, and a fully ungimped and functioning GK110 still has a lot of potential as a GeForce card against the R290X, if Nvidia are willing to release it soon and for a comparable price. If they still decide not to, Nvidia can prepare to delay Maxwell or scrap their costly GK110 salvage parts and/or sell them at give-away prices.

There are some things you seem to be forgetting:

These cards are being sold mostly to gamers, so no one cares how good the cards are at GPU Compute tasks. You're making the same argument that people made to defend Fermi. AMD should have learned from nVidia's mistake and learned the GPU Compute doesn't sell desktop GPUs.

Of course if the 7970 had been faster nVidia wouldn't have been able to use GK104 to compete, but it wasn't faster. I might as well say if the GTX580 had been a little bit faster, nVidia wouldn't have needed to release GK104 at all. You can live in a land of IF's all you want, but it doesn't help you make a point.

The fact is no one but nVidia knows how ready GK110 was. We certainly know the fab was capable of producing GK110 when GK104 was launch, we also know the architecture was ready. So it really comes down to yields, and obviously since nVidia knew what they had to compete with, they went with GK104 because it had much better yields. But I bet GK110 would have been possible in place of the GK104 cards we got if nVidia had to use it.

And no, AMD cards don't compete mainly against their own previous generation, they compete against nVidia. And we should be cutting AMD slack just because they mis-managed their company and now have no R&D funds.

I guess you missed the part where I said I liked the flagship Fermi parts and would prefer a fully functioning/overheating Fermi-equivalent with un-gimped compute -- over a cut-down, gimped and self-throttling Kepler to skew benchmarks.

I never supported AMD's horrible management that cut some 30% of its engineering force, was merely pointing out the fact that AMD during its best days vs Nvidia during their worst recent days, cannot remotely compare on paper in R&D budgets or any other financial stats, seeing how they are competing (or at least attempting to compete) against both Intel AND Nvidia, and it's a miracle that AMD are actually managing to do it in the costly top-of-the-line GPU market, instead of abandoning discrete GPUs entirely and becoming another APU/SOC-only company chasing Intel's most sought-after market (which may be a reality for AMD sooner than later). By all means feel free to show me info that says otherwise, but if you're going to somehow argue against them even attempting to compete, don't bother. Enjoy your brownie points from the green-favouring zealots on here and move along, I'm not going to even attempt to prove you (or them) wrong in this respect.

...so to sum it up:
1) they claim its cheaper
2) they ignore noise, temp and power consumption
3) then they say throw a water block on it ignoring the price of a water-cool kit (+$150 for the block and +$400 for the whole setup)
4) selective/inconsistency in comparison of performance of 290x (in uber mode), but using silent mode to compare noise and temp.
5) overhype with words such as "destroy" "kill" "massacre" when benchmarks show they are fairly equal matched.
6) selectively/inconsistency in comparison of price to titan when situation favors them. but compares to 780 when other situations fits them(performance).

1) people claim it's cheaper because it is, get over it.
2) that's because 3rd party designs are already on the way, which are never going to happen for the Titan.
3) see above post. Even so, R290X + watercooling or any other 3rd party cooling kit you're going to try grasping straws over, comes out cheaper, regardless of whether it is going against a Titan or a 780.
4) again, see point 2. It will take MSI/Gigabyte 5 minutes to drop some silent triple/double fan coolers they already have on their 7000/700 series cards on this R290X, which will make your point moot. Reference cooler card reviews sometimes don't mean shit (go see the GTX 770 stock cooler reviews with the Titan cooler -- it is almost not sold anywhere and is therefore pointless).
5) that's because it's true. R290X wins against whatever you want to try compare it to. Is it cheaper? Yep. Does it perform better? Yup. At lower res (290X disadvantage -- little use for the huge 512bit bus/ROPs)? Yep. At higher res (huge 290X disadvantage vs Titan with its 2GB more VRAM)? Still yes. You cannot spin it in Nvidia's favour in any way, other than the facts that Nvidia did enjoy the early lead and lower power consumption/thermals. And this is all excluding the fact that the R290X is on very early drivers which WILL get better performance from AMD -- not so with Nvidia, as they have had a 9 month head start already. And with temperature throttling on that shitty stock cooler, which will get a big advantage once those Twin Frozr/Windforce-like designs drop.

Yes quite some massacre. Always looks good when you leave out the benches that don't look so good- let me guess, you left out the Skyrim bench by accident?. I bet George Armstrong Custer is rueing the fact that he couldn't "rt click>save as" the Lakota Sioux he wanted to fight.
Seeing as how you posted so many benches, I assume you were going for the completeness motif- so here's the TESV bench and the CFX/SLI 7680x1440 results:
http://img.techpowerup.org/131025/tomshw.jpg

Can you even read what you're posting? R290X won 6/8 of the single card benchmarks you posted, (WITH A FREAKING 2GB VRAM deficiency, no less) so as a last desperate attempt you have to drag in Crossfire support of a 1-day old self-throttling card vs 9+ month old Nvidia cards in your pathetic attempt to grasp at your green-coloured straws -- which relies ENTIRELY on several-month old support of stable post-release drivers?

And DIRECTX 9? :roll: :roll: :roll: :roll:

Thanks for the good laugh, ya crazy Nvidia zealot, but you invalidated your opinion the minute you brought up a shitty, old & horribly ported DX9 game in what is now AMD's 4th gen DX11 flagship card review. :nutkick:
Please leave and take your fail with you, and while you're at it, bring in the old DX7/DX8 titles with Quake III, HL1 and Unreal Tournament in the mix for good measure, because if Nvidia aren't winning, you gotta keep digging for those pre-historic benchmarks nobody gives a flying shit about!

Please feel free to flame me with your predictable "AMD fanboy" comments though, despite the fact that all my current PCs run Intel/Nvidia GPUs -- I can always use a good laugh. Nvidia zealots are getting too predictable these days.

P.S. and 4K benchmarks DO matter because they show exactly how future proof the GPU is. How many of you were screaming "1080p benches are worthless" 10 years ago when we were still rolling on our 1280x1024 CRTs? 4K is on its way to being relevant over the next 2 years; the 7970 was AMD's flagship for nearly 2 years, so 4K benches sure as hell matter, to show progress in future GPUs if nothing else.
 

Frick

Fishfaced Nincompoop
Joined
Feb 27, 2006
Messages
19,582 (2.86/day)
Location
Piteå
System Name White DJ in Detroit
Processor Ryzen 5 5600
Motherboard Asrock B450M-HDV
Cooling Be Quiet! Pure Rock 2
Memory 2 x 16GB Kingston Fury 3400mhz
Video Card(s) XFX 6950XT Speedster MERC 319
Storage Kingston A400 240GB | WD Black SN750 2TB |WD Blue 1TB x 2 | Toshiba P300 2TB | Seagate Expansion 8TB
Display(s) Samsung U32J590U 4K + BenQ GL2450HT 1080p
Case Fractal Design Define R4
Audio Device(s) Plantronics 5220, Nektar SE61 keyboard
Power Supply Corsair RM850x v3
Mouse Logitech G602
Keyboard Cherry MX Board 1.0 TKL Brown
Software Windows 10 Pro
Benchmark Scores Rimworld 4K ready!
So how about that rad i740 huh? I really like how that is turning out.
 
Top