# AMD Radeon R9 290X 4 GB



## W1zzard (Oct 24, 2013)

AMD's new Radeon R9 290X launches today. Based on a new Hawaii GPU, the card promises record-breaking performance. Not only performance is impressive: With a price of $549, it's also the most affordable high-performance option, which will certainly put massive pressure on NVIDIA.

*Show full review*


----------



## Novulux (Oct 24, 2013)

I think I know what card I'm getting


----------



## Ravenas (Oct 24, 2013)

Let me checkout please.


----------



## natr0n (Oct 24, 2013)

No analog VGA outputs is a moot point for a thumb down. All cards just about come with a vga adapter.


----------



## radrok (Oct 24, 2013)

The voltage part of this review just sold the card for me. 

Can't wait.


----------



## Slomo4shO (Oct 24, 2013)

$549 is a good price


----------



## Supercrit (Oct 24, 2013)

GG titan


----------



## crazyeyesreaper (Oct 24, 2013)

Meh loud obnoxious no real killer bundle just yet as it sits it falls between 780 and Titan not exactly compelling only the price makes it a decent purchase. For those that plan to water cool the card no big deal. For a daily driver as is this card isnt very impressive. This is a card that should have launched with aftermarket coolers.


----------



## The Von Matrices (Oct 24, 2013)

This is AMD's version of Fermi/GF100.

It seems like history keeps repeating itself.  Sure, you get a card that is cheaper than its competitors for the same performance but you pay for it in power consumption and noise.

It's the same as the HD 2900 XT and GTX 480 (and more recently the 7970GE, to a lesser extent).


----------



## dir_d (Oct 24, 2013)

These must do well under water


----------



## terrastrife (Oct 24, 2013)

natr0n said:


> No analog VGA outputs is a moot point for a thumb down. All cards just about come with a vga adapter.



There is no DAC therefore no analogue output AT ALL.



The Von Matrices said:


> It seems like history keeps repeating itself.  Sure, you get a card that is cheaper than its competitors for the same performance but you pay for it in power consomption and noise.
> 
> It's the same as the HD 2900 XT and GTX 480 (and more recently the 7970GE, to a lesser extent).



I had both the 2900XT and GTX 480 XD


----------



## de.das.dude (Oct 24, 2013)

so it beats the titan in high res in all games lol. XD


----------



## LAN_deRf_HA (Oct 24, 2013)

Nothing too surprising given the leaks. Though I was expecting better overclocking performance with that huge bus. Overclocked 780 and Titan owners will be feeling justified, not counting price. Wonder how people will feel about "uber" mode. That's really loud.


----------



## Kissamies (Oct 24, 2013)

The Von Matrices said:


> It seems like history keeps repeating itself.  Sure, you get a card that is cheaper than its competitors for the same performance but you pay for it in power consomption and noise.
> 
> It's the same as the HD 2900 XT and GTX 480 (and more recently the 7970GE, to a lesser extent).



Except that HD2900XT was much slower than 8800GTX, about the same level with 8800GTS. GTX480 was faster than HD5870, but yeah the power consumption was much worse.


I can't say that missing the analog outputs should be a negative thing, because if someone buys a card like this, I think that his/her monitor isn't some analog monitor from the stone age..

(oh gosh I hate my English..)


----------



## xkm1948 (Oct 24, 2013)

Definitely getting this!  


Time to retire my old qx9650+5870! Whole new rig baby!


----------



## Naito (Oct 24, 2013)

Seems like it'll do well with custom cooling


----------



## natr0n (Oct 24, 2013)

terrastrife said:


> There is no DAC therefore no analogue output AT ALL.
> 
> 
> 
> I had both the 2900XT and GTX 480 XD



I overlooked that.


----------



## etayorius (Oct 24, 2013)

Holy crap it thrashes the titan even at Metro Last Light.

Anyone gaming at anything lower 1980x1080 with this GPU is definitely doing it wrong.


----------



## Whilhelm (Oct 24, 2013)

Can't wait to get one, toss the stock cooler in the garbage and see what it can do with a full cover water block strapped to it.


----------



## Xzibit (Oct 24, 2013)

LAN_deRf_HA said:


> Nothing too surprising given the leaks. Though I was expecting better overclocking performance with that huge bus. Overclocked 780 and Titan owners will be feeling justified, not counting price. Wonder how people will feel about "uber" mode. That's really loud.



It seams the cut-off is 95C aslong as you can keep it cooler than that it will keep overclocking itself.


----------



## H82LUZ73 (Oct 24, 2013)

Excellent review Wizz  Shame its no different in watt consumption in blu ray playback then my 6970  But at $549 and the performance is or might for me be a slight upgrade Also does the new AM3+ AMD Boards even support crossfire over pcie? Goes to read crossfire one .......One thing before i go actually 2 ,It was great to see some newer games in your benchmarks,So is BF4 going to replace Bf3 in the tests and overclocking tests? Also how much of the power consumption could be fixed in drivers if at all?


----------



## manofthem (Oct 24, 2013)

290x + water block = manofthem 's next purchase(s) 
(may have to wait a tad though for money's) 

Thanks W1zz


----------



## 15th Warlock (Oct 24, 2013)

$549!! All I can say is thank you so much AMD, Christmas is coming early this year! 

Just got an auto notify email alert from Newegg, but cards are still not available, anyone has a direct link to preorder?


----------



## Ravenas (Oct 24, 2013)

Does anyone know if the never settle bundle is also included?


----------



## natr0n (Oct 24, 2013)

Ravenas said:


> Does anyone know if the never settle bundle is also included?



Think its just BF4


----------



## W1zzard (Oct 24, 2013)

natr0n said:


> Think its just BF4



i'm not even sure if you get bf4 for 549. AMD was very vague about that.



9700 Pro said:


> I can't say that missing the analog outputs should be a negative thing, because if someone buys a card like this, I think that his/her monitor isn't some analog monitor from the stone age..



completely agree, but need to make people aware of that fact too. there's countless people out there who bought cheap 1080p monitors with analog vga only


----------



## The Von Matrices (Oct 24, 2013)

natr0n said:


> Think its just BF4



For the first time NVidia is actually offering a better game bundle than AMD.


----------



## btarunr (Oct 24, 2013)

Ta-Ta Titan. GTX 780 Ti stillborn?


----------



## Norton (Oct 24, 2013)

15th Warlock said:


> $549!! All I can say is thank you so much AMD, Christmas is coming early this year!
> 
> Just got an auto notify email alert from Newegg, but cards are still not available, anyone has a direct link to preorder?



XFX R9-290X ENFC Radeon R9 290X 4GB 512-bit GDDR5 ...


----------



## Xzibit (Oct 24, 2013)

The Von Matrices said:


> For the first time NVidia is actually offering a better game bundle than AMD.



At a much higher price. The bundle they offer only last 1month. The Never Settle bundle lasted much longer and is confirmed to be coming back soon.


----------



## Delta6326 (Oct 24, 2013)

Norton said:


> XFX R9-290X ENFC Radeon R9 290X 4GB 512-bit GDDR5 ...



You beat me to it. Have fun


----------



## The Von Matrices (Oct 24, 2013)

Xzibit said:


> At a much higher price. The bundle they offer only last 1month. The Never Settle bundle lasted much longer and is confirmed to be coming back soon.



I wouldn't discount NVidia yet.  I would be surprised if there wasn't an NVidia press release today or tomorrow with price changes.


----------



## erocker (Oct 24, 2013)

W1zzard said:


> i'm not even sure if you get bf4 for 549. AMD was very vague about that.



$30 more for the BF4 bundled version.

Tiger Direct:


----------



## mrwizard200 (Oct 24, 2013)

Guys,

use newegg mobile app and use code MBLEMC10G 
5% off!!!


----------



## 15th Warlock (Oct 24, 2013)

Norton said:


> XFX R9-290X ENFC Radeon R9 290X 4GB 512-bit GDDR5 ...



Awesome, I'm cancelling my preorder for BF4  

You wouldn't happen to have the link for the MSi card would you? 



mrwizard200 said:


> Guys,
> 
> use newegg mobile app and use code MBLEMC10G
> 5% off!!!



How can you look for the card on the mobile app, they all show coming soon..


----------



## TheGuruStud (Oct 24, 2013)

Check out Anand's made up scores lol. They're not even close to these.

When will people learn?


----------



## mrwizard200 (Oct 24, 2013)

15th Warlock said:


> Awesome, I'm cancelling my preorder for BF4
> 
> 
> 
> How can you look for the card on the mobile app, they all show coming soon..



just click on the item and it will show as add to cart


----------



## 15th Warlock (Oct 24, 2013)

mrwizard200 said:


> just click on the item and it will show as add to cart



It only shows auto notify


----------



## mrwizard200 (Oct 24, 2013)

15th Warlock said:


> It only shows auto notify



SAPPHIRE 100361BF4SR Radeon R9 290X 4GB GDDR5 PCI ...


----------



## The Von Matrices (Oct 24, 2013)

TheGuruStud said:


> Check out Anand's made up scores lol. They're not even close to these.
> 
> When will people learn?



Please, point out the discrepancies.  A few examples would be helpful.


----------



## erocker (Oct 24, 2013)

TheGuruStud said:


> Check out Anand's made up scores lol. They're not even close to these.
> 
> When will people learn?



X87 vs. X79? Some scores seem way off, but with some like Metro they're pretty close.


----------



## Pixrazor (Oct 24, 2013)

very very bad cooler from amd as always, temp is a real issue here, the clock speed fall down
really need some custom cooler from aib's
and wanna see some test with watercooling


----------



## 15th Warlock (Oct 24, 2013)

mrwizard200 said:


> SAPPHIRE 100361BF4SR Radeon R9 290X 4GB GDDR5 PCI ...


I ordered the XFX one, you just saved me $29! Thanks so much!

I'm basically getting BF4 for free


----------



## de.das.dude (Oct 24, 2013)

The Von Matrices said:


> I wouldn't discount NVidia yet.  I would be surprised if there wasn't an NVidia press release today or tomorrow with price changes.



i doubteven with price cuts they will be able to do 500$. dont think they can afford anything <700$ on the titan.

and slashing the price will put a frown on people who have already got it. That will show how much profits nvia was screwing out of people.

bet the people at nvidia are going crazy and bald now XD


----------



## Ravenas (Oct 24, 2013)

W1zzard said:


> i'm not even sure if you get bf4 for 549. AMD was very vague about that.



The BF4 bundle retails at $579.


----------



## mrwizard200 (Oct 24, 2013)

15th Warlock said:


> I ordered the XFX one, you just saved me $29! Thanks so much!
> 
> I'm basically getting BF4 for free



Awesome!

on topic:
Excellent review!
Question, where is the total system power consumption? i see card, but not total system. 
I ask because I have a 500w psu with a 4770k, one HD, and nothing else.


----------



## The Von Matrices (Oct 24, 2013)

de.das.dude said:


> i doubteven with price cuts they will be able to do 500$. dont think they can afford anything <700$ on the titan.
> 
> and slashing the price will put a frown on people who have already got it. That will show how much profits nvia was screwing out of people.
> 
> bet the people at nvidia are going crazy and bald now XD



*NVidia is not going to reduce the price on Titan.  Titan is a compute card that AMD can't match in DP floating point performance, and the people who can use compute capabilities are the only people who should have been buying it.*  It was a fluke that Titan ended up being a high end gaming card as well.  That compute niche is restored with high end gaming cards like the R9 290X and will further be reinforced with the GTX 780Ti.  There still is no DP compute card that can compete with Titan for the price.



mrwizard200 said:


> Awesome!
> 
> on topic:
> Excellent review!
> ...



You better have a good PSU then, because the R9 290X uses 300+W.  A 4770K stock is around 100W, and the auxiliary components are probably another 50W.  Your power supply at 90% load will probably be louder than the 290X ever will.  If you overclock your CPU, then you better get a larger power supply.


----------



## Ravenas (Oct 24, 2013)

The Von Matrices said:


> I wouldn't discount NVidia yet. I would be surprised if there wasn't an NVidia press release today or tomorrow with price changes.



After Nvidia raped the wallets of the enthusiast consumer base for ~7 months, you wouldn't discount Nvidia?


----------



## jihadjoe (Oct 24, 2013)

de.das.dude said:


> i doubteven with price cuts they will be able to do 500$. dont think they can afford anything <700$ on the titan.



I'm confident they can. GK110 die size isn't really that far from GF110, and they were able to sell the 580 at $500. The GTX580 even has a better power section than Titan. (srsly, 6 phases on a $1000 card?)




de.das.dude said:


> and slashing the price will put a frown on people who have already got it. That will show how much profits nvia was screwing out of people.



THIS is their real problem.


----------



## esrever (Oct 24, 2013)

mrwizard200 said:


> Awesome!
> 
> on topic:
> Excellent review!
> ...



total system on anandtech shows 400W in furmark so you should be perfectly fine.


----------



## LAN_deRf_HA (Oct 24, 2013)

It's funny technically speaking Nvidia already has the answer for this on the market. They can match the performance and beat AMD in heat, power, noise, and overclocking, but there's one little thing preventing them from doing it. Hubris. They could drop the Titan to $550 today and utterly crash AMD's party. It would be the smart move. They just won't do it though. Another card with less shaders than the Titan priced a $100 above the 290X? That's a pretty arrogant response. I want to call it stupid too, but we won't know that until we see how it sells since they have more brand exposure.


----------



## HumanSmoke (Oct 24, 2013)

de.das.dude said:


> and slashing the price will put a frown on people who have already got it. That will show how much profits nvia was screwing out of people.



I think most enthusiasts are well aware that the higher up the product stack you go, and the more you pay, the more you lose as it depreciates. Prime examples off the top of my head...the HD 7990 seems apropos.


de.das.dude said:


> bet the people at nvidia are going crazy and bald now XD


Jen Hsun's been crazy for a while, and the hairline is at the low tide mark already 

It's all swings and roundabouts in the GPU game. I'm sure Nvidia made a few bucks from the GTX 780 -both in sales and reflected glory for the lesser cards. My guess is that 780 sales stalled out once we got close to Hawaii's announcement (I know I put off my purchases). Somehow I don't think the loss of a few high end $650 sales will impact the company too severely. Nvidia would be more worried if AMD had launched a mobile GPU jihad.


----------



## bim27142 (Oct 24, 2013)

THAT power consumption... even if someone will give this one out to me for free, I'll accept it still but I surely will sell it for something else...


----------



## The Von Matrices (Oct 24, 2013)

Whatever NVidia does with pricing someone is going to complain.  NVidia made the mistake of touting Titan as a high end gaming card when they really shouldn't have.  If they drop the price of Titan (I don't think they should) then they will have complaints from current owners.  If they don't drop the price, then the enthusiast community will continue to complain about it.  Titan is a niche card.

My predictions:

NVidia will drop the GTX 780 to around $579, $30 over the R9 290X and the same price as the R9 290X BF4 bundle.  The GTX 780's better power consumption, better thermals and noise, and game bundle make up for the performance difference.  

Titan will stay where it is because it's a compute card, not a gaming card.  For the people complaining about the price, it's like complaining that Quadros or FirePros are too expensive; gaming performance is not it's primary purpose.

The GTX 780 Ti will be faster than the R9 290X for the same or less power and noise and will be priced around $700.


----------



## PopcornMachine (Oct 24, 2013)

Looking forward to 290.  Should still be quite powerful, less juice needed, less heat, and even less money.


----------



## Bansaku (Oct 24, 2013)

Great price, excellent performance yet I am still not impressed. It's one thing to beat out nVidia and their $1000 behemoth, but at the cost of 94'C and 100% fan speed?!? No thanks. What impresses me are my HIS IceQ x2 7950HD Boost in CFX, OC'd to 1100MHz/1300MHz; the temps on both GPUs never exceed 60'C and my Noctua 1200rpm 140mm I added to my PSU can be heard over BOTH cards at full load.

For those either needing to upgrade, or those who simply want to have the latest and greatest, I suggest waiting for the 3rd parties to release their non-reference coolers 290X, even if it means paying the extra $50 or so. 94'C, 100% fan, and the card throttling down by upwards of 30%? WTF AMD. Bundle their 290X with an FX-9590 and who needs to have the furnace on this upcoming winter!


----------



## 15th Warlock (Oct 24, 2013)

I'm just happy to see AMD pricing this card right, and for the first time in years I'm excited about an Ati card (yes, to me they never stopped being Ati) they are poised to make a come back big time, I long for the good old 9700Pro days.

Nvidia produces awesome cards too, but lately they have been pricing them way too high, yes, this coming from a dual Titan owner, don't take me wrong, I've thoroughly enjoyed these cards (and will continue doing so for the foreseeable future) but I sincerely hope this release starts a new era of price wars that will favor all of us as costumers, I for one would love to see Titan priced at $549 or lower, but I don't see it happening soon, we'll see what happens with the 780Ti.

For now lets allow AMD/Ati to bask in the limelight, they truly deserve it


----------



## Frizz (Oct 24, 2013)

As I was looking at the benchmarks I was like... "meh I'll stick to my 7970s".

Then I saw the price and was like  Do want.


----------



## esrever (Oct 24, 2013)

Bansaku said:


> Great price, excellent performance yet I am still not impressed. It's one thing to beat out nVidia and their $1000 behemoth, but at the cost of 94'C and 100% fan speed?!? No thanks. What impresses me are my HIS IceQ x2 7950HD Boost in CFX, OC'd to 1100MHz/1300MHz; the temps on both GPUs never exceed 60'C and my Noctua 1200rpm 140mm I added to my PSU can be heard over BOTH cards at full load.
> 
> For those either needing to upgrade, or those who simply want to have the latest and greatest, I suggest waiting for the 3rd parties to release their non-reference coolers 290X, even if it means paying the extra $50 or so. 94'C, 100% fan, and the card throttling down by upwards of 30%? WTF AMD. Bundle their 290X with an FX-9590 and who needs to have the furnace on this upcoming winter!



the fan speed is locked at 40% or 55%, the temps are mostly due to the stock heatsink.


----------



## MxPhenom 216 (Oct 24, 2013)

Pretty good price, performance and right where I was expecting. Right in between 780 and Titan and sometimes above both, or losing to both. Time for Nvidia price drops and 780ti to shake it up a bit.


----------



## hardcore_gamer (Oct 24, 2013)

LAN_deRf_HA said:


> It's funny technically speaking Nvidia already has the answer for this on the market. They can match the performance and beat AMD in heat, power, noise, and overclocking, but there's one little thing preventing them from doing it. Hubris. They could drop the Titan to $550 today and utterly crash AMD's party. It would be the smart move. They just won't do it though. Another card with less shaders than the Titan priced a $100 above the 290X? That's a pretty arrogant response. I want to call it stupid too, but we won't know that until we see how it sells since they have more brand exposure.



Then AMD can price it lower again because the 290x die is smaller than titan.


----------



## Sempron Guy (Oct 24, 2013)

This review also shows how good of a value the 7970/280x are.


----------



## hardcore_gamer (Oct 24, 2013)

The Von Matrices said:


> Titan is a niche card.



It no longer is...


----------



## chinmi (Oct 24, 2013)

bim27142 said:


> THAT power consumption... even if someone will give this one out to me for free, I'll accept it still but I surely will sell it for something else...



when u can buy a $549 video card u surely can pay an extra $30 per year for the extra electricity bill...


----------



## HumanSmoke (Oct 24, 2013)

hardcore_gamer said:


> It no longer is...


Really? What other card has CUDA support, a large framebuffer for graphics visualization, and doesn't have handicapped double precision that doesn't cost $2.5K +

I'd still call the Titan niche unless you can provide an alternative.


----------



## HalfAHertz (Oct 24, 2013)

Good price for the awesome performance but considering the fact that you'd need to spend 100$ for a water cooling solution to match 780's noise and temps, I don't think we'll see any serious price drops from Nvidia any time soon. My expectation is that the 780ti will slot in at 650$ and the 780 will drop to 599.


----------



## Xzibit (Oct 24, 2013)

erocker said:


> $30 more for the BF4 bundled version.



Why haven't you asked for the Driver yet. 13.11 Beta 5


----------



## Mathragh (Oct 24, 2013)

Lol :"My neighbors actually complained, asking why I used power tools that late at night." 
Is this really a true story? 

Also, great to see more competition once again. The decision of AMD going with this cooler is a bit, weird though. It would seem to me like they could've had a win on multiple fronts if they gave just a bit more attention to the cooler. Firstly Ofc the noise could be lessened, but also the card could've performed better if they simply slapped on a better cooler. 
They're kinda shooting themselves in the foot here, and I'm sure they could've made a cooler that was at least able to keep the card from overheating at stock clocks, while also not directly competing with 3rd party cooler makers.


----------



## hardcore_gamer (Oct 24, 2013)

HumanSmoke said:


> Really? What other card has CUDA support, a large framebuffer for graphics visualization, and doesn't have handicapped double precision that doesn't cost $2.5K +
> 
> I'd still call the Titan niche unless you can provide an alternative.



These things doesn't matter for gaming. Geforce GTX Titan is a gaming card after all. This review thread is for gaming cards. As far as gaming is concerned Geforce GTX Titan is no longer a niche card.

Nvidia can keep it as a niche card at $1000. But in this case, it will be a pointless niche card for gamers.


----------



## fullinfusion (Oct 24, 2013)

About Efn time!! Thanks for the great review W1zz!! Cancelled my order for two of these!

Mabey next time AMD.... kinda the same shit but different pile.. I mean the 5.0 ghz amd proc for a grand..

Give me a $380 price mark for each and Im in but if not GFU AND... not interested...


----------



## Nirutbs (Oct 24, 2013)

good luck to u all but the price is very high here. nice review almost perfect card.


----------



## RCoon (Oct 24, 2013)

Those temperatures are literally retarded. How could they honestly put a cooler that bad on their flagship card knowing full well its going to reach almost 100 degrees -_-
Nice card and all, but they done goof with their cooler


----------



## dj-electric (Oct 24, 2013)

Put a waterblock on this motherf$%ker and you'll violently vomit rainbows


----------



## silapakorn (Oct 24, 2013)

I wonder how they would name the dual GPU version. R9 299X maybe?


----------



## Sihastru (Oct 24, 2013)

With the exception of noise, power consumption and heat, this is one killer card.

But I do have one little reservation. Looking at the GPU clock/temperature graphs. It seems to me that even if the card is sold as a 1000MHz GPU card, after a few minutes, it throttles heavily. I mean, it even gets to go even under 650MHz. A HUGE drop.

So, with that in mind, I have to wonder. If the card is used to test/benchmark a game with a time-limited sequence (a few minutes) of gameplay or a built-in benchmark sequence (also, a few minutes)... wouldn't the card run that few minutes sequence at a higher clockspeed than it would tipically run during a gaming session?

*So wouldn't the results that the card gets in the reviews be a lot higher *(1000MHz and slightly down)* then the actual results you get while you're actually gaming for a few hours *(650MHz and up) *and not just a few minutes?*

Because it seems to me that *this is actually a 650MHz core card that turbos to 850MHz when you're actually gaming...*







Maybe I need to re-read the review a few times...


----------



## alwayssts (Oct 24, 2013)

btarunr said:


> Ta-Ta Titan. GTX 780 Ti stillborn?



Yes about Titan.  Titan never made any sense as a general consumer card.

As for 780ti, it's a very interesting question.

Let's say the average game uses 2816sp (counting special function) with 48 ROPs, and AMD hit the nail squarely on the head to compete with that limitation from nvidia.  It's pretty damn close, but let's give them the benefit of the doubt.

13 smx is similar to 2912sp (2496sp + 416sfu)...so perhaps slightly less efficient, but pretty darn close.  But then you also have 3GB (4 less chips) of memory and 16 less ROPs being fed, which are hardly a detriment in most single-card and/or single monitor scenarios, hence it will help power consumption relatively.

It could run up to around ~1020-some mhz before it is restrained by bandwidth, as it is consumed in conjunction with the shader processors.

(2912*1020-something) > (2816 =< 1000) any way you cut it, and I will be beyond surprised if that is not what nvidia will do.  The clocks will of course be listed as 9xx/9xx., but it will turbo and play games at precisely that level so all memory bandwidth is used for at least 384/6008.

You then have a card that is faster, realistically, than 290x at 1080p/2560.  

While it may only scale as well as memory bandwidth allows (say they overclock to 6.6, then it will be bottlenecked at ~1130mhz) the radeon has it's heat issues that will also come into play around a similar performance level.

They, like all their other ~300w watt cards, will be close.  You're essentially switching clockspeed for units (for compute) on nvidia's side with a slight increase in efficiency (780 is similar to 2688sp), as 780 can clock pretty high and get close to saturating it's overclocked bandwidth.  That said, if they use a slightly higher tdp and/or 7ghz ram, that could conceivably change things slightly.  Should be interesting, if only academically.  

290x is a hell of a deal, relatively speaking, and I look forward to see how they run with better coolers and the powertune level jacked up (to 50%/375w?!  Good call AMD!  Power back to the people!).  Even with the power consumption being high, if they can maintain stable clocks it's a pretty great design.  Would love to see one running maxing out the process/memory controller at somewhere close to 1300/6400...sucker would be a sight, and likely a sound, to be reckoned with.  Wonder if anywhere around that level will be realistic with a good aftermarket (AC) air/(whatever) water cooler.


----------



## The Von Matrices (Oct 24, 2013)

Sihastru said:


> With the exception of noise, power consumption and heat, this is one killer card.
> 
> But I do have one little reservation. Looking at the GPU clock/temperature graphs. It seems to me that even if the card is sold as a 1000MHz GPU card, after a few minutes, it throttles heavily. I mean, it even gets to go even under 650MHz. A HUGE drop.
> 
> ...



It's a good question.  I wish W1zzard would have posted a scale on the X-axis so we could get an perspective on the time scale.

This makes testing just as confusing for AMD cards as it has been for NVidia cards with GPU boost because a case with better airflow will result in a card with better performance.

This is no different than what smartphone manufacturers do.  They advertise SoC clock speeds fully knowing that those clock speeds are only attainable for a few minutes until the SoC overheats and throttles to a much lower clock speed.


----------



## Recus (Oct 24, 2013)

Sihastru said:


> With the exception of noise, power consumption and heat, this is one killer card.



It's not. Just 1-3 fps increase.
-------
Looks like someone is upset.
http://www.xtremesystems.org/forums...ands-details&p=5212736&viewfull=1#post5212736


----------



## W1zzard (Oct 24, 2013)

Mathragh said:


> Lol :"My neighbors actually complained, asking why I used power tools that late at night."
> Is this really a true story?



yes



Sihastru said:


> Because it seems to me that this is actually a 650MHz core card that turbos to 850MHz when you're actually gaming...



no, amd does not have any turbo. they reduce the clock


----------



## HumanSmoke (Oct 24, 2013)

Nirutbs said:


> good luck to u all but the price is very high here. nice review almost perfect card.


Price here is insane as well, but AMD cards seem to attract a price premium here in relation to other markets- although 2.5 times the price of an Asus 280X DC2T is taking the piss even for our price gougers.







Sihastru said:


> With the exception of noise, power consumption and heat, this is one killer card.


Sounds as though you're channelling the GTX 480 review thread


----------



## Sihastru (Oct 24, 2013)

The Von Matrices said:


> This makes testing just as confusing for AMD cards as it has been for NVidia cards with GPU boost because a case with better airflow will result in a card with better performance.



Actually, nVidia cards have a base clock that they don't go under. They advertise the cards as having the GPU running at that base clock. The boost clocks (turbo) are an added bonus.



Recus said:


> It's not. Just 1-3 fps increase.



Yes, but I'm also taking price into consideration when stating that.


----------



## the54thvoid (Oct 24, 2013)

The Von Matrices said:


> ...This makes testing just as confusing for AMD cards as it has been for NVidia cards with GPU boost because a case with better airflow will result in a card with better performance.



You need to ignore quiet mode and just look at the uber mode results.  Who'd a thought that wasn't a boost BIOS.

Basically, without aftermarket cooling the card as a complete entity is 'poor'.  Unless you like that level of noise.  Something so loud his neighbours complained?  Come on guys, that's indefensible.

BUT..... Give it proper cooling it looks to be an absolute beast.  It's going to break records, that's for sure.

Amused though that nobody has noticed it is now role reversal.  Nvidia have the architecture and design that is 'civilised' and AMD have created a furnace, the likes of which would shame Fermi.

But really, that's academic to me - I like to water cool my gfx cards so I'll need to think about how good this will be at 1440p and higher clocks...

Time to check a bunch of other reviews for a whole world approach...


----------



## Sihastru (Oct 24, 2013)

W1zzard said:


> no, amd does not have any turbo. they reduce the clock



I understand that. It's just "seems" like that.


----------



## The Von Matrices (Oct 24, 2013)

Sihastru said:


> Actually, nVidia cards have a base clock that they don't go under. They advertise the cards as having the GPU running at that base clock. The boost clocks (turbo) are an added bonus.





the54thvoid said:


> Time to check a bunch of other reviews for a whole world approach...



The advertising might be false but the conclusion is the same - a cooler test platform will result in a card with better performance.  This makes it much harder to compare review sites with different platforms.  Earlier in this thread people were criticizing Anandtech for having lower performance scores than TPU for the same card; maybe W1zzard just had a colder environment in which to run his card, in which case both sites' scores would be valid.


----------



## Yellow&Nerdy? (Oct 24, 2013)

It's a good card, but I'm still not quite convinced that it will be a "smash hit". Price/performance is great compared to Titan and GTX 780. If I were to get this card, I'd probably keep it on quiet; the 5% increase in performance is not worth blowing your ears out.

The problem is, Nvidia can (and probably will) drop the price of the GTX 780, and release the GTX 780 Ti at a price point between the GTX 780 and Titan. If Nvidia drops the price of the GTX 780, I would see little to no point in buying the R9 290X (other than Mantle), because overclocking the GTX 780 can give you similar/better performance to the R9 290X on "Uber", but at the same time being quieter and cooler. Based on that, Nvidia wouldn't even need to drop the full 75$, they could just drop it for 25-50$. 

Now I'm interested in seeing if vendors can put some serious 3rd party coolers on this card and make it run quieter, cooler, and give it better OC-potential.


----------



## the54thvoid (Oct 24, 2013)

The Von Matrices said:


> The advertising might be false but the conclusion is the same - a cooler test platform will result in a card with better performance.  This makes it much harder to compare review sites with different platforms.  Earlier in this thread people were criticizing Anandtech for having lower performance scores than TPU for the same card; maybe W1zzard just had a colder environment in which to run his card, in which case both sites' scores would be valid.



Nothing wrong with Anandtech.  People criticise when they don't like what they hear.  Anand pretty much recommends it anyway and he doesn't find the noise as bad as W1zz though he still recommends active noise cancelling headphones.

I doubt there is a single site that will recommend 780 over this.  Titan is irrelevant as a gaming item.  I couldn't use this card without a waterblock though- the fan noise would be a deal breaker.


----------



## HTC (Oct 24, 2013)

The Von Matrices said:


> The advertising might be false but the conclusion is the same - a cooler test platform will result in a card with better performance.  *This makes it much harder to compare review sites with different platforms.* *Earlier in this thread people were criticizing Anandtech for having lower performance scores than TPU for the same card; maybe W1zzard just had a colder environment in which to run his card, in which case both sites' scores would be valid.*



Agreed.

That could actually be on purpose: they could show benches with much better performance just because they were taken in outdoors Finland or Sweden or something and those would be valid too.


It seems it hasn't sink into AMD's heads that the cooler is almost as important as the card and non water cooling oriented folks may very well decide against opting for this card because of noise which, to me, seems like a shot in the foot using a cannon ball as a weapon ...


Im my case, i would never opt for this kind of card because:

A - i don't game often
B - noise (lack of it) is important for me
C - consumption is as important as noise, if not more important

The only good thing IMO about this card is the price: though expensive, compared to other cards with similar performance, it's way cheaper and, for those not caring for noise and consumption, it's a clear winner.


----------



## nem (Oct 24, 2013)

Come on just 9.3 , that will be are a blasfemy :shadedshu


----------



## The Von Matrices (Oct 24, 2013)

HTC said:


> It seems it hasn't sink into AMD's heads that the cooler is almost as important as the card and non water cooling oriented folks may very well decide against opting for this card because of noise which, to me, seems like a shot in the foot using a cannon ball as a weapon ...



I agree, although in the past the coolers were good enough to sustain the reference clock speed.  *This is the first graphics card that cannot sustain its advertised clock speed under any scenario with the reference cooler.*  If AMD had just called this a "boost" or even published a base clock this would be different.

Just for comparison, load noise for the GTX 480 and the R9 290X in "uber" mode _both_ rank in at 50dB in TPU's reviews.

For all the complaints that NVidia got about GTX 480's heat and power consumption I hope AMD gets just as many for the R9 290X.


----------



## Tatty_One (Oct 24, 2013)

Seems to be plenty available already in UK........

http://www.overclockers.co.uk/showproduct.php?prodid=GX-092-HS&groupid=701&catid=56&subcat=1752

that price however converts to $760


----------



## buggalugs (Oct 24, 2013)

The power/temps situation doesn't bother me, I haven't bought a reference card for a couple of generations for that reason. Those blower coolers are always noisy to me, even on less powerful cards.

 Its a bit of a worry that AMD said there wont be non -reference for a while but I have a feeling it wont be too long. AMD want to sell a bunch of reference cards first, giving people the impression non-reference is a loooong time away..they would be well aware that reviews will criticize temps/noise and many reviewers will recommend waiting for non-reference coolers.

 This card would be great on watercooling though or for people who don't care about fan noise so much.

 Overall awesome card though, for that price. THe non-reference should be awesome too, like a MSI lightning of Asus DCUII with awesome overclocking clocking potential

 Edit: My local store in Australia is selling them for $649 for powercolor to $699 for Sapphire/Gigabyte but they are already SOLD OUT. They should get some Asus and other brands soon though.


----------



## RCoon (Oct 24, 2013)

the54thvoid said:


> I doubt there is a single site that will recommend 780 over this.  Titan is irrelevant as a gaming item.  I couldn't use this card without a waterblock though- the fan noise would be a deal breaker.



I'd recommend a 780 over a 290X to anyone that isn't watercooling. These temperatures on their stock cooler are terrible, and that's an understatement. It throttles, it's loud as hell, and AMD essentially just released a card that only works at its defined baseclock of 1000mhz if it's watercooled(Because there are no aftermarket solutions being sold).

I think there's a reason this card is only 549... Maybe people are so hyped up over the last few months that they are ignoring the temperatures and noise, and I don't understand why. It's an awesome chip, but a very very bad card for non W/C gamers.


----------



## Tatty_One (Oct 24, 2013)

RCoon said:


> I'd recommend a 780 over a 290X to anyone that isn't watercooling. These temperatures on their stock cooler are terrible, and that's an understatement. It throttles, it's loud as hell, and AMD essentially just released a card that only works at its defined baseclock of 1000mhz if it's watercooled(Because there are no aftermarket solutions being sold).
> 
> I think there's a reason this card is only 549... Maybe people are so hyped up over the last few months that they are ignoring the temperatures and noise, and I don't understand why. It's an awesome chip, but a very very bad card for non W/C gamers.



It's an enthusiasts card, many who buy will watercool, many will wait until non reference cooler designs are available, some of the others won't care, some will and will take up other options.


----------



## erixx (Oct 24, 2013)

I will wait for the polished or next version, please AMD, no heat and noise and driver issues.


----------



## buggalugs (Oct 24, 2013)

RCoon said:


> I think there's a reason this card is only 549... Maybe people are so hyped up over the last few months that they are ignoring the temperatures and noise, and I don't understand why. It's an awesome chip, but a very very bad card for non W/C gamers.



 Blower coolers always suck (...or blow). Just like previous generations those us wanting good temps, noise performance and overclocking will go for a non- reference design or water. No big deal. I wouldn't write off the card based on the reference cooler.

 Anyway AMD have priced the card cheap enough to buy the reference and get an after market cooler. Anybody who is serious about overclocking on a  highend card doesn't run a stock blower cooler imo. Get an aftermarket cooler or water cooling.


----------



## RCoon (Oct 24, 2013)

buggalugs said:


> Anybody who is serious about overclocking on a  highend card doesn't run a stock blower cooler imo.



Um, both my old 780's with reference coolers ran at 1200mhz core and stopped dead at 69 degrees... NVidia's reference cooler was viable for high end cards and for overclocking, with decent noise output. Feel free to check out my 780 overclocking guide in the forums.

People are making excuses that we should expect the reference cooler to be bad and to throttle the card to kingdom come and not be a viable choice. But we shouldn't. Reference coolers should be(and a lot ARE) viable options for both noise and cooling potential. Especially in SLI/Crossfire circumstances.

EDIT: Not to mention this card DOES NOT maintain its advertised baseclock over extended periods of time. WHY ARENT PEOPLE ANGRY ABOUT THIS???


----------



## arterius2 (Oct 24, 2013)

IMO, this card is an engineering failure, it runs almost 50W more than gtx780/titan and what do I get? only 1~3avg fps faster? this is a major roadblock for me to even consider purchasing this card. I would much rather buy a much quieter/cooler card that performs almost at the same performance, and willing to accept the premium. this day and age is all about efficiency efficiency! the age of brute force was long past.

When people say "oh who cares im just gonna watercool this bitch", ummm no, 300+w of heat is still 300+w of heat, doesn't mater if you water cool it or not, that same amount of heat is still output to the surroundings and its still consuming the same amount of energy, and doesn't water block cost money too? did you add that to the total cost during comparison??? from a technical point of view, this card is very inefficient and inelegant. to put it into perspective, its same as OCing 780gtx by 15% and selling it as R290x, there is no innovation here, its just pure brute force, people. nothing to see here, move along.


----------



## VulkanBros (Oct 24, 2013)

Tatty_One said:


> It's an enthusiasts card, many who buy will watercool, many will wait until non reference cooler designs are available, some of the others won't care, some will and will take up other options.



+1 - I think you hit the nail on the head


----------



## the54thvoid (Oct 24, 2013)

RCoon said:


> People are making excuses that we should expect the reference cooler to be bad



AMD haven't really pushed much for a quieter blower solution in the past few years.  The 5850 was excellent but the 6xxx series was not so good and the 7970 blower was loud, this is the expected progression I guess.  However, the 7990 cooler was excellent by all accounts but on the 290X that would blow the 90+ degree heat into the case.



RCoon said:


> EDIT: Not to mention this card DOES NOT maintain its advertised baseclock over extended periods of time. WHY ARENT PEOPLE ANGRY ABOUT THIS???



It will on Uber mode but that is the sound penalty you will pay for.  In Guru's review he quotes AMD as saying 95 degrees is the units operating normal - it can run at that for it's lifetime.  If users want a cooler experience they can make it cooler by lowering performance or having higher fan speeds.

Anand also stated the lack of a comprehensive new cooler (like Titan's) helps to keep the end cost down.  In all fairness we'll get a slew of reviews and there will be varying acceptance of the noise.  Guru 3D says it's fine - he tested on quiet mode as he stated there was no performance difference and that was the default setting.


----------



## RCoon (Oct 24, 2013)

the54thvoid said:


> it can run at that for it's lifetime.



A lifetime of 50+ dB, 95 degrees, 315w power draw, and repetative throttling?
It's essentially selling a delicious bacon and cheese bagel, in a stale bagel. The baker is saying it's still edible, and the customers are saying "oh its fine just put it in a new bagel".

I don't get this. I really don't.

EDIT:I'll also note that the 290X is £479. For £479 you can also buy a Gigabyte 780 OC Windforce 3X.


----------



## sweet (Oct 24, 2013)

All hail the new performance king 
About the noise + heat, the card can simply be underclocked if quiet gaming is prefer. You can achieve the 780's performance + temp + noise with 550$, still a win 
For the guys who wear headphone when gaming, tweak the fan limit to 90-100% and enjoy the Titan killer performance, again with 550$.

Oh wait, who still cares about 780/Titan anyway


----------



## arterius2 (Oct 24, 2013)

Tatty_One said:


> It's an enthusiasts card, many who buy will watercool, many will wait until non reference cooler designs are available, some of the others won't care, some will and will take up other options.



did you consider the price of a watercooling kit?(+100~200$!) add that on top of the card it runs alot more than gtx780. 

and why do I have to buy a card and THEN watercool it to be even competitive with the other card?

its like selling me a brand new car with bad transmission, and then charging me a premium to fix it so I can drive it off the lot.


----------



## arterius2 (Oct 24, 2013)

sweet said:


> All hail the new performance king
> About the noise + heat, the card can simply be underclocked if quiet gaming is prefer. You can achieve the 780's performance + temp + noise with 550$, still a win
> For the guys who wear headphone when gaming, tweak the fan limit to 90-100% and enjoy the Titan killer performance, again with 550$.
> 
> Oh wait, who still cares about 780/Titan anyway



the card draws ~25% more power for ~5% performance gain, under clocking the card will do nothing to even the odds, at best it will run at the same temp as gtx780 but lower performance.


----------



## Xzibit (Oct 24, 2013)

For those who got there E-Penis cut off and complaining about clocks you might want to read how the card works...

[PCPER] AMD Radeon R9 290X Hawaii Review - Taking on the TITANs







AMD advertises the R-Series with "*GPU Clock Speed - Up-To 1GHz*"


----------



## The Von Matrices (Oct 24, 2013)

Xzibit said:


> For those who got there E-Penis cut off and complaining about clocks you might want to read how the card works...
> 
> [PCPER] AMD Radeon R9 290X Hawaii Review - Taking on the TITANs
> 
> ...



That article makes a point I think is especially poignant

To quote the article:



> All this means now is that we needed to "warm up" the GPU each time we were ready to benchmark it.  I tended to sit in the game for at least 5 minutes before running our normal test run and I think that is plenty of time to get the GPU up to its 95C operating temperature and push clocks to realistic levels.  *Be wary of benchmarks results that DO NOT take that into account as they could be 10%+ faster than real-world results would indicate.*



I would like to know if W1zzard took this into account in his review.


----------



## Mathragh (Oct 24, 2013)

Benchies with apparently lower ambient temperatures here.

Seems to make quite a difference.


----------



## Zen_ (Oct 24, 2013)

I can understand the complaints about noise and high temps, but 50w of electricity? Please. I'm all for higher efficiency, but this is like complaining that a 600 HP Ferrari gets poor fuel economy. Even if you gamed 40 hours a week, that 50w loss is 2 kw of power. Where I live that is 24 cents, or $1 a month, or $12 a year.


----------



## the54thvoid (Oct 24, 2013)

sweet said:


> All hail the new performance king



All hail the 7990 
Oh, you meant the single gpu king.  Okay, it's only king when it's really loud and if you use the quiet BIOS it's not king.




sweet said:


> About the noise + heat, the card can simply be underclocked if quiet gaming is prefer. You can achieve the 780's performance + temp + noise with 550$, still a win



It doesn't really even do quiet gaming.  It's 4 dBA louder than a 780 in quiet mode.  That's over twice the sound pressure.  You'd need to downclock seriously to match 'quiet' gaming levels.



sweet said:


> For the guys who wear headphone when gaming, tweak the fan limit to 90-100% and enjoy the Titan killer performance, again with 550$.



Yup, Think we've covered that. Kind of funny what people put up with to "win".  I'm old, i like refinement.  And about 3% isn't really killer.



sweet said:


> Oh wait, who still cares about 780/Titan anyway



If you have a Titan already - I think a lot of people are going, oh well, I can hang on to my card for now.  If you care about noise - 780 is better choice.  If you like custom cooling - EKWB have blocks for sale for the card now 

All in the card is an exceptional performer but it's let down by a shit cooler.  I think it's plain to see who can see logically the flaws of the card and who can see it's merits.  The linear voltage overclock scaling is absolutely fantastic.  I can see it pounding 690's and 7990's to dust with proper cooling.

The chip is brilliant (but hot) the price is excellent considering Nvidia's levels and it restores AMD to what I would say is joint top - not king.  But without proper modification, it loses when you start to try overclocking.  I'd like to see a baseline overclocking comparison for 290X, 780 and Titan, using stock coolers.  That would be quite useful.


----------



## buggalugs (Oct 24, 2013)

RCoon said:


> Um, both my old 780's with reference coolers ran at 1200mhz core and stopped dead at 69 degrees... NVidia's reference cooler was viable for high end cards and for overclocking, with decent noise output. Feel free to check out my 780 overclocking guide in the forums.
> 
> People are making excuses that we should expect the reference cooler to be bad and to throttle the card to kingdom come and not be a viable choice. But we shouldn't. Reference coolers should be(and a lot ARE) viable options for both noise and cooling potential. Especially in SLI/Crossfire circumstances.
> 
> EDIT: Not to mention this card DOES NOT maintain its advertised baseclock over extended periods of time. WHY ARENT PEOPLE ANGRY ABOUT THIS???



 Well, it might have kept it to 69 degrees, but noise levels are still way too high for me looking at the benchmarks 48 dB is still loud. I will admit Nvidia's most recent 7 series coolers are OK compared to most blower coolers but still don't compare to decent aftermarket coolers. As I said I haven't bought a reference design for a few years because of that.

 Sure this card is under equipped with the reference cooler but the card is cheap, either buy an aftermarket cooler or wait for non-reference, which I would do anyway.

 With a decent aftermarket cooler , overclocking potential is awesome and should pull right away from Titan. Overall, I think the card is a winner because of that performance to price.

 Have a look at current graphic cards at online stores now, most versions have aftermarket coolers maybe with the exception of Titan( and only because Nvidia wont allow it). Reference designs are just the base model, with a decent cooler it will be a game changer.


----------



## sweet (Oct 24, 2013)

arterius2 said:


> did you consider the price of a watercooling kit?(+100~200$!) add that on top of the card it runs alot more than gtx780.
> 
> and why do I have to buy a card and THEN watercool it to be even competitive with the other card?
> 
> its like selling me a brand new car with bad transmission, and then charging me a premium to fix it so I can drive it off the lot.



LOL, if this card is run under watercooling, it will blow both 780 and Titan away. 






Base on the figure above, the reference solution really hold the card down. However, the watercool aproach will help the card operating at 1000 MHz, or even more if your PSU is capable, consistently. So, the whole bunch of 290x + watercool will cost around a custom 780, with superior performance.


----------



## arterius2 (Oct 24, 2013)

sweet said:


> LOL, if this card is run under watercooling, it will blow both 780 and Titan away.
> 
> http://tpucdn.com/reviews/AMD/R9_290X/images/analysis_uber.gif
> 
> Base on the figure above, the reference solution really hold the card down. However, the watercool aproach will help the card operating at 1000 MHz, or even more if your PSU is capable, consistently. So, the whole bunch of 290x + watercool will cost around a custom 780, with superior performance.



why are you comparing watercooled card vs non-water cooled card? AMD fanboys never seem to show any sense of logic.
what's preventing anyone from water cooling a gtx780 and upping its clocks to meet the r9 290x while still running cooler and more quiet?

use some logic next time you post or you come off as a clown.


----------



## the54thvoid (Oct 24, 2013)

arterius2 said:


> why are you comparing a watercooled card vs non-water cooled card? AMD *fanboys* never seem to show any sense of logic



Let's not start using that bloody term please :shadedshu


----------



## sweet (Oct 24, 2013)

arterius2 said:


> why are you comparing a watercooled card vs non-water cooled card? AMD fanboys never seem to show any sense of logic



Because the price of the two systems are the same, mate. Do the math, 290X + water cool = 650-750$ = high end custom 780.
If you don't prefer the wc solution, just wait for a custom version. Around 600$ for a world beater.


----------



## buggalugs (Oct 24, 2013)

sweet said:


> LOL, if this card is run under watercooling, it will blow both 780 and Titan away.
> 
> Base on the figure above, the reference solution really hold the card down. However, the watercool aproach will help the card operating at 1000 MHz, or even more if your PSU is capable, consistently. So, the whole bunch of 290x + watercool will cost around a custom 780, with superior performance.



 I agree but not even water cooling, a decent non-reference cooler like on MSI's lightning or Asus DCUII will be all that's needed.

 Who knows, Asus and MSI etc might have a newer generation coolers that will perform even better than we expect, but at the very least it will get rid of noise and high temps.

 For the price, there are plenty of options.


----------



## HumanSmoke (Oct 24, 2013)

Mathragh said:


> Benchies with apparently lower ambient temperatures here.
> Seems to make quite a difference.



Might be a case of wallpapering over the cracks. Unless you're dedicated to running AC or plan on only cranking the card up in the cooler months I'm not convinced that a lower ambient temp does that much to mitigate the issue. The other side of that particular coin is...what about people who live in warm climates? My country is just coming into spring and summer where ambient can get to 30+C and 90% humidity.
If cold weather users can get a performance boost  what happens at the other end of the scale for us in the tropics?

BTW: Hardware France have thermographs of the card. Interesting comparison


----------



## The Von Matrices (Oct 24, 2013)

sweet said:


> Because the price of the two systems are the same, mate. Do the math, 290X + water cool = 650-750$ = high end custom 780.
> If you don't prefer the wc solution, just wait for a custom version. Around 600$ for a world beater.





buggalugs said:


> I agree but not even water cooling, a decent non-reference cooler like on MSI's lightning or Asus DCUII will be all that's needed.
> 
> Who knows, Asus and MSI etc might have a newer generation coolers that will perform even better than we expect, but at the very least it will get rid of noise and high temps.
> 
> For the price, there are plenty of options.



While I understand your desire to compare with the GTX 780 today, you are neglecting that the GTX 780 Ti will be here in three weeks, which is sooner than custom cooled R9 290X's will arrive.  

The GTX 780Ti will force down the price of the GTX 780 near R9 290X levels, making your arguments that custom cooled R9 290's are cheaper than the GTX 780 invalid.


----------



## Frick (Oct 24, 2013)

TheGuruStud said:


> Check out Anand's made up scores lol. They're not even close to these.
> 
> When will people learn?



In all those benchmarks the 290x consistently beats Titan. The FPS numbers might differ, but it still shows the 290x as being the better card, and he concludes that the 780 is a hard sell now except on acoustic grounds, which seems to be very true.

So what are you on about exactly?


----------



## Xaser04 (Oct 24, 2013)

sweet said:


> LOL, if this card is run under watercooling, it will blow both 780 and Titan away.
> 
> 
> Base on the figure above, the reference solution really hold the card down. However, the watercool aproach will help the card operating at 1000 MHz, or even more if your PSU is capable, consistently. So, the whole bunch of 290x + watercool will cost around a custom 780, with superior performance.



And if you watercool a 780/Titan they will also perform faster.... (after all its boost clock is linked to the temperature). 

The temperature & throttling seen makes me wonder how these cards will perform in game in a "normal" case scenario. What performance would we see in say Crysis 3 after 2 hours of 99% load? 

The 290x does appear to perform well (in a benchmark scenario at least) and is priced aggressively, but for me the noise & temperature are unnacceptable (especially compared to the excellent Titan / 780 blower).


----------



## LTUGamer (Oct 24, 2013)

Blower fan is crap... Why amd bans normal cooling solutions!?


----------



## sweet (Oct 24, 2013)

Xaser04 said:


> And if you watercool a 780/Titan they will also perform faster.... (after all its boost clock is linked to the temperature).
> 
> The temperature & throttling seen makes me wonder how these cards will perform in game in a "normal" case scenario. What performance would we see in say Crysis 3 after 2 hours of 99% load?
> 
> The 290x does appear to perform well (in a benchmark scenario at least) and is priced aggressively, but for me the noise & temperature are unnacceptable (especially compared to the excellent Titan / 780 blower).



If you can't see my point about *the price*, I think we can stop talking about wc now.

From the quiet mode results, the card still offers impressive performance even when throttling. You can see the answer for your question in the below picture, 290X quiet mode spend most of time in the 650-850 region, and still outperform 780.





Another note for you: The blower on Titan/780 actually benefits more from the testbench scenario. They use aluminum cover, which helps transfer the heat better than the plastic cover use on 290X, especially on testbench setup when you don't have to worry about the heat in your close case.


----------



## Mathragh (Oct 24, 2013)

Looks like we have a new LN2 winner aswell!

with 1,435/1,650MHz clocks I suppose there is a lot of headroom left for high end air and water set-ups aswell.


----------



## TheMailMan78 (Oct 24, 2013)

As all GPUs nowadays it seems over priced for what you can get for half the price. Over all its a nice card but it seems late to the party as usual for ATI. Curious to see how the 780ti stacks up in price and performance. I love a good price war......even in its more than I can afford lol.


----------



## Frick (Oct 24, 2013)

TheMailMan78 said:


> As all GPUs nowadays it seems over priced for what you can get for half the price.



There. This has always been the case.


----------



## HammerON (Oct 24, 2013)

Mathragh said:


> Looks like we have a new LN2 winner aswell!
> 
> with 1,435/1,650MHz clocks I suppose there is a lot of headroom left for high end air and water set-ups aswell.



Wow - I am liking these GPU's


----------



## Mathragh (Oct 24, 2013)

Another interesting read up on the architecture, quite detailed. 


One of the things explained is the new crossfire circuitry
Apparently the crossfire engine is vastly improved, which should result in way better frame synchronisation, timing and displaying.
Also, the limit of "just four cards" is gone, atleast theoretically.

I don't really see any CPU sufficiently feeding a quad dual-gpu set-up though 

Edit: a fragment about crossfire :





> Enter XDMA, exit connectors, all needed data is now pulled over the PCIe bus via DMA. A requesting card can reach out to the required target and essentially grab, or at least ask nicely, for what it needs. This also allows for an unlimited number of cards in Crossfire but we forgot to ask how many. In any case eight cards should be doable if something else doesn’t bottleneck things first.
> 
> What is the XDMA engine? It is really just a display controller connected to the display engine so it can be timing aware but unlike the older way it isn’t timing dependent. This is a subtle but key difference. The card with the monitor connected is now able to prefetch the data it needs long before it needs it. The old Crossfire basically requested when it was needed, not before, and if anything burped things got ugly. The new XDMA version has much more flexible algorithms to both predict and fetch what it needs long before it needs the data. Storage is also larger and more flexible, again based on algorithmic control rather than fixed.
> 
> In short as soon as a render surface is enabled on the displaying card the data can be requested from the source. The storage space is also guaranteed to be adequate as well and nothing is just in time delivery so minor bumps in the night don’t turn in to missed frames. This should not only allow more scaling for card count, display counts, and screen sizes, but also higher reliability too. It will be interesting to see if this opens up multi-card, multi-monitor Crossfire solutions, with a bit of clever software hacks, multiple cards each with connected monitors is at least theoretically possible. Hmmm, Pi.


----------



## BiggieShady (Oct 24, 2013)

Temperatures and dynamic clocks on this thing are hilarious. It runs at over 1 GHz for couple of minutes then it settles at 650 MHz at whopping 94 C.
Meanwhile, competing products keep their max clocks indefinitely with a temperature headroom - often more than 10 C below throttling temperature of 80 C.
No wonder price is so attractive.
I had my experiences with both overly hot and overly noisy cards, never again :shadedshu


----------



## adulaamin (Oct 24, 2013)

Thanks for the review Wizz! I'm gonna wait for a non-ref cooler if there will ever be one. I can't see myself using this reference design unless I go water.


----------



## the54thvoid (Oct 24, 2013)

Mathragh said:


> Another interesting read up on the architecture, quite detailed.



I used to enjoy S|A but it all got too much with CD's bile for Nvidia.  He's also very hit and miss with info - sometimes spot on and sometimes very wrong indeed.  Remember in the days he said there'd never be a working Fermi chip (after GTX 480's issues) and Nvidia brought out the GTX 580 with all cores enabled.  Also said Kepler was going to win this one (turned out 680 versus 7970) on _all_ metrics but arguably, the 7970 was equal to the 680, if not faster clock for clock.


----------



## Mathragh (Oct 24, 2013)

the54thvoid said:


> I used to enjoy S|A but it all got too much with CD's bile for Nvidia.  He's also very hit and miss with info - sometimes spot on and sometimes very wrong indeed.  Remember in the days he said there'd never be a working Fermi chip (after GTX 480's issues) and Nvidia brought out the GTX 580 with all cores enabled.  Also said Kepler was going to win this one (turned out 680 versus 7970) on _all_ metrics but arguably, the 7970 was equal to the 680, if not faster clock for clock.



I agree with you about that he's been hit/miss on a couple occasions.
I can see that he was at least partially right about Fermi, since they indeed did not bring out a fully enabled part without a respin. Furthermore, AMD did need to put out a GHz edition for the 7970 to be on par. Hes however also definitely been wrong about other things, and atleast in the past(seems to be slackened off a bit recently) hes always the first to crack down on Nvidia(also not always without reason).

All those points are a bit moot when it comes to what I linked though imho, since he's just doing an article on facts that are out there. The only thing he adds to these facts are explanations that let less tech-savvy users also understand what's going on. Granted there is also a bit of speculation going on, but its mainly just explaining what this new architecture is about. For that I cannot really fault him, and for myself at least, it was quite an enjoyable article to read. This was without noticing any real bias in what he was talking about as well.


----------



## birdie (Oct 24, 2013)

It's the first time I'm truly disappointed with a TPU review - was it really necessary to test 290X at resolutions below 2560x1600?

This card is meant for 4K or dual monitors resolution, and it's a monster at them.

Its results would have been so much batter if tested properly.

Wizzard, it's time to update your setup and rewrite this review.


----------



## LiveOrDie (Oct 24, 2013)

HAHA i love two for one deals buy a GPU and get a free heater or leaf blower.


----------



## amschip (Oct 24, 2013)

I rarely post but this really made my day W1zzard 



> Windows 8.1, are incredibly unpopular with PC gamers (at least our forum members) due to the idiotic user-interface that insults our intelligence.



Great review btw


----------



## the54thvoid (Oct 24, 2013)

birdie said:


> It's the first time I'm truly disappointed with a TPU review - was it really necessary to test 290X at resolutions below 2560x1600?
> 
> This card is meant for 4K or dual monitors resolution, and it's a monster at them.
> 
> ...



Conduct a poll and see what resolution monitor people use most predominantly.  Also, some people like to game quite competitively and using a 1080 screen with super fast frame rates is very useful to them.

He does test eyefinity (triple screen set ups) so I really don't know what the crap you're complaining about.  

How many people will own a 4K res monitor in the next 2 years?  He tests 1600p and triple screen gaming.  That's sufficient.


----------



## W1zzard (Oct 24, 2013)

You are free to send me a 4k monitor so I can start testing, I'll even add a thank you note with your name/company name on the test setup page.

If you expect me to buy the monitor with my own $$, you'll have to wait until HDMI 2.0 is available in monitors and they've come down a bit more in pricing. I'm quite positive next year we'll add 4k testing.


----------



## Tatty_One (Oct 24, 2013)

arterius2 said:


> did you consider the price of a watercooling kit?(+100~200$!) add that on top of the card it runs alot more than gtx780.
> 
> and why do I have to buy a card and THEN watercool it to be even competitive with the other card?
> 
> its like selling me a brand new car with bad transmission, and then charging me a premium to fix it so I can drive it off the lot.



I have not considered the price of watercooling, those enthusiasts that watercool GPU's watercool NVidia products also, so not really any different, people watercool 780's, they DON'T have to watercool either card.  From what I read in the review, it competed with the reference cooler.

Now to be fair, I agree, I don't like loud cooler noise, I don't like excessive temps, i always buy non reference design cards for the better cooling options, if I bought one of these I would do the same, therefore in my simple view, for me things are not substantially different enough to put me off the card,  yes it may take longer to get non reference designs but for these reasons I am personally judging the GPU and not specifically the PCB and/or cooler etc.  For those that only buy reference designs..... they may be put off and therefore I cannot disagree.


----------



## Aquinus (Oct 24, 2013)

A single-slot water cooled variant of the R9 290X might be enough to sell me on going under water and to replace my 6870s. I love the numbers and cost, I dislike the temps and noise.


----------



## ChristTheGreat (Oct 24, 2013)

High temperature, it was written in the sky. I am sure this chip was done for 20nm, but they have done it on 28nm..

Anyway, good GPU, good price, I will be waiting other than reference card as I am not doing any watercooling.

This will drop price for sure. It is 45% fore powerful than a stock HD7950, in quiet mode. So I guess overclocked, if I can grab a cheap 7950, let'S say 150$, this could worth it xD


----------



## birdie (Oct 24, 2013)

W1zzard said:


> You are free to send me a 4k monitor so I can start testing, I'll even add a thank you note with your name/company name on the test setup page.
> 
> If you expect me to buy the monitor with my own $$, you'll have to wait until HDMI 2.0 is available in monitors and they've come down a bit more in pricing. I'm quite positive next year we'll add 4k testing.



Point taken, but I believed the TPU website earns some money, not to mention that many companies know you personally and they are probably willing to lend you or to give you that 4K Asus for testing. P.S. I didn't mean to insult or anything.


----------



## Doc41 (Oct 24, 2013)

Damn it, and i was seriously considering getting this card....
Its price looks good but I know I’ll never be able to afford a custom loop after buying it,
*IMO* those temps and power consumption ruined it for me and are not worth it over those of the 780 including the "not so much fps over, mine is a 1080p anyway.

C'moon let’s see them price drops


----------



## 1d10t (Oct 24, 2013)

Definitively this is not my card.My current PSU could handle two 7970 (non GE),but as i overlooked maximum power draw i believe two of these will choke my PSU before entering any game 
As power,heat and noise...history repeated itself....this card was just like Fermi in old days,but not anything close.Old Fermi uses blower type cooler and massive chunk of heatsink with 5 heatpipes,





yet,still touch a 96'C mark




compare to nVidia,AMD still doing a bit good with their cheap@$$ cooling 
As for me,i'm definitely waiting next Wizz's review over R 290 (non X) 

-= edited=-

nearly forgot...waiting these people shared their picture 



EarthDog said:


> Id crap if the 290X was $550.... Perhaps the 290, but not the X...





Dj-ElectriC said:


> You'd crap? i'll vomit rainbows.





Sihastru said:


> And I will burp unicorns...


----------



## Deleted member 24505 (Oct 24, 2013)

People who can afford to buy this card probably already have a water loop, so won't care about the cost of the block. If you can buy a $600 card you are not going to be bothered about an extra 100 for a water block for it surely.


----------



## LiveOrDie (Oct 24, 2013)

tigger said:


> People who can afford to buy this card probably already have a water loop, so won't care about the cost of the block. If you can buy a $600 card you are not going to be bothered about an extra 100 for a water block for it surely.



Not really a lot of people dont like water cooling its fine if you dont move your PC around a lot but i rather air cooling for my lan rig.


----------



## Domokun (Oct 24, 2013)

Thanks for the review, W1zzard. On the second page of the review, you mentioned the following:



> "Hawaii" takes advantage of the PCI-Express Gen 3.0 x16 bus interface.


Is there actually an advantage to using PCIE3 (as opposed to PCIE2) with an AMD R9 290X? Or is it a similar situation to every video card before it (i.e. minimal improvement)?


----------



## GSG-9 (Oct 24, 2013)

Domokun said:


> Thanks for the review, W1zzard. On the second page of the review, you mentioned the following:
> 
> 
> Is there actually an advantage to using PCIE3 (as opposed to PCIE2) with an AMD R9 290X? Or is it a similar situation to every video card before it (i.e. minimal improvement)?



I would also like to know this, in my head I am guestimating its a 9 fps difference (similar to running sli in dual pci-e 8x vs dual 16x back in the day) but I would like real data on the card running PCIE2.


----------



## Tatty_One (Oct 24, 2013)

Domokun said:


> Thanks for the review, W1zzard. On the second page of the review, you mentioned the following:
> 
> 
> Is there actually an advantage to using PCIE3 (as opposed to PCIE2) with an AMD R9 290X? Or is it a similar situation to every video card before it (i.e. minimal improvement)?



I can only guess that he means that the max ceiling on PCI-E 2.0 x16 can potentially be exceeded by this card, of course PCI-E 3.0 x16 gives more.... that to date has not really been exploited, it may be though that those with multi GPU setups who's PCI-E slots revert to 8 x 8 or 16 x 4 see more of the benefit @ PCI-E 3.0 than previously.


----------



## msamelis (Oct 24, 2013)

BiggieShady said:


> Temperatures and dynamic clocks on this thing are hilarious. It runs at over 1 GHz for couple of minutes then it settles at 650 MHz at whopping 94 C.
> Meanwhile, competing products keep their max clocks indefinitely with a temperature headroom - often more than 10 C below throttling temperature of 80 C.
> No wonder price is so attractive.
> I had my experiences with both overly hot and overly noisy cards, never again :shadedshu



Very well said, I feel the same way. It also looks very ugly, it should have a much better quality look at that kind of money. Unless we see some non stock coolers - which AMD won't allow until perhaps even 2014 - I don't see the point of getting this.


----------



## swirl09 (Oct 24, 2013)

The Von Matrices said:


> This is AMD's version of Fermi/GF100.



I thought the very same, altho fermi1 was imo to a higher degree, the cards were going to 100+ and nVidia kept saying thats fine (^_^)

Nice card, nice price (hope that part doesnt do a kepler on us in a fortnight). _NEEDS _a seriously good custom cooler.


----------



## Solaris17 (Oct 24, 2013)

I just died


----------



## Prima.Vera (Oct 24, 2013)

I will wait for a custom cooler, something like VaporX or similar. This cooler must be the worst and cheapest from ATI's history.


----------



## BarbaricSoul (Oct 24, 2013)

Well it looks like I'm finally going full water-cooling if non-reference isn't released by Christmas. I love the performance numbers, not concerned about the energy usage, but the temperatures and noise levels are definitely a problem.


----------



## Solaris17 (Oct 24, 2013)

im not worried about noise or power draw but iv ran cards that hot 470 260 all have died what is the thermal junction of these cards? 105?


----------



## repman244 (Oct 24, 2013)

Wiz can you somehow measure the FPS over time - like at the start when the GPU has low temperature and at the end when it starts throttling?
I mean if the performance hit is big due to throttling the card doesn't look so good anymore...


----------



## Fourstaff (Oct 24, 2013)

Hot, fast and noisy. Welcome to 2010. Still faster than Titan, and price competitive against 780, so all in all an enthusiast card.


----------



## theJesus (Oct 24, 2013)

I told y'all it would sound like a jet engine. 


natr0n said:


> No analog VGA outputs is a moot point for a thumb down. All cards just about come with a vga adapter.


Those adapters only work with DVI-I (I for integrated, meaning analog signal integrated).  The ports on the card are DVI-D (D for digital, meaning digital signal only).  You can tell this by looking at them.  Also, if you had actually read the review, you'd see that W1zard even mentions this specifically.


----------



## Xzibit (Oct 24, 2013)

Fourstaff said:


> Hot, fast and noisy. Welcome to 2010. Still faster than Titan, and price competitive against 780, so all in all an *enthusiast card*.



Now we go LIVE! to get a reaction of Nvidia TITAN owners after showing them the R9 290X priced at $549.99


----------



## GSG-9 (Oct 24, 2013)

I am very happy I kept my 690 right now. 

Titans go cheep, something faster comes out from camp N, or in a generation AMD and Nvidia will be competing again as they release their flagship products (in price and performance).


----------



## mmaakk (Oct 24, 2013)

Great review boss, thanks!

I dunno why but looking at the gpu core brought me memories of the R600 (also a hot card) and same 512 bit wide mem.


----------



## Nortrop (Oct 24, 2013)

That thing sure is loud when not crippled by the 'quiet' BIOS setting. The cooler really let's it down.

http://youtu.be/1W5OXvIzRKc?t=2m40s

Power draw is also on the high side. Some of the other tests i've come across show above 300 watts at the uber setting. Guess that's the price you pay for 6.2B transistors.

I'm not completely sold on the whole AMD bruteforcing it deal, kinda leaves the image of going backwards instead of actually progressing  

I sure do like the price though, it definitely seems to be the card's best feature atm.


----------



## JDG1980 (Oct 24, 2013)

msamelis said:


> Very well said, I feel the same way. It also looks very ugly, it should have a much better quality look at that kind of money. Unless we see some non stock coolers - which AMD won't allow until perhaps even 2014 - I don't see the point of getting this.



I don't understand why AMD would want to cripple their flagship card in this way. With Nvidia's Titan, it makes sense that they would want to enforce the stock design because it was actually good. But all the reviews conclude that the stock 290X cooler sucks. Why wouldn't they let companies like Asus and MSI use their superior cooling solutions?


----------



## wolf (Oct 24, 2013)

So it beats the 780 and titan at the cost of a fair amount more power consumption, heat output and noise...

so ~2 years after the 7970 we get a card that is in the ballpark of ~25% faster, with the trade-offs I just mentioned, gotta say I'm not impressed.

The impressive part is the price/perf, as it will force Nvidia to re-jig their lineup a bit to remain more competitive, which they inevitably will. yes you may have payed a grand for a Titan, but how long ago were you enjoying that single card awesomeness? it's hard to compare the price now because Nvidia will obviously change it.

Given the power consumption/heat output, Nvidia have a lot of room to clock up GK110 and throw one back in there that matches the R9 290X and can price it the same, or perhaps 50$ more if its cooler, quieter and chews a bit less power.

all in all, good and bad, sure as hec not enough to get me to leave a GTX670 oc or leave Nvidia which is an ecosystem I am invested in.

I really hoped for more, the next 3 months will see some great re positioning in line-ups and great prices for everyone.


----------



## razaron (Oct 24, 2013)

arterius2 said:


> and why do I have to buy a card and THEN watercool it to be even competitive with the other card?



I'm not sure about other people that WC, but, if I was to get a new graphics card it would be WCed regardless of what card it is (other than backup cards, obviously). 
So, for me, I'm not comparing a WCed 290x vs a ACed 780. I'm comparing a WCed, overvolted, max 24/7 OC 290X vs a WCed, overvolted, max 24/7 OC 780. Both at 2560x1600.

It would be nice to see some WCed results within the week.


----------



## punani (Oct 24, 2013)

Noise and heat .... meh.


speed and pricetag


----------



## GSG-9 (Oct 24, 2013)

razaron said:


> I'm not sure about other people that WC, but, if I was to get a new graphics card it would be WCed regardless of what card it is (other than backup cards, obviously).
> So, for me, I'm not comparing a WCed 290x vs a ACed 780. I'm comparing a WCed, overvolted, max 24/7 OC 290X vs a WCed, overvolted, max 24/7 OC 780. Both at 2560x1600.



I run my card close to stock on water, but my goal is to keep all my fans @ 800rpm, and the GPU under 45c. I have grown into a stickler for noise.


----------



## Xzibit (Oct 24, 2013)

JDG1980 said:


> I don't understand why AMD would want to cripple their flagship card in this way. With Nvidia's Titan, it makes sense that they would want to enforce the stock design because it was actually good. But all the reviews conclude that the stock 290X cooler sucks. Why wouldn't they let companies like Asus and MSI use their superior cooling solutions?



Strategy...

Once AMD lets them put on better cooling solutions the sustained clock go up.  Its the equivalent of a overclock.

Instead of being around 800mhz-900mhz these cards will sustain a higher clock depending on the cooler during gaming/benchmarks

The difference it made from 7970-7970GHZ with only a 75mhz boost.  The limiting factor is temp so you can overclock as far as 95C lets you 1000mhz+. The way PowerTune works on these cards now they might not need different bioses for each manufacture either now (except for Identification purposes).

I suspect the AIB solution will come in Late November maybe


----------



## HD64G (Oct 24, 2013)

The Von Matrices said:


> Whatever NVidia does with pricing someone is going to complain.  NVidia made the mistake of touting Titan as a high end gaming card when they really shouldn't have.  If they drop the price of Titan (I don't think they should) then they will have complaints from current owners.  If they don't drop the price, then the enthusiast community will continue to complain about it.  Titan is a niche card.
> 
> My predictions:
> 
> ...



In which world do you live? Did you read the review carefully? Did you see the difference in FPS in most games between 290X and green's Top GPUs? It beats the $900 TITAN comfortly with a BETA driver. Amd if TITAN is a compute card, you have no idea how much more the 290X is with 512bit memory bus... 

And if you actually read the review, if you buy a later custom model of 290X with better cooling, it clocks much better and wins easier every GPU on planet today...


----------



## TheMailMan78 (Oct 24, 2013)

Ya know you can get two 670's for less money and it will blow the 290x out of the water. Not seeing a point to this new generation from NVIDIA and ATI.


----------



## btarunr (Oct 24, 2013)

TheMailMan78 said:


> Curious to see how the 780ti stacks up in price and performance.



GTX 780 Ti will have to be faster than Titan, and cheaper than GTX 780 to stand a chance against R9 290X.


----------



## RCoon (Oct 24, 2013)

btarunr said:


> GTX 780 Ti will have to be faster than Titan, and cheaper than GTX 780 to stand a chance against R9 290X.



Or NVidia will likely use the excuse that AMD Engineers could have taken a shit on the 290X and it would have cooled it more effectively. NVidia are usually blissfully arrogant when it comes to their cards price/performance, and will just rearrange prices and probably still be worse in terms of AMD price/performance anyway, but you know, they're elitist jerks like that.
That being said, I'd rather have a reference 780 than a reference 290X.


----------



## GSG-9 (Oct 24, 2013)

HD64G said:


> In which world do you live? Did you read the review carefully? Did you see the difference in FPS in most games between 290X and green's Top GPUs? It beats the $900 TITAN comfortly with a BETA driver. Amd if TITAN is a compute card, you have no idea how much more the 290X is with 512bit memory bus...
> 
> And if you actually read the review, if you buy a later custom model of 290X with better cooling, it clocks much better and wins easier every GPU on planet today...



Did YOU see the difference in FPS between the top GPUs at every resolution?
At some resolutions/programs you are correct, some you are not. I am not sure a single 290X with water cooling will close the 20 fps gap between the 690 at ultra high resolutions.

But I hope it does.


----------



## refillable (Oct 24, 2013)

One word for AMD, BRAVO.

One of the best cards, if not the best card I've see in terms of price/performance.

Thermal and Power Wise, Not good, doesn't really matter to me anyway. I hope it'll be fixed when they implement 20nm process.

Gamers, this is the right card for you...

Just Wait for the Titanfall.


----------



## TheMailMan78 (Oct 24, 2013)

btarunr said:


> GTX 780 Ti will have to be faster than Titan, and cheaper than GTX 780 to stand a chance against R9 290X.



In the over all market I cannot argue with that. I'm all about best bang for the buck and if a NEEDED a new card the 290X would have my attention. Problem is NVIDIA's last generation was so over priced that it makes the 290X the better deal at less over priced. At the end of the day you can get way better performance for less money with one generation past. That's my biggest problem with the 290X. It should be $420 not $550.


----------



## morphy (Oct 24, 2013)

refillable said:


> Just Wait for the Titanfall.



Not sure if the pun was intended or not but I like it


----------



## refillable (Oct 24, 2013)

Actually this is great for me, where prices favour AMD so much.


----------



## JDG1980 (Oct 24, 2013)

Here's how I think Nvidia is going to respond:
* Drop the existing 780 to $549 to match the 290X
* Introduce the new 780 Ti at the old 780 price point ($649). Give this card the same shader performance (2688 cores) as the Titan, but it will only have 3GB RAM, and the usual crippling of FP64 performance.
* Replace the Titan with the Titan Plus (or whatever they want to call it) which would be a full GK110 part with no disabled SMXs. This would up the core count from 2688 to 2880. This would still be a compute-oriented card and the price would remain the same.


----------



## dwade (Oct 24, 2013)

Great card and all, but damn GTX Titan is one efficient card.


----------



## MxPhenom 216 (Oct 24, 2013)

AMD has pretty much done exactly what Nvidia did when they released the 680. Performs a little better, and cheaper.


----------



## the54thvoid (Oct 24, 2013)

Xzibit said:


> Now we go LIVE! to get a reaction of Nvidia TITAN owners after showing them the R9 290X priced at $549.99



Now now Mr Xzibit, less of the trolling please 

You could also rightfully use the same video with the quote,



> How a new R9 290X owner reacted when he turned the boost on thinking the card would go faster.  Seconds later his neighbours complained about the noise, and they were crying too.



In all seriousness, it's a brilliant piece of engineering and the price for it is absolutely fantastic.  If you already have a loop, then this is a dead cert.  But to build a loop for it alone, well, that's adding a couple hundred bucks on top for that heat dissipation.

But I assure you Mr Xzibit, as a Titan owner I am not crying, I am smiling because AMD have the means to make Nvidia rethink their strategy  and that for all of us is a pretty good thing.
Also, it was mostly enthusiasts that bought Titans, _not_ green eyed zealots.  With that in mind, Titan owners might well end up grabbing these too.  We have the cash, we buy wtf we want


----------



## douglatins (Oct 24, 2013)

people are praising this? Its a GTX480 all over again...
Only this time with bad drivers...


----------



## MxPhenom 216 (Oct 24, 2013)

douglatins said:


> people are praising this? Its a GTX480 all over again...
> Only this time with bad drivers...



They are praising it for the price. Not to mention it will force nvidia to respond with something that will benefit all of us in the market for a card currently. I for one wouldnt touch this card till i had access to a waterblock right away.


----------



## the54thvoid (Oct 24, 2013)

douglatins said:


> Only this time with bad drivers...



What's wrong with the drivers?  They've introduced PCI-E lane crossfire capability to get by frame pacing issues.  The single gpu driver performance has never been an issue.  You're stuck in the past, get over it.  

Nothing wrong with AMD drivers, as much as there's nothing wrong with Nvidia drivers.


----------



## GSG-9 (Oct 24, 2013)

the54thvoid said:


> Nothing wrong with AMD drivers, as much as there's nothing wrong with Nvidia drivers.


Let me tell you fair price or not if one of them would release a eyefinity/surround driver that allowed 2+ monitors, rotated in any configuration (per display) they would have me. (Are you listening graphics overlords? I want you to take my money, just build it already!)


----------



## newtekie1 (Oct 24, 2013)

natr0n said:


> No analog VGA outputs is a moot point for a thumb down. All cards just about come with a vga adapter.



VGA Adapters won't work on this card.  But at this point if you are buying this class of card and still using a VGA monitor, you're doing wrong.


----------



## Ravenas (Oct 24, 2013)

Everyone is freaking out this thing is 2dBa higher noise than the Titan under load... If you enable Uber Mode, which has a slight increase in performance, and which would obviously generates more heat, the card jumps 10dBa... People's expectations are both unrealistic and on the verge of Nvidia fanboy. If you got a problem with noise or cooling, that's exactly why they make aftermarket coolers.


----------



## erasure (Oct 24, 2013)




----------



## Lionheart (Oct 24, 2013)

douglatins said:


> people are praising this? Its a GTX480 all over again...
> Only this time with bad drivers...


----------



## wolf (Oct 24, 2013)

Did you even read the review? also given the smorgasbord of reviews, it is far louder and hotter than a titan, not just 2db..



erasure said:


> http://cdn.meme.li/i/p9kxn.jpg


----------



## NeoXF (Oct 24, 2013)

GPU-wise I only need to wait for 2 things... see how 290 performs (and it's price... $400 PLZ!) and non-reference models of 290X...

Money-wise... a I need to wait for 3-4 things... and that's months, LOL, before I can scrap the cash for one of these.


----------



## Slizzo (Oct 24, 2013)

HD64G said:


> In which world do you live? Did you read the review carefully? Did you see the difference in FPS in most games between 290X and green's Top GPUs? It beats the $900 TITAN comfortly with a BETA driver. Amd if TITAN is a compute card, you have no idea how much more the 290X is with 512bit memory bus...
> 
> And if you actually read the review, if you buy a later custom model of 290X with better cooling, it clocks much better and wins easier every GPU on planet today...



I'm sorry, WHAT?

You realize that 512-bit memory bus has almost NOTHING to do with the compute performance of the R9 290X yes? The R9 290X doesn't have anywhere near the double precision performance that the Titan has.


----------



## sweet (Oct 24, 2013)

^ AMD's driver now is fine and updated usually. The "AMD's driver" story is old now, mate.


----------



## Ravenas (Oct 24, 2013)

wolf said:


> Did you even read the review? also given the smorgasbord of reviews, it is far louder and hotter than a titan, not just 2db..



http://www.techpowerup.com/reviews/AMD/R9_290X/26.html



> Idle noise levels are decent, almost quiet. You can barely hear the card when it is installed in a case.



Load noise: 40dBa vs. 38dBa

Wow did a nuclear bomb just go off inside your head?

P.S.: There are no other reviews. IMO, the only reviews that exist are made by W1z and cadaveca, the rest are trash.


----------



## wolf (Oct 24, 2013)

Ravenas said:


> http://www.techpowerup.com/reviews/AMD/R9_290X/26.html



Oh that's right, I forgot that we game in idle mode!

You buy this card because it's fast, 90%++ of people will use uber mode, 50dB.

It is hotter and louder, period.


----------



## NeoXF (Oct 24, 2013)

wolf said:


> Did you even read the review? also given the smorgasbord of reviews, it is far louder and hotter than a titan, not just 2db..



2dB, like Ravenas linked. +/- 1dB, Quiet Mode... as showed by a billion reviews I've already looked through (they can't all lie, can they?). And in Uber mode, it's pretty loud, but still, under HD 7970GHz stock cooler levels.

But I get it thought, some people just can't be objective, and even much less so, happy that we're finally getting another great round of high-end GPU/price wars. And that's just sad.



Ravenas said:


> P.S.: There are no other reviews. IMO, the only reviews that exist are made by W1z and cadaveca, the rest are trash.



Yeeeeah... you lost me here.


----------



## Ravenas (Oct 24, 2013)

wolf said:


> Oh that's right, I forgot that we game in idle mode!
> 
> You buy this card because it's fast, 90%++ of people will use uber mode, 50dB.



So whenever I do anything on my computer my CPU, GPU, and memory are ALWAYS overclocked? Really??


----------



## wolf (Oct 24, 2013)

Ravenas said:


> P.S.: There are no other reviews. IMO, the only reviews that exist are made by W1z and cadaveca, the rest are trash.



I appreciate your loyalty to TPU and they are fantastic reviews, but there are at least 2 others I can think of that do darn good job too. Guru3D and techreport. Reading one review is just narrow.



Ravenas said:


> So whenever I do anything on my computer my CPU, GPU, and memory are ALWAYS overclocked? Really??


----------



## springs113 (Oct 24, 2013)

At the end of the day, I will be getting this monster...My 780 hydro copper don't know whether to sell it or give my wife or move it to my htpc. I game at 1440p so this card is pretty much what I was expecting, I don't care bout the power usage my electric bill dont pass 65 if i don't go mad using the ac.  I will be throwing this under water, just wish there was a model like the komodo from swiftech available.


----------



## arterius2 (Oct 24, 2013)

Ravenas said:


> Everyone is freaking out this thing is 2dBa higher noise than the Titan under load...




it seems that some people here don't understand the concept of decibels, its not a linear ratio. a decibel difference of 10 means 10 folds difference, a decibel of 2 is 2 times louder.


----------



## Frick (Oct 24, 2013)

Guys. It is loud and hot. Whats to discuss?


----------



## MxPhenom 216 (Oct 24, 2013)

When you put the 290x in uber mode, its 12dba louder, hitting 50. And in "quiet" mode its still louder than Titan.


----------



## wolf (Oct 24, 2013)

Frick said:


> Guys. It is loud and hot. Whats to discuss?



But I was having fun arguing like a kid!


----------



## raptori (Oct 24, 2013)

6°C more to reach boiling temp.


----------



## arterius2 (Oct 24, 2013)

the difference between 40decible and 50 decibel is TEN (10) TIMES LOUDER


----------



## Frick (Oct 24, 2013)

raptori said:


> 6°C more to reach boiling temp.



It would have boiled a long time ago if you live in Quito.


----------



## SIGSEGV (Oct 24, 2013)

oh crap...


----------



## springs113 (Oct 24, 2013)

Lol @the temp complaints, Haswell hits those numbers on air cooling, for what a gpu does it is ok in my book.


----------



## sweet (Oct 24, 2013)

wolf said:


> Did you even read the review? also given the smorgasbord of reviews, it is far louder and hotter than a titan, not just 2db..



I really feel sorry for your math teacher, mate


----------



## arterius2 (Oct 24, 2013)

sweet said:


> I really feel sorry for your math teacher, mate



please go understand the decibel scale
http://en.wikipedia.org/wiki/Decibel

in quiet mode, 290x is FOUR(4) times louder than gtx780, and TWO(2) times louder than titan
in uber mode, 290x is FOURTEEN (14) times!!! louder than gtx780, and TWELVE(12)times louder than titan


----------



## Ravenas (Oct 24, 2013)

wolf said:


> I appreciate your loyalty to TPU and they are fantastic reviews, but there are at least 2 others I can think of that do darn good job too. Guru3D and techreport. Reading one review is just narrow.



SO whenever you jump in your car to go to work in the morning you turn on all of your tuners to max and then press the peddle to the floor until you get to work?


----------



## GSG-9 (Oct 24, 2013)

springs113 said:


> Lol @the temp complaints, Haswell hits those numbers on air cooling, for what a gpu does it is ok in my book.



Yeah, look at Sandy Bridge instead and you are talking 50c on water...


----------



## Crap Daddy (Oct 24, 2013)

Noise? What did you expect from a cheap $550 card? If you want silence it'll cost you $1000!


----------



## crazyeyesreaper (Oct 24, 2013)

springs113 said:


> Lol @the temp complaints, Haswell hits those numbers on air cooling, for what a gpu does it is ok in my book.



True enough but Haswell only throttles when it hits those temps in extreme situations aka Prime 95 / OCCT / IBT etc

Its now common knowledge that 290X could perform far better with aftermarket cooling.

Many sites show average clock speeds at 880-920 Mhz and over longer periods of time performance suffers by 2-4 fps.

Stock cooler = forget about overclocking.

Once aftermarket cooled cards are available 290X will shine right now its more of a lame duck. Lots of potential thats currently wasted unless you slap a water block on it oh wait they are all sold out of said blocks.

The sooner ASUS / MSI / Sapphire etc release their Direct Cu II / Twin Frozr / Vapor X series of these cards the better.


----------



## claylomax (Oct 24, 2013)

At the current exchange rate: 

   $550 = £340

   plus VAT (20%) = £408


   So why it's more expensive here; will it go down in the next few weeks? :shadedshu

http://www.scan.co.uk/shop/computer-hardware/all/gpu-amd/radeon-r9-290x-pci-e-(2816-streams)


----------



## Ravenas (Oct 24, 2013)

crazyeyesreaper said:


> True enough but Haswell only throttles when it hits those temps in *extreme situations* aka Prime 95 / OCCT / IBT etc



When would you need Uber mode other than extreme situations?


----------



## sweet (Oct 24, 2013)

arterius2 said:


> please go understand the decibel scale before commenting.
> http://en.wikipedia.org/wiki/Decibel
> 
> in quiet mode, 290x is FOUR(4) times louder than gtx780, and TWO(2) times louder than titan
> in uber mode, 290x is FOURTEEN (14) times!!! louder than gtx780, and TWELVE(12)times louder than titan



LOL, don't teach me about decibel scale. And I'm talking with the guy who claim that 40db (290x quiet mode) - 38db (Titan) is far more bigger than 2db.


----------



## crazyeyesreaper (Oct 24, 2013)

So your telling me you buy computing products so they can not perform at the level they should?

Since when did we reward downclocking during are gameplay? Since when did enthusiasts find it okay to lose performance thats there not because of a bad product but because of a shitty design for a cooler?

Oh look it does 50 fps in x game oh wait if I play for more than 5 minutes its really just 45 fps and has the exact same performance as the 780 yet is loud as fuck and is a space heater.  Since when did this = AWESOME

AMD dropped the ball with the cooler. Just like they did with the 6970 reference design that they had to shave off the plastic around the PCIe power plugs to make it fit.

As it stands if the card ran at 1000 MHz performance could be 5-9% better than it is instead downclocking during gameplay is now an awesome feature that means lower performance on reference cards something that wont exist for owners with water cooled cards. Everyone else gets to wait 3 months for a proper 290x.

people bitched about the GTX 480 being a space heater etc yet it stomped AMD into the ground performance wise. Now AMD's GPU is much the same way but its okay? yea no people need to stop drinking the vendor kool aide


----------



## arterius2 (Oct 24, 2013)

claylomax said:


> At the current exchange rate:
> 
> $550 = £340
> 
> ...



everything is expensive in the UK man, that's what happens when you manufactures nothing and sits on an island.


----------



## GSG-9 (Oct 24, 2013)

Ravenas said:


> When would you need Uber mode other than extreme situations?



"The situation you described does not fit me, therefore your point is invalid."

Grow up, How about someone gaming @ 7680x1440. A 690 does not even run some games as well as a user would like.


----------



## claylomax (Oct 24, 2013)

arterius2 said:


> everything is expensive in the UK man, that's what happens when you manufactures nothing and sits on an island.



Oh never mind, found it cheaper here: 

http://www.overclockers.co.uk/productlist.php?groupid=701&catid=56&subid=1752


----------



## sweet (Oct 24, 2013)

crazyeyesreaper said:


> So your telling me you buy computing products so they can not perform at the level they should?
> 
> Since when did we reward downclocking during are gameplay? Since when did enthusiasts find it okay to lose performance thats there not because of a bad product but because of a shitty design for a cooler?
> 
> ...



290X in quiet mode spends most of its gaming time in 650-850 MHz region, and it still beats 780 with a far margin in most games.


----------



## W1zzard (Oct 24, 2013)

Domokun said:


> Thanks for the review, W1zzard. On the second page of the review, you mentioned the following:
> 
> 
> Is there actually an advantage to using PCIE3 (as opposed to PCIE2) with an AMD R9 290X? Or is it a similar situation to every video card before it (i.e. minimal improvement)?



i'm looking at doing a PCIe scaling article with 290X in the next weeks.

http://www.techpowerup.com/reviews/#scaling
is the best data I can provide at this time


----------



## Ravenas (Oct 24, 2013)

GSG-9 said:


> "The situation you described does not fit me, therefore your point is invalid."
> 
> Grow up, How about someone gaming @ 7680x1440. A 690 does not even run some games as well as a user would like.



Yes because so many people game at Ultra resolutions.


----------



## GSG-9 (Oct 24, 2013)

W1zzard said:


> i'm looking at doing a PCIe scaling article with 290X in the next weeks.
> 
> http://www.techpowerup.com/reviews/#scaling
> is the best data I can provide at this time


Thank you!



Ravenas said:


> Yes because so many people game at Ultra resolutions.


"The situation you described does not fit me, therefore your point is invalid."

That being said I would water cool a 290 if I had one. So it is not an issue, or concern to me. That does not change that your argumentative logic is dismissive of facts, and therefore biased.


----------



## springs113 (Oct 24, 2013)

crazyeyesreaper said:


> True enough but Haswell only throttles when it hits those temps in extreme situations aka Prime 95 / OCCT / IBT etc
> 
> Its now common knowledge that 290X could perform far better with aftermarket cooling.
> 
> ...



My haswell before going water use to hit 67 with a h100 @stock just idling doing internet surfing and stuff.


----------



## arterius2 (Oct 24, 2013)

sweet said:


> 290X in quiet mode spends most of its gaming time in 650-850 MHz region, and it still beats 780 with a far margin in most games.



where is this "far margin" you are talking about? are we even looking at the same graph?
at 4 times the amount of noise and 25% more power draw, the 290x is about 1% faster than 780gtx in silent mode at 1920x1080, and 4% more in 2560x1600


----------



## wolf (Oct 24, 2013)

sweet said:


> I really feel sorry for your math teacher, mate



Everyone is real keen on looking at idle mode aren't they?



arterius2 said:


> I feel the same for your science teacher
> 
> please go understand the decibel scale before commenting.
> http://en.wikipedia.org/wiki/Decibel
> ...



arguing my point? good work on that one 



Ravenas said:


> SO whenever you jump in your car to go to work in the morning you turn on all of your tuners to max and then press the peddle to the floor until you get to work?



Thats the way we do it here mate.






Love how you pick one, completely non relevant example to suit your purpose too, unfortunately that's not how gfx cards work and I can do the same easily, shall we?

So your partner snores, not often mind you, perhaps only for 10 minutes as she goes to sleep, most of the time she's breathing it's quiet, not annoying at all, but when she snores it grinds your gears, please don't make me connect the dots.

Easy to make an example that fits what I want too mate.

Why is everyone so stuck up on idle, the thing is loud under load, and the difference grows to insane in uber mode. It is louder across the board, period.

This is my only point.


----------



## Ravenas (Oct 24, 2013)

GSG-9 said:


> Thank you!
> 
> 
> "The situation you described does not fit me, therefore your point is invalid."



Yeah your situation being described is like comparing a Nascar race to a highway joyride.


----------



## KainXS (Oct 24, 2013)

its a bit too hot and loud for my taste but all I care about is it driving the prices down.


----------



## crazyeyesreaper (Oct 24, 2013)

sweet said:


> 290X in quiet mode spends most of its gaming time in 650-850 MHz region, and it still beats 780 with a far margin in most games.



I dont see it?

1% and 4% over a stock reference 780. Heres the kicker most 780s on the market beat the Titan as is. On top of that a reference 780 is slower than the aftermarket cooled custom PCB 780s at the same clock speeds. So quiet mode the 290X = 780 margin of error in benchmarks is 3% as such they are tied and with a slight edge in uber mode overall to the 290X










Lets see where a 780 custom card comes in out of box








At 1920 the custom 780 is on par with the 290x
At 2560 the custom 780 is 2-3% slower than the 290x which falls into the margin of error.

I fail to see on a broad set of games how the 290X is that amazing. With clock scaling being what it is if the card pegged 1000 mhz during gaming. It would be on average another 5-7% faster giving it a greater lead and one that can be felt not just seen in benchmarks on the first run. 

As it stands knowing performance drops over time slightly means in reality  Quiet mode = 780 and Uber mode = Titan its not as much faster as its on par and everyone acts like beating titan is some insane feat? Every GTX  780 reviewed on TPU other than reference card beats Titan. 

The reference 290x is not impressive. Aftermarket cooled cards will show what the 290x is actually capable of right now the Hawaii silicon looks fantastic. Cooler / Bios / and reference settings not so much.

So looking at the current data a GTX 780 is $100 more expensive.

290X + waterblock = cost of a GTX 780
290X + aftermarket air cooler like an Accellero = $70-80 = $20-30 cheaper than a GTX 780 for same performance without the noise / heat or clock throttle


----------



## GSG-9 (Oct 24, 2013)

Ravenas said:


> Yeah your situation being described is like comparing a Nascar race to a highway joyride.



There are different levels of overclocking, I run 3DMark that is extrieme, that is the Nascar race. Tuned settings I do not use in day to day.

I play games, I overclock for that, the profiles only kick in when in 3d mode. But it is a day to day overclock(a small overclock, limited to no voltage increase). I want it quiet more than loud. You can look at my specs, I am not arguing to argue, those are the specs I game with. The 290 is damn good, I think it IS a 480. I am excited to see this tech optimized in the future. 

I just don't see why you think you can dismiss other people's use case in a thread because it is not your use case.


----------



## Aquinus (Oct 24, 2013)

arterius2 said:


> where is this "far margin" you are talking about? are we even looking at the same graph?
> at 4 times the amount of noise and 25% more power draw, the 290x is about 1% faster than 780gtx in silent mode at 1920x1080, and 4% more in 2560x1600
> 
> http://tpucdn.com/reviews/AMD/R9_290X/images/perfrel_1920.gif



Reviews have also shown that at higher resolutions (like eyefinity) the R9 290X does pretty well and scales to the higher resolutions well. At 1080p, there is a good bet that you don't need a video card this beefy. You're getting this if you're going beyond that, be it with 1440p, 4k, or with 3 displays.


----------



## Ravenas (Oct 24, 2013)

GSG-9 said:


> I just don't see why you think you can dismiss other people's use case in a thread because it is not your use case



I haven't dismissed anyone's case. I wasn't the one who came in to the thread telling people to "grow up". People can have their case, and not see eye to eye, but that doesn't mean you come saying I have dismissed everyone in this thread...


----------



## ZenZimZaliben (Oct 24, 2013)

Awesome review. Can't wait for full coverage water blocks to hit the market!! This card is gonna rock under water. 

Ahhh snap! - http://www.guru3d.com/news_story/ek_releases_water_block_for_amd_radeon_r9_290x.html


----------



## Am* (Oct 24, 2013)

The Von Matrices said:


> This is AMD's version of Fermi/GF100.
> 
> It seems like history keeps repeating itself.  Sure, you get a card that is cheaper than its competitors for the same performance but you pay for it in power consumption and noise.
> 
> It's the same as the HD 2900 XT and GTX 480 (and more recently the 7970GE, to a lesser extent).



I'll take a slightly higher performing and hotter Fermi equivalent at $550 over a throttling Kepler at $999 anyday. The key here is that the R290x is limited by the temperatures and not the VRMs like the Titan -- and it raped the Titan in its own Nvidia-favoured games at higher res. Ask anybody here what they'd rather have -- a GPU that's cooler/temperature limited or a PCB power limited card and anyone sensible will pick the former every time -- much easier to buy an aftermarket version of the card or a new cooler than voltmod or BIOS flash and risk getting a $1k paperweight. Most people here have 2 PCs or more and a tablet/phone as well so I doubt they will give a shit about a few extra watts here and there (though that Blu-ray/multi-monitor consumption looks quite embarrassing -- curious if it can be fixed with some modest underclocking/custom profiles).

One thing is for sure though. If that 780 Ti is anything less than a fully un-gimped GHz Edition GK110 with all 2880 cores and at the same price, Nvidia can GTFO till next year when Maxwell is ready. No point in them wasting time releasing anything less than that (that rumoured 2490 core GK110 will fail epically against this card, regardless of clocks). Titans are going to collect more dust on retailers' shelves until they do some serious price cuts on it (50%-60%).

This card did a hell of a lot better than I thought it would (I expected it to slot beneath the Titan, not above it). Big props to AMD for bringing the prices back close to sanity levels. And I'm glad this card just took a huge shit on Nvidia's "boutique card" parade to be honest.

I personally will wait for the BF4 benchmarks and aftermarket versions before considering biting the bullet on this card. The beta performance was...inconclusive, to say the least, and I'm curious to know if one 290X will max BF4 or not -- and if not, how far it is from doing so.


----------



## GSG-9 (Oct 24, 2013)

Ravenas said:


> I haven't dismissed anyone's case. I wasn't the one who came in to the thread telling people to "grow up". People can have their case, and not see eye to eye, but that doesn't mean you come saying I have dismissed everyone in this thread...



I read:


> When would you need Uber mode other than extreme situations?


As a way of saying/implying/dismissing anyone who would need Uber mode most of the time, we can drop it. 

*Bro hug


----------



## HalfAHertz (Oct 24, 2013)

Hey W1z what if instead of overvolting, you try to undervold? If it's still stable and you manage to lower the temps as well, wouldn't that mean longer time at the max  boost state and better performance?


----------



## n0tiert (Oct 24, 2013)

that baby is priced with € 569,-*

ASUS R9290X-G BF4 Edition
http://www.alternate.de/ASUS/ASUS+R9290X-G_BF4_Edition,_Grafikkarte/html/product/1105280/?

SAPPHIRE R9 290X BF4 Edition € 569,-*  (in stock)
http://www.alternate.de/SAPPHIRE/SAPPHIRE+R9_290X_BF4_Edition,_Grafikkarte/html/product/1104772/?

not to bad


----------



## Domokun (Oct 24, 2013)

W1zzard said:


> i'm looking at doing a PCIe scaling article with 290X in the next weeks.


I'm looking forward to it. It (PCIE3 vs PCIE2) didn't appear to make too much of a difference with Nvidia's GTX Titan. However, I thought that it may have been a different situation with AMD's R9 290X.


----------



## the54thvoid (Oct 24, 2013)

Am* said:


> The key here is that the R290x is limited by the temperatures and not the VRMs like the Titan -- and it raped the Titan in its own Nvidia-favoured games at higher res.



Yeah, my VRM's dont help me game at 1136 core (143 MHz above max stock boost of 993) or bench at 1202Mhz (209MHz above stock max boost).  Still......

And your definition of 'rape' is a tad off the mark.  Single digit %'s does not cry 'rape'.  Though in uber mode you could carry our many nefarious acts and not be heard.

However, I do believe that under proper cooling the card beats Titan in all fair fights.  It might use a lot more juice but that's not really a point for enthusiasts.


----------



## Casecutter (Oct 24, 2013)

Price... nailed it weeks ago!  Performance right what I figured.  

Efficiency not totally sold, but would like to see it over more games and that average of Perf/Watt.  As it throttles back we could find the such results noteworthy.  I honestly thought AMD would find the efficiency within a rearrangement of the GCN to provide 18% performance, while not raising power usage as was with Bonaire.  That said, it's not that bad 20% faster with 5% more power (Uber mode 27% using 13%).

Noise, bad but considering the watt's/heat trying to be shed from a die 30% smaller than a GK110 it’s not unpredicted.  Could AMD advanced the blower cooler to do better perhaps?  Are we hitting the limit of heat transfer while maintaining a dual slot cooler that blows all the heat out the back... conceivably?  Could AMD gone more exotic, imaginably, but at what cost?  Would love to know the actual fan differences of the Titan and the AMD uses in in terms of noise, performance and price.  I suppose AMD has to displace more heat so if the competing coolers and the fans are basically equal, the R9 290X has to push more air so 2dbA isn't bad.  As a release/reference cooler it's not great, however that wouldn't be a big issue, but it becomes one if the AIB's aren't able to move their designs in just a few weeks.

AMD did this, although I don't think they're plan was ever to do it, and the whole reason  for the "No New GPUs from AMD for the Bulk of 2013" in early February.  I think they thought they could hold-out for 20Nm on risk production Q3 2013, but that went bye-bye with TSMC and they had to counter the GK110 with something.  They realized the 28Nm for a high performance design, while staying significantly economical (they don't have the luxury of geldings from the professional line) and maintain power was no "walk in the park" on 28Nm.  This is what we get and why neither party really hasn't done exceptionally better than rebranding 28Nm production.

I think AMD had make do with design compromises.  Except one place price and on that we have a success.

What will Nvidia do... drop a 780 to $580; and bring the 780Ti in at $650, but other than initial review samples all those 780's will be AIB dual fan style cooler's.  Titan will maintain at $1000, but in most terms it's already vanished.  Those who need it for DP compute have pretty much snapped one up, those for gaming it it has no importance to pay the extra.

What's next "Non-X" versions for $450 that are 3-4% below a 780, while 10% besting the 770.  That might be a good contest by the Christmas season.


----------



## newtekie1 (Oct 24, 2013)

Ravenas said:


> Everyone is freaking out this thing is 2dBa higher noise than the Titan under load... If you enable Uber Mode, which has a slight increase in performance, and which would obviously generates more heat, the card jumps 10dBa... People's expectations are both unrealistic and on the verge of Nvidia fanboy. If you got a problem with noise or cooling, that's exactly why they make aftermarket coolers.



To expect the card to give the same performance with at least similar power consumption, heat output, and noise level as the competition isn't being unrealistic.  Especially not when the competition has been on the market for 9 months.

The card has potential for sure, but AMD really should have worked on the stock cooler more. There is no getting around that.  The price is the only saving grace, and since rumor has it the 780Ti will just be Titan with 3GB of RAM and likely price the same or slightly lower than 290X and with a stock cooler that doesn't deafen people, the price advantage will be short lived.



arterius2 said:


> the difference between 40decible and 50 decibel is TEN (10) TIMES LOUDER



50dBA is twice as loud as 40dBA, but still that is quite a bit louder.


----------



## Intel God (Oct 24, 2013)

ZenZimZaliben said:


> Awesome review. Can't wait for full coverage water blocks to hit the market!! This card is gonna rock under water.
> 
> Ahhh snap! - http://www.guru3d.com/news_story/ek_releases_water_block_for_amd_radeon_r9_290x.html



Frozencpu already has them available for order.


----------



## GC_PaNzerFIN (Oct 24, 2013)

Very much AMD equivalent of "GTX 480" launch. Performance is more than ok but stock cooler is awful and power consumption is all over the place. Price would be nice if it had a proper cooler but as it is, it just makes GTX 780 DCII OC look even better buy. 

Btw "A change in power by a factor of 10 is a 10 dB change in level." -wikipedia

50dBA is indeed 10x 40dBA power.


----------



## xorbe (Oct 24, 2013)

arterius2 said:


> the difference between 40decible and 50 decibel is TEN (10) TIMES LOUDER



10dB is about twice as loud to the human ear.  Power, and perception of the human ear are different things.


----------



## TheOne (Oct 24, 2013)

The codename fits this card well, just like Fermi fit the GTX 480 well, though both could be used to describe either card, Volcanic-Fermi.


----------



## Yeoman (Oct 24, 2013)

It seems like a great card, really nice when considering the price, so hopefully it will bring the prices down. I wonder what affect applying some Arctic Silver/MX paste might have on the temps? I know it's not ideal, nor should it be acceptable that someone has to do that reach acceptable temps/db, but in my experience, reapplying paste can take about 10% off the gpu temperature, which could be enough for the card to reach 1000mhz in quiet mode? 

Personally, I prefer AIB cards, simply because they have more finesse (temps/acoustics). But I've often wondered how far a bit of paste might go in terms of making a reference model a much more acceptable purchase.


----------



## boogerlad (Oct 24, 2013)

w1zzard; said:
			
		

> there is the data, linear fit is y = 1.1975x + 215.29
> 
> basically for every Â°C that the card runs hotter it needs 1.2W more power to handle the exact same load



Got that from a thread in xs. I'm pretty sure if watercooled, then power consumption will be much much lower.


----------



## buddatech (Oct 24, 2013)

I'm impressed!! (Price/performance ratio just rocks!!) Just built my entire rig 7 weeks ago. I may have to jump ship, what to do?


----------



## Ralfies (Oct 24, 2013)

I'm extremely disappointed with this card. I had high hopes for the 290/290X and was all ready to switch to the red team. The price and performance is awesome, but the power consumption, noise, and thermals are a deal breaker. 

Sure, the extra power consumption will have a minimal effect on my power bill, but the cost of having to run the AC to keep my apartment at a reasonable temperature sure will. My GTX 670 already heats up the room during long gaming sessions. Maybe I could overlook this if the card was properly cooled and the fan noise didn't have to be so loud my neighbors would complain to keep it from throttling. Hopefully the 290 has a little bit better power consumption and launches with aftermarket coolers. If so, it'll be a day one purchase for me.


----------



## Ravenas (Oct 24, 2013)

newtekie1 said:


> The card has potential for sure, but AMD really should have worked on the stock cooler more. There is no getting around that. The price is the only saving grace, and since rumor has it the 780Ti will just be Titan with 3GB of RAM and likely price the same or slightly lower than 290X and with a stock cooler that doesn't deafen people, the price advantage will be short lived.



I doubt the Ti will have a slightly lower price than the 290x, but time will tell.


----------



## 15th Warlock (Oct 24, 2013)

Xzibit said:


> Now we go LIVE! to get a reaction of Nvidia TITAN owners after showing them the R9 290X priced at $549.99



LOL! 

But seriously, these jokes are getting kinda old, moving on...




the54thvoid said:


> Now now Mr Xzibit, less of the trolling please
> 
> You could also rightfully use the same video with the quote,
> 
> ...



Yup, you have to recognize this is an amazing piece of hardware, yes, AMD had to save in the cooler, but I don't care, I know I'll put my 290X under water, and the best thing of all, those savings were passed down to the costumers.

What good is a Magnesium/Aluminum alloy cooler if is going to be in a box for the rest of the card's life span? Don't get me wrong, Titan looks like a piece of modern art compared to the 290X (how does the saying go? That's a face only a mother would love ) but I paid an arm a leg for something I used for a few weeks only.

Haters are gonna say this card is loud, hot and inefficient, who cares? the performance is there, and for a reasonable price too! This goes on and on, Prescott, Bulldozer, GTX480, Titan, and now the 290X... Whatever, people will always find something to complain about.

All I know is this is an awesome card, and credit goes to AMD for pricing it right, and that's alright in my book


----------



## newtekie1 (Oct 24, 2013)

GC_PaNzerFIN said:


> Btw "A change in power by a factor of 10 is a 10 dB change in level." -wikipedia
> 
> 50dBA is indeed 10x 40dBA power.



However, it isn't 10x louder to the human ear.


----------



## buildzoid (Oct 24, 2013)

Better VRMs and cooling(silicon gets less efficient as temps go up) could shave 20 to 50 watts of the power consumption easily so really I could care less about this versions power consumption as long as custom boards come out soon.


----------



## GSG-9 (Oct 24, 2013)

newtekie1 said:


> However, it isn't 10x louder to the human ear.



This is why I think sones are a more genuine way to measure noise level, it is easier to explain to people.



buildzoid said:


> Better VRMs and cooling(silicon gets less efficient as temps go up) could shave 20 to 50 watts of the power consumption easily so really I could care less about this versions power consumption as long as custom boards come out.



Dumb question, how often do aftermarket cards have custom VRMs? Is it basically any time its a custom PCB or are they sometimes the same even on a custom PCB?


----------



## buildzoid (Oct 24, 2013)

GSG-9 said:


> Dumb question, how often do aftermarket cards have custom VRMs? Is it basically any time its a custom PCB or are they sometimes the same even on a custom PCB?



The point of custom pcbs is to do one of four things:
1 different output options
2 modified VRMs (either to improve them or make them cheaper)
3 doubled vram capacity
4 any combination of the above

So custom pcbs don't always have to have custom VRMs but most do because the other changes aren't really that useful except for vram doubling on Nvidia gpus.
Though it is key to understand that not all custom VRMs are a good thing for example the stuff that xfx engineers designe is usually designed to minimize production costs and is terrible in every way imaginable.

You can use the ekwb cooling configurator for checking 90% of gpus for what vrm design they use.


----------



## HumanSmoke (Oct 24, 2013)

GSG-9 said:


> Dumb question, how often do aftermarket cards have custom VRMs? Is it basically any time its a custom PCB or are they sometimes the same even on a custom PCB?


Depends on the whim of the vendor in large part. Gigabyte for example has two versions of the Windforce3 GTX 780. Rev.1 uses the reference 6+2 layout, Rev.2 uses 8+2. Some vendors like Zotac, Inno3D, PoV, EVGA's SC use the reference layout and just raise clocks to build a product stack, while MSI, EVGA (Classified/FTW), Asus, Palit/Gainward revise the power delivery to varying extents.


----------



## broken pixel (Oct 24, 2013)

Any word when and if there will be a design change on VRMs via other venders? 

290x with an EK block sounds nice x2.


----------



## theonedub (Oct 24, 2013)

At least AMD delivered, to some degree D), on the performance. It would've been sad to see them concede as they did to Intel in the high performance CPU segment. 

Once the cooling gets sorted, I'm sure the card will really shine. Getting top tier card pricing back down to earth is a win for all of us- and this is a step in the right direction.


----------



## johnnyfiive (Oct 24, 2013)

To the whiners...


How much is the GTX 780 and Titan again? And how much is the R290x again? AMD is giving the people what they want... a affordable MONSTER of a card. If you're worried the cooler isn't up to par, get an extended warranty. This card is win/win/win for price/performance/features. So what if its loud? So what if it gets hot? Who fricking cares about that, seriously? I have two 480's and it plays everything great. I've had em over 3 years and they're still working fine. Be quiet already whiners! The R290x is exactly what PC gamers needed, something to stop nvidia from releasing these stupid ridiculous $600+ cards that aren't worth the ridiculous asking price.


----------



## radrok (Oct 24, 2013)

I honestly don't get what's all this fuss about power consumption and temperatures.

Who in earth buys a high end graphics card without having its cooling planned?

Reference and high end does not exist in the same context, imho.

Either watercooling or aftermarket, pick one 

Yes I know that Nvidia gets reference right but they charge a well damn hefty premium for it.


----------



## TheHunter (Oct 24, 2013)

Yeah just slap on a custom cooler and it flies 


Prolimatech cooler gets 72C @ 1200mhz (1.4v) imo says it at.. Stock 55C.


----------



## msamelis (Oct 24, 2013)

johnnyfiive said:


> To the whiners...
> 
> 
> How much is the GTX 780 and Titan again? And how much is the R290x again? AMD is giving the people what they want... a affordable MONSTER of a card. If you're worried the cooler isn't up to par, get an extended warranty. This card is win/win/win for price/performance/features. So what if its loud? So what if it gets hot? Who fricking cares about that, seriously? I have two 480's and it plays everything great. I've had em over 3 years and they're still working fine. Be quiet already whiners! The R290x is exactly what PC gamers needed, something to stop nvidia from releasing these stupid ridiculous $600+ cards that aren't worth the ridiculous asking price.



You can get the ASUS GTX780 OC for just 30 quid extra and have better cooling, less noise and better performance then the 290x


----------



## theonedub (Oct 24, 2013)

johnnyfiive said:


> To the whiners...
> 
> 
> How much is the GTX 780 and Titan again? And how much is the R290x again? AMD is giving the people what they want... a affordable MONSTER of a card. If you're worried the cooler isn't up to par, get an extended warranty. This card is win/win/win for price/performance/features. So what if its loud? So what if it gets hot? Who fricking cares about that, seriously? I have two 480's and it plays everything great. I've had em over 3 years and they're still working fine. Be quiet already whiners! The R290x is exactly what PC gamers needed, something to stop nvidia from releasing these stupid ridiculous $600+ cards that aren't worth the ridiculous asking price.



I don't know about you, but $549 and $659 are both ridiculously expensive cards. Its not as if AMD just brought us back to the $399 flagship era, the card is still far from what I, and what I presume other people would consider, affordable.

Like I was saying, props to AMD for taking a step in the right direction, but there is still a long way to go for both companies to get to a point where the word affordable becomes appropriate.


----------



## buildzoid (Oct 24, 2013)

msamelis said:


> You can get the ASUS GTX780 OC for just 30 quid extra and have better cooling, less noise and better performance then the 290x



Here where I live the price difference between the 290x and the cheapest 780 is slightly more than the cost of an aftermarket cooler which btw will stomp the dcii cooler. Also the 290x in the review was probably throttling so once it has the better cooler it could beat OCed 780 at stock settings.


----------



## bananarepublic (Oct 24, 2013)

"AMD's Radeon R9 290X shows fantastic clock scaling with GPU voltage, better than any GPU I've previously reviewed. The clocks do not show any signs of diminishing returns, which leads me to believe that the GPU could clock even higher with more voltage and cooling"

Ohh mine, why did I not read your review of R9 290X first? 

I may have forgotten how great this place is for stuff like this, such a well balanced and informed article I simply love it!  You guys bring up these small details etc many other sites left out or perhaps did not even think about? Had account here some years ago as well but forgot my log in info and cannot find it, only the name.

I´m Swedish so read first at Sweclockers which is a really great place with many great guys and superb reporters, indeed it is but just can´t stand many the comments in the forums anymore, used to be great but nowadays so full of fanbois and crazies claiming outrageous things and claim info they have no back up for what so ever even before products hit the market or NDA released?    I mean at least let the card be reviewed first before claiming to know all detail rights? Now it is and I for one think this looks quite promising I have to say, cannot wait for better optimized software + full cover block on this GPU (or at least non reference cooler) and see what it really can do? Competition is a beautiful thing so let the games begin..

Sorry about my English guys but I do try my best here.. Just let me know if I mess up to much and I´ll try correct.


----------



## Frick (Oct 24, 2013)

bananarepublic said:


> "AMD's Radeon R9 290X shows fantastic clock scaling with GPU voltage, better than any GPU I've previously reviewed. The clocks do not show any signs of diminishing returns, which leads me to believe that the GPU could clock even higher with more voltage and cooling"
> 
> Ohh mine, why did I not read your review of R9 290X first?
> 
> ...



The english is fine. 

It seems the same thing happens in most places to be honest. Each and every time something new is released the entire internet turns to foobar'd fud. Speculation and excitement is one thing, namecalling and slinging made up BS around is just annoying.

EDIT: Don't get me wrong, I've had my share of rageboners, but to see it on a large scale is interesting.


----------



## jamsbong (Oct 24, 2013)

I just scroll through the review pretty quickly and already I'm convinced this is the best of the best card I've seen.

I remember Titan was really fantastic but the problem with that card is the enormous price tag. Now, we have a faster card at half price!!

The ATI boys have done it again... This will be great with proper cooling... keep it at 60C with a highend water cooling kit will do the trick.


----------



## N3M3515 (Oct 24, 2013)

Wew i see a lo of butts hurt in this thread.
Power and acustics is not going to stop this cards from selling out.
Performance is excellent, and i suspect modified versions will be perfect and a 10% faster likely.



the54thvoid said:


> It will on Uber mode but that is the sound penalty you will pay for. In Guru's review he quotes AMD as saying 95 degrees is the units operating normal - it can run at that for it's lifetime. If users want a cooler experience they can make it cooler by lowering performance or having higher fan speeds.


+1



Zen_ said:


> I can understand the complaints about noise and high temps, but 50w of electricity? Please. I'm all for higher efficiency, but this is like complaining that a 600 HP Ferrari gets poor fuel economy. Even if you gamed 40 hours a week, that 50w loss is 2 kw of power. Where I live that is 24 cents, or $1 a month, or $12 a year.


+1



sweet said:


> Because the price of the two systems are the same, mate. Do the math, 290X   water cool = 650-750$ = high end custom 780.
> If you don't prefer the wc solution, just wait for a custom version. Around 600$ for a world beater.


+1


----------



## bananarepublic (Oct 24, 2013)

Frick said:


> The english is fine.
> 
> It seems the same thing happens in most places to be honest. Each and every time something new is released the entire internet turns to foobar'd fud. Speculation and excitement is one thing, namecalling and slinging made up BS around is just annoying.
> 
> EDIT: Don't get me wrong, I've had my share of rageboners, but to see it on a large scale is interesting.



Thanks my friend, I often misspells etc but with correction on here it´s just grammar to think about hehe. 

Perhaps you are right, maybe it´s just getting worse for every year that goes by or my memory could be failing? I try to imagine I talk face to face when online and it does help in those situations when I´m irritated cause you just don´t talk however you want when face to face to a person in my opinion or I don´t anyway and suspect you guys dont either. Respect seam somewhat lost these days, also just don´t lie and make stuff up it´s not that hard right?

Either way sorry not my intention to start any OT talk here the first thing I do with my brand new techpowerup account! 

I am so excited to see what these chips can produce when cooling gets better, if I buy one it will be a investment and have to save up the money first so I´m really scared to make the wrong decision cause would hurt me economically and set me back a lot so gotta at least wait a little while before making decision + see what Nvidia counter with I guess?

Going to be very interesting to see what what these reference cards can do with full cover blocks on them? Sure I know it´s all guessing at this stage but still very curios especially when taking into consideration that GK110 seam like a dream for many OC´ers


----------



## Tonim89 (Oct 24, 2013)

theonedub said:


> I don't know about you, but $549 and $659 are both ridiculously expensive cards. Its not as if AMD just brought us back to the $399 flagship era, the card is still far from what I, and what I presume other people would consider, affordable.
> 
> Like I was saying, props to AMD for taking a step in the right direction, but there is still a long way to go for both companies to get to a point where the word affordable becomes appropriate.



I agree with you, and I really miss 2009, when 5870 doubled the GTX 285 performance for $399 

But given the actual price tag of the high end cards, this is definitely a step forward. I really think Nvidia will have to slash heavily the prices. I can't wait to see the impact of R9-290 (non x) out there, GTX 780 performance for $450 will be insane!


----------



## newtekie1 (Oct 24, 2013)

johnnyfiive said:


> To the whiners...
> 
> 
> How much is the GTX 780 and Titan again? And how much is the R290x again? AMD is giving the people what they want... a affordable MONSTER of a card. If you're worried the cooler isn't up to par, get an extended warranty. This card is win/win/win for price/performance/features. So what if its loud? So what if it gets hot? Who fricking cares about that, seriously? I have two 480's and it plays everything great. I've had em over 3 years and they're still working fine. Be quiet already whiners! The R290x is exactly what PC gamers needed, something to stop nvidia from releasing these stupid ridiculous $600+ cards that aren't worth the ridiculous asking price.



Relying on price a your sole saving grace isn't a good idea.  That only works if nVidia can't lower prices.  We already know both the Titan and 780 were very high margin parts, so nVidia's only answer has to be a price reduction.  Yeah, it is great that there is finally competition to drive down the insane prices, but this isn't a ground breaking card any way you look at it.



radrok said:


> Yes I know that Nvidia gets reference right but they charge a well damn hefty premium for it.



No, they charge a premium for the performance.  Titan and the 780 were expensive because there was nothing that could compete with them, now there there is we should see a price drop on the 780, as well as the 780Ti being priced pretty similar to the 290X.


----------



## Hilux SSRG (Oct 24, 2013)

theonedub said:


> I don't know about you, but $549 and $659 are both ridiculously expensive cards. Its not as if AMD just brought us back to the $399 flagship era, the card is still far from what I, and what I presume other people would consider, affordable.
> 
> Like I was saying, props to AMD for taking a step in the right direction, but there is still a long way to go for both companies to get to a point where the word affordable becomes appropriate.



Agreed.  But Amd is not moving in the "right" direction, they are going in circles.

Two years ago the 7970 arrived for $549, then the 7970 Ghz arrived a year later for $500.

Now the 290x arrives a year later for $549 again, what should we expect next year when the 20nm isn't ready 290x Ghz for $500?

It's a cool enough card that runs hot with that stock cooler but it's not worth the price they are asking for.  Nvidia is pulling the same bullshit pricing.  

Good review W1zzard.


----------



## N3M3515 (Oct 24, 2013)

Crap Daddy said:


> Noise? What did you expect from a cheap $550 card? If you want silence it'll cost you $1000!



That was hilarious 

One can also say, it costs $650


----------



## Mathragh (Oct 24, 2013)

N3M3515 said:


> That was hilarious
> 
> One can also say, it costs $650



I still think the argument about silence is daft at best.

Only 2 decibel above titan, a card seen as fairly silent, is not all of a sudden loud by any means.

And don't start about that DECIBEL IS MOAR THAN +1 etc etc. The whole reason the decibel scale is the way it is, is because the energy, or "loudness" of sound also decreases exponentially with the distance. Furthermore, human hearing also is not an exact instrument, but regards the loudness of sound exactly the right way to correct for this exponential increase/decrease in sound level. 

So, the 2dB increase is not ACTUALLY OVER 9000!! in every perceivable way (important for humans) it is 2/38ths louder than titan. This is still not loud.


----------



## N3M3515 (Oct 24, 2013)

Am* said:


> I'm glad this card just took a huge shit on Nvidia's "boutique card" parade to be honest.



+1 Greedy nvidia.


----------



## newtekie1 (Oct 24, 2013)

Mathragh said:


> I still think the argument about silence is daft at best.
> 
> Only 2 decibel above titan, a card seen as fairly silent, is not all of a sudden loud by any means.
> 
> ...



Yes, but at the 2dBa louder setting it also is about 5% slower than Titan, performance is actually at the 780 level at that point.  For the 290X to top Titan it is 10dBa louder, which is very noticeable to the human ear.


----------



## Xzibit (Oct 24, 2013)

I don't know guys

All this sound & heat talk has me convinced that most people run an open air test bench setup between their keyboard and monitor while gaming and the exhaust is pointed at there face.


----------



## N3M3515 (Oct 24, 2013)

newtekie1 said:


> The card has potential for sure, but AMD really should have worked on the stock cooler more. There is no getting around that. The price is the only saving grace, and since rumor has it the 780Ti will just be Titan with 3GB of RAM and likely price the same or slightly lower than 290X and with a stock cooler that doesn't deafen people, the price advantage will be short lived.



And when that happens i think amd might just lower msrp to $500, modified versions for $550, then whatever happens is good for us all, lower prices!

PD: also, after seing how good this card can be with custom versions: custom 780ti vs custom 290x, i bet 290x ends on top.



15th Warlock said:


> Haters are gonna say this card is loud, hot and inefficient, who cares? the performance is there, and for a reasonable price too! This goes on and on, Prescott, Bulldozer, GTX480, Titan, and now the 290X... Whatever, people will always find something to complain about.


+1
Dude, you nailed it. People wil ALWAYS find something to complain about. It's like women always complaining about something, life isn't perfect people!


----------



## newtekie1 (Oct 24, 2013)

N3M3515 said:


> And when that happens i think amd might just lower msrp to $500, modified versions for $550, then whatever happens is good for us all, lower prices!
> 
> PD: also, after seing how good this card can be with custom versions: custom 780ti vs custom 290x, i bet 290x ends on top.



Yes, it is nice that AMD is finally able to compete and prices will come down to reasonable levels.  Of course I'm betting this will be short lived as nVidia will probably release what they've been working on for the past year in like January or something.


----------



## buildzoid (Oct 24, 2013)

I wonder if the cooler mounting holes are in a position to support the red mod (ziptying a CLC to the gpu core) because if they do an h90/70 or antec h2o 920 should be able to cool one of these pretty well for a reasonable price.


----------



## Am* (Oct 24, 2013)

the54thvoid said:


> Yeah, my VRM's dont help me game at 1136 core (143 MHz above max stock boost of 993) or bench at 1202Mhz (209MHz above stock max boost).  Still......
> 
> And your definition of 'rape' is a tad off the mark.  Single digit %'s does not cry 'rape'.  Though in uber mode you could carry our many nefarious acts and not be heard.
> 
> However, I do believe that under proper cooling the card beats Titan in all fair fights.  It might use a lot more juice but that's not really a point for enthusiasts.



But that's with your uber-expensive aftermarket watercooling setup (with blocks that cover the VRMs I presume) on top of the $1000 a pop GPUs. Add in the fact that: 

A: GK110 had at least 6 months head start on drivers vs R290X with early release drivers 

B: The potential of Mantle (yeah yeah, highly debatable at this point, but the potential is there, unlike for Nvidia)

C: Aftermarket versions potential (beefier VRM versions imminent)

D: R290X is nearly HALF the bloody price of the Titan, and that's even at the current ridiculous launch prices of the R290X that they are charging at the moment (nearly $1=£1 conversion rate). I wouldn't care too much if they weren't priced so far apart...but to say the Titan didn't get destroyed by the R290X is more than a little unfair as well IMHO -- everybody was shouting defeat from the rooftops when GK104 came out last year and barely matched the 7970

E: Full DirectX 11.2 hardware support (might not mean much to me, especially now, but some peeps with uber setups like yours will at least take a passing look at this)

When Nvidia decide to do the same as AMD and come out with a $600 GPU that can beat a $999 GPU from AMD, I will be jumping on it in an instant to upgrade my current card. The last cards Nvidia released that came even close to doing this were their golden era G80s/G90s/G92s. So far, for the past 4+ years Nvidia have been, at best, matching AMD at a higher or same price -- and absolutely wallet-raping at every tier above with their Apple-like pricing structure (their "people will pay more for our shit because we're pretentious with loads of gimmicks under our belt and brand recognition" plan): might work fine for their silly "fans" who don't know any better -- doesn't mean much for the average informed gamer/enthusiast. I'm getting a bit tired of it TBH -- I was initially going to wait it out to see what Maxwell brings but so far I see no reason to play the waiting game any more -- especially with Nvidia, seeing how they are just about matching everything AMD has already available and almost never trying to beat them by a decent noteworthy price or performance margin these days. Might make me buy this card sooner than later methinks.


----------



## Hilux SSRG (Oct 24, 2013)

Am* said:


> B: The potential of Mantle (yeah yeah, highly debatable at this point, but the potential is there, unlike for Nvidia)



Mantle = Dead [at this second].  

No game is currently running on Mantle and therefore no benches to compare with DirectX/OpenGL/etc.


----------



## Assimilator (Oct 24, 2013)

While this card may be the big kahuna I am personally more interested in the 290. I have a feeling that it will boast significantly better thermals and power than 290X at only slightly lower performance.

Either way NVIDIA is between a rock and a hard place now. Maxwell is still half a year away, it will be interesting to see what Team Green does to keep Kepler competitive. 780 Ti and price cuts are obvious, but will they be enough?


----------



## Xzibit (Oct 24, 2013)

Assimilator said:


> While this card may be the big kahuna I am personally more interested in the 290. I have a feeling that it will boast significantly better thermals and power than 290X at only slightly lower performance.



Leaks of the 290 are showing up now


----------



## newtekie1 (Oct 24, 2013)

Am* said:


> R290X is nearly HALF the bloody price of the Titan, and that's even at the current ridiculous launch prices of the R290X that they are charging at the moment (nearly $1=£1 conversion rate). I wouldn't care too much if they weren't priced so far apart...but to say the Titan didn't get destroyed by the R290X is more than a little unfair as well IMHO -- everybody was shouting defeat from the rooftops when GK104 came out last year and barely matched the 7970



So what, AMD knew the price of Titan and beat it.  Titan has sat at that price for at least 8 months now.  It is very easy for nVidia to just lower the price, they've been raking in the money with these cards.  And no one actually thinks nVidia has to sell Titan at $1000 to make money, we all know it is an extremely high margin card.

So, no Titan didn't get destroyed by the 290X.  AMD released a card that finally matches Titan, and you haven't even given nVidia 24 hours to adjust prices and you are already claiming Titan was destroyed by the 290X?  That is a little bit unfair.

And everyone was shouting defeat when GK104 came out and matched the 7970 because everyone knew GK104 was _supposed_ to be a mid-range GPU, yet it toppled AMD's best offering and allowed nVidia to hold back the release of GK110.  Oh, and GK104 also managed better power, heat, and noise too...



Assimilator said:


> While this card may be the big kahuna I am personally more interested in the 290. I have a feeling that it will boast significantly better thermals and power than 290X at only slightly lower performance.
> 
> Either way NVIDIA is between a rock and a hard place now. Maxwell is still half a year away, it will be interesting to see what Team Green does to keep Kepler competitive. 780 Ti and price cuts are obvious, but will they be enough?




Why wouldn't they be enough?  They way I see it is all they have to do is drop the 780 price $125 and release the 780Ti at $550 and they've got AMD beat.  *Assuming the 780Ti rumors are true and it is just Titan with 3GB of RAM.


----------



## buildzoid (Oct 24, 2013)

newtekie1 said:


> So what, AMD knew the price of Titan and beat it.  Titan has sat at that price for at least 8 months now.  It is very easy for nVidia to just lower the price, they've been raking in the money with these cards.  And no one actually thinks nVidia has to sell Titan at $1000 to make money, we all know it is an extremely high margin card.
> 
> So, no Titan didn't get destroyed by the 290o adjust prices and you are already claiming Titan was destroyed by the 290X?  That is a little bit unfair.
> 
> And everyone was shouting defeat when GK104 came out and matched the 7970 because everyone knew GK104 was _supposed_ to be a mid-range GPU, yet it toppled AMD's best offering and allowed nVidia to hold back the release of GK110.  Oh, and GK104 also managed better power, heat, and noise too...



Except Nvidia has been adopting an Apple like policy locking down voltage making the card do everything for you and even has aluminium vanity coolers(at least they don't suck) they charge higher prices for the same performance as the competition(gtx 770 vs 280x) the only thing left is to not drop the price on the Titan(very likely considerig the gtx 770 price and the leaked/rummored gtx 780 ti specs). From what I just said above it should be obvious what Nvidia is likely to do with the Titan's price.


----------



## purecain (Oct 24, 2013)

I thought i'd boost my memory subsystems while I had my wallet out... 

heres mine 


Dear purecain, Your Order has been packed and is in the process of being shipped.
----------------------------------------------------------
Goods Shipped:
£391.66 x 1 - HIS Radeon R9 290X Boost "BF4 Edition" 4096MB GDDR5 PCI-Express Graphics Card
£141.66 x 1 - Kingston HyperX Beast 16GB (2x8GB) PC3-19200C11 2400MHz Dual Channel Kit (KHX24C11T3K2/16X)

I can not wait for tomorrow morning...


----------



## radrok (Oct 24, 2013)

If someone/something "raped" Titan then it would be Nvidia itself with its stupid reference-locking and voltage-locking policy and price, of course.

The GTX 780 has always been the better buy over Titan, especially because it was allowed to shine through AIBs customizations.

I'm completely positive and sure that if Titan had been launched at 549$ as 780 in February it would have been an utter success with probably more units sold than actual 780 and Titan combined.


Also congratulations purecain  Can't wait to see you post some scores in our valley thread


----------



## GSG-9 (Oct 24, 2013)

buildzoid said:


> ...they charge higher prices for the same performance as the competition(gtx 770 vs 280x)...



The 770 was released 5 months ago you nut, there was no 280x to compare it to.


----------



## buildzoid (Oct 24, 2013)

GSG-9 said:


> The 770 was released 5 months ago you nut, there was no 280x to compare it to.



Once the 280x did come out they said they won't be dropping the price anyway so my point still stands. In that same press release they also said that they will not lower the Titan's price tag. So far they have kept to the gtx 770 part of that press release now we just have to see if Nvidia really thinks that a shiny cooler and a really (imo)cheesy  name are worth 1000$.


----------



## Fluffmeister (Oct 24, 2013)

Hey, at least the GTX 480 has finally been validated.


----------



## newtekie1 (Oct 25, 2013)

buildzoid said:


> Except Nvidia has been adopting an Apple like policy locking down voltage making the card do everything for you and even has aluminium vanity coolers(at least they don't suck) they charge higher prices for the same performance as the competition(gtx 770 vs 280x) the only thing left is to not drop the price on the Titan(very likely considerig the gtx 770 price and the leaked/rummored gtx 780 ti specs). From what I just said above it should be obvious what Nvidia is likely to do with the Titan's price.



Except they haven't been charging higher prices for the same performance.  The GTX770 has pretty steadily outpaced the 7970 in price/performance for months now.



buildzoid said:


> Once the 280x did come out they said they won't be dropping the price anyway so my point still stands. In that same press release they also said that they will not lower the Titan's price tag. So far they have kept to the gtx 770 part of that press release now we just have to see if Nvidia really thinks that a shiny cooler and a really (imo)cheesy  name are worth 1000$.



You really think they are that stupid?  Where did they say they wouldn't drop the price of the GTX770? And even if they did, they've said one thing and then done something totally different the next day, because the industry changed that quickly.  Basically, nVidia has the upper hand right now, they can tailor their line-up to best AMD very easily.

Why do you think they haven't even bothered to release specs for the 780Ti?  I'll tell you why, because they were waiting to see what the 290X was capable of so they could tailor the 780Ti to beat it.


----------



## Am* (Oct 25, 2013)

newtekie1 said:


> So what, AMD knew the price of Titan and beat it.  Titan has sat at that price for at least 8 months now.  It is very easy for nVidia to just lower the price, they've been raking in the money with these cards.  And no one actually thinks nVidia has to sell Titan at $1000 to make money, we all know it is an extremely high margin card.
> 
> So, no Titan didn't get destroyed by the 290X.  AMD released a card that finally matches Titan, and you haven't even given nVidia 24 hours to adjust prices and you are already claiming Titan was destroyed by the 290X?  That is a little bit unfair.
> 
> And everyone was shouting defeat when GK104 came out and matched the 7970 because everyone knew GK104 was _supposed_ to be a mid-range GPU, yet it toppled AMD's best offering and allowed nVidia to hold back the release of GK110.  Oh, and GK104 also managed better power, heat, and noise too...



A few things you seem to be forgetting:

1. GK104 was a gaming-only GPU, Tahiti was AMD's one-size-fits-all markets GPU, which absolutely mopped the floor with the GK104 Kepler in every non-proprietary compute bench (and even gave the Titan a good run for its money)

2. GK110 was nowhere near ready at the time and, had the 7970 been much faster, Nvidia would've pulled another infamous wood screws launch

3. AMD's flagship GPUs supersede/compete mainly against their own, previous gen flagship GPUs (in which case, AMD are doing pretty well). Nvidia's pricing is merely a side effect of this. Outside of this, Nvidia and AMD do not compare even remotely in R&D costs, revenue etc to each other so the fact that AMD are managing to pull GPUs like this off that even compare to Nvidia's on a performance level is astonishing to say the least, seeing how Nvidia are a GPU only company while AMD split their already vastly limited resources between their CPU and GPU divisions. 

At least give them credit where it is due.


----------



## The Von Matrices (Oct 25, 2013)

Xzibit said:


> I don't know guys
> 
> All this sound & heat talk has me convinced that most people run an open air test bench setup between their keyboard and monitor while gaming and the exhaust is pointed at there face.



You need to reread how W1zzard does his testing before you make condescending statements.  To quote a few important points:



> "The tested graphics card was installed in a system that was completely cooled system passively. That is, passive PSU, passive CPU cooler, and passive cooling on the motherboard and solid state drive.
> 
> The measurement was conducted at a distance of 100 cm and 160 cm off the floor."



This is NOT an open air test bench, and the 50dB reading is over 1m away from the case.  This is the noise you would experience if the case was on the floor next to a desk.  Unless you are deaf and crank your speakers up to 100dB, this card will be audible and _very_ annoying in "uber" mode.


----------



## Fluffmeister (Oct 25, 2013)

Am* said:


> A few things you seem to be forgetting:
> 
> 1. GK104 was a gaming-only GPU, Tahiti was AMD's one-size-fits-all markets GPU, which absolutely mopped the floor with the GK104 Kepler in every non-proprietary compute bench (and even gave the Titan a good run for its money)



The GK104 has been a massive success for Nvidia, and as you said yourself it was purely focused on gaming and it has and still does it's job brilliantly. No one gives a toss about compute benchmarks they never run. Can't say I've missed out on anything running my little GTX 670, in fact it does me proud every day.

At least give them credit where it is due.



Am* said:


> 2. GK110 was nowhere near ready at the time and, had the 7970 been much faster, Nvidia would've pulled another infamous wood screws launch



And yet GK110 powered cards like the K20 and K20X have been on the market for ages already, the chip that powers Titan is literally over a year old already.

At least give them credit where it is due.



Am* said:


> 3. AMD's flagship GPUs supersede/compete mainly against their own, previous gen flagship GPUs (in which case, AMD are doing pretty well). Nvidia's pricing is merely a side effect of this. Outside of this, Nvidia and AMD do not compare even remotely in R&D costs, revenue etc to each other so the fact that AMD are managing to pull GPUs like this off that even compare to Nvidia's on a performance level is astonishing to say the least, seeing how Nvidia are a GPU only company while AMD split their already vastly limited resources between their CPU and GPU divisions.
> 
> At least give them credit where it is due.



Can't argue with that, but you make your bed and you lie in it.


----------



## Xzibit (Oct 25, 2013)

The Von Matrices said:


> You need to reread how W1zzard does his testing before you make condescending statements.  To quote a few important points:
> 
> 
> 
> This is NOT an open air test bench, and the 50dB reading is over 1m away from the case.  This is the noise you would experience if the case was on the floor next to a desk.  Unless you are deaf and crank your speakers up to 100dB, this card will be audible and _very_ annoying in "uber" mode.



I did read it and I wasn't talking about him but since you brought it up






Compare that to Tom's Hardware Noise test which is Open Air at half the distance.

Well I hope you can tell the discrepancy


----------



## Disruptor4 (Oct 25, 2013)

And I thought, 2 years ago, that buying a HX 850W would be enough to future proof a bit. Oh how I was wrong!


----------



## HTC (Oct 25, 2013)

Xzibit said:


> I did read it and I wasn't talking about him but since you brought it up
> 
> http://tpucdn.com/reviews/AMD/R9_290X/images/kjaer_setup.jpg
> 
> ...



Is that the actual noise in uber mode? I can see why W1zzard said the neighbors complained because of the noise when he was benching @ night.

Calling it nasty doesn't quite cover it, IMO.


----------



## Xzibit (Oct 25, 2013)

HTC said:


> Is that the actual noise in uber mode? I can see why W1zzard said the neighbors complained because of the noise when he was benching @ night.



or his testing room is acting like a parabolic reflector to his neighbors.


----------



## SIGSEGV (Oct 25, 2013)

15th Warlock said:


> LOL!
> 
> Haters are gonna say this card is loud, hot and inefficient, who cares? the performance is there, and for a reasonable price too! This goes on and on, Prescott, Bulldozer, GTX480, Titan, and now the 290X... Whatever, people will always find something to complain about.



+1
you don't say..






It looks like that this card somehow throttled at 92 -94*C zone. I'm afraid that this card would still throttled at 92-94*C even using water cooling setup. Is there any possibility that this card would go into 75*C-80*C zone with driver optimization?


----------



## tacosRcool (Oct 25, 2013)

So this at $550, what is Nvidia gonna do? The GTX 780 ti doesn't look appealing at all.


----------



## newtekie1 (Oct 25, 2013)

Am* said:


> A few things you seem to be forgetting:
> 
> 1. GK104 was a gaming-only GPU, Tahiti was AMD's one-size-fits-all markets GPU, which absolutely mopped the floor with the GK104 Kepler in every non-proprietary compute bench (and even gave the Titan a good run for its money)
> 
> ...



There are some things you seem to be forgetting:

These cards are being sold mostly to gamers, so no one cares how good the cards are at GPU Compute tasks.  You're making the same argument that people made to defend Fermi.  AMD should have learned from nVidia's mistake and learned the GPU Compute doesn't sell desktop GPUs.

Of course if the 7970 had been faster nVidia wouldn't have been able to use GK104 to compete, but it wasn't faster.  I might as well say if the GTX580 had been a little bit faster, nVidia wouldn't have needed to release GK104 at all.  You can live in a land of IF's all you want, but it doesn't help you make a point.

The fact is no one but nVidia knows how ready GK110 was.  We certainly know the fab was capable of producing GK110 when GK104 was launch, we also know the architecture was ready.  So it really comes down to yields, and obviously since nVidia knew what they had to compete with, they went with GK104 because it had much better yields.  But I bet GK110 would have been possible in place of the GK104 cards we got if nVidia had to use it.

And no, AMD cards don't compete mainly against their own previous generation, they compete against nVidia.  And we should be cutting AMD slack just because they mis-managed their company and now have no R&D funds.


----------



## Xzibit (Oct 25, 2013)

SIGSEGV said:


> +1
> you don't say..
> 
> http://s24.postimg.org/3nmdnj3d1/Screenshot_from_2013_10_25_08_01_51.png
> ...



The PowerTune is temp controlled so you can adjust by temp but the fan will speed up to maintain the low temp.

Card operated on limits
Temp = 95C
Silent Fan = 40%
Uber Fan= 55%

If you want to go beyond the fans 55% in uber you then become limited by temp.

Its a give and take.  With water the temp will be lower thus telling the card not to speed up the fan so temp nor the fan limit will be hit until you start pushing it beyond the default settings. depending on how well its cooled your then looking at a chip/memory limit rather then a temp/fan limit.


----------



## TheoneandonlyMrK (Oct 25, 2013)

radrok said:


> I honestly don't get what's all this fuss about power consumption and temperatures.
> 
> Who in earth buys a high end graphics card without having its cooling planned?
> 
> ...



Exactly  , I would rather  watercool it on my own anyway but im skipping this gpu as I don't have the pixels for it.


----------



## The Von Matrices (Oct 25, 2013)

newtekie1 said:


> These cards are being sold mostly to gamers, so no one cares how good the cards are at GPU Compute tasks.  You're making the same argument that people made to defend Fermi.  AMD should have learned from nVidia's mistake and learned the GPU Compute doesn't sell desktop GPUs.



I agree that Titan was targeted to gamers as well as compute users, but your claim that Titan won't sell as a compute card by comparing it to GTX 480 is ridiculous.  GTX 480 wasn't a compute card.  The two cards are completely different.

There are two reason to buy a Tesla over a GeForce:
1) Unrestricted double precision compute
2) Stability features like ECC RAM
3) Better drivers which help with utilization with multiple cards and nodes

Back in the GF100 days, all three differences applied, so there was a good reason to pay $3000 for a Tesla card that could get 4x the double precision performance of a $500 GTX 480, which was also handicapped with 1.5GB of VRAM compared to 3 or 6GB on Tesla cards.  _This_ is why no one bought a GTX 480 for compute.

Today, you can get a Titan with unrestricted double precision compute; a single Titan can match a single Tesla in double precision compute.  Titan also has gobs of VRAM, which helps immensely in visualization.  If you want a single card for compute, nothing can beat Titan at $1000.

Sure, NVidia won't sell as many Titans at $1000 now that R9 290X exists, but to say that Titan is a dead product at $1000 is a very narrow viewpoint that omits how much of a steal it is for compute users compared to Tesla.


----------



## MxPhenom 216 (Oct 25, 2013)

The Von Matrices said:


> I agree that Titan was targeted to gamers as well as compute users, but your claim that Titan won't sell as a compute card by comparing it to GTX 480 is ridiculous.  GTX 480 wasn't a compute card.  The two cards are completely different.
> 
> There are two reason to buy a Tesla over a GeForce:
> 1) Unrestricted double precision compute
> ...



Meh, Titan still isn't that great in compute. There are quite a few benchmarks that still show that the 7970 can beat it by quite a bit. 

Real compute beasts from Nvidia are not affordable to the average consumer.


----------



## qubit (Oct 25, 2013)

Too little too late and that noise is a complete dealbreaker.

"AMD should have invested some time and money into developing their own high-end cooler, like NVIDIA did for the GTX Titan. The noise figures of this reference card only go on to show that AMD should urgently allow its board partners to launch cards with non-reference air coolers that can handle the heat at saner noise levels." Says it all.

It should only come in at around a 6.5-7.5 score I think. 9.3 editor's choice is being far too generous to this card. It might be priced a lot lower than a Titan, but then it's a lot less card too, so I dunno why wizzy is bowled over by its price and given it such a high score.


----------



## MxPhenom 216 (Oct 25, 2013)

qubit said:


> Too little too late and that noise is a complete dealbreaker.
> 
> "AMD should have invested some time and money into developing their own high-end cooler, like NVIDIA did for the GTX Titan. The noise figures of this reference card only go on to show that AMD should urgently allow its board partners to launch cards with non-reference air coolers that can handle the heat at saner noise levels." Says it all.
> 
> It should only come in at around a 6.5-7.5 score I think. 9.3 editor's choice is being far too generous to this card. It might be priced a lot lower than a Titan, but then it's a lot less card too, so I dunno why wizzy is bowled over by its price and given it such a high score.



I don't know why you think its less of a card than Titan. Other than the heat and noise, for the price the card is a winner. Most people even considering it right now already plan to throw a waterblock on it, or are waiting for board partners to release their designs. 

There is one issue I have with the reference design though. The fact that uber mode shouldn't be used with the stock cooler for day to day use as noted by Wiz. Who wants to hold back their $550 video card purchase? Unleash that shit, but then you have to throw a $100 water block one it to do it comfortably IMO.


----------



## The Von Matrices (Oct 25, 2013)

MxPhenom 216 said:


> Meh, Titan still isn't that great in compute. There are quite a few benchmarks that still show that the 7970 can beat it by quite a bit.
> 
> Real compute beasts from Nvidia are not affordable to the average consumer.



You need to do double precision for Titan to make any sense.  I agree, AMD can beat it in single precision.



qubit said:


> Too little too late and that noise is a complete dealbreaker.
> 
> "AMD should have invested some time and money into developing their own high-end cooler, like NVIDIA did for the GTX Titan. The noise figures of this reference card only go on to show that AMD should urgently allow its board partners to launch cards with non-reference air coolers that can handle the heat at saner noise levels." Says it all.
> 
> It should only come in at around a 6.5-7.5 score I think. 9.3 editor's choice is being far too generous to this card. It might be priced a lot lower than a Titan, but then it's a lot less card too, so I dunno why wizzy is bowled over by its price and given it such a high score.



I think that a lot of people see Titan and think that since R9 290X is half the price you can accept its flaws.  What I argue is that R9 290X is still a $550 card, which is a LOT of money.  You shouldn't have to put up with this when you spend that much money.  NVidia did AMD a favor, because if all that existed was the $650 GTX 780 and there was no comparison to the $1000 Titan, I think the conclusions would be much different.


----------



## sweet (Oct 25, 2013)

qubit said:


> Too little too late and that noise is a complete dealbreaker.
> 
> "AMD should have invested some time and money into developing their own high-end cooler, like NVIDIA did for the GTX Titan. The noise figures of this reference card only go on to show that AMD should urgently allow its board partners to launch cards with non-reference air coolers that can handle the heat at saner noise levels." Says it all.
> 
> It should only come in at around a 6.5-7.5 score I think. 9.3 editor's choice is being far too generous to this card. It might be priced a lot lower than a Titan, but then it's a lot less card too, so I dunno why wizzy is bowled over by its price and given it such a high score.



It only scored 9.3 because of the heat and noise. For its superior performance alone, it would have been a *10*. Furthermore, another *10* score for its ideal price would be reasonable.


----------



## MxPhenom 216 (Oct 25, 2013)

sweet said:


> It only scored 9.3 because of the heat and noise. For its superior performance alone, it would have been a *10*. Furthermore, another *10* score for its ideal price would be reasonable.



I do not think Wizzard gives 10 scores, except for the DirectCU Top GTX670 a while back, but I think that might have been a mistake.  A lot of people were buying those cards due to that score, and then a lot of them were failing IIRC.


----------



## sweet (Oct 25, 2013)

MxPhenom 216 said:


> I do not think Wizzard gives 10 scores, except for the DirectCU Top GTX670 a while back, but I think that might have been a mistake.  A lot of people were buying those cards due to that score, and then a lot of them were failing IIRC.



Completely agree! A perfect 10 should only be rewarded to a flagship, not a mere harvest chip.


----------



## MxPhenom 216 (Oct 25, 2013)

sweet said:


> Completely agree! A perfect 10 should only be rewarded to a flagship, not a mere harvest chip.



I disagree. No piece of computer hardware should ever get a 10. There will always be some sort of flaw.


----------



## sweet (Oct 25, 2013)

MxPhenom 216 said:


> I disagree. No piece of computer hardware should ever get a 10. There will always be some sort of flaw.



That's the point. The flaw of any non-flagship GPU is that it will be beaten in performance. Therefore, the perfect 10, if ever exists, should only belong to the performance king.


----------



## xorbe (Oct 25, 2013)

Xzibit said:


> or his testing room is acting like a parabolic reflector to his neighbors.



This actually happened to me once.  When I tracked down the "bong" sound that started at 5:30 am for 2 week ... it was a neighbor playing a video game, with a decent sized plastic computer speaker across _two parking lots_ at his window frame (aimed inside), and it wasn't even that loud there, but it resonated in my bedroom.  True story.  He was nice and used headphones from then on.


----------



## Loosenut (Oct 25, 2013)

sweet said:


> That's the point. The flaw of any non-flagship GPU is that it will be beaten in performance. Therefore, the perfect 10, if ever exists, should only belong to the performance king.



All kings eventually lose their crowns


----------



## mastershake575 (Oct 25, 2013)

qubit said:


> Too little too late


 As a current AMD card owner, even I agree 

Don't get me wrong, the 290x is fast and the price is good but it isn't the 780 killer that most people hyped it as (hence why people are comparing it to the Titan instead of the 780). 

Stock vs stock and overclocked vs overclocked, its basically the same damn thing as the 780 while being $75 cheaper (NVidia is dropping the price of the 780 to $550-575 in the next few weeks to pretty much cancel things out all while releasing the 780 Ti for $650 that will take single GPU crown). 

I'm not extremely disappointed with this card ($550 is a good price) its just a $550-575 780 and a $650 780 Ti will pretty much negate all the pros of the card (price cut 780 will make the 290x not really a better value of the money since it will be identical and a 780Ti will clearly take the single GPU crown while costing not that much more)

After reading this review im for damn sure waiting for the 20nm GPU"s next year (the GTX 870/pirate island equivalent should offer better performance than the 290x/780 for only $400).


----------



## okidna (Oct 25, 2013)

Good review like always, but I'm curious about something.

From 290X and 280X review :


> The card requires a *6-pin and 8-pin* PCI-Express power connector. This configuration is good for up to *375 W* of power draw.



From 270X review :


> The card requires *two 6-pin* PCI-Express power connectors. This configuration is good for up to *300 W* of power draw.



From 780 and TITAN review :


> The card requires *one 6-pin and one 8-pin* PCI-Express power cable for operation. This power configuration is good for up to *300 W* of power draw.



From 760 review :


> The card requires *two 6-pin* PCI-Express power connectors. This configuration is good for up to *225 W* of power draw.



Is there any difference between AMD's and NVIDIA's power connectors configuration & specification? Or the difference coming from PCI-E 3.0 specification?


----------



## sweet (Oct 25, 2013)

mastershake575 said:


> As a current AMD card owner, even I agree
> 
> Don't get me wrong, the 290x is fast and the price is good but it isn't the 780 killer that most people hyped it as (hence why people are comparing it to the Titan instead of the 780).
> 
> ...



WRONG! They are not the same when overclocked. 290X is a beast when overclocked given that you can cool it down.






http://xtreview.com/addcomment-id-31366-view-Radeon-R9-290X-vs-GeForce-GTX-Titan-benchmarks.html

In the chart above, MSI Lightning 780 actually run at 1300 MHz range, while 290X OC at 1100 MHz often throttles down near 1000 MHz. You can still observe a big different in performance here.

If you believe in TPU chart and claim that 780 is just 1% below 290 silent, let me tell you this fact: The overall chart of TPU is in fact, deceiving. In some adnormal case such as Starcraft 2, Splinter Cell: Blacklist, a.k.a games favoring old DirectX version, AMD's cards generally doesn't perform at their full potential. Only modern DirectX 11 games, which AMD's GCN cards are designed for, can take the best of AMD. The overall result, sadly, is heavily affected by the old tech games, and that doesn't feel right.


----------



## Xzibit (Oct 25, 2013)

mastershake575 said:


> As a current AMD card owner, even I agree
> 
> Don't get me wrong, the 290x is fast and the price is good but it isn't the 780 killer that most people hyped it as (hence why people are comparing it to the Titan instead of the 780).
> 
> ...



I'll be waiting it out aswell. Usually I wait until the next revisions show up and get what I want on Sale.

Seams to do just fine against a 780 HOF clocked at 1006-1059mhz in FireStrike so its not too bad eh.
Galaxy GTX 780 HOF3GB @ $689.99






Then you consider how the 290X achillies heal is the ref cooler which is probably throttling in the test well under 1ghz more like 830-900mhz.

I say not bad for $140 difference.

Now if the HOF cooler was on the 290X then you'd see the 290X OC "uber" number get closer to the R280 DC2TOP CF number.

If it almost best a 780 HOF on both FireStrike scores while still in reference form its not a killer to the ref 780 ?  I guess its perspective then.


----------



## AsRock (Oct 25, 2013)

Would of been nice to see if the positional sound though HDMI worked as it should and if you could tell if some thing was inn front or behind you like you can with older AMD cards.


----------



## The Von Matrices (Oct 25, 2013)

AsRock said:


> Would of been nice to see if the positional sound though HDMI worked as it should and if you could tell if some thing was inn front or behind you like you can with older AMD cards.



If you're referring to TrueAudio, AMD has no driver for it yet, so as of right now it can't be used.


----------



## arterius2 (Oct 25, 2013)

LOL people keep contradicting themselves on this forum.

First, they'll start off with the argument that the 290X is about ~$100 cheaper than the GTX780 (and that the price killed the 780)

and then they'll say "Oh who cares about temp, noise and power consumption", -but you know what? most people do! Then after that, they'll say "Just throw a waterblock on it". 

ummm, if you truly didnt' care about noise and temp why are you throwing a waterblock on it, and how much does a waterblock cost? around $150, so basically, 290x + waterblock cost around $700, that is only if you already have a waterloop setup, if you don't, well you are likely to run into the $1000s+ to have everything setup just to run this card within acceptable parameters.

and don't give me this 'that everyone who buys high end already have a waterloop setup', that is just BS. most people don't run waterloops and those who do are in the small minorities, and the people who makes these claims don't even have waterloops themselves.

and when all arguments fail against them, they'll bring out the performance card, where they claim 290x "absolutely destroyed" titan, um why are you comparing to titan first of all(it LOST to titan in silent mode, and if u are comparing in uber mode, then don't give me this its only 2db louder, because its 12db louder in uber mode), it barely even "destroyed" 780, they are well within 1~5% of each other trading blows. that is not what "destroy" means, "destroy" means beating something by at least 15-20%, otherwise, a few fps difference is barely even noticeable to the human eye.

so to sum it up:
1) they claim its cheaper
2) they ignore noise, temp and power consumption
3) then they say throw a water block on it ignoring the price of a water-cool kit (+$150 for the block and +$400 for the whole setup)
4) selective/inconsistency in comparison of performance of 290x (in uber mode), but using silent mode to compare noise and temp.
5) overhype with words such as "destroy" "kill" "massacre" when benchmarks show they are fairly equal matched.
6) selectively/inconsistency in comparison of price to titan when situation favors them. but compares to 780 when other situations fits them(performance).


----------



## 1d10t (Oct 25, 2013)

SIGSEGV said:


> +1
> you don't say..
> 
> http://s24.postimg.org/3nmdnj3d1/Screenshot_from_2013_10_25_08_01_51.png
> ...



naaahh...it wouldn't touch 92'C mark if you drowned them under water 
Universal GPU block such as my XSPC Rasa have done perfectly well job keeping 7970 below 60'C...even if you furmark'ed them they still even touch 80'C.



okidna said:


> Good review like always, but I'm curious about something.
> Is there any difference between AMD's and NVIDIA's power connectors configuration & specification? Or the difference coming from PCI-E 3.0 specification?



Just like processor counterparts,AMD usually set the TDP higher.What you see is maximum in rare occasion,such as furmark or extreme overvolting.
Another living proof,a mere Seasonic P760W could handle my previous 7970 CF heavy OC'ed without hiccups 



The Von Matrices said:


> If you're referring to TrueAudio, AMD has no driver for it yet, so as of right now it can't be used.



If you referring True Audio as codec,then its driverless.If AMD put True Audio more like hardware audio stream,more likely they will release a driver for it.Personally i have no clue about that and i'm not interested since the day of "passthrough" 

-= edited=-



arterius2 said:


> ummm, if you truly didnt' care about noise and temp why are you throwing a waterblock on it, and how much does a waterblock cost? around $150, so basically, 290x + waterblock cost around $800, that is only if you already have a waterloop setup, if you don't, well you are likely to run into the $1000s+ to have everything setup just to run this card within acceptable parameters.



Because WE LIKE IT! 
Watercooling doesn't always mean performance,it more like aesthetic.Ask anyone around here who drowned their rigs,did they want to go back to the air?


----------



## SIGSEGV (Oct 25, 2013)

arterius2 said:


> so to sum it up:
> 1) they claim its cheaper
> 2) they ignore noise, temp and power consumption
> 3) then they say throw a water block on it ignoring the price of a water-cool kit (+$150 for the block and +$400 for the whole setup)
> ...



meh..



1d10t said:


> naaahh...it wouldn't touch 92'C mark if you drowned them under water
> Universal GPU block such as my XSPC Rasa have done perfectly well job keeping 7970 below 60'C...even if you furmark'ed them they still even touch 80'C.







even my GTX680 DC II T furmark'ed wouldn't touch 50*C 

so, is it safe to say that this card wouldn't hit 65*C under water ?


----------



## NutZInTheHead (Oct 25, 2013)

Nice one AMD.
Now all Nvidia needs to do is drop the price to something equal to or a tad-bit lower than the R9 290X.
And the price of GTX 780 needs to be around $499.


----------



## sweet (Oct 25, 2013)

arterius2 said:


> LOL people keep contradicting themselves on this forum.
> 
> First, they'll start off with the argument that the 290X is about ~$100 cheaper than the GTX780 (and that the price killed the 780)
> 
> ...



Learn math please. 550$ + 150$ = 700$, not 800$. And a 600$ custom card is enough to fix the heat problem.
And I can quote my post above, because you are too lazy to scroll the page up



sweet said:


> WRONG! They are not the same when overclocked. 290X is a beast when overclocked given that you can cool it down.
> 
> http://xtreview.com/images/radeon290X_002.jpg
> http://xtreview.com/addcomment-id-31366-view-Radeon-R9-290X-vs-GeForce-GTX-Titan-benchmarks.html
> ...



The fact is: Titan manages to beat 290X silent mode with the help of *old tech games*. Starcraft 2: DirectX 9, World of Warcraft and Splinter Cell: Blacklist : DirectX 9 engine with some DX11 tweak. If you exclude these games, Titan will be more miserable in the overall result.


----------



## okidna (Oct 25, 2013)

1d10t said:


> Just like processor counterparts,AMD usually set the TDP higher.What you see is maximum in rare occasion,such as furmark or extreme overvolting.



Errrm... Thing is, I'm not asking about TDP (TDP != maximum power draw) and w1zzard didn't even say anything about TDP (he said power configuration), and what I'm asking is exactly that, power configuration. 

Why with the same power configuration (one 6pin + 8pin or 2 x 6pin) AMD cards has higher theoretical maximum power draw than NVIDIA? Is the difference coming from them (special power setup/setting from AMD)? Or is it coming from PCI-E 3.0 standard?


----------



## arterius2 (Oct 25, 2013)

sweet said:


> Learn math please. 550$ + 150$ = 700$, not 800$.
> And I can quote my post above, because you are too lazy to scroll the page up
> 
> 
> ...



im looking at your chart, and it says higher is better, this is an efficiency chart,(not power consumption chart) so obviously higher number = better, it shows 290x having one of the worst performance/watt in the list,  you are reading it wrong! learn math!


----------



## HumanSmoke (Oct 25, 2013)

sweet said:


> WRONG! They are not the same when overclocked. 290X is a beast when overclocked given that you can cool it down.


You can cool any card down. What's your point?

BTW: I don't know if a pre-release leak of a 1920x1080 set of benches tells the whole story. OC3D, Linus, and Tweaktown have all posted OC v OC results and none look like those numbers as an average







sweet said:


> you believe in TPU chart and claim that 780 is just 1% below 290 silent, let me tell you this fact: The overall chart of TPU is in fact, deceiving. In some adnormal case such as Starcraft 2, Splinter Cell: Blacklist, a.k.a games favoring old DirectX version, AMD's cards generally doesn't perform at their full potential. Only modern DirectX 11 games, which AMD's GCN cards are designed for, can take the best of AMD. The overall result, sadly, is heavily affected by the old tech games, and that doesn't feel right.


Testing should be representative of the gaming actually taking place, not how new the game is. DirectX 11 (or a Gaming Evolved title) doesn't automatically warrant inclusion in a bench suite.
Before you get your panties in a (further) wad, I'd see how the reviews play out- since a number of sites have OC vs OC reviews planned.


----------



## arterius2 (Oct 25, 2013)

1d10t said:


> Because WE LIKE IT!
> Watercooling doesn't always mean performance,it more like aesthetic.Ask anyone around here who drowned their rigs,did they want to go back to the air?



I totally understand, but the amount of ignorance here is killing me tho.

first they throw out the price argument by saying 290x is ALOT cheaper (which is not, money saved couldn't even pay for a waterblock)

then they slap down the "enthusiast" card by saying that money essentially doesn't matter for them because they'll drown it underwater anyways. if so, then why make it their primary argument case to begin with?


----------



## sweet (Oct 25, 2013)

HumanSmoke said:


> You can cool any card down. What's your point?
> 
> BTW: I don't know if a pre-release leak of a 1920x1080 set of benches tells the whole story. OC3D, Linus, and Tweaktown have all posted OC v OC results and none look like those numbers as an average
> http://img.techpowerup.org/131025/ttoc.jpg
> ...



The first chart is WRONG, no way a single Titan can reach that high fps. 
About the games, it should be just the medium to compare card. Is there any point to use an old tech game to bench new released cards, which were designed to future proof for at least 2 years later?



arterius2 said:


> I totally understand, but the amount of ignorance here is killing me tho.
> 
> first they throw out the price argument by saying 290x is ALOT cheaper
> 
> then they slap down the "enthusiast" card by saying that money essentially doesn't matter for them because they'll drown it underwater anyways. if so, then why make it their primary argument case to begin with?



You got the wrong picture here!
All hail 290X because of its *performance*. Even with its price, it wouldn't take the spotlight if it couldn't beat Titan. The heat/noise can be compensated with the price, that make 290x an impressive card in many sense.
And most of guys who buy a reference design card will drown it in water, except for Titan because there is no custom Titan. Normal users will buy a custom version, which can fix the heat problem. The wc solution was brought on discussion because the cost for it in 290x case is equal to a high end custom 780, with superior performance.


----------



## arterius2 (Oct 25, 2013)

sweet said:


> The first chart is WRONG, no way a single Titan can reach that high fps.
> About the games, it should be just the medium to compare card. Is there any point to use an old tech game to bench new released cards, which were designed to future proof for at least 2 years later?
> 
> 
> ...



OH OK, so now we are talking about performance!

I think there are enough benchmarks going around now to warrant that the 290X's performance is nothing to write home about, I mean OK, it competes equally against the 780 which came out half a year ago, but it doesn't "obliterate, destroy, massacre" like people make it out to be.


----------



## SIGSEGV (Oct 25, 2013)

HumanSmoke said:


> BTW: I don't know if a pre-release leak of a 1920x1080 set of benches tells the whole story. OC3D, Linus, and Tweaktown have all posted OC v OC results and none look like those numbers as an average
> [u/rl]http://img.techpowerup.org/131025/ttoc.jpg[/url]



wow..  



arterius2 said:


> OH OK, so now we are talking about performance!
> 
> I think there are enough benchmarks going around now to warrant that the 290X's performance is nothing to write home about, I mean OK, it competes equally against the 780 which came out half a year ago



ok buddy, you are the winner..


----------



## arterius2 (Oct 25, 2013)

SIGSEGV said:


> ok buddy, you are the winner..



Not here to win anything, just to point out the obvious.

In all seriousness, the 290x is just a OC'd 780gtx with a shitty cooler that came late to the party at 75$ cheaper


----------



## SIGSEGV (Oct 25, 2013)

arterius2 said:


> Not here to win anything, just point out the obvious.
> 
> In all seriousness, the 290x is just a 780gtx with a shitty cooler that came late to the party at 75$ cheaper



what obvious? what would you say then? 
i guess you would say "oh cmon guys, this card is utterly crap, thrash and junk, please avoid them at any cost" 

Yes, we have already knew that this card is hot, moar power and loud. So what?


----------



## sweet (Oct 25, 2013)

arterius2 said:


> OH OK, so now we are talking about performance!
> 
> I think there are enough benchmarks going around now to warrant that the 290X's performance is nothing to write home about, I mean OK, it competes equally against the 780 which came out half a year ago, but it doesn't "obliterate, destroy, massacre" like people make it out to be.




In a DX11 intensive game like Metro LL, it's a obliteration, mate






nVi guys managed to save their faces with old tech games, it's really a shame. And I have not even mentioned 4K yet


----------



## Xzibit (Oct 25, 2013)

Dam that shitty card..






Looks like people are already overclocking stock cards high

R3quiem - Sapphire R9 290X Stock Cooler - Ubermode @ 1161 MHz / 1625 MHz







> *EK Water Blocks facebook*
> 1200/1600 easy peasy, no voltmods at all.


----------



## arterius2 (Oct 25, 2013)

sweet said:


> In a DX11 intensive game like Metro LL, it's a obliteration, mate
> 
> http://tpucdn.com/reviews/AMD/R9_290X/images/metro_lastlight_1920_1080.gif
> 
> nVi guys managed to save their faces with old tech games, it's really a shame. And I have not even mentioned 4K yet



ummm dude, I can selectively pull benchmarks to my favor as well. doesn't show the whole picture, here:






（btw, I love how you pulled the 1080p benchmarks to make your point while there are 4k benchmark showing ~2 fps difference between the two, so lets mention 4k.)


----------



## sweet (Oct 25, 2013)

arterius2 said:


> ummm dude, I can selectively pull benchmarks to my favor as well. doesn't show the whole picture, here:
> 
> http://tpucdn.com/reviews/AMD/R9_290X/images/sc_blacklist_5760_1080.gif



Oh you brought out Splinter Cell, a nice candidate for my "old tech" statement. Do you know the age of Unreal Engine 2.5 which runs this game? Google it and you'll know what I'm implying with "old tech"
A quick hint for you from Unreal Engine's wiki


> On March 24, 2011, Ubisoft Montreal revealed that UE2.5 was successfully running on the Nintendo 3DS



And here is the 4k massacre.


----------



## arterius2 (Oct 25, 2013)

sweet said:


> Oh you brought out Splinter Cell, a nice candidate for my "old tech" statement. Do you know the age of Unreal Engine 2.5 which runs this game? Google it and you'll know what I'm implying with "old tech"
> A quick hint for you from Unreal Engine's wiki
> 
> 
> ...



once again, you are using the word massacre to describe a few fps/~10% difference
and once again, you are selectively pulling benchmarks that AMD typically do well in(this time from toms but rejects ones from tweaktown) to favor your argument.

I see no massacre here, all im seeing is exactly what a slightly OC'd 780 or titan would do and would probably still run quieter and cooler than the 290x


----------



## sweet (Oct 25, 2013)

arterius2 said:


> once again, you are using the word massacre to describe a few fps/~10% difference
> and once again, you are selectively pulling benchmarks that AMD typically do well in(this time from toms but rejects ones from tweaktown) to favor your argument.
> 
> I see no massacre here, all im seeing is exactly what a slightly OC'd 780 or titan would do and would probably still run quieter and cooler than the 290x



Oh sorry. I just want to fit your word in this case. And grasp a calculator, mate. It is not 10% in overall. Good luck overclocking your 780/Titan.

Meanwhile, enthusiasts already put this card under LN2 and are busy breaking world records


----------



## HumanSmoke (Oct 25, 2013)

sweet said:


> And here is the 4k massacre.


Yes quite some massacre. Always looks good when you leave out the benches that don't look so good- let me guess, you left out the Skyrim bench by accident?. I bet George Armstrong Custer is rueing the fact that he couldn't "rt click>save as" the Lakota Sioux he wanted to fight.
Seeing as how you posted so many benches, I assume you were going for the completeness motif- so here's the TESV bench and the CFX/SLI 7680x1440 results:


----------



## N3M3515 (Oct 25, 2013)

arterius2 said:


> once again, you are using the word massacre to describe a few fps/~10% difference
> and once again, you are selectively pulling benchmarks that AMD typically do well in(this time from toms but rejects ones from tweaktown) to favor your argument.
> 
> I see no massacre here, all im seeing is exactly what a slightly OC'd 780 or titan would do and would probably still run quieter and cooler than the 290x



Wow, man, those are 6 AAA games and it beats in all.
Don't forget that if you mention gtx 780 or titan o/c, you have to wait and see what a 290x non ref will do, and as wizz said, overclocking potential of this card is without precedent.

Non reference boards won't cost more than 600 - 650 and will pull even further away from the titan and 780, so even if the 780ti is faster than 780, it will have a tough time trying to catch up to 290x non ref performance, even if it is itself a non reference version.

So, all non reference boards will take care of the noise and temp problems, and in the process augment performance by a 12% more i guess, for 50 - 100 bucks more.

More performance, less noise, less temp. You have to admit those boards will be awesome.

For me the only problem was noise, with that gone i don't care about anything else as long as bang for buck is assured.

Finally, be it with higher temps, noise and power, at the end it took the perf crown, and i personally don't see 780ti taking it back, at least not from non reference 290x boards.

So what? where does that criticism come from?, the important stuff, getting those ridiculous prices down was more important, WE, THE CONSUMERS, WIN. Isn't that more important than arguing which one is better in our SUBJETIVE opinions?

Prices are down! GOOD FOR US NO MATTER WHO WINS!


----------



## H82LUZ73 (Oct 25, 2013)

Well okay enough of the bickering please.Who cares about green vs red All i want from 290/90x is to replace my crossfire setup and not worry for another 2 years .The 290x delivers,Just have one question for Wizz 

Q
When is eta of the 290 cards reviews coming? And are you testing bios mods for unlocking to 290x specs??? This is what NV guys should be worried about the most.Because the 290 will be in the sweet spot of $450-$470(based of the 280x being $350-$380and the 290x $549-$579) price range and if it unlocks to 290x specs then it has advantage over buying a 780/780ti.


----------



## sweet (Oct 25, 2013)

HumanSmoke said:


> Yes quite some massacre. Always looks good when you leave out the benches that don't look so good- let me guess, you left out the Skyrim bench by accident?. I bet George Armstrong Custer is rueing the fact that he couldn't "rt click>save as" the Lakota Sioux he wanted to fight.
> Seeing as how you posted so many benches, I assume you were going for the completeness motif- so here's the TESV bench and the CFX/SLI 7680x1440 results:
> http://img.techpowerup.org/131025/tomshw.jpg



If you read my post above, you will know that I don't want to include an old tech game like Skyrim (DirectX 9) in any comparisons.


----------



## the54thvoid (Oct 25, 2013)

This thread has now officially turned into a blind bias shitstorm.  Pretty obvious where loyalties lie.

Let's hope someone starts up a 290X owners club so we can get some *refreshing perspectives based on experiences* and not subjective quotes from selective sources for selective opinions.

Maybe I'll be in that 290X club..... Once I wait a month or so to see what happens when the dust settles.


----------



## jigar2speed (Oct 25, 2013)

arterius2 said:


> once again, you are using the word massacre to describe a few fps/~10% difference
> and once again, you are selectively pulling benchmarks that AMD typically do well in(this time from toms but rejects ones from tweaktown) to favor your argument.
> 
> I see no massacre here, all im seeing is exactly what a slightly OC'd 780 or titan would do and would probably still run quieter and cooler than the 290x



At 4K you will notice this as a hugh difference.


----------



## HumanSmoke (Oct 25, 2013)

the54thvoid said:


> Maybe I'll be in that 290X club..... Once I wait a month or so to see what happens when the dust settles.


I think I'd like to see what the 290 (non-X) brings. If the usual pricing structure holds and the second tier card sits $100-150 below the 290X, I think I'd be more inclined to go with it (AIB board design ofc)


----------



## SIGSEGV (Oct 25, 2013)

the54thvoid said:


> *This thread has now officially turned into a blind bias shitstorm.  Pretty obvious where loyalties lie.
> *
> Let's hope someone starts up a 290X owners club so we can get some *refreshing perspectives based on experiences* and not subjective quotes from selective sources for selective opinions.
> 
> Maybe I'll be in that 290X club..... Once I wait a month or so to see what happens when the dust settles.



+1 
couldn't agree more..



HumanSmoke said:


> Yes quite some massacre. Always looks good when you leave out the benches that don't look so good- let me guess, you left out the Skyrim bench by accident?. I bet George Armstrong Custer is rueing the fact that he couldn't "rt click>save as" the Lakota Sioux he wanted to fight.
> Seeing as how you posted so many benches, I assume you were going for the completeness motif- so here's the TESV bench and the CFX/SLI 7680x1440 results:
> [u/rl]http://img.techpowerup.org/131025/tomshw.jpg[/url]



SLI Titan 2K vs CFX 290X 1K
for me that's quite impressive..


----------



## HumanSmoke (Oct 25, 2013)

SIGSEGV said:


> SLI Titan 2K vs CFX 290X 1K
> for me that's quite impressive..


Yes it is.
As I said, I added the benches for sake of completeness.


----------



## centaurius (Oct 25, 2013)

birdie said:


> It's the first time I'm truly disappointed with a TPU review - was it really necessary to test 290X at resolutions below 2560x1600?
> 
> This card is meant for 4K or dual monitors resolution, and it's a monster at them.
> 
> ...



Oh really?

Ever occurred to you that some players use a 120 Hz monitor? And game at 1920x1080 ? At least I am very interested in seeing if I can get around 120 FPS in recent games to see if I can match with a 120 HZ monitor. Not everyone just points to the 60 hz mark!


----------



## HTC (Oct 25, 2013)

sweet said:


> *And here is the 4k massacre.*
> 
> http://media.bestofmicro.com/O/U/406542/original/arma-38-fr.png
> http://media.bestofmicro.com/P/6/406554/original/bf3-38-fr.png
> ...





arterius2 said:


> *once again, you are using the word massacre to describe a few fps/~10% difference*
> and once again, you are selectively pulling benchmarks that AMD typically do well in(this time from toms but rejects ones from tweaktown) to favor your argument.
> 
> I see no massacre here, all im seeing is exactly what a slightly OC'd 780 or titan would do and would probably still run quieter and cooler than the 290x



If the price difference VS titan wasn't by almost a factor of 2, i would agree this was no massacre but achieving this performance @ a bit over half the price, i call that a massacre and i would call it a massacre if instead of ~10% difference was 0%.


Still, i think AMD shot themselves on the foot with a cannon ball here!

The way i see it, they are trying to sell more cards by selling the reference models now with the piece-of-garbage cooler that only those that will either water cool or don't care about noise (considering this, there won't be many, regarding the latter) and then allowing for non-reference cards with a better cooler which, by itself, *should* increase performance, as in, not increasing any other specs other then the cooler.

What i think they fail to realize is that, if they had a better cooler, and i'm talking about one that could make the card hover around the GHz (which is the supposed default speed) mark as opposed to drop into the 850 MHz range with even worse 650 MHz dips as evidenced by the graph below (by hover, i mean it could drop like 50-70 MHz but not 350 MHz), *they would sell WAY more cards now* (which would compensate, and then some, for the higher price of the cooler) and, when the non-reference cards hit the market, with even better coolers, sell a shitload more of them.


----------



## BiggieShady (Oct 25, 2013)

HTC said:


> What i think they fail to realize is that, if they had a better cooler, and i'm talking about one that could make the card hover around the GHz (which is the supposed default speed) mark as opposed to drop into the 850 MHz range with even worse 650 MHz dips as evidenced by the graph below (by hover, i mean it could drop like 50-70 MHz but not 350 MHz)



Yeah, funny thing is, this card would perform much better in just 4 C cooler ambient. But then again it would heat up the room pretty fast and throttle down soon after.
Better aftermarket coolers that would keep it under 90 C mark are often non-blowout type and that would raise case temperature and speed up case fans.
It's hard to balance the noise/temps when you deal with hot hardware on air. You often end up using the headphones.
For this card on air, triple slot vapour chamber blowout style cooler is a way to go IMO.


----------



## HTC (Oct 25, 2013)

BiggieShady said:


> Yeah, funny thing is, this card would perform much better in just 4 C cooler ambient. But then again it would heat up the room pretty fast and throttle down soon after.
> *Better aftermarket coolers that would keep it under 90 C mark are often non-blowout type and that would raise case temperature and speed up case fans.
> It's hard to balance the noise/temps when you deal with hot hardware on air.* You often end up using the headphones.
> For this card on air, triple slot vapour chamber blowout style cooler is a way to go IMO.



But the *stock cooler* wouldn't have to be top-of-the-line: just not bottom-of-the-line.

As i said, it could still dip below the advertised base speed like 50-70 MHz or so: dipping 350 MHz is well beyond over the top, IMO.

In alternative, they could have a base clock of 900 MHz coupled with a better cooler and i *think* it would still perform better in *quiet mode*, though not as good in uber mode but that could be fixed easily by having the quiet mode speed of 900 MHz and the uber mode speed of 1000 MHz.

As it stands, even in uber mode default fan speed the card throttles down because of high temps, as seen in the graph below, and it reaches 600 MHz and that's a throttle down of *"just 40%"* ...






It's a shame a time scale on W1zzard's graphs isn't included: i would like to know how long it took for the card to throttle down *heavily* in both modes (not the beginning of the throttling: the part where it starts being more severe).


A question @ W1zzard: is it possible to run some tests (don't need all: a few should suffice) the way you ran them but with the base speed on both modes @ ... say ... 900 MHz? I'm wondering if it performs better by not throttling constantly due to temps.


----------



## Xzibit (Oct 25, 2013)

.


> *Gibbo* - Overclockers UK
> 
> **UNDER-VOLTING**
> 
> ...


----------



## arterius2 (Oct 25, 2013)

HTC said:


> If the price difference VS titan wasn't by almost a factor of 2, i would agree this was no massacre but achieving this performance @ a bit over half the price, i call that a massacre and i would call it a massacre if instead of ~10% difference was 0%.
> 
> 
> Still, i think AMD shot themselves on the foot with a cannon ball here!
> ...



why are you comparing this to titan? this is exactly the inconsistency/bias I talked that people use to further their argument. a better comparison would be against the 780gtx which should be coming down in price soon.
I'll repaste my previous post again in case you have reading comprehension issues:



arterius2 said:


> First, they'll start off with the argument that the 290X is about ~$100 cheaper than the GTX780 (and that the price killed the 780)
> 
> and then they'll say "Oh who cares about temp, noise and power consumption", -but you know what? most people do! Then after that, they'll say "Just throw a waterblock on it".
> 
> ...


----------



## HTC (Oct 25, 2013)

arterius2 said:


> *why are you comparing this to titan?* this is exactly the inconsistency/bias I talked that people use to further their argument. a better comparison would be against the 780gtx which should be coming down in price soon.
> I'll repaste my previous post again in case you have reading comprehension issues:



Doesn't this card trade blows (mostly in uber mode) with the titan? Wins some and loses some. It barely beats the 780 in quiet mode, but it does beat it and that's with a piss poor cooler which, i'm sure you agree, restrains the performance of this card. And it DOES beat the titan when in uber mode, according to W1zzard's graph below.






Since you're not the only one who can re-paste your previous posts ...



HTC said:


> If the price difference VS titan wasn't by almost a factor of 2, i would agree this was no massacre but achieving this performance @ a bit over half the price, *i call that a massacre and i would call it a massacre if instead of ~10% difference was 0%.*



Do you see my point?


----------



## arterius2 (Oct 25, 2013)

HTC said:


> Doesn't this card trade blows (mostly in uber mode) with the titan? Wins some and loses some. It barely beats the 780 in quiet mode, but it does beat it and that's with a piss poor cooler which, i'm sure you agree, restrains the performance of this card. And it DOES beat the titan when in uber mode, according to W1zzard's graph below.
> 
> http://tpucdn.com/reviews/AMD/R9_290X/images/perfrel.gif
> 
> ...



because the titan is not in direct competition with 290x, with a portion of the titan users buying this card for company/workstation use, the Titan can be seen as a card that's "best of both worlds" between a gaming card and a professional card(which cost significantly more than gaming cards). 

another factor is that the titan and to that extend the gtx780 is much better in terms of noise and temperature and efficiency. from a technical point of view, it would be much more difficult to design a product that does well in all criteria than if a product were just focused in one. in this case, Nvidia had to design a graphics card that had noise, temperature, power consumption well under control while trying to maximize performance, do you understand this is a lot more difficult to achieve than just brute force performance? I would imagine the R&D cost would be much higher too which is reflected in the premium. To give an example of this would be that I'm an architectural designer, if I were asked to design a very cheap building or a very efficient building it would be relatively easy, but if I were asked to design a building that is both cheap, elegant, and energy efficient then I would probably charge a hell of a lot more to come up with the design, do you get my drift?

I also stated that the 290x has just release while the titan/780 has been out for half a year. and Nvidia will soon be adjusting their pricing very soon, its too early to be calling "massacres" and "obliterations" at this point.

what bugs me (or scares me) more than anything at this point is that with the release of 290x, many of the "Nvidia-naysayers" are suddenly out of the forest creating this product of Titan/780hybrid,  where this imaginary card has the price tag of the Titan but the performance of the 780. and with that creation of another imaginary card with the performance of 290x uber mode and noise/temp of the silent mode, and is selectively pitting these two imaginary cards against each other to further their agenda. Don't get me wrong im not taking any sides here, but I would like to sort the facts straight and see some consistencies in their arguments.


----------



## 1d10t (Oct 25, 2013)

SIGSEGV said:


> http://i1241.photobucket.com/albums/gg515/Elang_Mahameru/furmark_6mins.png
> even my GTX680 DC II T furmark'ed wouldn't touch 50*C
> so, is it safe to say that this card wouldn't hit 65*C under water ?



Trust me mate...that's why they called me Elders or opa in kaskus 



okidna said:


> Errrm... Thing is, I'm not asking about TDP (TDP != maximum power draw) and w1zzard didn't even say anything about TDP (he said power configuration), and what I'm asking is exactly that, power configuration.
> Why with the same power configuration (one 6pin + 8pin or 2 x 6pin) AMD cards has higher theoretical maximum power draw than NVIDIA? Is the difference coming from them (special power setup/setting from AMD)? Or is it coming from PCI-E 3.0 standard?



TDP is straight value constraint with VRM design 

Let me elaborate...









Titan and GTX 780 only had 6 choke Foxconn made with only single channel/tunneling driver+ mosfet ,albeit R22 rating 35A each at OC (operational condition) 90'C with tolerance 10-15%.This could translate they only deliver 200W,added some pci-e slot power 75W and then you have fairly 275W.

While R290X...




Had 6 chokes Coiltronics made with dual channel/tunneling driver+mosfet,R 15 rating 50A each at OC (operational condition) 110'C with tolerance 5-10%.This could translate they will  deliver 300W,added some pci-e slot power 75W and then you have 375W 



arterius2 said:


> I totally understand, but the amount of ignorance here is killing me tho.
> first they throw out the price argument by saying 290x is ALOT cheaper (which is not, money saved couldn't even pay for a waterblock)



It's just nature of internet 
You can always mind them though,basically they didn't have valid ground to make such a statement implying to spot which is which.Its hilarious to see anyone debating over enthusiast card while he himself never touched,try,test or even had one 
Just like debating Viper is ridiculously inefficient,Beemer had utterly crap materials and Mercedes big cc cant run faster than yo mama riding a wheel chair while he only had Prius


----------



## wolf (Oct 25, 2013)

This thread just gets better and better 

All said and done, Nvidia clearly have the upper hand, they can now adjust their range of highly competitive cards ( and introduce SKU's to fit) accordingly.

Let's disregard the Titan completely for arguments sake, Nvidia did it because they could, like Intel their extreme chips, they can sell one for a grand because AMD can't, and AMD did similarly back in the day with the old FX's too, anyway;

Yes it is $100 more than a 290X, but not for long, as we already know, and it hasn't even been 48 hours yet.

And that 100$ more, bought you this level of performance up to 6+ months ago, remember that, AMD didn't have a card this fast then.

This card is good because it will alter how much performance you get for ~$500-$600, and that's about it. This level of performance was new 9 months ago, even longer if you look at a 690, which I'm sure people that have them are quite content with, again , yes it was $1000, but it was $1000 when it launched a year and a half ago.

Last but not least, all the talk about 4k, who on earth has a 4k monitor pray tell? that's what I thought. the vast majority of us game at 1080p-1440p-1600p, the cards are a hec of a lot closer there, and lets face it, even one 780 or 290x is overkill for 1080p alone.

Let's keep things in perspective.

todays 2 cents.


----------



## sweet (Oct 25, 2013)

1d10t said:


> Trust me mate...that's why they called me Elders or opa in kaskus
> 
> 
> 
> ...



Oh you showed him this picture. My plan was to humiliate the Titan/780 later on for their poor VRM design. In short, except the flashy cover, the board and the chip of Titan/780 are far more inferior than 290X's counterparts. No surprise that 290X has been broken records with LN2, it is truly the beast.


----------



## GSG-9 (Oct 25, 2013)

wolf said:


> all the talk about 4k, who on earth has a 4k monitor pray tell? that's what I thought. the vast majority of us game at 1080p-1440p-1600p, the cards are a hec of a lot closer there, and lets face it, even one 780 or 290x is overkill for 1080p alone.
> 
> Let's keep things in perspective.
> 
> todays 2 cents.



<-- There are lots of surround and 3d vision gamers. Speaking as one of em I don't think we could go wrong with a 690 or 2 290x at this point.


----------



## MxPhenom 216 (Oct 25, 2013)

sweet said:


> Oh you showed him this picture. My plan was to humiliate the Titan/780 later on for their poor VRM design. In short, except the flashy cover, the board and the chip of Titan/780 are far more inferior than 290X's counterparts. No surprise that 290X has been broken records with LN2, it is truly the beast.



I don t think nvidia has ever had good reference designs. They just save that for their board partners, to do nice non reference designs.


----------



## Casecutter (Oct 25, 2013)

If some AIB said they would offer a three slot version that was rear exhaust and quieter, while letting it run more often at 1000Mhz and $550 it might really peak my interest.  Sure perhaps not a C-F card... but to not have all the heat floating inside the chassis would be nice.  

While then duct it all outside with a vent/fan in summer, while inside for winter!  Necessity is the mother of invention.

I still would like to see a through tear down of the 290X cooler/fan vs. what the stock Titan was built like.  Would be interesting to see if you could mount a Titan on the 290X and see if was able to it cool better.


----------



## centaurius (Oct 25, 2013)

MxPhenom 216 said:


> I don t think nvidia has ever had good reference designs. They just save that for their board partners, to do nice non reference designs.



What? Reference cooler of titan and 780 is great. Its slick, silent and does the job pretty well. 

If you change the fan speed accordingly you can even OC the reference model to go beyond the 902 Mhz "limit". I have experienced a 780 with custom fan speed to 67ºC with the boost to 993 Mhz, on a REFERENCE cooled 780. And the noise is practically the same. And after almost 3 hours temps are about 69ºC MAX, with constant 993 Mhz clock. It only throttles down when it reaches the 80ºC as you may be aware of course if you read 780 review, this I only saw with default fan settings where it would go down to the reference 863 Mhz but never go below that ofc.


----------



## johnnyfiive (Oct 25, 2013)

newtekie1 said:


> Relying on price a your sole saving grace isn't a good idea.  That only works if nVidia can't lower prices.  We already know both the Titan and 780 were very high margin parts, so nVidia's only answer has to be a price reduction.  Yeah, it is great that there is finally competition to drive down the insane prices, but this isn't a ground breaking card any way you look at it.
> 
> 
> 
> No, they charge a premium for the performance.  Titan and the 780 were expensive because there was nothing that could compete with them, now there there is we should see a price drop on the 780, as well as the 780Ti being priced pretty similar to the 290X.



So AMD will lower the price of the R290X, they'll play the price war just as much as NVidia MIGHT do to ensure sales. This isn't anything new. I think people are missing the point here... AMD just released a $549 MSRP card that competes with NVidia's $999 MSRP card. It's nearly *half* the price.

Lets make a scenario and think of things a little differently. What if AMD had released the R290X when NVidia released the Titan and it was $999. So, we're reversing the releases.

Would the $999 R290X sell as well as the Titan did... yes.
*Why?* Because it would have been the most powerful card available. Hardcore enthusiasts would be the only ones buying them because it cost a thousand flipping dollars. No one would be complaining about temps/noise/power consumption when they just spent $1k on a card. *Why?* Because its beats anything else available with ease and is clearly the king of the GPU crop. They would be praising its unparallelled performance and gawking at its frame rates and whatever else enthusiasts do.

Now, continuing our reversal scenario, NVidia would now release the Titan and it would cost $549. NVidia would now have the best bang for your buck along with the overall best card. Now AMD would look silly with their overly insanely priced R290X and NVidia would be the _saving grace_ of the affordable top tier card market.

Moral of my scenario: Enthusiast with 1,000 dollars to spend don't care about cost. They want the best card money can buy. They don't care about blowing money, and both AMD and NVidia don't care about charging you that price. But here is what they DO care about... SALES and moving product. 

If they make cards that are awesome that people aren't buying, then what is the point? Why bother making awesome cards if no one is going to buy them? Why would I waste my R&D on something that no one is going to buy? In other words, stop whining about the R290X when its clear as crystal that it IS the best bang for your buck ATM. How can people have anything to complain about when a $550 card is competing with a card that goes for 1,000 dollars? 

So now people are saying "well, i could spend an extra $50-100 for a GTX 780 and it will blow away a R290X if I get one that's non reference" . Well, *LIGHTBULB*, what about non-reference R290X's? Like a lightning from MSI or something from Gigabyte or ASUS. C'mon... I've never seen so many NVidia-biased responses in a video card review in a long time... this is silly business.


----------



## MxPhenom 216 (Oct 25, 2013)

centaurius said:


> What? Reference cooler of titan and 780 is great. Its slick, silent and does the job pretty well.
> 
> If you change the fan speed accordingly you can even OC the reference model to go beyond the 902 Mhz "limit". I have experienced a 780 with custom fan speed to 67ºC with the boost to 993 Mhz, on a REFERENCE cooled 780. And the noise is practically the same. And after almost 3 hours temps are about 69ºC MAX, with constant 993 Mhz clock. It only throttles down when it reaches the 80ºC as you may be aware of course if you read 780 review, this I only saw with default fan settings where it would go down to the reference 863 Mhz but never go below that ofc.



Im talking board designs, not the cooler.


----------



## GSG-9 (Oct 25, 2013)

johnnyfiive said:


> Would the $999 R290X sell as well as the Titan did... yes.
> *Why?* Because it would have been the most powerful card available. Hardcore enthusiasts would be the only ones buying them because it cost a thousand flipping dollars. No one would be complaining about temps/noise/power consumption when they just spent $1k on a card. *Why?* Because its beats anything else available with ease and is clearly the king of the GPU crop. They would be praising its unparallelled performance and gawking at its frame rates and whatever else enthusiasts do.
> 
> Now, continuing our reversal scenario, NVidia would now release the Titan and it would cost $549. NVidia would now have the best bank for your buck along with the overall the best card, and now AMD would look silly with their overly insanely priced R290X and NVidia would be the saving grace of the affordable top tier card market.



As others said in this thread (its a long thread at this point), I don't think the Titan would exist if this (290x) had been released then, there would have been a gaming card that did not have the compute performance of the titan, and maybe a more affordable version of the tesla that cost less for professional use.


----------



## okidna (Oct 25, 2013)

1d10t said:


> TDP is straight value constraint with VRM design
> 
> Let me elaborate...
> 
> ...



Make sense, I understand now, thank you. 
But still curious why W1zzard didn't write about this difference in VRM design can cause difference maximum power draw despite both using exactly same power configuration. 
When I first saw this I thought it was a typo, but then after reading all R9 reviews (280X,270X, and now 290X) I assume that there must be something going on.

From TPU picture I see that 290X is using R23 not R15 (R15 is 7970).
Is there any difference between R23 and R15?
All I can find is this datasheet : http://www.cooperindustries.com/con...-datasheets/Bus_Elx_DS_4341_FP1007_Series.pdf


----------



## johnnyfiive (Oct 25, 2013)

GSG-9 said:


> As others said in this thread (its a long thread at this point), I don't think the Titan would exist if this (290x) had been released then, there would have been a gaming card that did not have the compute performance of the titan, and maybe a more affordable version of the tesla that cost less for professional use.



Because they would magically change their entire R&D plans midway through making the Titan..... yeah ok. Well what about the GTX 4xx series, they should have done it then but they didn't. Moot.


----------



## GSG-9 (Oct 25, 2013)

johnnyfiive said:


> Because they would magically change their entire R&D plans midway through making the Titan..... yeah ok. Well what about the GTX 4xx series, they should have done it then but they didn't. Moot.



You were the one that postulated a before the titan was released situation. You can granularize your logic after the fact to suit your needs as much as you want. But it really just means no one will want to talk to you because you constantly rescope the debate. Enjoy talking to yourself.


----------



## JDG1980 (Oct 25, 2013)

BiggieShady said:


> Better aftermarket coolers that would keep it under 90 C mark are often non-blowout type and that would raise case temperature and speed up case fans.



But it's a lot easier to get good performance out of case fans while retaining low noise than it is with a small, cheaply-made blower-type fan. It's the same reason why closed-loop liquid coolers work well: you're not eliminating the need for ventilation, but moving it to a location where you can use larger (and multiple) fans which move more air at lower noise levels.


----------



## the54thvoid (Oct 25, 2013)

1d10t said:


> Trust me mate...that's why they called me Elders or opa in kaskus
> 
> 
> 
> ...



Excellent info on the chokes, thanks. Makes me more interested in it now,  knowing it has a solid voltage base.


----------



## radrok (Oct 25, 2013)

the54thvoid said:


> Excellent info on the chokes, thanks. Makes me more interested in it now,  knowing it has a solid voltage base.



AMD/ATI has always had the best reference designs.

I've been the most impressed by 6990s volterras PWM, they could resist more than 100C on stock cooling with 1.3v+

Do that to a Titan and you'll end up with a 1k $ paperweight.


----------



## Am* (Oct 25, 2013)

Fluffmeister said:


> The GK104 has been a massive success for Nvidia, and as you said yourself it was purely focused on gaming and it has and still does it's job brilliantly. No one gives a toss about compute benchmarks they never run. Can't say I've missed out on anything running my little GTX 670, in fact it does me proud every day.
> 
> At least give them credit where it is due.
> 
> ...



Way to contradict yourself, bud. Attempts to argue that compute performance is worthless, proceeds to bring up compute cards as prime examples. Umm...kay.

I will give Nvidia credit where it's due: Nvidia have done well to milk their obscenely overpriced range of cards for this long, and a fully ungimped and functioning GK110 still has a lot of potential as a GeForce card against the R290X, if Nvidia are willing to release it soon and for a comparable price. If they still decide not to, Nvidia can prepare to delay Maxwell or scrap their costly GK110 salvage parts and/or sell them at give-away prices.



newtekie1 said:


> There are some things you seem to be forgetting:
> 
> These cards are being sold mostly to gamers, so no one cares how good the cards are at GPU Compute tasks.  You're making the same argument that people made to defend Fermi.  AMD should have learned from nVidia's mistake and learned the GPU Compute doesn't sell desktop GPUs.
> 
> ...



I guess you missed the part where I said I liked the flagship Fermi parts and would prefer a fully functioning/overheating Fermi-equivalent with un-gimped compute -- over a cut-down, gimped and self-throttling Kepler to skew benchmarks.

I never supported AMD's horrible management that cut some 30% of its engineering force, was merely pointing out the fact that AMD during its best days vs Nvidia during their worst recent days, cannot remotely compare on paper in R&D budgets or any other financial stats, seeing how they are competing (or at least attempting to compete) against both Intel AND Nvidia, and it's a miracle that AMD are actually managing to do it in the costly top-of-the-line GPU market, instead of abandoning discrete GPUs entirely and becoming another APU/SOC-only company chasing Intel's most sought-after market (which may be a reality for AMD sooner than later). By all means feel free to show me info that says otherwise, but if you're going to somehow argue against them even attempting to compete, don't bother. Enjoy your brownie points from the green-favouring zealots on here and move along, I'm not going to even attempt to prove you (or them) wrong in this respect.



arterius2 said:


> ...so to sum it up:
> 1) they claim its cheaper
> 2) they ignore noise, temp and power consumption
> 3) then they say throw a water block on it ignoring the price of a water-cool kit (+$150 for the block and +$400 for the whole setup)
> ...



1) people claim it's cheaper because it is, get over it.
2) that's because 3rd party designs are already on the way, which are never going to happen for the Titan.
3) see above post. Even so, R290X + watercooling or any other 3rd party cooling kit you're going to try grasping straws over, comes out cheaper, regardless of whether it is going against a Titan or a 780.
4) again, see point 2. It will take MSI/Gigabyte 5 minutes to drop some silent triple/double fan coolers they already have on their 7000/700 series cards on this R290X, which will make your point moot. Reference cooler card reviews sometimes don't mean shit (go see the GTX 770 stock cooler reviews with the Titan cooler -- it is almost not sold anywhere and is therefore pointless).
5) that's because it's true. R290X wins against whatever you want to try compare it to. Is it cheaper? Yep. Does it perform better? Yup. At lower res (290X disadvantage -- little use for the huge 512bit bus/ROPs)? Yep. At higher res (huge 290X disadvantage vs Titan with its 2GB more VRAM)? Still yes. You cannot spin it in Nvidia's favour in any way, other than the facts that Nvidia did enjoy the early lead and lower power consumption/thermals. And this is all excluding the fact that the R290X is on very early drivers which WILL get better performance from AMD -- not so with Nvidia, as they have had a 9 month head start already. And with temperature throttling on that shitty stock cooler, which will get a big advantage once those Twin Frozr/Windforce-like designs drop.



HumanSmoke said:


> Yes quite some massacre. Always looks good when you leave out the benches that don't look so good- let me guess, you left out the Skyrim bench by accident?. I bet George Armstrong Custer is rueing the fact that he couldn't "rt click>save as" the Lakota Sioux he wanted to fight.
> Seeing as how you posted so many benches, I assume you were going for the completeness motif- so here's the TESV bench and the CFX/SLI 7680x1440 results:
> http://img.techpowerup.org/131025/tomshw.jpg



Can you even read what you're posting? R290X won 6/8 of the single card benchmarks you posted, (WITH A FREAKING 2GB VRAM deficiency, no less) so as a last desperate attempt you have to drag in Crossfire support of a 1-day old self-throttling card vs 9+ month old Nvidia cards in your pathetic attempt to grasp at your green-coloured straws -- which relies ENTIRELY on several-month old support of stable post-release drivers?

And DIRECTX 9?     

Thanks for the good laugh, ya crazy Nvidia zealot, but you invalidated your opinion the minute you brought up a shitty, old & horribly ported DX9 game in what is now AMD's 4th gen DX11 flagship card review.  
Please leave and take your fail with you, and while you're at it, bring in the old DX7/DX8 titles with Quake III, HL1 and Unreal Tournament in the mix for good measure, because if Nvidia aren't winning, you gotta keep digging for those pre-historic benchmarks nobody gives a flying shit about!

Please feel free to flame me with your predictable "AMD fanboy" comments though, despite the fact that all my current PCs run Intel/Nvidia GPUs -- I can always use a good laugh.  Nvidia zealots are getting too predictable these days.

P.S. and 4K benchmarks DO matter because they show exactly how future proof the GPU is. How many of you were screaming "1080p benches are worthless" 10 years ago when we were still rolling on our 1280x1024 CRTs? 4K is on its way to being relevant over the next 2 years; the 7970 was AMD's flagship for nearly 2 years, so 4K benches sure as hell matter, to show progress in future GPUs if nothing else.


----------



## Frick (Oct 25, 2013)

So how about that rad i740 huh? I really like how that is turning out.


----------



## buildzoid (Oct 25, 2013)

1d10t said:


> Trust me mate...that's why they called me Elders or opa in kaskus
> 
> 
> 
> ...



Dude I just checked the spec sheet for the coils and their rated for 70A at 125c just search coiltronics 1007r3 and the first copper bussman pdf contains the specs. Also those mosfets exist in only to variants that I know of one for 70A and the ones the Vapor-x 7970/50/r9 280x uses which are 60A rated. So the vrm has nothing to do with tdp. Tdp stands for expected power draw based on in lab tests.
Now if you look at the VRM closely you can see it is in fact 5+1+1 design capable of pushing 350A on the core voltage line without degradation(that happens only if you go over the source drain voltage spec or source drain current spec) it is capable of pushing another 40A on the VTT rail. So basically this VRM is one of the best designs on the market in terms of raw power output, it should also have low noise because it is a 1pwm to 1phase design what it does lack is efficiency as the components are running close to the maximum of their spec.


----------



## The Von Matrices (Oct 25, 2013)

The discussions of board power and VRM output are missing the question of the OP.  The OP (and I) wonder why W1zzard uses conflicting numbers.  W1zzard says that the same configuration of power cables can supply different amounts of power (i.e. 375W on an AMD card versus 300W on an NVidia card for 6+8pin PCIe power connectors).  This has nothing to do with how much power the VRM can output, just how much power can be input to the VRM.

In these reviews W1zzard always quotes the amount of power than can be input without violating the PCIe specification.  So the question is that since both cards conform to the same PCIe power specifications, why can AMD draw 75W more than NVidia without violating the specification?


----------



## W1zzard (Oct 25, 2013)

okidna said:


> Good review like always, but I'm curious about something.
> 
> From 290X and 280X review :
> 
> ...



I just fail at math.

PCIe slot = 75W
6-pin = 75W
8-pin = 150W


----------



## buildzoid (Oct 25, 2013)

The Von Matrices said:


> The discussions of board power and VRM output are missing the question of the OP.  The OP (and I) wonder why W1zzard uses conflicting numbers.  W1zzard says that the same configuration of power cables can supply different amounts of power (i.e. 375W on an AMD card versus 300W on an NVidia card for 6+8pin PCIe power connectors).  This has nothing to do with how much power the VRM can output, just how much power can be input to the VRM.
> 
> In these reviews W1zzard always quotes the amount of power than can be input without violating the PCIe specification.  So the question is that since both cards conform to the same PCIe power specifications, why can AMD draw 75W more than NVidia without violating the specification?



Can't find where he states the tdp of the card but only goes out of spec by 15w which I think doesn't matter too much also the only difference between a 6 and 8 pin is that the 8 pin has 2 extra ground wires so it really doesn't matter that the card pulls around 1.25A more than permitted on the configuration of 6+8 pin because the limit exists to make sure you're power cables don't melt or catch fire. Which they won't if the draw is only beyond limits for a second or 2 though it may trigger ocp on some psus.


----------



## N3M3515 (Oct 25, 2013)

johnnyfiive said:


> So now people are saying "well, i could spend an extra $50-100 for a GTX 780 and it will blow away a R290X if I get one that's non reference" . Well, *LIGHTBULB*, what about non-reference R290X's? Like a lightning from MSI or something from Gigabyte or ASUS. C'mon... I've never seen so many NVidia-biased responses in a video card review in a long time... this is silly business.



+1
This is exactly what i've been saying all along, only people choose to ignore it.


----------



## 1d10t (Oct 25, 2013)

sweet said:


> Oh you showed him this picture. My plan was to humiliate the Titan/780 later on for their poor VRM design. In short, except the flashy cover, the board and the chip of Titan/780 are far more inferior than 290X's counterparts. No surprise that 290X has been broken records with LN2, it is truly the beast.



I didn't say Titan/GTX 780 had a bad VRM,its their design to suit their chip.Needless to say,you're correct...aside from that,let finish this never ending bashing shall we? 



okidna said:


> Make sense, I understand now, thank you.
> But still curious why W1zzard didn't write about this difference in VRM design can cause difference maximum power draw despite both using exactly same power configuration.
> When I first saw this I thought it was a typo, but then after reading all R9 reviews (280X,270X, and now 290X) I assume that there must be something going on.
> 
> ...



You may PM'ed Wizzard about that 

It's R15 (1007R3-R15) on the main phase,the R23 you saw was a split plane use for another instance,such as memory banks and memory controller.Basically there's no different between R15 and R23,except R23 is more sensitive (with delta A) and more accurate feeding voltage, but also capable of doing "instance spike / passthrough" if the chip needs it.



buildzoid said:


> Dude I just checked the spec sheet for the coils and their rated for 70A at 125c just search coiltronics 1007r3 and the first copper bussman pdf contains the specs. Also those mosfets exist in only to variants that I know of one for 70A and the ones the Vapor-x 7970/50/r9 280x uses which are 60A rated. So the vrm has nothing to do with tdp. Tdp stands for expected power draw based on in lab tests.
> Now if you look at the VRM closely you can see it is in fact 5+1+1 design capable of pushing 350A on the core voltage line without degradation(that happens only if you go over the source drain voltage spec or source drain current spec) it is capable of pushing another 40A on the VTT rail. So basically this VRM is one of the best designs on the market in terms of raw power output, it should also have low noise because it is a 1pwm to 1phase design what it does lack is efficiency as the components are running close to the maximum of their spec.



About the spec sheet there's no account that will be translate to a raw power.Notice there still dual channel (dual Low RDS(on) in single package plus one bridge between them).I don't know about how they controlled it though,but obviously board maker wouldn't let their choke work on their maximum spec all the time.I've read some of from jonny,that Renesas,Foxconn Magic,Sanyo as coil maker (toroidal,composite,ferrite,duralium coil) they test specific part only a small amount of time such as sudden spike,so on the spec sheet only as guidance but not their OC (operational condition).


----------



## The Von Matrices (Oct 25, 2013)

N3M3515 said:


> +1
> This is exactly what i've been saying all along, only people choose to ignore it.



We're not ignoring it because we're "NVidia zealots"; you're just not seeing the full picture because you assume that NVidia is doing nothing.  

First, I want to point out something W1zzard said:



W1zzard said:


> so far, everybody who I talked to, said that AMD doesn't allowed custom designs for 290X. This will probably change soon, maybe AMD just wants to sell more of their ref boards asap



There's no proof that vendors don't have custom designs ready today.  The evidence indicates that AMD is not letting vendors release custom designs even if they do have them.  When that restriction will lift is up for question, but _as of now AMD has released no time frame_.  So to say when these custom designs will be coming is pure speculation. 

What is known is that in three weeks NVidia will release the GTX 780 Ti; there's no debating that time frame.  It's also hard to argue that the 780Ti won't change NVidia's pricing structure of their lineup driving down the price of the GTX 780.

*Here's a more realistic scenario for those championing custom cooled R9 290X's*

When custom cooled R9 290X's come about, there will have been a price cut for custom GTX 780's.  We will be in the same competitive situation all over again.  You will be able to get either the R9 290X or the GTX 780 for the same amount of money.  The comparison will be whether you want the more powerful R9 290X and are willing to put up with the extra heat and noise or if you want the slower GTX 780 with much less heat and noise.  

The R9 290X seems to be difficult to cool, so there might even be a situation where vendors want to spend extra money on the coolers of their R9 290X's to make them faster and they end up more expensive than the custom GTX 780's, which will make the situation even more confusing.

What I see personally happening is a segmentation of the R9 290X market.  There will be R9 290X's about the same cooling performance as the reference cooler and they will be priced to compete with GTX 780's.   Then there will be R9 290X's with extravagent heatsinks that can get extra performance, and they will be priced to complete with the GTX 780Ti.  This is sort of like what AMD did with the 7970 vs. the 7970 GHz edition, since you could get GHz edition performance with the regular 7970 just by improving its cooler and applying an overclock,


----------



## okidna (Oct 25, 2013)

W1zzard said:


> I just fail at math.
> 
> PCIe slot = 75W
> 6-pin = 75W
> 8-pin = 150W



 

Thanks for the clarification 



1d10t said:


> You may PM'ed Wizzard about that



No need to, see above


----------



## arterius2 (Oct 25, 2013)

The Von Matrices said:


> We're not ignoring it because we're "NVidia zealots"; you're just not seeing the full picture because you assume that NVidia is doing nothing.
> 
> First, I want to point out something W1zzard said:
> 
> ...



you nailed it, exactly what I wanted to say as well, why is that people who are breathing logic and sense into this discussion suddenly labeled as "NVidia zealots"? while the opposites are resorted to name calling and personal insults?


----------



## Crap Daddy (Oct 25, 2013)

The Von Matrices said:


> We're not ignoring it because we're "NVidia zealots"; you're just not seeing the full picture because you assume that NVidia is doing nothing.
> 
> First, I want to point out something W1zzard said:
> 
> ...



There will be soon another card in the mix and it could be the most interesting. The 290. At a presumably 450$ price point and 780 performance it could cause some real pain for NVidia's lineup.


----------



## HumanSmoke (Oct 25, 2013)

Am* said:


> Can you even read what you're posting? R290X won 6/8 of the single card benchmarks you posted...._blah verbal blah diarrhea blah_....


More to the point, I'm reading what others are posting. My point was aimed at the hyperbole - MASSACRE - really?
Depends on your terms of reference I suppose. My definition would be a substantial leap over the previous architecture. 9700 Pro over the GF4 Ti ? Yes!, 8800GTX over X1950XTX? Yes. 290X over GTX 780* by a few fps per game? Not really.  It would be a different matter if one card was limited to gameplay without AA while the other could use FS AA.

The fact that there isn't that much _real world _gameplay difference between the cards is a leading factor in why this thread is so long. 

* If a single digit performance lead qualifies as a MASSACRE, I'm quite surprised that a 20% performance lead for the GTX 780 over AMD's single GPU champ didn't warrant an even more hyperbolic response from you or your sidekick.


----------



## Casecutter (Oct 25, 2013)

The Von Matrices said:


> Then there will be R9 290X's with extravagent heatsinks that can get extra performance,


AKA EVGA Classified FTW, ROG, HOF that level...?


----------



## broken pixel (Oct 25, 2013)

*question*

I plan on getting 2x 290x for my 120Hz 1440p panel but I am not sure if there will be modded versions of the 290x in a decent time frame. I plan on H20 for both GPUs so buy now or wait for non ref. designed boards to hit the shelves?


----------



## N3M3515 (Oct 25, 2013)

The Von Matrices said:


> We're not ignoring it because we're "NVidia zealots"; you're just not seeing the full picture because you assume that NVidia is doing nothing.



I'm not assuming that, it is obvious nvidia has an answer, it's just that answer is not going to change things from what they are now, meaning 290x will still be the faster one.

It's my opinion but i think it's clearly obvious than non reference versions of 290x are going to be fast enough to compete with anything nvidia gets out of the gates.

If i'm wrong at that well, i'm not perfect, but i'm not going to start criticisim the shit out of nvidia for not being able to get past by the 290x.

That still leaves one question left to answer: r9 290 performance, which will be equal to 780 i guess.

Let amd have their time of glory guys, nvidia had it, it's now amd turn to be king of the hill, and lowering the prices at the same time, how can that be bad??? lol.....



The Von Matrices said:


> Here's a more realistic scenario for those championing custom cooled R9 290X's
> 
> When custom cooled R9 290X's come about, there will have been a price cut for custom GTX 780's. We will be in the same competitive situation all over again. You will be able to get either the R9 290X or the GTX 780 for the same amount of money. The comparison will be whether you want the more powerful R9 290X and are willing to put up with the extra heat and noise or if you want the slower GTX 780 with much less heat and noise.
> 
> ...



That's also pure speculation.


----------



## The Von Matrices (Oct 25, 2013)

Crap Daddy said:


> There will be soon another card in the mix and it could be the most interesting. The 290. At a presumably 450$ price point and 780 performance it could cause some real pain for NVidia's lineup.



I agree to a point, but AMD is filling in the slots in NVidia's lineup.  I doubt the GTX 770 will be able to compete with the R9 290, but then again it doesn't have to for $50 less.  I'm skeptical that the R9 290's performance will be that close to the GTX 780 at stock speeds.  I think it will be lower performance than the GTX 780 in stock form because AMD will limit its power to 225W (2 x 6-pin PCIe) in order to appeal to a broader market.

However, the R9 290 will be a very good value for overclockers.  This is because the Hawaii chip is obviously power limited.  The R9 290 presumably will have the same cooler and board as the R9 290X but a lower heat ouput; therefore, if you just crank up the R9 290's power limit to that of the R9 290X it should reach about the same performance as the R9 290X.


----------



## johnnyfiive (Oct 25, 2013)

GSG-9 said:


> You were the one that postulated a before the titan was released situation. You can granularize your logic after the fact to suit your needs as much as you want. But it really just means no one will want to talk to you because you constantly rescope the debate. *Enjoy talking to yourself*.



You looking for some cookies?


----------



## The Von Matrices (Oct 25, 2013)

N3M3515 said:


> It's my opinion but i think it's clearly obvious than non reference versions of 290x are going to be fast enough to compete with anything nvidia gets out of the gates.



I don't disagree with you on that.  What I disagree with you is that these non-reference versions of R9 290X that compete with the GTX 780Ti will be any less expensive than the GTX 780Ti.  If you want to get to greater performance than the current R9 290X you need a 375W card.  That is strictly custom card territory, as in EVGA Classified, Galaxy HOF, Gigabyte SOC, etc., and those cards carry premiums over reference cards.



N3M3515 said:


> Let amd have their time of glory guys, nvidia had it, it's now amd turn to be king of the hill, and lowering the prices at the same time, how can that be bad??? lol.....



Lowering prices is great, but just because AMD's R9 290X forced NVidia into a position to lower prices doesn't mean that I'm now obligated to buy a R9 290X to thank AMD.

I have no manufacturer preference assuming features are the same.  Furthermore, I don't believe in this whole "glory" or "halo card" thing, and I don't think many people in this forum do either.  I buy the product that fits my needs no matter who the manufacturer is.  This "glory" is all marketing and nothing else; just because someone has the best card in the  world doesn't mean that every product in their entire lineup is good (and more importantly, well priced).


----------



## Am* (Oct 25, 2013)

HumanSmoke said:


> More to the point, I LUUUV Nvidia coz it costs moar -- it must be ghud, just like da Apples...brb, gotta knock one out over dat Titan...waah waaah I'm a condescending zealot in denial, please help meh...
> 
> My definition would be a substantial leap over the previous architecture. 9700 Pro over the GF4 Ti ? Yes!, 8800GTX over X1950XTX? Yes. 290X over GTX 780* by a few fps per game? Not really.  It would be a different matter if I could use some logic here...
> 
> ...



Nice oversight/comprehension fail yet again.


Nvidia GF4 to Radeon 9000 = different generations of GPUs + API change (DX7/8 to DX9)

ATI X1900 to 8000 = different generation of GPUs yet again + API change again (DX9 to DX10)

This gen (R200 series) is a direct response to Nvidia 700 series re-brand which launched first, almost a year and a half after its "competition" that is the 7970 (with a mediocre 20%-30% improvement and absolutely retarded pricing), not a game-changing start of a new generation (GCN 7970, which by the way, only was a few dozen to a hundred dollars more than the old GTX 580 it was replacing when it came out -- and was much faster) architecture/not even a die shrink, and with no new API still. 

It's bad enough that you're making stupid, nonsensical posts, blindly defending your favourite brand without understanding what you've posted, but for you to then be condescending to people explaining to you why you're wrong is...moronic, to say the least.


----------



## Ebo (Oct 25, 2013)

I dont really care about the heat and the power the new card usses, i just want preformance for the money.
I live in Denmark, and here the new R9-290X can be bourght for 3999,- Dkr which is just about 740 dollars incl. VAT and shipment. Thats not really that bad. GFX Titan cost 7194,- Dkr/1331 dollars, GFX 780 at 4917,-/910 dollars.

Does it get hot? yeah, does it make a lot of noise? yeah, but again who cares. I dont because im just waiting for an aftermarketcooler from Artic cooling and some memory/mofsetcooling from Enzotech i pure copper and i put that on the card, and im a happy camper again. 
If thats not enuff, i have a complete watercoolingset just laying arround and i have to invest in is a fullblock from EK, then i have no problems with heat or with noise, so who cares about the warrenty, if it brakes i will buy a new one, how hard is that ?.

I have had 2x5850 in crossfire since they came out and they can still hold their own in the games i play(1920x1200) for now on a 26" Iiyama screen, and i havent really seen anything until now, that could convince me in upgrading.

Im going to tryout eye enfinity with the new card on 3 screens, so the resolution will be high, and thats where this card is going to shine and show all its power. My old cards will be put in my husbands computer and we will both be fine. He can play his small games while i take on BF4(cant wait) and AC4 Blackflag and so on, and i will be happy for the next 2 years gaming.

Im sorry if i havent spelled correct english, but i do my best.


----------



## The Von Matrices (Oct 25, 2013)

Am* said:


> This gen (R200 series) is a direct response to Nvidia 700 series re-brand which launched first, almost a year and a half after its "competition" that is the 7970 (with a mediocre 20%-30% improvement and absolutely retarded pricing), not a game-changing start of a new generation (GCN 7970, which by the way, only was a few dozen to a hundred dollars more than the old GTX 580 it was replacing when it came out -- and was much faster) architecture/not even a die shrink, and with no new API still.



I'm not agreeing with the OP, but your memory of the GTX 580/7970 comparison (at launch) is incorrect

GTX 580: $500
HD 7970: $550
Price increase: 10%
7970 Performance Advantage (TPU Link): 10% at 1920x1200.

If you go by price/performance, the 7970 did not improve on the GTX 580.  It performed better and was priced higher by an equal percentage.  It was not a great deal, and it certainly did not "change the game".

I think that everyone needs to realize that whether it is is AMD or NVidia, both companies are opportunistic when launching new cards that are faster than anything their competitor can offer.  There is no "good" or "bad" company.  When AMD clearly had the high end with the 7970, it didn't price the card to be especially competitive with NVidia; similarly, when NVidia launched the 780 it had the high end and did not price the card to be competitive with AMD.  The R9 290X is not in this category; it couldn't cleanly beat Titan, so AMD priced it aggressively instead.


----------



## N3M3515 (Oct 26, 2013)

The Von Matrices said:


> I have no manufacturer preference assuming features are the same. Furthermore, I don't believe in this whole "glory" or "halo card" thing, and I don't think many people in this forum do either. I buy the product that fits my needs no matter who the manufacturer is. This "glory" is all marketing and nothing else; just because someone has the best card in the world doesn't mean that every product in their entire lineup is good (and more importantly, well priced).



Well, i've had more geforces than radeons, but what i hate nvidia for is that they seem to be the more greedy company when it comes to pricing, come on $1000 really?, 650?
When amd had the chance to do that they put 550, but hey that is not the point of the thread, and i'm not puting a revolver to your head and saying buy a 290x, i myself don't like the reference card either but i'm not going to critizice the shit out of it like it is some piece of garbage that couldn't be out 9 months before.....everyone has been at that position, be nvidia or amd...


----------



## HTC (Oct 26, 2013)

arterius2 said:


> *because the titan is not in direct competition with 290x, with a portion of the titan users buying this card for company/workstation use, the Titan can be seen as a card that's "best of both worlds" between a gaming card and a professional card(which cost significantly more than gaming cards)*.
> 
> another factor is that the titan and to that extend the gtx780 is much better in terms of noise and temperature and efficiency. from a technical point of view, it would be much more difficult to design a product that does well in all criteria than if a product were just focused in one. in this case, Nvidia had to design a graphics card that had noise, temperature, power consumption well under control while trying to maximize performance, do you understand this is a lot more difficult to achieve than just brute force performance? I would imagine the R&D cost would be much higher too which is reflected in the premium. To give an example of this would be that I'm an architectural designer, if I were asked to design a very cheap building or a very efficient building it would be relatively easy, but if I were asked to design a building that is both cheap, elegant, and energy efficient then I would probably charge a hell of a lot more to come up with the design, do you get my drift?
> 
> ...



Really? Why would the top of the line single GPU card of AMD not be competing against the top of the line single GPU of nVidia *if it trades blows evenly with it*?

I fail to see the logic in that, dude!

Your second point however does make a lot of sense (still in the part i highlighted): gaming cards are WAY cheaper then professional cards. Even so, this does NOT negate my above statement.


----------



## HumanSmoke (Oct 26, 2013)

Am* said:


> Nvidia GF4 to Radeon 9000 = different generations of GPUs + API change (DX7/8 to DX9)


Makes no difference to the market. The GF4 Ti was Nvidia's line of cards when the 9700 Pro debuted. You run what you brung.


Am* said:


> ATI X1900 to 8000 = different generation of GPUs yet again + API change again (DX9 to DX10)


Same argument....and DirectX 10? Yeah, that made all the difference :shadedshu. Number of DirectX 10 games at 8800 GTX launch...TWO (Dungeons & Dragons Online , and FSX )


Am* said:


> This gen (R200 series) is a direct response to Nvidia 700 series re-brand


Ah! Didn't you just say that difference between GPUs are negated because of DirectX version? I seem to recall that one of these is DX11.2 compliant and one is DX11.0. Let me guess, your argument negates performance revisions within a DX version.



Am* said:


> It's bad enough that you're making stupid, nonsensical posts.


Hey, I've just read your devotion to the logical fallacy. We all have our cross to bear, so don't go full-emo just yet.


----------



## sweet (Oct 26, 2013)

The Von Matrices said:


> I'm not agreeing with the OP, but your memory of the GTX 580/7970 comparison (at launch) is incorrect
> 
> GTX 580: $500
> HD 7970: $550
> ...



7970 at launch only had 10% lead due to the driver problem, but now the gap is at least 40%. AMD cards usually take time to mature, and they are beasts when they reach their prime.


----------



## HTC (Oct 26, 2013)

sweet said:


> 7970 at launch only had 10% lead due to the driver problem, *but now the gap is at least 40%*. AMD cards usually take time to mature, and they are beasts when they reach their prime.



According to W1zzard's graph, and assuming i'm not screwing up my math, the difference went from 8.6% to 20.9%: a huge increase, yes, but not 40%.


----------



## mastershake575 (Oct 26, 2013)

N3M3515 said:


> +1
> This is exactly what i've been saying all along, only people choose to ignore it.


 We'll its because the quote won't hold weight once the price drops roll around. The 780TI is taking the $650 crown while the third party 780 is dropping to $550-575 so why in the hell would anyone pay more money then they have to ? (logically they wouldn't which is why no one is acknowledging the quote). 

Nobody (unless their extremely uniformed) is saying "im going to pay $75 more than the 290x for the third party 780" when both price drops and a better card are 3 weeks away (im an AMD card owner and even I know that's insane).


----------



## DannibusX (Oct 26, 2013)

hahaha, you people are ridiculous!

Wait, wait, wait.  Did they use the 9.3 Catalyst drivers for benching?????


----------



## HammerON (Oct 26, 2013)

I think it is time to give a warning to those posting in this thread. Please do not argue and put others down. Agree to disagree and move on.
Act civil and we will all get along


----------



## wolf (Oct 26, 2013)

*Like I said this thread just keeps getting better and better! *



Am* said:


> Thanks for the good laugh, ya crazy Nvidia zealot, but blah blah blah
> 
> Please feel free to flame me with your predictable "AMD fanboy" comments though, despite blah blah blah


----------



## qubit (Oct 26, 2013)

The Von Matrices said:


> I think that a lot of people see Titan and think that since R9 290X is half the price you can accept its flaws.  What I argue is that R9 290X is still a $550 card, which is a LOT of money.  You shouldn't have to put up with this when you spend that much money.  NVidia did AMD a favor, because if all that existed was the $650 GTX 780 and there was no comparison to the $1000 Titan, I think the conclusions would be much different.



+1, exactly. It should be damned near perfect at this price. One thing that really gets me, is why they still use that noisy impeller first used on the 2900 six years ago? FFS AMD, can't you make a better one? NVIDIA have done and it's a very cheap part too, so what gives?

On most NVIDIA cards, you can actually run the fan at 100% and while fairly loud, it's not overly objectionable. However, AMD cards are overly objectionable at much lower speeds, which is ridiculous. And to anyone who tells me to just buy an aftermarket cooler, that's a non-argument, because someone shouldn't be expected to spend more money, void their warranty, potentially break their card and be inconvenienced just because AMD won't put a proper cooler on their card ie compensating for AMD at the customers' own expense.



MxPhenom 216 said:


> I disagree. No piece of computer hardware should ever get a 10. There will always be some sort of flaw.



Agreed. Perhaps 9.8 or 9.9 should be the maximum for hardware that does literally _everything_ superlatively. For example, imagine a graphics card that was three times faster than a Titan, used 2/3 the power, made very little noise even when totally maxed out, used only one slot, had perfect drivers (including "perfect" multicard scaling) and only cost $200. Something as extreme as that would be worth such a score. Alas, we can only dream, lol.



mastershake575 said:


> As a current AMD card owner, even I agree
> 
> Don't get me wrong, the 290x is fast and the price is good but it isn't the 780 killer that most people hyped it as (hence why people are comparing it to the Titan instead of the 780).
> 
> ...



Agreed. It's beginning to look like we're hitting the limits of what 28nm can do within a commercially viable power and heat envelope isn't it?

I'd love to see what an ungimped GK110 AND 290x could do when put into a special board that can supply all the juice the GPUs need, overclocked as high as possible, along with high powered cooling to prevent throttling even under the likes of Furmark. I'll bet you might see a 60-80% performance improvement, perhaps even 100% if you're really lucky. Of course, a card like that would be for demo purposes only and not be commercially viable, alas. It must be quite frustrating for a GPU designer to know what their chip can really do, but be forced to constrain them in order to sell them.


----------



## Fourstaff (Oct 26, 2013)

qubit said:


> Agreed. Perhaps 9.8 or 9.9 should be the maximum for hardware that does literally _everything_ superlatively. For example, imagine a graphics card that was three times faster than a Titan, used 2/3 the power, made very little noise even when totally maxed out, used only one slot, had perfect drivers (including "perfect" multicard scaling) and only cost $200. Something as extreme as that would be worth such a score. Alas, we can only dream, lol.



In 10 years time, we should be seeing cards like this with current extrapolation.


----------



## qubit (Oct 26, 2013)

Fourstaff said:


> In 10 years time, we should be seeing cards like this with current extrapolation.



I'm gonna hold you to that.


----------



## TheMailMan78 (Oct 26, 2013)

Its funny I see a lot of the old dogs on TPU who are not impressed with this GPU and the newer guys are all like "OMGBBQ MUST SELL MY TITAN!". Just relax guys. Sit back and really, REALLY look at the big picture.


----------



## qubit (Oct 26, 2013)

TheMailMan78 said:


> Its funny I see a lot of the old dogs on TPU who are not impressed with this GPU and the newer guys are all like "OMGBBQ MUST SELL MY TITAN!". Just relax guys. Sit back and really, REALLY look at the big picture.



Good point and one I think you should elaborate on. 

I reckon the old dogs like me are not impressed, because we have more experience with building PCs and the kind of hardware that should go into a good one and the expectations of what it should deliver.


----------



## TheoneandonlyMrK (Oct 26, 2013)

qubit said:


> Good point and one I think you should elaborate on.
> 
> I reckon the old dogs like me are not impressed, because we have more experience with building PCs and the kind of hardware that should go into a good one and the expectations of what it should deliver.



I think you get more realistic and generally knowledgeable after buying so many of these damned graphics cards, Ironically most on here think imho im a Amd kinda guy but im just a skint member  well not skint but like many i prioritise pc upgrades as a  commodity and only buy what i Think i need then spend the spare on beer and women


----------



## GSG-9 (Oct 26, 2013)

theoneandonlymrk said:


> I think you get more realistic and generally knowledgeable after buying so many of these damned graphics cards, Ironically most on here think imho im a Amd kinda guy but im just a skint member  well not skint but like many i prioritise pc upgrades as a  commodity and only buy what i Think i need then spend the spare on beer and women



This. Upgrade cycles man.


----------



## wolf (Oct 26, 2013)

TheMailMan78 said:


> Its funny I see a lot of the old dogs on TPU who are not impressed with this GPU and the newer guys are all like "OMGBBQ MUST SELL MY TITAN!". Just relax guys. Sit back and really, REALLY look at the big picture.



I was thinking something close to this the other day when for the first time in years I went to a 500+ people LAN, makes me feel old, which I guess in comparison I am. Priorities change, motorcycles get bought, women need jewellery..

I had almost forgotten how everyone on the internet seems to be a full bottle industry expert because they built a current PC which (for them) is the best.

I hear this argument about mac vs pc, iphone vs android, it's the same song, just on a different day.


----------



## TheMailMan78 (Oct 26, 2013)

qubit said:


> Good point and one I think you should elaborate on.
> 
> I reckon the old dogs like me are not impressed, because we have more experience with building PCs and the kind of hardware that should go into a good one and the expectations of what it should deliver.



Its simple. Most games today with the exception of maybe Battlefield 3/4, Crysis and Metro can get by with a 660ti for 230 bucks. You can get a 670 for $270 and that will run Battlefield well and Crysis and Metro decent on medium settings. Which to me makes the 780 and the 290 vastly over priced. 

Lets say you buy two 670s for $270 each new which brings me to $540. Top that off with the duplicate games they will come with in bundles. Selling one game that will bring it me to $500 easy. If I sold both games it would bring me to well under $500. Maybe around $460 at $40 bucks per game for two 670's when its all said and done. That's far better performance with an older generation GPU for less money.

The 290 should really cost about $420 bucks. Maybe $450. However NIVIDA ran the market segment up so high with the 780 being the top dog for so many months with the stupid $600 dollar price tag that $550 seems cheap now. Its not. Its stupidly over priced.

This folks is how marketing and a non-informed public keep over paying for stuff. Good for companies. Bad for consumers.


----------



## Fourstaff (Oct 26, 2013)

TheMailMan78 said:


> Most games today with the exception of maybe Battlefield 3/4, Crysis and Metro can get by with a 660ti for 230 bucks.



As a 660Ti user I concur. That said, it did annoy me slightly because I have to tone down the settings for Crysis 3. I will need to tone down a lot of things once the next gen console games come in in full force in about a years time, but that will be a story for another day.


----------



## TheoneandonlyMrK (Oct 26, 2013)

TheMailMan78 said:


> Its simple. Most games today with the exception of maybe Battlefield 3/4, Crysis and Metro can get by with a 660ti for 230 bucks. You can get a 670 for $270 and that will run Battlefield well and Crysis and Metro decent on medium settings. Which to me makes the 780 and the 290 vastly over priced.
> 
> Lets say you buy two 670s for $270 each new which brings me to $540. Top that off with the duplicate games they will come with in bundles. Selling one game that will bring it me to $500 easy. If I sold both games it would bring me to well under $500. Maybe around $460 at $40 bucks per game for two 670's when its all said and done. That's far better performance with an older generation GPU for less money.
> 
> ...



 I think both companies deffinately price high on release to allow for non company scrapping price drops later too and many are blinded by the new toy elimment  , and i think,no i actually do buy cards for what I think its actually worth even if i have to wait until its a,, still useful last gen product to get my ultra-ish game fix

and the price v my opinionated worth mentality is the reason I have given nvidia some stick lately i dont like intel's priceing either despite both making great products im not buying until i must or they change.


----------



## purecain (Oct 26, 2013)

I bought an R9 290x and im happy with the performance it gives me now@850-1000mhz... 

games feel more fluid than they did with the 7970...imo

AMD have a gpu here that scales like mad with core speed, so as far as benchmarking goes this is an exciting card... I cant wait to get my hands on a top tier after market cooler....

fitting an aftermarket cooler for your own high end gpu is hardware heaven...imo

so im actually looking forward to it but then im a hardware junky...


----------



## HTC (Oct 26, 2013)

purecain said:


> I bought an R9 290x and im happy with the performance it gives me now@850-1000mhz...
> 
> games feel more fluid than they did with the 7970...imo
> 
> ...



Congratulations on your purchase.

W1zzard's review shows the card throttling as much as ~400 MHz

If @ all possible, could you try to underclock the base voltage in to ... say ... 900 MHz? The reason i ask is i'm wondering if with less throttling the card performs better then stock clocks using the same cooler or if the excessive throttling, IMO, doesn't hamper performance in the slightest.

EDIT

Dunno if this is possible to test.


----------



## broken pixel (Oct 26, 2013)

TheMailMan78 said:


> Its simple. Most games today with the exception of maybe Battlefield 3/4, Crysis and Metro can get by with a 660ti for 230 bucks. You can get a 670 for $270 and that will run Battlefield well and Crysis and Metro decent on medium settings. Which to me makes the 780 and the 290 vastly over priced.
> 
> Lets say you buy two 670s for $270 each new which brings me to $540. Top that off with the duplicate games they will come with in bundles. Selling one game that will bring it me to $500 easy. If I sold both games it would bring me to well under $500. Maybe around $460 at $40 bucks per game for two 670's when its all said and done. That's far better performance with an older generation GPU for less money.
> 
> ...



1080p is so old fashion, try using a 660ti or 670 on a 1440p panel over 60Hz with moderate eye candy filtering.


----------



## Fourstaff (Oct 26, 2013)

broken pixel said:


> 1080p is so old fashion, try using a 660ti or 670 on a 1440p panel over 60Hz with moderate eye candy filtering.



Well 1440p is still very much enthusiast territory, your average gamer is still on 1080p.


----------



## Ralfies (Oct 26, 2013)

broken pixel said:


> 1080p is so old fashion, try using a 660ti or 670 on a 1440p panel over 60Hz with moderate eye candy filtering.



As someone who games at 1440p with a 670, 60Hz+ is very achievable on the vast majority of games by turning only a few settings down - usually just AA. That being said, I'm an enthusiast and of course need moar! I sure hope non-reference 290's come out soon.


----------



## Steevo (Oct 26, 2013)

qubit said:


> On most NVIDIA cards, you can actually run the fan at 100% and while fairly loud, it's not overly objectionable. However, AMD cards are overly objectionable at much lower speeds, which is ridiculous. And to anyone who tells me to just buy an aftermarket cooler, that's a non-argument, because someone shouldn't be expected to spend more money, void their warranty, potentially break their card and be inconvenienced just because AMD won't put a proper cooler on their card ie compensating for AMD at the customers' own expense.
> 
> 
> 
> ...



NVIDIA's GK110 graphics processor was first introduced as a Tesla-only product to power demanding GPU compute applications. NVIDIA has now released it as a GeForce GPU too. It uses 7.1 billion transistors on a die size that we measured to be 561 mm².

AMD's Hawaii graphics processor uses the GCN shader architecture. It is produced on a 28 nm process at TSMC Taiwan, with 4.31 billion transistors on a 438 mm² die.

28% smaller die with the same performance +/- 3% but requires 40W more power = thermal dynamics of the efficiency. There is only so much heat copper can dissipate, a higher fan speed is going to be required as the die size is smaller and heat output more concentrated, the shaders are more efficiently utilized in the 290 than the Titan. I don't know how else to explain it to you, I'm sure if thy wanted to create a vapor chamber and alter the cooler and increase the price to $750 they would have, but they know who is buying the card, enthusiasts, with liquid cooling, or overclockers who would rather get a cheaper card and spend the extra $100 on a custom cooler. 


If you actually look at the cooler its the exact same design as Nvidia uses, except the Titan has a vapor chamber.


So would you rather buy a card cheaper and be able to customize it, or buy the more expensive card like everyone else?


----------



## broken pixel (Oct 26, 2013)

Buy newer, cheaper and modify. ^


----------



## Fluffmeister (Oct 26, 2013)

Frick said:


> Guys. It is loud and hot. Whats to discuss?



This cracked me up.


----------



## HTC (Oct 26, 2013)

Fluffmeister said:


> This cracked me up.



I think i saw a bit of throttling there ...


----------



## purecain (Oct 26, 2013)

HTC said:


> Congratulations on your purchase.
> 
> W1zzard's review shows the card throttling as much as ~400 MHz
> 
> ...



as it happens, all I need to do with my gpu is set the fan to 70percent(it will then run up to but not above, not stick at 70%). 

with everything else stock, the card runs valley windowed with gpu-z open and never throttles... but that's only at 1000mhz. 

when I run 11111mhz+ with the fan at max 70% you can see the card throttling but it never goes lower than 1000mhz when overclocked...or at least not that ive noticed. when left at 1000mhz it did throttle down to 850mhz at one point...

I also set the highest temp for the gpu to run at 80% not 95%... 

I don't like anything in my machine running at 95%...


----------



## qubit (Oct 26, 2013)

Fluffmeister said:


> This cracked me up.



Dammit that's funny!

Nice bit of CGI work there with the graphics card, too. Notice how they inverted it, so it's going left to right instead.



Steevo said:


> NVIDIA's GK110 graphics processor was first introduced as a Tesla-only product to power demanding GPU compute applications. NVIDIA has now released it as a GeForce GPU too. It uses 7.1 billion transistors on a die size that we measured to be 561 mm².
> 
> AMD's Hawaii graphics processor uses the GCN shader architecture. It is produced on a 28 nm process at TSMC Taiwan, with 4.31 billion transistors on a 438 mm² die.
> 
> ...



Well, the GPU is actually 6b not 4b transistors according to the review. It's less than GK110 on the same 28nm process, yet uses more power. I don't get it. Is the onboard power regulator that inefficient, perhaps? I don't know

Also, there's no excuse for putting a crap cooler on there. The GTX 770, a lower end card than the 290x, uses the exact same vapour chamber cooler as the Titan, yet doesn't cost the earth.

In short, it looks like AMD designed a GPU with bags of performance (I'm thinking of wizzy's comments about scaling) and then gimped it. Why the hell they'd do this I don't know. They'd make more money if they did it right. Did the beancounters strike, perhaps?

Check out the start of this O3D review and you'll see what I mean about the cooler.

[yt]-lZ3Z6Niir4[/yt]


----------



## HumanSmoke (Oct 26, 2013)

qubit said:


> Well, the GPU is actually 6b not 4b transistors according to the review. It's less than GK110 on the same 28nm process, yet uses more power. I don't get it. Is the onboard power regulator that inefficient, perhaps? I don't know


Might be a case of transistor density. Hawaii at 6.2bn trans and 438mm² is 14.55 million/mm² against GK110's 7.08bn and 561mm²  (12.62 million/mm²), and that comparison might actually be worse factoring out the lower power demand uncore parts of the chips ( I/O, memory controllers etc.) given the difference in bus sizes and GDDR5 controllers.


qubit said:


> Also, there's no excuse for putting a crap cooler on there.


None whatsoever. AMD would have had a slam dunk on their hands if not for reservations over the power dissipation from review sites. As it is they've basically engineered a fault into a product that had no downside. The GPU looks power hungry, and you're still limited to a 2-slot design for OEM contracts, but I'm pretty certain a cooling design house could have come up with a more elegant and effective option. Even if it raised the price by $20-30, it still would have paid for itself in more positive reviews.


----------



## radrok (Oct 27, 2013)

HumanSmoke said:


> Might be a case of transistor density. Hawaii at 6.2bn trans and 438mm² is 14.55 million/mm² against GK110's 7.08bn and 561mm²  (12.62 million/mm²), and that comparison might actually be worse factoring out the lower power demand uncore parts of the chips ( I/O, memory controllers etc.) given the difference in bus sizes and GDDR5 controllers.
> 
> None whatsoever. AMD would have had a slam dunk on their hands if not for reservations over the power dissipation from review sites. As it is they've basically engineered a fault into a product that had no downside. The GPU looks power hungry, and you're still limited to a 2-slot design for OEM contracts, but I'm pretty certain a cooling design house could have come up with a more elegant and effective option. Even if it raised the price by $20-30, it still would have paid for itself in more positive reviews.



Consider that, usually, custom AIB cooled graphics cards on reference boards cost like 10$ or 20$ more so AMD/ATI really has no excuses continuing with these crappy coolers.


----------



## sweet (Oct 27, 2013)

radrok said:


> Consider that, usually, custom AIB cooled graphics cards on reference boards cost like 10$ or 20$ more so AMD/ATI really has no excuses continuing with these crappy coolers.



How about the crappy VRM circuit of Titan/780


----------



## radrok (Oct 27, 2013)

sweet said:


> How about the crappy VRM circuit of Titan/780




Completely unrelated argument you bring up. 

Titan and 780 power delivery circuitry is perfectly fine at stock and for moderate overclocks. 

We are talking about a stock card that is not reaching its own stock clocks.


----------



## broken pixel (Oct 27, 2013)

If I had a ref 290x I would remove the blower casing and mount fans all over the heatsink, maybe that would help out?


----------



## manofthem (Oct 27, 2013)

broken pixel said:


> If I had a ref 290x I would remove the blower casing and mount fans all over the heatsink, maybe that would help out?



If I had a ref 290x, I would remove the blower casing and mount a waterblock over that beyotch and love life


----------



## broken pixel (Oct 27, 2013)

manofthem said:


> If I had a ref 290x, I would remove the blower casing and mount a waterblock over that beyotch and love life



Well it seems EK only has a few 290x back plates in stock & they are all out of 290x water blocks.


----------



## manofthem (Oct 27, 2013)

broken pixel said:


> Well it seems EK only has a few 290x back plates in stock & they are all out of 290x water blocks.



That makes me haz a sad  for now. I'll have to wait before purchasing for $ reason anyway


----------



## Xzibit (Oct 27, 2013)

radrok said:


> Completely unrelated argument you bring up.
> 
> Titan and 780 power delivery circuitry is perfectly fine at stock and for moderate overclocks.
> 
> *We are talking about a stock card that is not reaching its own stock clocks*.



Might want to read up on how the clocks work

AMD has a ceiling of up-to 1ghz


----------



## radrok (Oct 27, 2013)

Xzibit said:


> Might want to read up on how the clocks work
> 
> AMD has a ceiling of up-to 1ghz
> 
> http://cdn.pcper.com/files/imagecache/article_max_width/review/2013-10-23/specs2.jpg



Grasping at straws much?


----------



## broken pixel (Oct 27, 2013)

On the chart above under the section memory type, data rate it appears the 280x & 270x do up to 6 - 5.6 Gbps com paired to the 290x which is up to 5 Gbps?


----------



## Xzibit (Oct 27, 2013)

radrok said:


> Grasping at straws much?



Not at all 

Seamed you were not able to comprehend what W1zzard graphs were depicting in his PowerTune Analysis

I just provided a AMD reference for you since you seam to be under the impression AMD advertises a set clock.


----------



## g101 (Oct 27, 2013)

The Von Matrices said:


> *NVidia is not going to reduce the price on Titan.  Titan is a compute card that AMD can't match in DP floating point performance, and the people who can use compute capabilities are the only people who should have been buying it.*  It was a fluke that Titan ended up being a high end gaming card as well.  That compute niche is restored with high end gaming cards like the R9 290X and will further be reinforced with the GTX 780Ti.  There still is no DP compute card that can compete with Titan for the price.
> QUOTE]
> 
> Although someone probably already called you out on this: that's 100% BS. Please be silent, little boy, you know nothing of compute. Not only did the 7970 come incredibly close to the titan's double precision at 925mhz core, its single precision was higher, even the 7950 was nearly as powerful for both precisions. It's already known that the r290x broke the 5 tf barrier, I guess you don't grasp how this almost certainly equates to the double precision performance... Please, stop wasting all your time on forums pretending like you have the slightest clue about GPGPU implementations or architectures. When implemented in real instances, GCN already showed greater real world floating performance than titan...Completely out of your depth here, kid.
> ...


----------



## sweet (Oct 27, 2013)

^ I'm still wonder why people believe that Titan is a compute card :rofl:
Face the truth, only Quadro/Tesla cards benefit from their specific driver, not the Geforce. Titan has better compute power than 680, but it will never reach the level of its brothers in Quadro/Tesla line up. In fact, when it comes to raw compute power, no single GPU card can beat 290x now. 



broken pixel said:


> On the chart above under the section memory type, data rate it appears the 280x & 270x do up to 6 - 5.6 Gbps com paired to the 290x which is up to 5 Gbps?



It is the data rate of the VRAM, 290x default core mem is 1250 MHz, which is translated to 5 Gbps. The memory bus of 290x however is 512 bit instead of 384 bit in 280x, therefore the memory bandwidth of 290x is 320 GBps.


----------



## HumanSmoke (Oct 27, 2013)

radrok said:


> Completely unrelated argument you bring up.


But it does serve to break up a reasoned discussion by rebooting the Green vs Red trolling 


g101 said:


> The Von Matrices said:
> 
> 
> > *NVidia is not going to reduce the price on Titan.  Titan is a compute card that AMD can't match in DP floating point performance, and the people who can use compute capabilities are the only people who should have been buying it.*  It was a fluke that Titan ended up being a high end gaming card as well.  That compute niche is restored with high end gaming cards like the R9 290X and will further be reinforced with the GTX 780Ti.  There still is no DP compute card that can compete with Titan for the price.
> ...


1. The 7970 isn't the R9-290X
2. You need to add an "["


g101 said:


> It's already known that the r290x broke the 5 tf barrier


Single precision. I believe the statement you're ranting over specifically mentions double precision


g101 said:


> , I guess you don't grasp how this almost certainly equates to the double precision performance... Please, stop wasting all your time on forums pretending like you have the slightest clue about GPGPU implementations or architectures.


I'm guessing the OP knows a great deal more than you do- quelle surprise. The R9-290X has it's FP64 rate capped at 1:8 of single precision - probably due to the power demand of FP64 calculation.





The R9-290X has a lower FP64 value than the HD 7970 (0.7 TFlops vs 1.18 for the 7970)
Maybe you can lay off the insults. They don't further the discussion, and aren't particularly relevant given the actual facts.


----------



## Solaris17 (Oct 27, 2013)

HumanSmoke said:


> But it does serve to break up a reasoned discussion by rebooting the Green vs Red trolling
> 
> 1. The 7970 isn't the R9-290X
> 2. You need to add an "["
> ...



im impressed that you kept it together im more impressed that you actually backed it by statistical facts instead of just politely shrugging him off. I think this is a good example why iv stayed here so long


----------



## HammerON (Oct 27, 2013)

Here is another warning to not insult others and to be civil.
Infractions handed out.


----------



## Steevo (Oct 27, 2013)

HumanSmoke said:


> But it does serve to break up a reasoned discussion by rebooting the Green vs Red trolling
> 
> 1. The 7970 isn't the R9-290X
> 2. You need to add an "["
> ...



Its probably a BIOS lock that could be cracked. It is very demanding and could very well push it past the thermal capacity of most coolers.

I want to see what happens with water or extreme cooling.


----------



## HumanSmoke (Oct 27, 2013)

Steevo said:


> Its probably a BIOS lock that could be cracked. It is very demanding and could very well push it past the thermal capacity of most coolers.


The FirePro version features a 1:4 FP64 rate by all accounts, so it could be possible from that angle, although the FirePro will clock lower to accommodate the overhead.
FWIW, GPU input power on these reference cards is limited to 208 watts by PowerTune ( 300W total with VRAM and efficiency taken into account).
Couldn't find an English speaking site with the PowerTune parameters offhand, but here's PCGH's



> Powertune behaves like this and in this order. The first point is mandatory.
> 1. Do not supply the GPU with more than 208 watts input
> 
> 2. try to hit 1 GHz, but
> ...


----------



## Tatty_One (Oct 27, 2013)

This thread is turning into Epicness and may well yet sit in the TPU hall of fame, the similarities between this and an old 2900xt versus 8800GTS comparison is uncanny, I think many of you need to stop and pause for a minute because some seem to have an inability to argue or disagree with any form of maturity, try and get some perspective on this, it's a GPU not a life support system, if I was a guest coming here to get some opinion of a new GPU offering I would at the very least be bemused at some of the reaction and at the very worst think I had accidentally stumbled over a kindergarten squabble!

In reality.... only time will tell with the 290X, in it's present reference form you could argue that it is far from perfection, lets check back in 3 months and see how opinion is divided then.


----------



## qubit (Oct 27, 2013)

Tatty_One said:


> This thread is turning into Epicness and may well yet sit in the TPU hall of fame, the similarities between this and an old 2900xt versus 8800GTS comparison is uncanny, I think many of you need to stop and pause for a minute because some seem to have an inability to argue or disagree with any form of maturity, try and get some perspective on this, it's a GPU not a life support system, if I was a guest coming here to get some opinion of a new GPU offering I would at the very least be bemused at some of the reaction and at the very worst think I had accidentally stumbled over a kindergarten squabble!
> 
> In reality.... only time will tell with the 290X, in it's present reference form you could argue that it is far from perfection, lets check back in 3 months and see how opinion is divided then.



A well reasoned post. No, that's not right. :shadedshu

For some odd reason, comparisons with new CPUs and GPUs always leads to situations like this. I suspect at least part of the reason is denialism by people defending the underdog, when really it's the underdog because it's simply not as good. Accept it, don't buy the inferior item and move on. Simple. I did that with my gear and I'm very happy with it as a result.


----------



## Aquinus (Oct 27, 2013)

qubit said:


> Accept it, don't buy the inferior item and move on. Simple. I did that with my gear and I'm very happy with it as a result.



I would hardly call either product inferior...

As soon as everyone realizes that the GTX 780, Titan, and R9 290X are not bad GPUs, the better off everyone will be. The real question is how much are you willing to invest for that experience, not which card is better than the other.


----------



## qubit (Oct 27, 2013)

Aquinus said:


> I would hardly call either product inferior...
> 
> As soon as everyone realizes that the GTX 780, Titan, and R9 290X are not bad GPUs, the better off everyone will be. The real question is how much are you willing to invest for that experience, not which card is better than the other.



The only real let down with the 290x seems to be that awful cooler - even CrossFire works properly now. Did you check out that overclock3d video I posted above? Tom Logan is absolutely gutted about it, going on ad nauseum about it and he's much more expert with these things than me plus he has tested it too. He really wanted it to be an NVIDIA killer.

It's an expensive product and should work properly out of the box. Expecting someone to mod the cooler to fix AMD's shortcomings is completely unreasonable. Logan ended up recommending the MSI GTX 780 Twin Frozr OC Gaming with the custom cooler, because it ran quieter, faster and overclocked like a banshee, while being only £30 more expensive (his words).

I dunno why AMD design an inherently good product (that GPU really does have a lot to give) along with a decent motherboard, but then hamstring it like this. It's very frustrating. I want to see AMD kick NVIDIA in the nuts with an _all round_ excellent product and sell that at a good price. Then we'd get proper competition, better prices and better products from both sides.


----------



## newconroer (Oct 27, 2013)

Price /performance vs Titan, I'm impressed.
Price /performance vs a couple of used 680s(or 670s) in SLI? Not impressed at all.

I said some time ago that Fermi's power/voltage architecture and it's boost system, make it an extremely friendly and easy product to manage as well as tweak.

Combined with the 98% scale-ability of SLI among other amenities, there's just no good reason to fork over even more money for a new single card that's slower and produces no extenuating bonuses.

I'm glad that AMD has come to market with something competitive, but they took too long to produce something that's going to be trumped in half a year.


----------



## Frick (Oct 27, 2013)

newconroer said:


> Price /performance vs Titan, I'm impressed.
> Price /performance vs a couple of used 680s(or 670s) in SLI? Not impressed at all.
> 
> I said some time ago that Fermi's power/voltage architecture and it's boost system, make it an extremely friendly and easy product to manage as well as tweak.
> ...



So all in all, a normal release.


----------



## newconroer (Oct 27, 2013)

Frick said:


> So all in all, a normal release.



It deserves more than "normal" because it has accomplished the (I assume) goal of taking a the dominant single card solution away from the Titan at an impressive price difference.

The real argument is that like Qubit mentioned, the R290 is not a well rounded product. What it gains in price/performance it loses in heat and power consumption, among having shocking release drivers and nothing miscellaneously beneficial such as Crossfire working in windowed mode or a particular easy clocking interface. Effectively it's guilty of what older GTX cards were.


The problem AMD have created for themselves is that by chasing the crown, they've forgotten that the 290 is completely ineffective to the market for anything but 1600p resolution or higher. 1080p and down can be handled by a single Fermi or AMD 7 series without hassle. 
Which is why I harped on about Fermi (and Radeon 7XXX cards to some degree). As an overall product it is significantly better and lest costly. Put two together and you've just eradicated the need for a 290x.


----------



## broken pixel (Oct 27, 2013)

Go check out the R290x owners club forum over @ OCN for a more civil discussion from people who actually own this GPU.


----------



## Fluffmeister (Oct 27, 2013)

broken pixel said:


> Go check out the R290x owners club forum over @ OCN for a more civil discussion from people who actually own this GPU.



Read this, is spookily similar:

http://www.techpowerup.com/forums/showthread.php?t=117929


----------



## Solaris17 (Oct 27, 2013)

I dont really see the massive problem with the cooler either. I mean i will finally jump nvidias ship and grab a 290X when I eventually have the money. The problem seems to be that everyone is trying to push moors law thermal output you cant have more powerful and less hot you can argue that you can I suppose but I will dismiss it. This happened with the 2xx series (remember dual monitor temps idle at 60C?) and the highend 4xx series. This is a completely new architecture. This is not a revision. This is a completely new card. I assume next generation or maybe even the next few revisions be it AMD mandate or board partner will use diffirent components to bring the thermals down. (maybe) but for a card of such power I really do not see the problem with the temperatures it may heat up your room it might get loud it may even scare you a little bit. but this is not new for a release flag ship. (GX2 anyone?)


----------



## HTC (Oct 27, 2013)

purecain said:


> as it happens, all I need to do with my gpu is set the fan to 70percent(it will then run up to but not above, not stick at 70%).
> 
> *with everything else stock, the card runs valley windowed with gpu-z open and never throttles... but that's only at 1000mhz.*
> 
> ...



That's not what i was interest in knowing: probably difficult if not impossible to test 


I'll try another approach to see if i can explain it better:

1 - take any benchmark you like that's able to push the card so that it throttles a lot @ stock settings (everything, fan included) and try and get the *average speed* of it throughout the test
2 - set the default speed of the card to the value you discovered in point #1
3 - the card will now throttle way less and, in theory, it should produce the same result as with everything @ stock

If throttling lots of times makes it slower then throttling a few times in the above scenario, then the fact it throttles too much due to the shoddy cooler is actually hampering performance: that's what i want to know.


I know what i want to say but have difficulty putting it to words, sometimes


----------



## TheoneandonlyMrK (Oct 27, 2013)

Solaris17 said:


> I dont really see the massive problem with the cooler either. I mean i will finally jump nvidias ship and grab a 290X when I eventually have the money. The problem seems to be that everyone is trying to push moors law thermal output you cant have more powerful and less hot you can argue that you can I suppose but I will dismiss it. This happened with the 2xx series (remember dual monitor temps idle at 60C?) and the highend 4xx series. This is a completely new architecture. This is not a revision. This is a completely new card. I assume next generation or maybe even the next few revisions be it AMD mandate or board partner will use diffirent components to bring the thermals down. (maybe) but for a card of such power I really do not see the problem with the temperatures it may heat up your room it might get loud it may even scare you a little bit. but this is not new for a release flag ship. (GX2 anyone?)



I think its quite clear that Amd have a very good chip here like Gk110 for Nv that would scale Very well with a node drop , I think like tahiti this is going to run and run and they will have quite a beast for a bit.
Im very glad nvidia released the Gk110 its spurred some effort


----------



## erocker (Oct 27, 2013)

I'm wondering if and if so, when there will be AIB designs? I wouldn't mind something with an "upgraded PCB/components" and preferably something from EK like they did for the Cu II series.


----------



## HumanSmoke (Oct 27, 2013)

erocker said:


> I'm wondering if and if so, when there will be AIB designs? I wouldn't mind something with an "upgraded PCB/components" and preferably something from EK like they did for the Cu II series.



From a SweClockers article a couple of days ago (Sorry, this is a direct Google Translate so excuse the grammatical spaghetti)


> Sources SweClockers now state that AMD has not yet begun deliveries of Hawaii XT graphics processor to partner manufacturers. This means that it will take at least six weeks before the first bespoke models show up in the trade. This is also confirmed by AMD, which expects that these arrive by the end of the fourth quarter.


-----------------------------



Tatty_One said:


> In reality.... only time will tell with the 290X, in it's present reference form you could argue that it is far from perfection, lets check back in 3 months and see how opinion is divided then.


Three months??? In three months I predict the prams will be restocked with toys ready to be thrown out again in the Maxwell - Pirate Islands speculation threads. If Nvidia bring out a GTX 780Ti in the next few weeks  (and lets face it, with the holiday season in sight, and AMD leaving the door open for Nvidia to go nuts with clocks/power and still compare favourably with the 290X, it is a definite possibility) we're more or less back to the HD 7970 (non-GE)/GTX 680 days- relative parity might be great for the consumer, but it certainly dampens the ardour of forum warriors


----------



## MetalRacer (Oct 27, 2013)

erocker said:


> I'm wondering if and if so, when there will be AIB designs? I wouldn't mind something with an "upgraded PCB/components" and preferably something from EK like they did for the Cu II series.



Asus's product manager confirmed in this interview they are working on a model with the new DirectCU cooler.

http://us.hardware.info/reviews/492...n-tour-2013-asus-new-amd-cards-and-the-future


----------



## Xzibit (Oct 27, 2013)

broken pixel said:


> Go check out the R290x owners club forum over @ OCN for a more civil discussion from people who actually own this GPU.



Its interesting.

Aside from GPU-Z issues of it influencing GPU to load at 100% and cause some issues for some.

R9 290X @ 1150 Limit
GTX 780 @ 1300 Boost
GTX TITAN @ 1200 Boost
Seams to have similar benchmark numbers

It will be interesting once they get better cooling or put them under water.


----------



## sweet (Oct 28, 2013)

Is there any Titan/780 that can touch this score?







http://kingpincooling.com/forum/showthread.php?t=2473&page=3


----------



## MxPhenom 216 (Oct 28, 2013)

sweet said:


> Is there any Titan/780 that can touch this score?
> 
> http://www.freeimagehosting.net/newuploads/hb92h.jpg
> 
> http://kingpincooling.com/forum/showthread.php?t=2473&page=3



Yep. KingPins Titan


----------



## N3M3515 (Oct 28, 2013)

MxPhenom 216 said:


> Yep. KingPins Titan
> 
> http://hwbot.org/image/1027068.jpg



Wow.........1800 core to score 50 points higher. Talk about clock for clock 

Edit: 1800 core on evga precision and 1230 on gpuz?


----------



## sweet (Oct 28, 2013)

N3M3515 said:


> Wow.........1800 core to score 50 points higher. Talk about clock for clock
> 
> Edit: 1800 core on evga precision and 1230 on gpuz?



1230 on gpuz shows the base clock, it's not the real time clock. 1800 core on precision shows the boost clock. The card may boost higher through Kepler boost, but maybe not in this case.


----------



## wolf (Oct 28, 2013)

sweet said:


> Is there any Titan/780 that can touch this score?





MxPhenom 216 said:


> Yep. KingPins Titan



Must admit I lol'd

I really just want the next generation of Nvidia OR AMD GPU's to come out asap so that this performance level goes down in price.

Because of Nvidias' -intel extreme chip like- pricing, people think $550 is good value for a high end card, and it's not!

$400-450 (350 €) is already a lot to pay for a card and should bring with it a whole butload of performance.


----------



## Frick (Oct 28, 2013)

wolf said:


> Must admit I lol'd
> 
> I really just want the next generation of Nvidia OR AMD GPU's to come out asap so that this performance level goes down in price.
> 
> ...



It's not a price hike. There have been cards in the past that launched at this price. 1$ generally equals €1.


----------



## wolf (Oct 28, 2013)

When the competition don't have an answer for that level of performance, the company generally charges around the $1000 mark for it (Intel and Nvidia spring to mind), But personally I find $550 to be too much for me to spend on a single card these days.

I just spent 3 years living in France and I definitely payed less euro's than USD for my tech. I think the biggest reason that it evens out (depending on the items, how new they are etc) is that the same item is generally more expensive in France (tax/shipping?) than it is in the states.

Well that was my experience anyway


----------



## Frick (Oct 28, 2013)

wolf said:


> When the competition don't have an answer for that level of performance, the company generally charges around the $1000 mark for it (Intel and Nvidia spring to mind), But personally I find $550 to be too much for me to spend on a single card these days.
> 
> I just spent 3 years living in France and I definitely payed less euro's than USD for my tech. I think the biggest reason that it evens out (depending on the items, how new they are etc) is that the same item is generally more expensive in France (tax/shipping?) than it is in the states.
> 
> Well that was my experience anyway



Might not be 1$ = 1€ exactly but it's closer to that than to the actual exchange rate.

And as have been said again and again and again, AMD and Intel have both had $1000 parts. The slower GPU's are in different price bracket (the x1950Pro was €200 for instance, now it would have been ... €150 mayhap?) but the high end have at most times been about where we are now. There have been exceptions though.

You might not be willing to pay this, but that is a different topic. Me I wouldn't spend more than €100 on a GPU these days, but that's me.


----------



## The Von Matrices (Oct 28, 2013)

NVidia just announced price cuts for GTX 780 to $499 effective tomorrow.  I'm sure they'll be an official press release later and it will be posted in TPU news.

This basically kills any reason to buy a R9 290X.


----------



## Fluffmeister (Oct 28, 2013)

The Von Matrices said:


> NVidia just announced price cuts for GTX 780 to $499 effective tomorrow.  I'm sure they'll be an official press release later and it will be posted in TPU news.
> 
> This basically kills any reason to buy a R9 290X.



Price drop already in effect at Overclockers.co.uk, some real nice 780's for around £400. Scan.co.uk too.


----------



## Mathragh (Oct 28, 2013)

Damn Nvidia must also be thinking the AMD HD R9 290X is the superior card if they price it under that one. Even more when you factor in the "superiority factor" that NVidia normally uses to ask a premium for a similar product.


----------



## manofthem (Oct 28, 2013)

The Von Matrices said:


> NVidia just announced price cuts for GTX 780 to $499 effective tomorrow.  I'm sure they'll be an official press release later and it will be posted in TPU news.
> 
> This basically kills any reason to buy a R9 290X.



That does sound exciting... If new 780s drop to $500, I could probably find a used one for $400, and that would be very enticing.    a drop of $130+ shows how inflated the price really was.

Still I can see the prices of the 290x settling like 2 months after launch to a nice deal too.


----------



## wolf (Oct 28, 2013)

So basically with a cooler, quieter card that offer the same or even slightly better price/performance, they will keep a lot of customers.


----------



## Tatty_One (Oct 28, 2013)

The Von Matrices said:


> NVidia just announced price cuts for GTX 780 to $499 effective tomorrow.  I'm sure they'll be an official press release later and it will be posted in TPU news.
> 
> This basically kills any reason to buy a R9 290X.



But certainly does not kill the reason to buy the 290 non "x" which in my opinion was always going to be the key player at the even lower price.


----------



## crazyeyesreaper (Oct 28, 2013)

Heres the kicker

Almost every single GTX 780 with an aftermarket cooler is faster than Titan, thus right now Nvidia and their partners offer a cheaper equally performing GPU.   AMD only has loud obnoxious reference design that will eventually shine once aftermarket versions with a game bundle arrive.

Heres the problem tho. Would you wait a couple more months now? to get a solid overclockable aftermarket cooled 290X with a possible game bundle or take a GTX 780 with exceptionally cooling right now at $50 less?

Money talks bullshit walks and right now at $499 GTX 780 is the more attractive option.  R290X with aftermarket cooling and a never settle bundle will eventually prove to be a better deal by far however thats not the case right now.

The GTX 780 is available right now its got a price drop there is no wait. Nvidia will captialize on that to gain some quick sales and thus boost market share.


----------



## N3M3515 (Oct 28, 2013)

Tatty_One said:


> But certainly does not kill the reason to buy the 290 non "x" which in my opinion was always going to be the key player at the even lower price.



lol, i was going to say that.
Besides, that doesn't stop amd from lowering 290x to $500 (still, i think the 780 is a better choice at the same price) and 290 vanilla to $400, and that would be a better buy than the gtx 780 imho(equal performance for $100 bucks less).

Especulation: I see non reference 290x hitting $550 - 600 while being some 10 ~ 15% faster than stock.

GTX 780ti will cost $700
So, once 290x non ref arrives at $550 - 600, they slash a hundred bucks? seems predictable.


----------



## MxPhenom 216 (Oct 28, 2013)

And in comes Nvidia with big price drops to stomp on the parade.....


----------



## The Von Matrices (Oct 28, 2013)

Tatty_One said:


> But certainly does not kill the reason to buy the 290 non "x" which in my opinion was always going to be the key player at the even lower price.



That announcement also showed that NVidia is cutting the GTX 770 to $329, which now leave a huge gap between the $329 GTX 770 and $499 GTX 780.  The R9 290 would fit nicely in that gap.

The lack of supply of R9 290X isn't giving me hope for the R9 290.  It might be a great card, but if AMD can't provide enough supply the card won't matter.  The biggest advantage NVidia has at the moment is a huge supply of their cards compared to AMD.

*NVidia is killing AMD with the game bundles too.  If you buy a GTX 780 (or 770 or 780Ti) you get Assassin's Creed IV, Batman: Arkham Origins, and Splinter Cell: Blacklist at no extra cost.*  That is far better than paying $30 extra to get a BF4 bundle with the R9 290X.


----------



## sweet (Oct 28, 2013)

The Von Matrices said:


> NVidia just announced price cuts for GTX 780 to $499 effective tomorrow.  I'm sure they'll be an official press release later and it will be posted in TPU news.
> 
> This basically kills any reason to buy a R9 290X.



The most important reason to buy R9 290X is: it is holding the performance crown. Guys with LN2 have been proving that fact with world records set by that Hawaii chip.

The price is just the cream on the crop. 780 is brought down to $500? There are still 290 at $450 can match its performance. The price of 290 may be less thanks to nVi's attack


----------



## crazyeyesreaper (Oct 28, 2013)

SO R9 290 is your counter point a GPU that is around $450 and again has no bundle?

How about just sell the 3 games that come with the 780 which again in your own chart most of them perform at the level of the 780 HOF.

Sell those games and the 780 drops by around $70-80 drops the 780 down to $420 So no game bundle $450 vs game bundle dropping cost top $420 ish.  

Seems like the GTX 780 is still the better deal


----------



## Tatty_One (Oct 28, 2013)

The Von Matrices said:


> That announcement also showed that NVidia is cutting the GTX 770 to $329, which now leave a huge gap between the $329 GTX 770 and $499 GTX 780.  The R9 290 would fit nicely in that gap.
> 
> The lack of supply of R9 290X isn't giving me hope for the R9 290.  It might be a great card, but if AMD can't provide enough supply the card won't matter.  The biggest advantage NVidia has at the moment is a huge supply of their cards compared to AMD.
> 
> *NVidia is killing AMD with the game bundles too.  If you buy a GTX 780 (or 770 or 780Ti) you get Assassin's Creed IV, Batman: Arkham Origins, and Splinter Cell: Blacklist at no extra cost.*  That is far better than paying $30 extra to get a BF4 bundle with the R9 290X.



Availability here is quite good although I am more interested in the availability of the straight 290 when it arrives, that card will take the biggest part of sales at a guess, and if it really does hit the 780's performance by give or take 5% and comes with non reference cooling then I think everyone wins, those in the green camp get much cheaper 780's, those in the red camp gets near 780 busting performance for less.


----------



## The Von Matrices (Oct 28, 2013)

sweet said:


> The most important reason to buy R9 290X is: it is holding the performance crown. Guys with LN2 have been proving that fact with world records set by that Hawaii chip.
> 
> The price is just the cream on the crop. 780 is brought down to $500? There are still 290 at $450 can match its performance. The price of 290 may be less thanks to nVi's attack :toast



You're cherry picking a benchmark by using Firestrike (although granted there's no better data until the R9 290 is released).  It's clear that the R9 290X and by extrapolation the R9 290 are the kings of Firestrike by a large margin over NVidia, but that huge advantage evaporates in nearly everything else and is definitely a best case scenario for AMD.  Firestrike only matters to competitive benchmarkers, who are but a very tiny fraction of the market.  The overall performance charts, like the ones at the ends of TPU reviews, are the best comparison.  To state that the R9 290X clearly holds the performance crown based on one benchmark is ridiculous; it's sharing it with Titan at the moment.


----------



## erocker (Oct 28, 2013)

The Von Matrices said:


> NVidia just announced price cuts for GTX 780 to $499 effective tomorrow.  I'm sure they'll be an official press release later and it will be posted in TPU news.
> 
> This basically kills any reason to buy a R9 290X.



Evga GTX 780 cards on Newegg are now $499-$550 

$699 tag on the Ti version is pretty awful though.


----------



## sweet (Oct 28, 2013)

crazyeyesreaper said:


> SO R9 290 is your counter point a GPU that is around $450 and again has no bundle?
> 
> How about just sell the 3 games that come with the 780 which again in your own chart most of them perform at the level of the 780 HOF.
> 
> ...



I'm not sure about whether the Neversettle bundle will have support for Hawaii cards or not, but AMD knows how to react with the sale record. And, oh wait, now nvidia's fans are talking about bundles. A while ago they still ruled the game bundle out of the 7xxx/6xx  comparison

I will just stick to AMD. Next gen consoles belong to them, and the ported games will be boosted on AMD cards with Mantle running behind. No reason to go for green camp in the PS4/XBO era.



The Von Matrices said:


> You're cherry picking a benchmark by using Firestrike (although granted there's no better data until the R9 290 is released).  It's clear that the R9 290X and by extrapolation the R9 290 are the kings of Firestrike by a large margin over NVidia, but that huge advantage evaporates in nearly everything else and is definitely a best case scenario for AMD.  Firestrike only matters to competitive benchmarkers, who are but a very tiny fraction of the market.  The overall performance charts, like the ones at the ends of TPU reviews, are the best comparison.  To state that the R9 290X clearly holds the performance crown based on one benchmark is ridiculous; it's sharing it with Titan at the moment.


Thanks, but about gaming I don't play games like Starcraft 2, World of Warcraft, .. a.k.a DX9 legacy. When you exclude those games, 780/Titan looks even more miserable. In modern DX11 games, nvidia truly lags behind AMD. They told us their card fully support DX 11.1, but Microsoft didn't think so, and the truth was point out through a large number of games.

And as pointed out above, when playing Mantle game next year, AMD cards will lead even further. Just stick to the red and be safe.


----------



## MxPhenom 216 (Oct 28, 2013)

erocker said:


> Evga GTX 780 cards on Newegg are now $499-$550
> 
> $699 tag on the Ti version is pretty awful though.



Do want another for SLI!


----------



## crazyeyesreaper (Oct 28, 2013)

How does Nvidia lag behind?

780 cards available now all beat Titan\

Right now in quiet mode AMD is 1% faster compared to a stock 780  and in uber mode its 3-5% faster than Titan not to mention 3% tends to be the accepted as the nominal margin of error.

As such the 290X is pretty much on par with the non reference 780s in the majority of games. has a game bundle is now priced lower. I fail to see AMD's upside

Mantle has support in 1 title thats it and its not even available for testing untill December. Till then Mantle is an unknown and one I don't expect to be a game changer either.

You can believe what you want Sweet but the truth is

Nvidia / AMD are on par with each other this time.  Prices are in relative parity currently Nvidia is the more attractive option AMD will be more so come December overall it doesn't matter. People can drink the red or green Kool aid.

I go where performance is TODAY not 3 years from now when the 290x is just another previous generation product. Mantle if it takes off wont be an instant amazing game changer. I love how people are hailing this product like its amazing when in another year it will be just a footnote as a newer faster product takes it place.

IF the card had launched with better cooling and noise profile I might sing a different tune but right now there are to many unknowns / drawbacks to offset its 1-5% performance lead which is lets face it in truth non existant when compared with the GTX 780s on offer right now.

the GTX 780 HOF semi proves that its no better or worse then other pre OCed 780s and yet it matches the OCed 290X so again it comes down to picking and choosing benchmarks. In the end they are roughly even in performance and price. I for one just can't stand the noise Testing coolers here at TPU 50 dBA + is loud and I test 10cm away the fact the card does that at nearly 3 friggin feet is ridiculous.


----------



## Xzibit (Oct 28, 2013)

crazyeyesreaper said:


> As such the 290X is pretty much on par with the non reference 780s in the majority of games. has a game bundle is now priced lower. I fail to see AMD's upside



Upside of AMD is potential OC.  A ref cooler = Non-reference cooling 780.  As for the bundle it end Nov 26.  Nvidia hasn't said if it will continue with it so the window is narrow for purchases.



crazyeyesreaper said:


> the GTX 780 HOF semi proves that its no better or worse then other pre OCed 780s and yet it matches the OCed 290X so again it comes down to picking and choosing benchmarks.



You can go look at a few OC forums and see how effortlessly these cards are overclocking with no mods yet.

Which will change once the 780 Ti gets here



crazyeyesreaper said:


> I for one just can't stand the noise Testing coolers here at TPU 50 dBA + is loud and I test 10cm away the fact the card does that at nearly 3 friggin feet is ridiculous.



Might want to tell W1zzard that since he posted the review with 100cm = 50dBa


----------



## erocker (Oct 28, 2013)

Crazy, change your avatar to a fish, cuz you're takin' the bait bro!

Hahaha. Okay seriously though, things in video card land seem pretty competitive now. Good right? 

Though, I would love it if people reject the price of the Ti. It's a bad precedent that Nvidia keeps running with.


----------



## crazyeyesreaper (Oct 28, 2013)

Xzibit said:


> Might want to tell W1zzard that since he posted the review with 100cm = 50dBa



I test at 10CM for coolers

W1zz tests at 100cm

100cm = 3.2 feet todays math lesson is now over

@Erocker 

they are competitive its nice to see. I just like arguing and since I am current sick I have nothing better to do 

If the 290X was quieter and cooler I would buy one Hopefully with both cards actually being close a price war will start.


----------



## PatoRodrigues (Oct 28, 2013)

This card actually looks more awesome now.

It made Nvidia cut 780's price by $150. 

The best of it: The customers leave with the advantage. Except for the dudes that bought two 780's for $650. Maybe get a third one (and a new PSU?)


----------



## erocker (Oct 28, 2013)

crazyeyesreaper said:


> If the 290X was quieter and cooler I would buy one Hopefully with both cards actually being close a price war will start



It looks like "late November" according sites around the web this morning for non-reference stuff. I'm interested in seeing some better PCB/power delivery designs.


----------



## MxPhenom 216 (Oct 28, 2013)

erocker said:


> It looks like "late November" according sites around the web this morning for non-reference stuff. I'm interested in seeing some better PCB/power delivery designs.



Like a DirectCU card with a EK DirectCU block!


----------



## N3M3515 (Oct 28, 2013)

MxPhenom 216 said:


> Like a DirectCU card with a EK DirectCU block!



Or a MSI lightning! Military Class 5


----------



## Xzibit (Oct 28, 2013)

This might of factored into the price dip.

R9 290X @ 1100mhz Limit





i7-3770k @ 4.3ghz
780 @ 1254mhz Boost (No Tesselation)





780 has a lot more dips with less of a workload


----------



## MxPhenom 216 (Oct 28, 2013)

Xzibit said:


> This might of factored into the price dip.
> 
> R9 290X @ 1100mhz Limit
> http://cdn.overclock.net/c/cc/cc8685ef_MetroLL290x.png
> ...



Price drop had to happen regardless. Though and dips on the 780 could be a driver thing.


----------



## Steevo (Oct 28, 2013)

No tessellation on either?


----------



## Xzibit (Oct 28, 2013)

Steevo said:


> No tessellation on either?



The R9 290X has it set to Very High

The 780 says Not supported


----------



## crazyeyesreaper (Oct 28, 2013)

That benchmark looks like BS,

considering Nvidia's GPUs are still miles ahead in tessellation performance. When looking at Tess bench from Microsoft anyway so pure tessellation Nvidia still has more grunt.

http://www.hardocp.com/article/2013/10/23/amd_radeon_r9_290x_video_card_review/8#.Um7fKlzCRzw

Here same settings theres 2 FPS apart from a 780 vs a 290X ooooooo 2 FPS


----------



## Steevo (Oct 28, 2013)

Just wanted to be sure that was pointed out.


----------



## HumanSmoke (Oct 28, 2013)

Xzibit said:


> This might of factored into the price dip.


Extremely unlikely :shadedshu ....Extremely unlikely even if it was a new game....like Total War: Rome II






(Both cards clocked similarly. GTX 780 @ 980 base/1033 boost)


----------



## Xzibit (Oct 28, 2013)

crazyeyesreaper said:


> That benchmark looks like BS,
> 
> considering Nvidia's GPUs are still miles ahead in tessellation performance. When looking at Tess bench from Microsoft anyway so pure tessellation Nvidia still has more grunt.
> 
> ...



Might want to take some more medicine.

Hardocp doesn't have SSAA:ON

Get well soon.


----------



## Steevo (Oct 28, 2013)

crazyeyesreaper said:


> That benchmark looks like BS,
> 
> considering Nvidia's GPUs are still miles ahead in tessellation performance. When looking at Tess bench from Microsoft anyway so pure tessellation Nvidia still has more grunt.
> 
> ...



Perhaps that graph looked different to me, but it shows the 290x is 5FPS faster than Titan, and 10 faster than the 780....


----------



## crazyeyesreaper (Oct 28, 2013)

Physx On vs 290 being off scroll down to Apples vs Apples its 2 fucking FPS at the same resolution and settings.

As for no SSAA yes because everyone plays games that run like shit with a minimum that regularly drops to unplayable levels On that graph posted i can count at least 12 occassions where the 290X drops to unplayable rate many times it happens back to back thus unplayable.  Fun to look at but worthless as a comparison because I don't know anyone that plays games at unplayable settings just for Lulz.


2560x1600 Max settings SSAA off PhysX off = 38 vs 37 on the minimum and 54 vs 52 on the avg vs a Stock GTX 780

Honestly doesnt matter what benchmarks anyone pulls depending on the review and the reviewer the cards trade blows they are equals and it shows.


----------



## Xzibit (Oct 28, 2013)

crazyeyesreaper said:


> Honestly doesnt matter what benchmarks anyone pulls depending on the review and the reviewer the cards trade blows they are equals and it shows.



And that's why I didn't post any of that since they are all over the place.

Those screens are from E-Peen measuring.  Overclocking Forums.
To call them B.S. well Its no different then what TPU does either community wise. Care to call B.S. on that while your at it ?

I highly doubt the guy with the 780 wants to be beaten by the guy with the 290X.

Much like some in these threads.



You did say you didn't see the upside to a 290X vs 780.  You just mentioned it. Depends on reviewers and consumers perspective.

Something you knew already i'm sure but like you said 


crazyeyesreaper said:


> I just like arguing and since I am current sick I have nothing better to do


----------



## the54thvoid (Oct 28, 2013)

Is it not time this thread was shut?  There's a multitude of other threads opened now with even more fighting going on 

Facts are....

Stock versus stock a R9 290X is the better card in terms of raw performance versus GTX 780
Problem is stock 290X is a bit poo as it's cooler is shittingly loud and therefore, not pleasant.
Custom cooled 290X - different kettle of fish.  I'd love to see the stats on one of those.
Problem is GTX 780Ti might have some surprises.  But Nvidia are being arseholes with it's pricing.

In summary.... If you like AMD you're happy because your camps card is now up there toe to toe with a ludicrously priced Nvidia card (Titan).  Unless you like noise though you might want to wait for some custom cooling or get some water going.
If you like Nvidia - hey - it just got cheaper.  Your very overclockable 780 matches the hot and loud 290X.  And your hideously expensive Titan, well, nothing to upgrade over yet.

Everybody gains.

But if you want a true winner we need a meta analysis of performance per clock versus price versus power efficiency versus noise.  At the moment AMD wins on pricing metric but loses on power metric. Nvidia loses on pricing but wins on power.  They tie on performance.

Story ends until the 7th November.


----------



## The Von Matrices (Oct 28, 2013)

the54thvoid said:


> Is it not time this thread was shut?  There's a multitude of other threads opened now with even more fighting going on
> 
> Facts are....
> 
> ...



Your conclusion was correct at the time of the R9 290X launch, but with the GTX 780 at $499 I don't see any reason for the mass market (people who don't modify cards, just insert and turn on) to get a reference 290X anymore.  Without modification of the card NVidia beats it on almost all levels.  This may change when custom 290X's hit the market, but for now only version of the 290X available has limited appeal due to its $549 price and the reference cooler.

AMD has some time to modify its product to make it more competitive (bundles, lower price, etc.), but if the company does nothing before the 780Ti launch I suspect all those reviews will be pretty harsh on AMD (after all, even if it doesn't directly compete on price, it is the product closest in price to the 780Ti).


----------



## mastershake575 (Oct 28, 2013)

Damn, the nvida price drops came quick. 

I just checked and the Evga 780 with the ACX cooler (pretty nice cooler) is currently $500. That's a great freaking price (the sapphire 290x with stock cooler is $580).


----------



## Fluffmeister (Oct 29, 2013)

mastershake575 said:


> Damn, the nvida price drops came quick.
> 
> I just checked and the Evga 780 with the ACX cooler (pretty nice cooler) is currently $500. That's a great freaking price (the sapphire 290x with stock cooler is $580).



The beauty of some real competition, rather than a 7970 with 27 million games bundled with it.


----------



## radrok (Oct 29, 2013)

Yep gotta love competition 

I'd seriously love to get some more titans at reasonable prices, they make for awesome cuda computing parts, I'll probably switch to 290x on gaming rig to move over the ones I have to workplace.

Let's hope this new 780 ti is just a rebranded titan


----------



## HumanSmoke (Oct 29, 2013)

mastershake575 said:


> Damn, the nvida price drops came quick.
> 
> I just checked and the Evga 780 with the ACX cooler (pretty nice cooler) is currently $500. That's a great freaking price (the sapphire 290x with stock cooler is $580).


That $580 price for the R9-290X looks all the more shaky when you can now get an MSI 780 Lightning (w/ 2 more games) for $30 less.


----------



## qubit (Oct 29, 2013)

the54thvoid said:


> Is it not time this thread was shut?



News and reviews threads don't get shut no matter how much naughtiness goes on. They just get moderated (censored) if the mods think things have gone too far with possible infractions all round.

My little summary for the 290X: a fine card with a great GPU badly let down by a crap cooler.


----------



## radrok (Oct 29, 2013)

HumanSmoke said:


> That $580 price for the R9-290X looks all the more shaky when you can now get an MSI 780 Lightning (w/ 2 more games) for $30 less.




It's funny to say it but AMD should drop prices 

Anyway let's hope they start a price war.


----------



## manofthem (Oct 29, 2013)

radrok said:


> It's funny to say it but AMD should drop prices
> 
> Anyway let's hope they start a price war.



Perhaps all along AMD knew that Nvidia would drop their 780, and that they would need to end up dropping their prices of the 290/x.  After all, 270x is $200, 280x is $300, and maybe they were shooting for a settling of their 290/x ~$400. (the 7970 went through a similar trend)

Might be wishful thinking, but I would certainly fancy a 290/x at that price.


----------



## HTC (Oct 29, 2013)

the54thvoid said:


> Is it not time this thread was shut?  There's a multitude of other threads opened now with even more fighting going on
> 
> Facts are....
> 
> ...



Nicely put.


I highly doubt it *just yet* but i hope AMD also has some price cuts.

If they keep lowering their prices, *everybody wins*, regardless of which one ends up being the better card.

Gotta love competition 


Would just loooooove AMD managed to put this kind of pressure on Intel too


----------



## TheoneandonlyMrK (Oct 29, 2013)

qubit said:


> News and reviews threads don't get shut no matter how much naughtiness goes on. They just get moderated (censored) if the mods think things have gone too far with possible infractions all round.
> 
> My little summary for the 290X: a fine card with a great GPU badly let down by a crap cooler.



your not wrong but should know better. 
Who has more options right now Amd or Nvidia. 
To the trolls inc bored crazeyeyes ,do you just like arguing.
Sounding a bit purile now and the arguments getting boring since this is just the fitst salvo from this chip in a war for sales get your minds out your asses about the cooler , only tards and nvidia release their top dog with most of its balls out.
Now is that put simple enough for you all.


----------



## Fluffmeister (Oct 29, 2013)

theoneandonlymrk said:


> your not wrong but should know better.
> Who has more options right now Amd or Nvidia.
> To the trolls inc bored crazeyeyes ,do you just like arguing.
> Sounding a bit purile now and the arguments getting boring since this is just the fitst salvo from this chip in a war for sales get your minds out your asses about the cooler , only tards and nvidia release their top dog with most of its balls out.
> Now is that put simple enough for you all.



Not really, I still have no idea what you said.


----------



## HumanSmoke (Oct 29, 2013)

theoneandonlymrk said:


> your not wrong but should know better......To the trolls...do you just like arguing.....Sounding a bit purile now........*only tards and nvidia*.


Wow, you were doing so well then had to go and spoil it. Pu*e*rile indeed


----------



## TheoneandonlyMrK (Oct 29, 2013)

HumanSmoke said:


> Wow, you were doing so well then had to go and spoil it. Pu*e*rile indeed





Fluffmeister said:


> Not really, I still have no idea what you said.



Boverd 《not


----------



## crazyeyesreaper (Oct 29, 2013)

at least I am honest about my trolling  HURRAY R9 290X what an amazing GPU its so damn amazing I will name my unborn child AMD.


----------



## radrok (Oct 29, 2013)

crazyeyesreaper said:


> at least I am honest about my trolling  HURRAY R9 290X what an amazing GPU its so damn amazing I will name my unborn child AMD.



You should name your next dog Titan


----------



## N3M3515 (Oct 29, 2013)

The only fact:

290X: Amazing GPU.
290X Cooler: poo. 

I highly doubt nvidia will keep that hideous $700 price con gtx 780ti once non reference 290x arrives.


----------



## Fluffmeister (Oct 29, 2013)

N3M3515 said:


> The only fact:
> 
> 290X: Amazing GPU.
> 290X Cooler: poo.
> ...



Speaking of hideous, you ordered your 290X yet?

Just messing, I love your "facts".


----------



## xorbe (Oct 29, 2013)

N3M3515 said:


> I highly doubt nvidia will keep that hideous $700 price con gtx 780ti once non reference 290x arrives.



I think they will keep it.  Titan is effectively dead to the masses, so $699 is the new halo bracket.  With a little overclocking, there's probably hardly any difference between a 780 and 780Ti, unless they pull a rabbit out of the hat, or start gimping 780 cards with lesser vrm / cooling / fans like they did to the 570 cards.


----------



## N3M3515 (Oct 29, 2013)

fluffmeister said:


> speaking of hideous, you ordered your 290x yet?
> 
> Just messing, i love your "facts".



EDIT: I totally misunderstood at first 

I think a 290 non x would be a better buy, and non reference of course. Depending on my bracket it's that one or a 280X matrix.



xorbe said:


> I think they will keep it. Titan is effectively dead to the masses, so $699 is the new halo bracket. With a little overclocking, there's probably hardly any difference between a 780 and 780Ti, unless they pull a rabbit out of the hat, or start gimping 780 cards with lesser vrm / cooling / fans like they did to the 570 cards.



The reason i see them lowering the $700 is the non reference 290X, according to reviews (oc 290X) i assume they will be on par, and having nvidia lowered 780 to $500, of course amd is lowering 290x stock to $500(just like 680 vs 7970) if not less.  So, a non reference 290X will have an mrsp of $550(speculation), will be quiet, good temps, and $150 cheaper than 780Ti.
And all of that is not taking in to account that 290 vanilla presumably has the exact same performance as the gtx 780 and will cost between $400 ~ $450 stock versions (again a non reference could even overtake a stock 290X). That's a lot of speculation but not far from reality, after seing what the 290x can do.


----------



## sweet (Oct 29, 2013)

crazyeyesreaper said:


> Physx On vs 290 being off scroll down to Apples vs Apples its 2 fucking FPS at the same resolution and settings.
> 
> As for no SSAA yes because everyone plays games that run like shit with a minimum that regularly drops to unplayable levels On that graph posted i can count at least 12 occassions where the 290X drops to unplayable rate many times it happens back to back thus unplayable.  Fun to look at but worthless as a comparison because I don't know anyone that plays games at unplayable settings just for Lulz.
> 
> ...



It seems that u missed this part on Hardocp



> For our comparison we are comparing the R9 290X to the GTX TITAN, GTX 780 and Radeon HD 7970 GHz Edition, all stock reference video cards. We are using the default "*Quiet Mode*" setting on the R9 290X for our normal highest playable settings and apples-to-apples testing.


----------



## Tatty_One (Oct 29, 2013)

The Von Matrices said:


> Your conclusion was correct at the time of the R9 290X launch, but with the GTX 780 at $499 I don't see any reason for the* mass market *(people who don't modify cards, just insert and turn on) to get a reference 290X anymore.



Thing is.... most mass market people don't buy top end cards, exactly why in relative terms these cards make little impact on overall sales pictures.


----------



## crazyeyesreaper (Oct 29, 2013)

sweet said:


> It seems that u missed this part on Hardocp



Well I can't see using Uber mode, you can say what you want but not a damn person on here is running a GPU that is that loud when gaming. 
At this point you would have to go water cooling or buy an after market cooler thus far more expensive to get a decent gaming experience. For those that water cool everything this isnt a big deal. However for those that just want to plug and play this cooler fucking blows no matter what you say that wont change and its still over a month away untill custom cards arrive even then good luck managing to actually GET ONE before stock runs out. 

Setting the fan to 100% so it the card stays pegged at the proper clock speed has the the noise levels climbing to 70 dBA


----------



## Tatty_One (Oct 29, 2013)

I just counted the blade rotations and did some math, I reckon thats at 75% fan speed, either that or the guys Mom was vacuming in the background!


----------



## crazyeyesreaper (Oct 29, 2013)

Like i said above video is 55% = 50 dBA at 100% fan speed the Cooler hits 70+ dBA sound levels.  It requires the fan speed to be around 70-80% to keep the card pegged at 1000 MHz core clock in all games benchmarks etc.

Which means to get the maximum possible performance from the 290x requires dealing with noise levels in the 60+ dBA range.


----------



## erocker (Oct 29, 2013)

Wear headphones.


----------



## Xzibit (Oct 29, 2013)

crazyeyesreaper said:


> Like i said above video is 55% = 50 dBA at 100% fan speed the Cooler hits 70+ dBA sound levels.  It requires the fan speed to be around 70-80% to keep the card pegged at 1000 MHz core clock in all games benchmarks etc.
> 
> Which means to get the maximum possible performance from the 290x requires dealing with noise levels in the 60+ dBA range.



Just like in W1zzards performance summary and many others like the Hardocp link.

It doesn't require it to be out of silent mode to best a 780.


----------



## de.das.dude (Oct 29, 2013)

70dba? that sounds too much considering lodspeakers are 80dba :/


----------



## radrok (Oct 29, 2013)

erocker said:


> Wear headphones.



I do, but it would still kill me as I use open back headphones.

Stock cooler is just unbearable imo, I can still remember my 6990s lol.

Sometimes people would ask me if I was playing with my hairdryer.


----------



## crazyeyesreaper (Oct 29, 2013)

Xzibit said:


> Just like in W1zzards performance summary and many others like the Hardocp link.
> 
> It doesn't require it to be out of silent mode to best a 780.



hmm okay.. lets just pull up a previous post here. 

http://www.techpowerup.com/forums/showpost.php?p=3003131&postcount=227










Lets see where a 780 custom card comes in out of box








I fail to see on a broad set of games how the 290X is that amazing. With clock scaling being what it is if the card pegged 1000 mhz during gaming. It would be on average another 5-7% faster giving it a greater lead and one that can be felt not just seen in benchmarks on the first run. 

As it stands knowing performance drops over time slightly means in reality  Quiet mode = 780 and Uber mode = Titan its not as much faster as its on par and everyone acts like beating titan is some insane feat? Every GTX  780 reviewed on TPU other than reference card beats Titan. 


So from what I see here the difference between a GTX 780 you can buy now that runs quiet and the 290X in uber mode is pretty much a crap shoot.

R9 290x = 7-14% faster then a reference GTX 780
GTX 780 custom = 9-11% faster than a reference GTX 780

That gives the 290x a -2% to + 3% spread not exactly amazing performance but no people can keep deluding themselves, and again thats if we dont take the performance drop into consideration a few FPS slower means its % lead evaporates a little bit more. 

Today right now you can buy and after market cooled GTX 780 for $20-30 less than a R9 290x and get a bunch of games, quieter cooler, better temps all with no dicking around. The debate rages on  best part thats not the fastest 780 you can buy, it is however a fairly solid card the Jetstream that is in that the ASUS / Gigabyte / MSI gaming series etc perform similar.


----------



## Tatty_One (Oct 29, 2013)

Not sure why we are going over old ground, you can go back to about page 3 to determine that most people think the GPU is good, the price is good, the reference design leaves something to be desired...... or have I missed something?  Of course since then we see the NVidia price drops.... all good, the 290 and 290X will adjust no doubt and if the 780Ti is really the fastest single GPU in a few weeks time, then it will stay pricier.

Those that don't care about the noise (and I do) will buy it anyway, those that do care will wait for non reference cooler solutions that will be more efficient and quieter, by which time prices will have come down most probably, NVidia will probably re-adjust theirs and everyone wins again.


----------



## crazyeyesreaper (Oct 29, 2013)

Can't help myself Tatty just can't help myself the truth is there but so many are blind. I must show them the light  the truth is in the banana pudding at the salad bar seriously just trust me.  

That said i do agree with you Tatty but sometimes you just have to drill home the facts.


----------



## Tatty_One (Oct 29, 2013)

crazyeyesreaper said:


> Can't help myself Tatty just can't help myself the truth is there but so many are blind. I must show them the light  the truth is in the banana pudding at the salad bar seriously just trust me.
> 
> That said i do agree with you Tatty but sometimes you just have to drill home the facts.



Ahhhh but he who leads the blindman ends up carrying the white stick and feeding the guide dog


----------



## crazyeyesreaper (Oct 29, 2013)

haha indeed good sir indeed.


----------



## Crap Daddy (Oct 29, 2013)

It seems you can't actually buy a 290x right now. Hmmm. I totally agree with crazyeyes. Galaxy 780 HOF is $540 with three games and it spanks Titan out of the box. Furthermore overclocking the card towards 1300Mhz seems achievable. At those speeds it should be 10% faster than a stock Titan.


----------



## HTC (Oct 30, 2013)

That's why i said AMD shot themselves in the foot with a cannon ball: with a price adjustment, nVidia is now the better buy.

If they bothered to have better cooler (no need for a good one: *just not a super crappy one*), the current prices nVidia's offering would not favor them as much and i would consider it more of a tie, IMO.

I mean, why would anyone buy a very loud card if they can get roughly the same performance for around the same price but way more silent and more energy efficient too?


----------



## The Von Matrices (Oct 30, 2013)

With the custom coolers on 290X, how would power consumption change?  I would assume lower temperatures would produce lower leakage and result in the card using less power.  However, AMD has made a big deal about reducing leakage in the Hawaii silicon and justifies 95°C because of that, so I don't know if in this case the lower temperature would reduce power draw all that much.  The card would definitely boost higher though, so would consume more power.  Plus you have axial fans instead of a blower.  What would the net result be?


----------



## HTC (Oct 30, 2013)

The Von Matrices said:


> With the custom coolers on 290X, how would power consumption change?  I would assume lower temperatures would produce lower leakage and result in the card using less power.  However, AMD has made a big deal about reducing leakage in the Hawaii silicon and justifies 95°C because of that, so I don't know if in this case the lower temperature would reduce power draw all that much.  The card would definitely boost higher though, so would consume more power.  Plus you have axial fans instead of a blower.  What would the net result be?



What is the bigger drawback: noise of power consumption?

If card A and card B have similar performance (and price) but one uses a bit more "juice", the difference could *probably* swing to the one that overclock's better BUT if one of these cards is noisy as hell, *odds are* it's the other one that will be chosen.


----------



## Fourstaff (Oct 30, 2013)

The Von Matrices said:


> With the custom coolers on 290X, how would power consumption change?



Don't think it will decrease, because everyone will be clocking higher.


----------



## The Von Matrices (Oct 30, 2013)

I guess there still is the same powertune limit, now that I think of it.  Even if it boosts higher the powertune limit would keep it from consuming more power.  I guess I just answered my own question.

I asked because I have an air conditioner that has a low, fixed cooling rate, so heat generation and power consumption matters more to me than temperature.  If I put power guzzling components in my PC then the room becomes way too hot; I made that mistake in the past.


----------



## sweet (Oct 30, 2013)

Some guys from overclock.net only need 70% fan speed (set by Afterburner) to maintain 1100/1300 Mhz at 83 Celcius degree. GPU had been working fullload in BF4.






http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/1540

And for your information, W1zz's review failed to address the new CCC overdrive, which let you lower that notorious 95 celcius degree with a simple tweak.


----------



## The Von Matrices (Oct 30, 2013)

sweet said:


> Some guys from overclock.net only need 70% fan speed (set by Afterburner) to maintain 1100/1300 Mhz at 83 Celcius degree. GPU had been working fullload in BF4.



Did they use the stock cooler?  If so, I hope they have hearing protection.  Okay, I'm exaggerating, but at that fan speed the reports shows around 70dBA.  You may be playing BF4, but you certainly won't hear very much of the game.


----------



## crazyeyesreaper (Oct 30, 2013)

70 dBA is at 100%

so at 70% dBA level would be around 60 ish dBA

still thats ridiculously high


----------



## sweet (Oct 30, 2013)

The Von Matrices said:


> Did they use the stock cooler?  If so, I hope they have hearing protection.  Okay, I'm exaggerating, but at that fan speed the reports shows around 70dBA.  You may be playing BF4, but you certainly won't hear very much of the game.



As if you owned the card, lol.

70% fan in a close case is just around 55 dba at your seat. That guy said that the card is noticeable, but it is not a big deal when he wears his headphones. Of course he is using the stock coooler.

For your information, a normal conversation is 60 dba, and when wearing headphones you barely notice it.


----------



## Fourstaff (Oct 30, 2013)

sweet said:


> For your information, a normal conversation is 60 dba, and when wearing headphones you barely notice it.



Its less of how loud it is but how intrusive it can be, and from my experience the blower sound can be very annoying. I wouldn't accept cooling solutions which is much more than 40db, let alone 55. Same can't be said to others who are more tolerant (or slightly deaf) of course.


----------



## The Von Matrices (Oct 30, 2013)

sweet said:


> As if you owned the card, lol.
> 
> 70% fan in a close case is just around 55 dba at your seat. That guy said that the card is noticeable, but it is not a big deal when he wears his headphones. Of course he is using the stock coooler.
> 
> For your information, a normal conversation is 60 dba, and when wearing headphones you barely notice it.



I have used the reference 7970 cooler, which is nearly identical to the R9 290X's cooler in that it uses the same motor and fan, and it is very annoying if it gets above 45%.



erocker said:


> Wear headphones.



I use speakers and not headphones, so I can see how headphones would make a difference, but any additional noise is going to affect perceived audio quality.  If you have a loud computer you have to reduce dynamic range to still hear the quieter sounds while not blowing out your eardrums when something loud happens in game.  There's also no point in getting a high end sound card that has a high SNR if you're going to make the noise floor really loud.

Sure, you you can put on headphones, but if you're playing an online multiplayer game you still can't stop the fan noise from obscuring your voice in voice chat, especially if the card's noise is just as loud as your voice.  In every online game I've played there's always a few people who have good microphones but really loud computers that compete with their voices and make understanding what they're saying difficult or impossible.


----------



## radrok (Oct 30, 2013)

The funny thing is that after price cuts the 780 costs 100 eur less compared to the 290x here in Italy.

It's the same for some other european countries according to geizhals, cheapest 290x is like 490 eur and cheapest 780 is 405-410 eur.


No brainer imo, AMD has to kinda drop prices (and it has to fast) if it wants to sell its cards in here.


----------



## HTC (Oct 30, 2013)

radrok said:


> The funny thing is that after price cuts the 780 costs 100 eur less compared to the 290x here in Italy.
> 
> It's the same for some other european countries according to geizhals, cheapest 290x is like 490 eur and cheapest 780 is 405-410 eur.
> 
> ...



Don't think that's AMD: probably both distributors and vendors trying to cash in.


----------



## Tatty_One (Oct 30, 2013)

it is indeed the vendors playing the game, if you look at some UK prices they are way lower than those Radrok has quoted so these are not AMD rrp's.........

http://www.aria.co.uk/Products/Comp...R5+PCI-Express+Graphics+Card+?productId=57980


----------



## radrok (Oct 30, 2013)

Someone is ruining AMD sales , either AMD or vendors because let's face it, would you rather buy that 290x or this?

http://www.aria.co.uk/Products/Comp...Graphics+Card+++3+FREE+GAMES+?productId=57506

It even has three games bundled lol


----------



## Solaris17 (Oct 30, 2013)

the 290x


----------



## crazyeyesreaper (Oct 30, 2013)

Ill take the 780 my card was free though hurray early Christmas lol


----------



## MxPhenom 216 (Oct 30, 2013)

radrok said:


> The funny thing is that after price cuts the 780 costs 100 eur less compared to the 290x here in Italy.
> 
> It's the same for some other european countries according to geizhals, cheapest 290x is like 490 eur and cheapest 780 is 405-410 eur.
> 
> ...



Not to mention the Lightning GTX780 here is $539.99 which is already a bit faster then the Titan which will be faster than the 290x for a bit cheaper.

And it comes with free games so you could come out spending ~$500 or less.


----------



## broken pixel (Oct 30, 2013)

290x ? 780 aftermarket variant? 780ti hell no. I will wait for an aftermarket 290x, or even the R10 Zues to be released, hahahaha!


----------



## radrok (Oct 30, 2013)

MxPhenom 216 said:


> Not to mention the Lightning GTX780 here is $539.99 which is already a bit faster then the Titan which will be faster than the 290x for a bit cheaper.
> 
> And it comes with free games so you could come out spending ~$500 or less.



Talking about Lightning, let's hope AIB get wild with 780ti and 290x


----------



## Tatty_One (Oct 30, 2013)

personally, I am waiting for the straight 290 version, hopefully with a very decent aftermarket cooler, if it don't look nice, sound nice and perform nice, i will be getting me one of them cheaper 780's, I can take a miss on the games bundle, maybe they could replace that for a troop of half naked dancing girls!


----------



## radrok (Oct 30, 2013)

Tatty_One said:


> personally, I am waiting for the straight 290 version, hopefully with a very decent aftermarket cooler, if it don't look nice, sound nice and perform nice, i will be getting me one of them cheaper 780's, I can take a miss on the games bundle, maybe they could replace that for a troop of half naked dancing girls!



I was expecting massive boobs from Sapphire's boxes but they failed to deliver


----------



## HumanSmoke (Oct 30, 2013)

radrok said:


> I was expecting massive boobs from Sapphire's boxes but they failed to deliver


Given the way Roy Taylor love to self-publicize, maybe he can strike a deal to appear on the custom cards box art thereby fulfilling both his and your wishes


----------



## eidairaman1 (Oct 30, 2013)

Good Review


----------



## qubit (Oct 30, 2013)

The Von Matrices said:


> In every online game I've played there's always a few people who have good microphones but really loud computers that compete with their voices and make understanding what they're saying difficult or impossible.



So that one player who's got the loud computer effectively imposes the inconvenience on his fellow players who also use headsets since that microphone's output is transmitted to everyone. Nice. :shadedshu


----------



## Suka (Oct 31, 2013)

Great Card and a great review. Wish i could afford one of these suckers and a decent machine, i would be in dreamland


----------



## N3M3515 (Nov 1, 2013)

For anyone looking for a direct comparison between titan/780 SLI and 290x CF:
SLI vs CF review


----------



## sweet (Nov 1, 2013)

I think W1zz should have included the new overdrive feature in CCC for 290x


----------



## HTC (Nov 1, 2013)

N3M3515 said:


> For anyone looking for a direct comparison between titan/780 SLI and 290x CF:
> SLI vs CF review



WOW: i knew the R9 290X was faster but, in crossfire, i didn't expected it to be this much faster!

Wish W1zzard could make one of these reviews (minus the 4K display part).


Still, what surprised me the most was that the crossfire frame pacing problems that were plaguing AMD are totally gone with this card, so much so that now it's nVidia that "seems" to have frame pacing issues and just recently they were being praise for not having this: i find that amazing!!!!


----------



## radrok (Nov 1, 2013)

Yep the only thing that's left to fix is DX9, which it can wait imo.

This review is much better than the hardocp one (which I never liked as a site, always kinda biased towards Nvidia), http://www.pcper.com/reviews/Graphi...deon-R9-290X-CrossFire-and-4K-Preview-Testing.


----------



## HTC (Nov 1, 2013)

radrok said:


> Yep the only thing that's left to fix is DX9, which it can wait imo.
> 
> This review is much better than the hardocp one (which I never liked as a site, always kinda biased towards Nvidia), http://www.pcper.com/reviews/Graphi...deon-R9-290X-CrossFire-and-4K-Preview-Testing.



So this frame pacing fix seems to be DX10 and DX11 only? Though, with only 1 DX9 game tested, i'm not sure of this.

Seems the previous review made the R9 290X look better then it was, frame pacing wise, but it still shows massive improvements over previous cards: hopefully, they can learn from this and come up with some driver optimizations for the other cards already out there, unless this is an hardware issue, in which case, other card's owners are pretty much screwed


----------



## crazyeyesreaper (Nov 1, 2013)

Still doesnt tell all very small subset of games, and if you havent noticed AMD's fix is per game some games will never get fixed. You will also notice they stick to fixing the most popular titles. Thus overall SLI may not be scaling as well but it still works across a greater number of games and tends to work without issue across all Direct X versions.  

I enjoy new eyecandy as much as the next person but there is still alot of work to be done by AMD before I would consider Xfire again.


----------



## HTC (Nov 1, 2013)

crazyeyesreaper said:


> Still doesnt tell all very small subset of games, and if you havent noticed AMD's fix is per game some games will never get fixed. You will also notice they stick to fixing the most popular titles. Thus overall SLI may not be scaling as well but it still works across a greater number of games and tends to work without issue across all Direct X versions.
> 
> I enjoy new eyecandy as much as the next person but *there is still alot of work to be done by AMD* before I would consider Xfire again.



I'm sure you agree: this is a step in the right direction.

All that's needed now is enough steps to reach "goal"


----------



## crazyeyesreaper (Nov 1, 2013)

Correct it is a step in the right direction the problem is AMD tends to take a few steps then just kinda stops to shoot the breeze for awhile lol. Then they seem to remember oh shit i was suppose to be working on this I better get back to it.


----------



## mastershake575 (Nov 2, 2013)

Damn I checked newegg again and the Evga GTX 780 with the ACX cooler is $495 after promo code and $485 after rebate (it includes three recent games also). 

AMD for damn sure has to lower there prices (the Sapphire 290x with stock cooler is $100 more). 

I wouldn't be surprised if AMD dropped the stock cooler editions to $500 and released the third party cooler editions for $550


----------



## The Von Matrices (Nov 2, 2013)

mastershake575 said:


> Damn I checked newegg again and the Evga GTX 780 with the ACX cooler is $495 after promo code and $485 after rebate (it includes three recent games also).
> 
> AMD for damn sure has to lower there prices (the Sapphire 290x with stock cooler is $100 more).
> 
> I wouldn't be surprised if AMD dropped the stock cooler editions to $500 and released the third party cooler editions for $550



There's no reason to lower prices when Newegg can't keep them in stock.  There must be a lot of people who want the card even if it isn't the best value.  As I said earlier in this thread, the card is worthless if you can't find one to purchase.  If AMD can't fix their supply in the next week or two then the company will miss the lucrative holiday season.



crazyeyesreaper said:


> Correct it is a step in the right direction the problem is AMD tends to take a few steps then just kinda stops to shoot the breeze for awhile lol. Then they seem to remember oh shit i was suppose to be working on this I better get back to it.



That's a comical way of putting it, but it is true.  It's almost as if the driver team is so short-handed that the managers have to tell the entire driver team to stop everything non-critical (i.e. not a BSOD/crash fix) for two months before a product release to ensure that the upcoming product has good release drivers.  But in the mean time they neglect all the rest of their product line for that same period.


----------



## Tatty_One (Nov 2, 2013)

I think everyone needs to lower their prices...... still but not sure (yet) why AMD in particular, NVidia seemed to keep their prices high on the Titan even when you could get the 780 with only a minor performance hit for so much cheaper, some still bought the Titan's, as some have already said, 1st release prices where availability is limited are rarely the prices seen after a few weeks when there are more on the shelves.

If anything, the release of these new AMD cards has made me reconsider buying a 780, the effect of this release and it's knock on effect has been very favorable for the consumer and for that alone AMD should be congratulated.


----------



## N3M3515 (Nov 2, 2013)

Tatty_One said:


> I think everyone needs to lower their prices...... still but not sure (yet) why AMD in particular, NVidia seemed to keep their prices high on the Titan even when you could get the 780 with only a minor performance hit for so much cheaper, some still bought the Titan's, as some have already said, 1st release prices where availability is limited are rarely the prices seen after a few weeks when there are more on the shelves.
> 
> If anything, the release of these new AMD cards has made me reconsider buying a 780, the effect of this release and it's knock on effect has been very favorable for the consumer and for that alone AMD should be congratulated.



+1
My exact thoughts.
I'm even considering a GTX 670 OC(1046Mhz) i saw at $250


----------



## qurotro (Nov 2, 2013)

The Von Matrices said:


> Whatever NVidia does with pricing someone is going to complain.  NVidia made the mistake of touting Titan as a high end gaming card when they really shouldn't have.  If they drop the price of Titan (I don't think they should) then they will have complaints from current owners.  If they don't drop the price, then the enthusiast community will continue to complain about it.  Titan is a niche card.
> 
> My predictions:
> 
> ...


nice one !seems that nvidia is gonna cut down GTX780ti's compute ability...it sound sad that a complete GK110 that lost its compute ability


----------



## Armagg3don (Nov 2, 2013)

Guys, what do you think about these results?














> _Temperature and fan speed stabilize after another while, but clocks only run at a fraction of the 1000 MHz. The card was in our case seen to run as low as 570 MHz instead, which reduced performance greatly_




Do not you think this performance but irregular, caused by excessive consumption and temperatures, makes me feel surprised and disappointed with the final performance of the card.

Any opinion on the matter?


----------



## TheoneandonlyMrK (Nov 2, 2013)

Armagg3don said:


> Guys, what do you think about these results?
> 
> http://tpucdn.com/reviews/AMD/R9_290X/images/analysis_quiet.gif
> 
> ...



A great deal of opinion has already been spouted in this very thread read up.


----------



## Solaris17 (Dec 7, 2013)

I just wanted to post this because I actually found it lacking in the review. It seems simple enough sure but I can also say I was honestly a little confused about the bios switch positions since left and right are completely relative and i mean who knows right? That said im falling apart watching my 290x make its way across the country hurry up FEDEX!



“*Quiet Mode*” – Bios position one. Switch is in position closest to where you plug in your displays. (toward DVI connectors) This mode is designed to optimally suit a gamer that wants to keep a tight lid on acoustics. If you do not play with headphones, you do not have a high end gaming chassis, or your room’s ambient noise level is extremely low this may be the mode for you.
“*Uber Mode*” – Bios position two. Switch is in position furthest away to where you plug in your displays (toward POWER connectors)


----------



## BIGARC (Jan 26, 2014)

I went from 2 gtx 670's on water @ 1346 core / 7456 mem effective and my xfx r9 290 x @ core 1131 / 1390 mem is pretty much on par but that is going from to epic overclocking 670's !im ridiculously hapy with my new amd card on water  cant wait to be able to adjust the volts !! oh and my 670's beat my mates 780 ti overclocked to 1300 MHz by a small margin


----------



## eidairaman1 (Jan 27, 2014)

Uber mode, prefer Performance Mode


----------



## Domokun (Feb 2, 2014)

W1zzard said:


> i'm looking at doing a PCIe scaling article with 290X in the next weeks.


Hi W1zzard,

Sorry to nag, as I know you're probably extremely busy, but I'm just wondering if you ever got around to writing the above mentioned article?


----------



## W1zzard (Feb 2, 2014)

Domokun said:


> Hi W1zzard,
> 
> Sorry to nag, as I know you're probably extremely busy, but I'm just wondering if you ever got around to writing the above mentioned article?


Nope, no time.


----------



## sweet (Feb 2, 2014)

Domokun said:


> Hi W1zzard,
> 
> Sorry to nag, as I know you're probably extremely busy, but I'm just wondering if you ever got around to writing the above mentioned article?


Ask him to review a 760 from nVidia, a GPU released 7 months ago, he will eagerly provide you http://www.techpowerup.com/reviews/MSI/GTX_760_Mini_ITX_Gaming/
Ask him to review a custom cooler 290x, he will give you the worst variance: Asus DC II, which have 1 fully contacted heatpipe, 2 partly, and 2 others just for show.
	

	
	
		
		

		
		
	


	




In short, don't ask :lol:


----------



## HumanSmoke (Feb 2, 2014)

sweet said:


> Ask him to review a 760 from nVidia, a GPU released 7 months ago, he will eagerly provide you,In short, don't ask :lol:


I'm sure W1zzard would be happy to review any card he gets his hands on. Since the IHV's don't seem overly interested in handing out review samples, maybe you could buy a few and send them to him...while you're at it, buy a few extras and send them to the other sites as well since only a small number of sites seem to have any reviews of vendor designs apart from the Asus DC2.
I'm hoping your bank balance matches the size of your sense of entitlement.


----------



## Tatty_One (Feb 2, 2014)

I like this one.... the card, not the review necessarily......


http://www.madshrimps.be/articles/article/1000548/#axzz2sC6sJGnt


----------



## sweet (Feb 3, 2014)

HumanSmoke said:


> I'm sure W1zzard would be happy to review any card he gets his hands on. Since the IHV's don't seem overly interested in handing out review samples, maybe you could buy a few and send them to him...while you're at it, buy a few extras and send them to the other sites as well since only a small number of sites seem to have any reviews of vendor designs apart from the Asus DC2.
> I'm hoping your bank balance matches the size of your sense of entitlement.


LOL, it's his JOB to review cards, not mine. 
By the way, I stated facts, and what you gave was just an assumption. For any sense, if he is happy to review a 760 just only because he has it in hand, I will happy to send him a 260, maybe he will review it


----------



## HumanSmoke (Feb 3, 2014)

sweet said:


> For any sense, if he is happy to review a 760 just only because he has it in hand, I will happy to send him a 260, maybe he will review it


Send him a few 290X's...after all, weren't you bleating on about TPU only reviewing the DC2 ? I guess W1zzard might be happy to review some warmed-over-Bonaire budget orientated card, although I'm guessing you won't be sending anything - whether it be PM or board.


sweet said:


> LOL, it's his JOB to review cards, not mine.


And for that I am eternally thankful


----------



## Tatty_One (Feb 3, 2014)

sweet said:


> LOL, *it's his JOB to review cards, not mine. *
> By the way, I stated facts, and what you gave was just an assumption. For any sense, if he is happy to review a 760 just only because he has it in hand, I will happy to send him a 260, maybe he will review it


 
^^^ That in itself is an assumption


----------

