# Radeon R9 290X Pitted Against GeForce GTX TITAN in Early Review



## btarunr (Oct 16, 2013)

Here are results from the first formal review of the Radeon R9 290X, AMD's next-generation flagship single-GPU graphics card. Posted by Chinese publication PCOnline.com.cn, the it sees the R9 290X pitted against the GeForce GTX TITAN, and GeForce GTX 780. An out-of-place fourth member of the comparison is the $299 Radeon R9 280X. The tests present some extremely interesting results. Overall, the Radeon R9 290X is faster than the GeForce GTX 780, and trades blows, or in some cases, surpasses the GeForce GTX TITAN. The R9 290X performs extremely well in 3DMark: FireStrike, and beats both NVIDIA cards at Metro: Last Light. In other tests, its half way between the GTX 780 and GTX TITAN, leaning closer to the latter in some tests. Power consumption, on the other hand, could either dampen the deal, or be a downright dealbreaker. We'll leave you with the results.



 

 

 

 

More results follow.





 

 

 



*View at TechPowerUp Main Site*


----------



## 1d10t (Oct 16, 2013)

has NDA been lifted?


----------



## dom99 (Oct 16, 2013)

1d10t said:


> has NDA been lifted?



No, Ive seen these before and I think its some bloke in China who claims to have a card and benched it. Its not an offical review but all the same I believe these results will reflect what the card will be capable of. The NDA lifts on the 25th I think, saw somehting on the AMD website last night that hinted that way

http://sites.amd.com/us/Documents/AMD_Radeon_290X_Battlefield4_Limited_Ed.pdf

A little concerned about the furmark temp though


----------



## dj-electric (Oct 16, 2013)

1d10t said:


> has NDA been lifted?



Not yet, no.

BTW, i feel that test results will be more in favor of the R9 290X at 2560X1440, that's where he belongs


----------



## Outback Bronze (Oct 16, 2013)

Hmmm not that impressed atm. Lets see how they overclock. Hope they price em well!


----------



## btarunr (Oct 16, 2013)

Added power graph. Ouchies are in order. 

Then again, they used Furmark, which GeForce drivers have throttle triggers for.


----------



## EzioAs (Oct 16, 2013)

So we're seeing overall 30-37% improvement over the HD 7970(280X), except in metro:LL where it's around 60%. That's not bad, but I don't like the $600 srp much (not sure if it's official or mere rumors).

Most enthusiast are probably okay with it but I'm seeing $450 as a more reasonable price or maybe $500 at most. Yes, compared to the GTX780 and Titan, with this kind of performance, $600 is probably "cheap" but comparing it with the 280X (or last gen flagship), it still seems like a bad purchase especially since it's not that much of an improvement (30% improvement with different architectures is common).


----------



## DeadSkull (Oct 16, 2013)

Now the price tag makes some sense.

I wonder what the overclocks are going to be like for this card


----------



## harry90 (Oct 16, 2013)

These are preliminary tests which aren't done by a reliable source. Also there are no drivers which support R290x yet, so  none of the benchmark results  cant be trusted. Drivers are key i got 6950cfx with latest beta drivers i get 51fps average in crysis 3 veryhigh FXAA. with regular 13.9 i get like 42fps. with amd, drivers matter a lot.


----------



## RCoon (Oct 16, 2013)

So much for those claiming Titan killer - Power charts are simply retarded. I sure hope that reference hoover cooler can keep up...
I was going to preorder one of these 2 months ago. Each week, the 290X just looks worse and worse. Should have known AMD are master PR weavers.


----------



## buggalugs (Oct 16, 2013)

Exactly as I said all along, faster than the 780, slower than titan.....with a few game specific scenarios where the 290X is faster than both.


----------



## LAN_deRf_HA (Oct 16, 2013)

Power isn't surprising. They were already bad on that front and now they're trying to get more performance on the same process so power must go up. It occurs to me that price cuts from nvidia might be much smaller than expected. Truthfully they could drop the 780 to $600 and do whatever with the Titan. It's a mixed bag so people from both camps can claim to be on top, which usually means you won't get a big price response from nvidia.


----------



## Vlada011 (Oct 16, 2013)

No more for 10-15% stronger than Titan for 500-550$.
At the end if people buy R9-290X TwinFroze for same price as GTX780 Classified they can be happy.
And AMD Optimization and kicking as horse in games is famous. 
One thing must admit, AMD can make good GPU as ATI if they try hard, that's good.
AMD have ASUS ROG, NVIDIA EVGA Classified and EVBot.
ASUS somehow immediately offer Radeon under ROG badge. They are angry, probably on that way push NVIDIA to give them better chips. Their games are dirties on market.

That's is it, first stories had task to stop little selling of GTX780 and  push people in confusion.
But only crazy man can be happy and wish bankrupt and someones family to stay with money and people without job.
At the and 5% more or less.
From other side Titan price is not fair at all. 
800$ sound better with ACX Titan Cooler that can be again best option.


----------



## 15th Warlock (Oct 16, 2013)

If these results are real (and that's a big if) It's crucial for AMD to price this card in such a way that it is a much better deal than the 780, forget Titan, there are factory OCd 780s out there that are already faster than Titan out of the box.

If the $729.99 price is real, then AMD is in real trouble, lets assume vanilla 290Xs sell close to $699 or even $649, that pits this card directly against the similarly priced 780, and right in the same ballpark as heavily factory OCd models.

Here's hoping AMD can pull a fast one and price this card to kill, as it seems performance is going to be very similar to the 18 month old GK110 if these performance numbers are to be believed...


----------



## buggalugs (Oct 16, 2013)

I wouldn't read much into it using furmark. It could be a good sign. If the 290X is not throttling and Nvidia is throttling it doesn't really say much. With games it could be very different. I would rather have a card that doesn't throttle. Should be a better overclocker.


----------



## Novulux (Oct 16, 2013)

http://videocardz.com/46785/amd-radeon-r9-290x-performance-charts-emerge
And another graph that has questionable results? R9-290 might be an awesome card if this is true...


----------



## the54thvoid (Oct 16, 2013)

Novulux said:


> http://videocardz.com/images/2013/10/AMD-R9-290X-Performance.jpg
> 
> http://videocardz.com/46785/amd-radeon-r9-290x-performance-charts-emerge
> And another graph that has questionable results? R9-290 might be an awesome card if this is true...



That graph is very standalone (untrustworthy), even following the source link.  It's also two resolutions and three games aggregated, seriously?  Plus they've got the 290 and 290X benched.  I'm less believing of that than the PCOnline info.  I genuinely think the PCOnline info is not fake.  Benches could be off from drivers but it seems people have got retail cards and boxes.


----------



## Xzibit (Oct 16, 2013)

Splinter Cell: Blacklist comparison











I don't think anyone is going to be buying these cards for 1920x1080 gaming.

How much is the 780 ?  $650

AMD has a price window of $350-$650 to play with.  Its there for them to make it good or screw it up like the TITAN.


----------



## Assimilator (Oct 16, 2013)

WTF is the point of measuring power consumption in FurMark? Give us the power consumption in the tested applications/games alongside the frame rates so we can draw a useful conclusion about the power vs FPS numbers.

I'm also skeptical about Hawaii's performance in general. It still seems to be slower than the GK110 clock-for-clock, so there's nothing stopping nVIDIA from releasing a "780 Ultra" or somesuch with 1GHz core clock, which will then blow R290/X out of the water.


----------



## the54thvoid (Oct 16, 2013)

Xzibit said:


> Splinter Cell: Blacklist comparison
> 
> http://www.techpowerup.com/img/13-10-16/116f.jpg
> 
> ...



That 2560x1600 Blacklist result for the 780 looks wrong.  It's over 20% slower than the Titan.  I ran it at 1440p and it didn't use anywhere near 3GB in VRAM.  Wonder how the 780 scored so low 

As for pricing.... you're totally right, they could screw it up but Titan pricing isn't relevant.  Unless you need 6GB memory or DP the 780 made Titan redundant.  If AMD price the 290X around the 780, I can see it doing little to market share.  AMD guys will buy AMD, Nvidia guys will stick with a 780 (even if it performs worse).  
But, if AMD price the 290x at £400-450 (GTX 780 is £500-550) then i can see Nvidia fans switching to AMD.

And maybe when time allows the custom cards will shine through (again , as long as they don't cock it up like Titan with shitty boosts and low power limits).



Assimilator said:


> I'm also skeptical about Hawaii's performance in general. It still seems to be slower than the GK110 clock-for-clock, so there's nothing stopping nVIDIA from releasing a "780 Ultra" or somesuch with 1GHz core clock, which will then blow R290/X out of the water.



The base clocks for Titan are irrelevant.  A stock Titan will generally boost to 993MHz. It can drop as low as 876 though if temps get past 80 degrees si the clocks could be anywhere from 876 - 993 in those tests.


----------



## thematrix606 (Oct 16, 2013)

Xzibit said:


> Splinter Cell: Blacklist comparison
> 
> http://www.techpowerup.com/img/13-10-16/116f.jpg
> 
> ...



Most of the gamers use 1080, of course they will be used for 1080. Why in hell would you think otherwise?

Have you never played beyond 60Hz on a monitor? And they barely even reach 60FPS in Crysis 3 and other games!


----------



## bogami (Oct 16, 2013)

It is wonderful to see such a competitive product has great potential that I'd also will be much cheaper than NVIDIA products wich behaves like a pig with the young to thecustomers.        If the price variable $ 500-550 ˘ seriously considering buying at least three or four R9-290X .

Because R9-290X has reasonable price and wery good preformance it will bee other highly desirable blow between nVidia legs for manipulation with the price and products.
nVidia did not develop anything new and sale of old technology with a new label .
If customers are paying such a high price you would expect at least 200% the best product in the next year given that it should be paid for the development of new technologies.we all know that k104 did not get improvements as would normally be expected and sold throughout the year as Kepler's best for 500$ while holding in reserve k100 with a bunch of excuses.And when it was year around presented TITAN with the more abnormal price.Unfortunately, I only have two kidneys and would not cover SLI (2xTITAN) price !
I hope at a reasonable price and good preformance good slap in the face too nVidia


----------



## HammerON (Oct 16, 2013)

Still waiting for W1zzard's review...
Hopefully soon. Until then I will try not to bite on all of this speculation


----------



## Deleted member 24505 (Oct 16, 2013)

I love the smell of new hardware in the morning.

whether these are better than Nvidia cards or not, I still love it when new GPU's come out, all the speculation and "discussion" is interesting and sometimes fun to read.

Personally for me, a 280x or a 7970.


----------



## RCoon (Oct 16, 2013)

tigger said:


> I love the smell of new hardware in the morning.
> 
> whether these are better than Nvidia cards or not, I still love it when new GPU's come out, all the speculation and "discussion" is interesting and sometimes fun to read.
> 
> Personally for me, a 280x or a 7970.



7970 prices are to die for right now. If I didn't have a 780, I would have been all over these 7970's. It's not like they're even slow either, 1080p muncher without a doubt.


----------



## Lionheart (Oct 16, 2013)

tigger said:


> I love the smell of new hardware in the morning.
> 
> whether these are better than Nvidia cards or not, I still love it when new GPU's come out, all the speculation and "discussion" is interesting and sometimes fun to read.
> 
> Personally for me, a 280x or a 7970.



I love that smell as well, speaking of new PC hardware smell I just recently bought a new Seasonic PSU, installed & everything... 5mins later my bedroom smelt like new PC hardware lolz


----------



## Deleted member 24505 (Oct 16, 2013)

Lionheart said:


> I love that smell as well, speaking of new PC hardware smell I just recently bought a new Seasonic PSU, installed & everything... 5mins later my bedroom smelt like new PC hardware lolz



Few years ago, my mate bought a new corsair PSU, fitted it, and all it did was fill his room with the magic smoke naughty electronic stuff makes.


----------



## Fatal (Oct 16, 2013)

I will wait for Wizz to test it. All the other testers I can care less about. Even if its between the Titan and the 780 it still will help with lower prices. That's a win for all


----------



## sweet (Oct 16, 2013)

Fatal said:


> I will wait for Wizz to test it. All the other testers I can care less about. Even if its between the Titan and the 780 it still will help with lower prices. That's a win for all



There is no doubt that this card is above Titan clock to clock. However, this beast will consume bunch of wattage and that tiny fan is not really reliable. The card is capable at the top, but it is not perfect. Hope that the custom versions will be available soon.


----------



## Lionheart (Oct 16, 2013)

tigger said:


> Few years ago, my mate bought a new corsair PSU, fitted it, and all it did was fill his room with the magic smoke naughty electronic stuff makes.


----------



## Footman (Oct 16, 2013)

I think I'll wait for more reliable reviews on or after the 25th before deciding.


----------



## Frick (Oct 16, 2013)

Lionheart said:


> http://img.techpowerup.org/131016/Ouch.jpg



You know tpu.org has a resize feature right?


----------



## BiggieShady (Oct 16, 2013)

If anything, FurMark test shows that card can survive it without throttling  which means it will probably overclock generously with a voltage bump



> To begin with, the GPU core is clocked at 1050 MHz. There is no dynamic-overclocking feature, but the chip can lower its clocks, taking load and temperatures into account.



Load and temperature is mentioned for dynamic clocks, not power so if there is enough cooling there should be no throttling.
We should know if anyone has these under water clocked to 1.3 GHz, to see how much voltage is needed and if clocks get lowered under load at all ... it will be interesting for sure


----------



## Ghost (Oct 16, 2013)

sweet said:


> There is no doubt that this card is above Titan clock to clock. However, this beast will consume bunch of wattage and that tiny fan is not really reliable. The card is capable at the top, but it is not perfect. Hope that the custom versions will be available soon.



If Titan @ 837 MHz is faster than 290X @ 1000 MHz, would 290X @ 1000 MHz be faster than Titan @ 1000 MHz?


----------



## Lionheart (Oct 16, 2013)

Frick said:


> You know tpu.org has a resize feature right?








Yes I know I just get lazy sometimes...

On another note, when are the actual reviews suppose to come out, is it today, tomorrow or later this month?


----------



## RCoon (Oct 16, 2013)

Lionheart said:


> http://img.techpowerup.org/131016/e78.jpg
> 
> Yes I know I just get lazy sometimes...
> 
> On another note, when are the actual reviews suppose to come out, is it today, tomorrow or later this month?



When NDA lifts, which isnt until the end of October, but we don't know what date specifically *NOBODY DOES*


----------



## buildzoid (Oct 16, 2013)

Ghost said:


> If Titan @ 837 MHz is faster than 290X @ 1000 MHz, would 290X @ 1000 MHz be faster than Titan @ 1000 MHz?



All titan benchmarks are run a 993 to 1006mhz because of Nvidia's boost 2.0 so it's basically a clock to clock comparison.


----------



## Blín D'ñero (Oct 16, 2013)

Lionheart said:


> http://img.techpowerup.org/131016/e78.jpg
> 
> Yes I know I just get lazy sometimes...
> 
> On another note, *when are the actual reviews suppose to come out, is it today, tomorrow or later this month? *




Well, the *24th of October*, as i posted last night here and here (and nobody reacted ): that is what the Chinese reviewer said.


----------



## sweet (Oct 16, 2013)

Ghost said:


> If Titan @ 837 MHz is faster than 290X @ 1000 MHz, would 290X @ 1000 MHz be faster than Titan @ 1000 MHz?



Another victim of the scheme pulled by nVidia's dynamic boost


----------



## Prima.Vera (Oct 16, 2013)

Blín D'ñero said:


> Well, the *24th of October*, as i posted last night here and here (and nobody reacted ): that is what the Chinese reviewer said.



never understand why are they taking so long...


----------



## Aithos (Oct 16, 2013)

bogami said:


> It is wonderful to see such a competitive product has great potential that I'd also will be much cheaper than NVIDIA products wich behaves like a pig with the young to thecustomers.        If the price variable $ 500-550 ˘ seriously considering buying at least three or four R9-290X .
> 
> Because R9-290X has reasonable price and wery good preformance it will bee other highly desirable blow between nVidia legs for manipulation with the price and products.
> nVidia did not develop anything new and sale of old technology with a new label .
> ...



You do realize that almost the entire lineup of new AMD cards are rebadges from old cards right?  This card will also not retail for $500-550, I'm betting it will be $650-700 instead.  Then Nvidia will drop their 780 price to $550-600 and claim the same performance for less money (which is their previous MO).  Keep in mind these are based on reference clock, the Nvidia is an aggressive overclocker AND the 800 series which is the real competition for this card will be out Q1 2014.  The 780 has been out for quite some time already and is barely lower performance in any of these benchmarks.

Even if the 290x ends up being a bigger performance jump than they show here, it is unlikely anyone with a 780 will "upgrade" and if the 780 drops in price most people will go for the better performance/dollar ratio.  AMD is looking rather lackluster if you ask me.


----------



## Blín D'ñero (Oct 16, 2013)

btarunr said:


> [...] An out-of-place fourth member of the comparison is the $299 Radeon R9 280X. [...]



It's not out of place at all. Because it shows how strong even this card (R9 280X, the old 7970GE) is for relatively so little money (300 bucks or less). 
I have been crossfiring 7970 (@1100/1575) the past 1¾ years, getting performance far exceeding the results in these charts by R9 290X, Titan, 780..., for instance Crysis 3 Ultra runs a steady 60 fps vsync on my 2560x1600 screen; there isn't a game that doesn't run smooth as butter with these.
To those with a 7970 and a crossfire capable motherboard and no 600~650 bucks to upgrade i'd suggest to pair it with a second one or with a 280X and be done with this whole "new cards" rage. Getting the best result for the least cash.


----------



## Aithos (Oct 16, 2013)

sweet said:


> Another victim of the scheme pulled by nVidia's dynamic boost



Another person who doesn't understand how overclocking with current nVidia cards works.  When you overclock the base clock you also overclock the dynamic boost.  So if it runs at say 900hz (1000hz boost) when you overclock to 1000hz you end up with say 1050hz boost.  Both numbers go up.  If the 290x can't beat the reference Titan it would be even lower than the overclocked Titan.

Same goes for the 780 which can easily match a Titan OCed.  If the power numbers end up being an accurate reflection then even if the 290x is a serious OCing card it won't be worth the minor performance gain.  If it is subpar for overclocking the 780 will beat it flat out and AMD will be in trouble if nVidia drops their prices as expected.

Keep in mind nVidia also has another generation of cards coming out Q1 2014, if AMD can't beat the current cards effectively that's not a good sign.


----------



## Am* (Oct 16, 2013)

Seriously impressive results, especially for a card with early beta drivers beating Titan in its own game (Metro LL). If we take future driver updates into account as well as Mantle, this card has some serious potential. 

Power consumption figures don't mean jackshit on Nvidia's side and even if they were fair (which they aren't), I'll happily take a card that's cheaper and performs at its full potential that takes a few more watts at load than a gimped piece of crap like the Titan which is throttled & broken out of the factory and with almost no OC potential thanks to shitty power delivery circuitry and no custom PCB designs. Nvidia better halve the price of the Titan & drop the rest of their cards' prices (GTX 780 & 770 by at least 30%-40%) and come up with a $600 successor consisting of a fully enabled GK110 core, fast, or they can piss off out of the GPU race permanently this year and prepare for lots more doom and gloom analyst predictions and shitty Q4 results (and we can prepare for lots of butthurt statements from Nvidia execs for why they failed -- "because we didn't even try"..."we let AMD have the market this year"..."wait for Maxwell"...etc etc).


----------



## RCoon (Oct 16, 2013)

Am* said:


> Seriously impressive results, especially for a card with early beta drivers. If we take future driver updates into account as well as Mantle, this card has some serious potential. Power consumption figures don't mean jackshit on Nvidia's side and even if they were fair (which they aren't), I'll happily take a card that's cheaper and performs at its full potential than a gimped piece of crap like the Titan which is throttled & broken out of the factory and with almost no OC potential thanks to shitty power delivery circuitry. Nvidia better halve the price of the Titan and come up with a $600 successor with a fully enabled GK110 core fast, or they can piss off and prepare for lots of doom and gloom analyst predictions and shitty Q4 results.



I'm not sure you're experienced in anything to do with the Titan, or how it overclocks, so that conjecture you're making is a little, well, wrong. I want Titan to be thrashed as much as you do, because the price is retarded, but the Titan does OC, and it OC's very well in the right hands. Also the fact that in the steam survey, NVidia cards took the highest share of video cards in systems, should tell you enough about how their Q4 results will turn out.
This is yet another dubious leak, so it should be taken with a pile of salt and then some.
Also Mantle means nothing at this point. Better FPS in BF4 and that's it. No other games are confirmed to use Mantle, so for now Mantle can be completely ignored when it comes to possible benchmarking figures. Until somebody confirms they're using Mantle to port a game to PC, anyone claiming Mantle will make AMD dominate PC game benchmarks is entirely misdirected to the Nth degree.


----------



## Am* (Oct 16, 2013)

RCoon said:


> I'm not sure you're experienced in anything to do with the Titan, or how it overclocks, so that conjecture you're making is a little, well, wrong. I want Titan to be thrashed as much as you do, because the price is retarded, but the Titan does OC, and it OC's very well in the right hands. Also the fact that in the steam survey, NVidia cards took the highest share of video cards in systems, should tell you enough about how their Q4 results will turn out.
> This is yet another dubious leak, so it should be taken with a pile of salt and then some.
> Also Mantle means nothing at this point. Better FPS in BF4 and that's it. No other games are confirmed to use Mantle, so for now Mantle can be completely ignored when it comes to possible benchmarking figures. Until somebody confirms they're using Mantle to port a game to PC, anyone claiming Mantle will make AMD dominate PC game benchmarks is entirely misdirected to the Nth degree.



I don't need "experience" of any sort to know that a GPU is broken and gimped out of the factory -- I've read plenty of complaints both here and on GPU vendor forums to know that it has serious throttling issues and needed to be BIOS flashed in order to even work properly or ALLOW the GPU to sustain a decent overclock and that was from people who spent $3000+ grand on their GPU setups and water cooling alone. And I have a friend that owned one and returned it for the exact same reason. Prior to doing my fair bit of research, I had plans to move up to the Titan from my GTX 660 just before reviews for it came out. I ended up disregarding the Titan and returning my GTX 660 for similar reasons (throttling/unstable clocks, and a flatout nonsensical TDP, which was exceeded in several games under normal gameplay load), except the stupendous $1000 price tag of the Titan, which I was ready to fork out because I even had the cash set aside, had it been as good as promised by Nvidia.

And Mantle has direct ties with AMD's consoles, all of which run on AMD GCN GPUs -- regardless of how effective it will be, it is of more value to the average user than all of Nvidia's bullshit gimmicky features combined, including PhysX and the rest of the proprietary garbage they run. Since Mantle promises to make porting to the PC easier, it has a far brighter future ahead of it than anything Nvidia has to date. 

And BTW, the GPU market share on Steam includes a crapton of old GPU users like me, who are running Nvidia's golden age cards like Fermi and G92, who will at some point be looking to upgrade. Fact is AMD is taking more market share from Nvidia and faster than it ever has before and lots of people with money to spare, like me, will switch to AMD without a second thought if the performance is there. The only reason I am holding off pre-ordering a R290X right now is because A. I have a 3D Vision 2 monitor, which I wish AMD supported as well as Nvidia, but they don't, and B. My nearly 4 year old, VRAM starved GTX 460 still manages to run BF4 beta at a half decent framerate. The minute it dies or I find something tempting enough to upgrade to, I will. 

Oh and BF4, just like BF3 and BFBC2 before it, will be the main, if not the only reason I will upgrade, so I don't need to "wait and see" to know how Mantle will turn out. BF4 will be the deciding factor in my purchase, so if Dice say "Mantle makes a 290X mop the floor with the Titan" or "gives it an advantage over equivalent or overpriced Nvidia cards", it is already of more worth to me than anything Nvidia can promise.


----------



## Recus (Oct 16, 2013)

Am* said:


> I don't need "experience" of any sort to know that a GPU is broken and gimped out of the factory -- I've read plenty of complaints both here and on GPU vendor forums to know that it has serious throttling issues and needed to be BIOS flashed in order to even work properly or ALLOW the GPU to sustain a decent overclock and that was from people who spent $3000+ grand on their GPU setups and water cooling alone. And I have a friend that owned one and returned it for the exact same reason. Prior to doing my fair bit of research, I had plans to move up to the Titan from my GTX 660 just before reviews for it came out. I ended up disregarding the Titan and returning my GTX 660 for similar reasons (throttling/unstable clocks, and a flatout nonsensical TDP, which was exceeded in several games under normal gameplay load), except the stupendous $1000 price tag of the Titan, which I was ready to fork out because I even had the cash set aside, had it been as good as promised by Nvidia.
> 
> And Mantle has direct ties with AMD's consoles, all of which run on AMD GCN GPUs -- regardless of how effective it will be, it is of more value to the average user than all of Nvidia's bullshit gimmicky features combined, including PhysX and the rest of the proprietary garbage they run. Since Mantle promises to make porting to the PC easier, it has a far brighter future ahead of it than anything Nvidia has to date.
> 
> ...



Depression caused by huge power consumption and heat of R9 290.


----------



## ensabrenoir (Oct 16, 2013)

*Here we go!*

...amd would never deceive us...

....wow how late did this post show up..


----------



## dom99 (Oct 16, 2013)

Blín D'ñero said:


> Well, the *24th of October*, as i posted last night here and here (and nobody reacted ): that is what the Chinese reviewer said.



I think its the 25th...

Source http://sites.amd.com/us/Documents/AMD_Radeon_290X_Battlefield4_Limited_Ed.pdf


----------



## the54thvoid (Oct 16, 2013)

Am* said:


> I don't need "experience" of any sort to know that a GPU is broken and gimped out of the factory -- I've read plenty of complaints both here and on GPU vendor forums to know that it has serious throttling issues and needed to be BIOS flashed in order to even work properly or ALLOW the GPU to sustain a decent overclock and that was from people who spent $3000+ grand on their GPU setups and water cooling alone.



I have read (up to this morning at least) every page of the GTX Titan owners thread at OCN.  Everybody accepts that the Titan is hamstrung by BIOS limits but even without that, if temps are controlled it hits about 1097-1137MHz before throttling.  FTR, a Titan at 1137MHz is f*cking fast.



Am* said:


> so if Dice say "Mantle makes a 290X mop the floor with the Titan" or "gives it an advantage over equivalent or overpriced Nvidia cards", it is already of more worth to me than anything Nvidia can promise.



And I couldn't agree more with you.  If you want massive frame rates on BF4 at 1440p res or higher, it looks like 290X IS the way forward.  But bear in mind my 1136MHz Titan averages 60fps at Ultra settings (at 1440p).  I'm quite sure it will get blown away come Mantle time but let's not dismiss what is a sound card and has been since February.


----------



## Tatty_One (Oct 16, 2013)

Blín D'ñero said:


> Well, the *24th of October*, as i posted last night here and here (and nobody reacted ): that is what the Chinese reviewer said.



Dates quoted for availability for sale in the UK on a couple of sites say 25th October so that would tie in with the 24th NDA lift prediction.


----------



## Ghost (Oct 16, 2013)

sweet said:


> Another victim of the scheme pulled by nVidia's dynamic boost



Oh right lol. Titan @ 836~900 MHz. Still should be faster clock-to-clock.


----------



## Am* (Oct 16, 2013)

the54thvoid said:


> I have read (up to this morning at least) every page of the GTX Titan owners thread at OCN.  Everybody accepts that the Titan is hamstrung by BIOS limits but even without that, if temps are controlled it hits about 1097-1137MHz before throttling.  FTR, a Titan at 1137MHz is f*cking fast.



It makes me wonder how good GK110 really could have been with all 2880 cores enabled and with beefier VRMs from, say Gigabyte or MSI. Nvidia really shot themselves in the foot by not allowing custom designs -- the problem is Titan hits its board power limits long before it gets anywhere near 'unsafe' temperatures. Maybe Nvidia will release the full GK110 shortly after the R290X? Will be interesting to see what it will be capable of if that is indeed the case.



the54thvoid said:


> And I couldn't agree more with you.  If you want massive frame rates on BF4 at 1440p res or higher, it looks like 290X IS the way forward.  But bear in mind my 1136MHz Titan averages 60fps at Ultra settings (at 1440p).



Thanks for this. With my GTX 460 OC'd to hell and back and on its last legs, crawling through BF4 at medium-ish settings at 40-60FPS @ 1080p, your post really makes me really look forward to my next upgrade. I finally hope to use my 120Hz monitor to its full extent soon.


----------



## sweet (Oct 16, 2013)

Ghost said:


> Oh right lol. Titan @ 836~900 MHz. Still should be faster clock-to-clock.



The scheme of nVidia is much more sophisticated. Even that the boost clock says 900 MHz, every stock Titan will run at 993~1016 MHz, because of the Kepler boost. nVidia just knows how to cheat with benchmark.


----------



## Casecutter (Oct 16, 2013)

First no one buys this level of card for 1920x1080p. Could see this not from using actual/final release drivers for reviewers of R9 290X.   So most all those numbers those are... not worth speculation.

Don't put to much in Furmark consumption test, unless there's a score saying the amount of "work" that produced it's hardly significant or earth-shattering. The one the it might indicate is the cooler has some headroom.

But these numbers do hold to what I've been saying, can soundly beat a 780, while will spar with Titan depending on the Title.  Metro was one Nvidia had owned but not so much anymore. 

If AMD hold to what they've indicated and done with re-badge prices they'll have a win!
We wait...


----------



## 1d10t (Oct 16, 2013)

dom99 said:


> No, Ive seen these before and I think its some bloke in China who claims to have a card and benched it.



thanks for pointing out, lad 



Dj-ElectriC said:


> BTW, i feel that test results will be more in favor of the R9 290X at 2560X1440, that's where he belongs



AMD should make this statement for their card...TO THE EYEFINITY AND BEYOND 



Novulux said:


> http://videocardz.com/images/2013/10/AMD-R9-290X-Performance.jpg
> 
> http://videocardz.com/46785/amd-radeon-r9-290x-performance-charts-emerge
> And another graph that has questionable results? R9-290 might be an awesome card if this is true...



Jeez..those graphs look very tempting.Bummer,i don't know whom i should trust now.I need salvation...




thematrix606 said:


> Most of the gamers use 1080, of course they will be used for 1080. Why in hell would you think otherwise?
> Have you never played beyond 60Hz on a monitor? And they barely even reach 60FPS in Crysis 3 and other games!



Most of the gamers will choose mainstream card.Why in hell would you think otherwise?
Only enthusiast and score bitching worshiper will look otherwise.



RCoon said:


> ...Also the fact that in the steam survey, NVidia cards took the highest share of video cards in systems, should tell you enough about how their Q4 results will turn out...



Most of steam user consist from pre-build PC's,average joe and regular jane who bought marketing induced product.They only know three thing :
- anything cost more is better.
- any product which hordes the market is always faster.
- and the worst...buy product that had commercial opening scene in a game will make your game stable.
Let me guess,Intel's Havok and nVidia old motto The Way It's Meant To be Played definitely winner here.Heck,some of my colleagues believe AMD processor and graphics doesn't do gaming.They even boasting their i3+650Ti will decimate my FX8350+CF 7970


----------



## erocker (Oct 16, 2013)

sweet said:


> The scheme of nVidia is much more sophisticated. Even that the boost clock says 900 MHz, every stock Titan will run at 993~1016 MHz, because of the Kepler boost. nVidia just knows how to cheat with benchmark.



Doesn't a Titan run those boost clocks with games too? I'm having difficulty understanding where the cheating comes in to play.


----------



## the54thvoid (Oct 16, 2013)

erocker said:


> Doesn't a Titan run those boost clocks with games too? I'm having difficulty understanding where the cheating comes in to play.



Was about to post same. Boost applies in all scenarios. There is no cheating at all, just misguided sentiment.


----------



## Am* (Oct 16, 2013)

Casecutter said:


> First no one buys this level of card for 1920x1080p.



How do you work that out when current games like Far Cry 3, Crysis 3, Metro LL etc under the highest settings, cannot exceed 60FPS @1080p average on the most expensive single-GPU cards? Or how about the fact that next gen consoles will struggle to run launch titles natively @1080p (BF4 will run @720p)? You'll be surprised to know that the vast majority buying these cards will be gaming @1080p, whether it is across 1 or 3 panels. I am using a 27" 2560x1440 IPS panel right alongside my 1080p panel -- guess what, I still game on the TN panel due to the higher framerate, better response time and less input lag. It definitely makes more sense to game @1080p, at least competitively, since IPS benefits mostly do not apply much to gaming (viewing angles don't matter since I sit right in front of the monitor, and neither does the superior colour accuracy, since most games these days tend to have a really limited colour palette anyway -- BF3 has a blue tint to everything, BF4 seems to have a grey tint etc).


----------



## ManosHandsOfFate (Oct 16, 2013)

sweet said:


> There is no doubt that this card is above Titan clock to clock. However, this beast will consume bunch of wattage and that tiny fan is not really reliable. The card is capable at the top, but it is not perfect. Hope that the custom versions will be available soon.



What are you talking about?   There is 100% doubt as whether the 290x is faster than the Titan, and that's what EVERYONE is trying to figure out. So for you to say there is none, just sets the whole conversation back.


----------



## Rei86 (Oct 16, 2013)

Damn that power graph, I'm a power user so i don't really care about how much power it'll suck up but still makes me rethink about the PSU I'm running. 

Still starving for a review of these things.


----------



## TheoneandonlyMrK (Oct 16, 2013)

ManosHandsOfFate said:


> What are you talking about?   There is 100% doubt as whether the 290x is faster than the Titan, and that's what EVERYONE is trying to figure out. So for you to say there is none, just sets the whole conversation back.



obviously there are plenty who doubt the R9 290x is faster, this is yet another example of something you can throw steam stats at (not me),,most have nvidia apparently 

even when its out,, and mantels out, your still going to be able to find a fair few who would nock it, even if it were 50% faster.

stop trying to figure something thats mathmatically proveable and proven,, its about "application" anyway and more importantly wizzards application of it(R9 290X) that really matters


----------



## HumanSmoke (Oct 16, 2013)

Am* said:


> It makes me wonder how good GK110 really could have been with all 2880 cores enabled and with beefier VRMs from, say Gigabyte or MSI. Nvidia really shot themselves in the foot by not allowing custom designs


Firstly, Titan (and the identical power delivery reference GTX 780) seems to be a reasonable seller for Nvidia...So by your reckoning Nvidia shot themselves in the foot by selling a load of high-revenue GPUs in a package that keeps warranty failure rates low for eight months...

Yup, that's some foot shooting right there. 

Secondly, Nvidia have had two salvage parts collecting revenue and dominating the review benchmarks for the same length of time. Bonus point question: When was the last time Nvidia released a Quadro or Tesla card that didn't have a GeForce analogue of equal or higher shader count?*

Quadro K6000 (2880 core) released four days ago
Tesla K40 (2880 core) imminent

* Answer: Never


----------



## TheHunter (Oct 16, 2013)

Assimilator said:


> WTF is the point of measuring power consumption in FurMark? Give us the power consumption in the tested applications/games alongside the frame rates so we can draw a useful conclusion about the power vs FPS numbers.
> 
> I'm also skeptical about Hawaii's performance in general. It still seems to be slower than the GK110 clock-for-clock, so there's nothing stopping nVIDIA from releasing a "780 Ultra" or somesuch with 1GHz core clock, which will then blow R290/X out of the water.



Yeah, I bet they tested ES with probably improper driver/bios TDP protection for such apps (OCCT, Afterburner, Furmark).. 

I mean if you'd remove that in GK110 it would be the same 400-450w for sure.


----------



## arbiter (Oct 16, 2013)

So its only as fast as a stock clocked titan in most things? So what would be the results if it was tried against a titan OC'ed to 1000-1100mhz which seems to be where a lot of people are able to OC titan's to with little to no problem?


----------



## Blín D'ñero (Oct 16, 2013)

arbiter said:


> [...] So what would be the results if it was tried against a titan OC'ed to 1000-1100mhz which seems to be where a lot of people are able to OC titan's to with little to no problem?


The result: Titan would be way OV'priced... 
Titan $ 999,-  (newegg), whereas  R9 290X $600 ~650 _(expected)_


----------



## ManosHandsOfFate (Oct 16, 2013)

Blín D'ñero said:


> The result: Titan would be way OV'priced...
> Titan $ 999,-  (newegg), whereas  R9 290X $600 ~650 _(expected)_




Probably closer to $700, if not more...


----------



## Casecutter (Oct 16, 2013)

Am* said:


> It definitely makes more sense to game @1080p, at least competitively


Competitively okay that's a point.



ManosHandsOfFate said:


> Probably closer to $700, if not more...


There no way AMD can think $700, they have to move enough to get an ROI for a new part, it not like they've geldings and are just needing a home.  AMD needs to insure they can recoup engineering and set-up, while splitting that over enough wafers.  I don't see this as a boutique product I think they see it as a full-production offering no different the Tahiti was 2 years ago. 

I'm wanting a $550 MSRP.


----------



## SIGSEGV (Oct 17, 2013)

1d10t said:


> Only enthusiast and score bitching worshiper...
> 
> Heck,some of my colleagues believe AMD processor and graphics doesn't do gaming.They even boasting their i3+650Ti will decimate my FX8350+CF 7970



LOL 
so true...

Even my father would prefer to choose intel celeron (i'm not sure it's dual cores) and its gpu rather than amd A8-4 series (4 cores) and its igpu. 

@casecutter : count me in, i would be happy to get them in cfx and put it under water.


----------



## xorbe (Oct 17, 2013)

Tell me about power while playing a game, not Furmark ... who knows which is throttling more.


----------



## thematrix606 (Oct 17, 2013)

Casecutter said:


> First no one buys this level of card for 1920x1080p.



And yet again, another 60Hz monitor owner. Please go back to your cave. 



1d10t said:


> thanks for pointing out, lad
> Most of the gamers will choose mainstream card.Why in hell would you think otherwise?
> Only enthusiast and score bitching worshiper will look otherwise.



So a gamer with enough money to spend on nice hardware (anyone with a job really) is an enthusiast? Wow.


----------



## HammerON (Oct 17, 2013)

thematrix606 said:


> And yet again, another 60Hz monitor owner. Please go back to your cave.
> 
> 
> 
> So a gamer with enough money to spend on nice hardware (anyone with a job really) is an enthusiast? Wow.



And what monitor(s) do you own?
I will stick with my "lackluster" 60Hz 30" Dell and wait for the 4K monitors to come down in price. Does this mean I am in a cave as well? I absolutely love my monitor and see no reason at this point to go to 120Hz.


----------



## 1d10t (Oct 17, 2013)

SIGSEGV said:


> LOL
> so true...
> Even my father would prefer to choose intel celeron (i'm not sure it's dual cores) and its gpu rather than amd A8-4 series (4 cores) and its igpu.



I know you won't...even you show up in AMD Lounge...hahaha 
Not even a year and i already miss everyone in last gath @ bandung 



thematrix606 said:


> So a gamer with enough money to spend on nice hardware (anyone with a job really) is an enthusiast? Wow.



Why would someone spend $600 or $1000 for stupid card that cannot even attract a woman with nice boobs?Yup...that's enthusiast.



HammerON said:


> And what monitor(s) do you own?
> I will stick with my "lackluster" 60Hz 30" Dell and wait for the 4K monitors to come down in price. Does this mean I am in a cave as well? I absolutely love my monitor and see no reason at this point to go to 120Hz.



Might have something to add,no matter how fast your monitor,windows only sees them in 60Hz.It's in the panel,not in OS'es or graphic card.
FYI,I have 240Hz panel and yet still have a severe judder :shadedshu


----------



## the54thvoid (Oct 17, 2013)

1d10t said:


> Why would someone spend $600 or $1000 for stupid card that cannot even attract a woman with nice boobs?Yup...that's enthusiast.



Ah, the naivety of youth.  I enjoy having both.


----------



## Ahhzz (Oct 17, 2013)

tigger said:


> I love the smell of new hardware in the morning.
> 
> whether these are better than Nvidia cards or not, I still love it when new GPU's come out, all the speculation and "discussion" is interesting and sometimes fun to read.
> 
> Personally for me, a 280x or a 7970.



I'm with you on that. A 280x is looking more and more like the bee's knees.


----------



## 1d10t (Oct 17, 2013)

the54thvoid said:


> Ah, the naivety of youth.  I enjoy having both.



It's nice to know you had a better life than mine sir,may God always bless your family and guide  you in your hardest time 
Although in my early 30's i'm still at shitty job with minimum wages and barely make a living,I still believe God would have pity on me so i could date someone and make my own family someday


----------



## Am* (Oct 17, 2013)

HumanSmoke said:


> Firstly, Titan (and the identical power delivery reference GTX 780) seems to be a reasonable seller for Nvidia...So by your reckoning Nvidia shot themselves in the foot by selling a load of high-revenue GPUs in a package that keeps warranty failure rates low for eight months...



Let's get a few things straight: first of all, Titan was a terrible seller for Nvidia as a Geforce card and anyone that told you otherwise, is delusional to say the least. If you read that from Nvidia, "reasonable" means a piss poor seller where it mattered, because had it sold well, Nvidia would be screaming this from the rooftops and AMD wouldn't think twice about doing the same thing if they could. This joke of a card had a small surge of pre-order sales in the first month from several pro market customers who wanted to test the waters with lots of cheap Double Precision/CUDA cards where Titan was the cheapest option (and since GK104 is a complete joke in this area), which Nvidia could've milked for FAR more money than they did. Secondly, the Titan has been an expensive dust collector for all the biggest e-tailers in our country since the first few weeks of release and I can count on my hand the number of people in the UK that I know or have seen on forums and gaming communities, who bought one (even where everyone is the most likely to brag about their setups, I can literally point out most of the Titan owners). What all of Titan owners/ex-owners complained about was the underpowered stock design and no 3rd party alternatives to fix the problems. So yeah, they lost plenty of potential sales, whether you like it or not.



HumanSmoke said:


> Secondly, Nvidia have had two salvage parts collecting revenue and dominating the review benchmarks for the same length of time. Bonus point question: When was the last time Nvidia released a Quadro or Tesla card that didn't have a GeForce analogue of equal or higher shader count?
> 
> * Answer: Never



Congrats on stating the obvious, Sherlock. Did you see me mention anywhere that it won't be released? No...the question is how late will it be, and if anybody is going to be left that will want it by the time it actually comes out.



1d10t said:


> Might have something to add,no matter how fast your monitor,windows only sees them in 60Hz.It's in the panel,not in OS'es or graphic card.
> FYI,I have 240Hz panel and yet still have a severe judder :shadedshu



I think you're getting confused here. The fastest consumer panel you can get is 144Hz and Windows detects every last Hz perfectly. Any panels above that (especially the TVs) are just pure marketing bullshit -- they are 60Hz, low quality CCFL panels with frame repeating image controllers that "smudge" the image from one frame to the next, which is why you have severe "juddering". People really need to avoid falling for the marketing bullshit from manufacturers -- you cannot get a true 240Hz display the same way any TN monitor cannot display more than 256K colours, even though most manufacturers will flat out lie and say that their TN panels can display 16.7M colours with more image trickery (dithering).


----------



## Casecutter (Oct 17, 2013)

Casecutter said:


> First no one buys this level of card for 1920x1080p.


Man didn't mean to rile so many feathers, although I should've not used the words "no one".  

First I didn't say multi-panels, and sure there will be instances with specific type panels and if a competitive play you'd need some... Über.  While, even Far Cry 3, Crysis 3 if you juggle settings a little, a R9-280X you can make it work.  And those are perhaps the two exceptions, because even Metro LL on a R9-280X is 60FpS at 1920x1080p.  If those are what you intend to play there're always exceptions/concessions for some, but the large bulk of enthusiasts aren't looking at special circumstances, just a normal 1920x1080p. Perhaps 27" they got a year or two back and hoping to hold out perhaps another generation for 4k to become what some might see as "affordable".  Till then I see that many have nestled in with what they have. 

You could send $650-1000 on the graphic card and then what… a multi-panels on the cheap perhaps, because thinking "it" will give you a path later to get a 4k isn't a good avenue.  The smart move for the average person (for that money) step-up from an older 24” 1080p TN and last generation cards (6950/560Ti) and get a descent 2560x1440 monitor and say $310 for a R9-280X and have a fairly enjoyable enthusiast experience.  At least not all that much different than those that dropped $650-1000 on just the graphic card.


----------



## HumanSmoke (Oct 17, 2013)

Am* said:


> HumanSmoke said:
> 
> 
> > Firstly, Titan (*and the identical power delivery reference GTX 780*) seems to be a reasonable seller for Nvidia...So by your reckoning Nvidia shot themselves in the foot by selling a load of high-revenue GPUs in a package that keeps warranty failure rates low for eight months...
> ...


Comprehension fail or trolling ?
I believe we were talking about GK110- that is Titan *and the 780*....but of course, you're argument stands up fairly well if you're ignoring the largest selling parts 

You also seem to be making some fundamental error in what the Titan in particular was supposed to be. For the consumer, the card was supposed to represent the (fleeting) pinnacle of single-GPU performance with slight incentive of Nvidia not diluting FP64 performance. For Nvidia it represented PR. Every GPU review since (whether Nvidia or AMD) the Titan's launch has featured the card at or near the top of every performance metric. Eight months of constant PR and advertising that hasn't cost Nvidia an additional penny.

If sales of Titan (or the 780 for that matter) were paramount then you bet that Nvidia wouldn't have priced it as they have -in exactly the same way that AMD priced a supply constrained HD 7990 at $999. That hypothesis is all the more credible when the main revenue stream for GK110, the Tesla K20 is known to be supply constrained itself.

You're living in cloud cuckoo land if you believe that taking the muzzle off the AIB's for vendor specials, and lowering prices would have any significant impact of the overall balance sheet. The market for $500+ graphics cards is negligible in the greater scheme of things. Now subtract the percentage of people that if presented with a $500 card wouldn't also pay for a $650 (or more) board. Now subtract the percentage of people that would spend the same amount of cash on two lower specced cards offering better overall performance

Basically you're putting the gaming aspect under the microscope and not really looking at the big picture...the other alternative is that some random internet poster knows more about strategic marketing than the company with the sixteenth highest semiconductor revenue in the world


----------



## Blín D'ñero (Oct 17, 2013)

*Benchmarks @ 3840×2160 – AMD Radeon R9 290X Versus NVIDIA GeForce GTX 780:* 

LegitReviews


> In Bioshock: Infinite with the Ultra preset the AMD Radeon R9 290X ran at an average of 44.22 FPS and the NVIDIA GeForce GTX 780 was at 39.63.This shows a significant 11.6% performance advantage for the new AMD Radeon R9 290X with the Hawaii GPU.
> 
> Tomb Raider showed the AMD Radeon R9 290X averaging 43.0 FPS and the NVIDIA GeForce GTX 780 was at 40.8 FPS. This would make the AMD Radeon R9 290X roughly 5.4% faster than the NVIDIA GeForce GTX 780 in Tomb Raider.
> 
> ...





> AMD is also letting sites publish their own 4K benchmarks from Bioshock Infinite and Tomb Raider today, so expect to see some sites having numbers from their own unique test setup. We are here in Montreal and far from our test system, so expect to see our benchmark results when the full review goes live.



Source: LegitReviews

TomsHardware: First Official AMD Radeon R9 290X Benchmarks



> [...]
> There's not much more I can say at this point, except that I have several cards tested at 3840x2160, and R9 290X doesn’t just do well against the GeForce GTX 780…
> [...]



Hmmm... *"R9 290X Quiet Mode"...* Promising!


----------



## HumanSmoke (Oct 17, 2013)

Anand too:


----------



## Blín D'ñero (Oct 17, 2013)

Who forgets to mention the "Quiet Mode"...


----------



## springs113 (Oct 17, 2013)

blín d'ñero said:


> who forgets to mention the "quiet mode"...



lmfao


----------



## HumanSmoke (Oct 17, 2013)

Blín D'ñero said:


> Who forgets to mention the "Quiet Mode"...


Bwahahahahahaha.
Let me guess....you'll be petitioning W1zzard to measure sound and power consumption using "Quiet Mode", and to measure gaming benchmarks using "Noisy As Fuck Mode".


----------



## Casecutter (Oct 17, 2013)

HumanSmoke said:


> If sales of Titan (or the 780 for that matter) were paramount then you bet that Nvidia wouldn't have priced it as they have


Not arguing just stating the obvious marketing employed. 

A movie theater sells small popcorn for $3 and a large for $7.  People are actually more often shown they'll buy the $3 size having trouble rationalizing the higher price.

Then the Theater introduces a medium that somewhat close to the size of the large, but not as much though folks rationalize that it's just .50 cent more and they get more.  The theater actually starts selling more larges while medium and smalls aren't near as popular.

More choice provokes more thought and changes how the brain rationalizes things.  In this case it's opposite.  Nvidia release Titan and sure everyone salivates over what they’d like, but for large part of the market it's hard to justify, then add a $650 part and… set the hook.


----------



## Blín D'ñero (Oct 17, 2013)

HumanSmoke said:


> Bwahahahahahaha.
> Let me guess....you'll be petitioning W1zzard to measure sound and power consumption using "Quiet Mode", and to measure gaming benchmarks using "Noisy As Fuck Mode".



 No. AnandTech preview makes no mention of Quiet Mode (is it used/not used) nor the test system. Still you're eating that picture??? I guess you're just happy the 780 doesn't look too bad in it.


----------



## HumanSmoke (Oct 18, 2013)

Blín D'ñero said:


> No. AnandTech preview makes no mention of Quiet Mode (is it used/not used) nor the test system. Still you're eating that picture??? I guess you're just happy the 780 doesn't look too bad in it.


Y'know, if I wanted "Quiet Mode", I'd just set a fan profile in AB or Precision and allow the card to throttle rather than increase fan noise. I guess some people need to be spoon fed this advanced thinking as a "feature".


Casecutter said:


> Not arguing just stating the obvious marketing employed.


Bingo. Elementary sales technique. I doubt Nvidia wanted to sell the Titan in quantity - especially not at Wal-Mart prices when the Tesla variety brings in the revenue. A second tier salvage part (GTX 780) would have limited appeal for a pro part since the loss of shaders aren't going to be mitigated by any meaningful lowing of power consumption.


----------



## 1d10t (Oct 18, 2013)

Am* said:


> I think you're getting confused here. The fastest consumer panel you can get is 144Hz and Windows detects every last Hz perfectly. Any panels above that (especially the TVs) are just pure marketing bullshit -- they are 60Hz, low quality CCFL panels with frame repeating image controllers that "smudge" the image from one frame to the next, which is why you have severe "juddering". People really need to avoid falling for the marketing bullshit from manufacturers -- you cannot get a true 240Hz display the same way any TN monitor cannot display more than 256K colours, even though most manufacturers will flat out lie and say that their TN panels can display 16.7M colours with more image trickery (dithering).



Later i knew that 240Hz only quadrupling single frame 60Hz with MEMC methods while dimming backlight in lightning fast speed,it makes me outrage.Sure my panel had all the goodies,IPS panel 2 ms GTG 5 ms MRPT 240Hz 3D MMR put nicely across 42 diagonal while sitting in ridiculously one grand mark.But to know windows only sees them in 60Hz even i did some firmware flashing,define manual structured DDC and custom distributed EDID...I even fried my BCM3556 board.All pain and sacrifice for the glory of 3D is in vain thanks to 4K monitor :shadedshu


----------



## okidna (Oct 18, 2013)

1d10t said:


> They even boasting their i3+650Ti will decimate my FX8350+CF 7970





1d10t said:


> FYI,I have 240Hz panel and yet still have a severe judder :shadedshu





1d10t said:


> It's nice to know you had a better life than mine sir,may God always bless your family and guide  you in your hardest time
> Although in my early 30's i'm still at shitty job with minimum wages and *barely make a living*,I still believe God would have pity on me so i could date someone and make my own family someday



Coming from the same country as you, I wouldn't say having "FX8350+CF 7970", "240Hz panel", and those fancy water cooling stuff as an indicator of "barely make a living".... unless you're some kind of robot who doesn't eat at all and use all of your paycheck/money to buy PC hardware .

Seriously, be thankful. There's a lot of people out there in our country who's not as lucky as you.

Anyway, from Montreal : http://imgur.com/a/MEXNo


----------



## the54thvoid (Oct 18, 2013)

okidna said:


> Anyway, from Montreal : http://imgur.com/a/MEXNo



Non troll question. Thats an x79 mobo. High chance the 780 is running at pci-e 2 .  If the 290x is running at pci-e 3 would the 4k resolution have an effect on fps?


----------



## BiggieShady (Oct 18, 2013)

the54thvoid said:


> Non troll question. Thats an x79 mobo. High chance the 780 is running at pci-e 2 .  If the 290x is running at pci-e 3 would the 4k resolution have an effect on fps?



Valid concern. PCIE 2 is barely enough. 4k requires just a little less than 16 GBps : http://web.forret.com/tools/video_fps.asp?width=3840&height=2160&fps=60&space=rgba&depth=8


----------



## SIGSEGV (Oct 18, 2013)

okidna said:


> Coming from the same country as you, I wouldn't say having "FX8350+CF 7970", "240Hz panel", and those fancy water cooling stuff as an indicator of "barely make a living".... unless you're some kind of robot who doesn't eat at all and use all of your paycheck/money to buy PC hardware .
> 
> Seriously, be thankful. There's a lot of people out there in our country who's not as lucky as you.
> 
> Anyway, from Montreal : http://imgur.com/a/MEXNo



who knows.
i believe he had took 90% of his wages for months or years maybe to buy shinny hardware... It because the same shit also happened to me.. 



bencrutz said:


> cmon man, he's exaggerating. he's a manager on a large scale IT company  he wouldn't have much time posting here otherwise :shadedshu



wow, that's great.. 

-----





according to videocardz dot com.. http://videocardz.com/46929/official-amd-radeon-r9-290x-2160p-performance-17-games


----------



## bencrutz (Oct 18, 2013)

okidna said:


> Coming from the same country as you, I wouldn't say having "FX8350+CF 7970", "240Hz panel", and those fancy water cooling stuff as an indicator of "barely make a living".... unless you're some kind of robot who doesn't eat at all and use all of your paycheck/money to buy PC hardware .
> 
> Seriously, be thankful. There's a lot of people out there in our country who's not as lucky as you.
> 
> Anyway, from Montreal : http://imgur.com/a/MEXNo



cmon man, he's exaggerating. he's a manager on a large scale IT company  he wouldn't have much time posting here otherwise :shadedshu




the54thvoid said:


> Non troll question. Thats an x79 mobo. High chance the 780 is running at pci-e 2 .  If the 290x is running at pci-e 3 would the 4k resolution have an effect on fps?



errr, except from the picture it's obvious that 780 & 290x benched using the very same mobo


----------



## okidna (Oct 18, 2013)

the54thvoid said:


> Non troll question. Thats an x79 mobo. High chance the 780 is running at pci-e 2 .  If the 290x is running at pci-e 3 would the 4k resolution have an effect on fps?



That's a valid concern.

I read this thread @ EVGA forum : http://forums.evga.com/tm.aspx?m=2032492
It seems that 331.40 BETA provides PCI-E 3.0 support for Titan and 780 under X79 platform (but with Ivy Bridge-E processor, *don't know about SB-E*).


----------



## Xzibit (Oct 18, 2013)

the54thvoid said:


> Non troll question. Thats an x79 mobo. High chance the 780 is running at pci-e 2 .  If the 290x is running at pci-e 3 would the 4k resolution have an effect on fps?



How do you account for reviewer setups from Toms Hardwares, Anandtech & Eteknix that used there own hardware?


----------



## the54thvoid (Oct 18, 2013)

okidna said:


> That's a valid concern.
> 
> I read this thread @ EVGA forum : http://forums.evga.com/tm.aspx?m=2032492
> It seems that 331.40 BETA provides PCI-E 3.0 support for Titan and 780 under X79 platform (but with Ivy Bridge-E processor, *don't know about SB-E*).



Yeah but i'm using those drivers and i'm not showing pci-e 3 unfortunately. 

As for Xzibit 's point, yes, that's valid. If an Ivybridge board is used it negates any issues  

Like I said, not trolling but AMD's set up isn't 'potentially' using equal specs on both cards. If other reviews use Ivy, then all's cool (if the lane bandwidth is even a problem in the first place!)


----------



## TheHunter (Oct 18, 2013)

What's with these crappy 4k benchmarks, I dont care for that reso, even less for that lousy fps. I mean I would never play at 24-50fps..


Give us some 1920x1200 reso benchmarks.


----------



## okidna (Oct 18, 2013)

the54thvoid said:


> Yeah but i'm using those drivers and i'm not showing pci-e 3 unfortunately.
> 
> As for Xzibit 's point, yes, that's valid. If an Ivybridge board is used it negates any issues
> 
> Like I said, not trolling but AMD's set up isn't 'potentially' using equal specs on both cards. If other reviews use Ivy, then all's cool (if the lane bandwidth is even a problem in the first place!)



Ah, my bad, didn't realize you have SB-E


----------



## 1d10t (Oct 18, 2013)

okidna said:


> Seriously, be thankful. There's a lot of people out there in our country who's not as lucky as you.
> 
> Anyway, from Montreal : http://imgur.com/a/MEXNo



Not that i'm complaining 



BiggieShady said:


> Valid concern. PCIE 2 is barely enough. 4k requires just a little less than 16 GBps : http://web.forret.com/tools/video_fps.asp?width=3840&height=2160&fps=60&space=rgba&depth=8
> 
> http://www.rtcmagazine.com/files/images/3097/RTC02-TCTW-PCISIG-Table1_large.jpg



This.
290X implement a "new" crossfire methods via PCIe bus link.Although looks good on paper,my major concern is 38 PCIe lane 2.0 over my 990FX boards.Two of them could consume 32 lanes,and left 6 lane.It's still unclear to me whether AMD will do full duplex or needed another lane for crossfiring.On the side note,AMD could utilize IOMMU which is available across all 900 series board,perhaps...by creating virtual sideband addressing. 



SIGSEGV said:


> who knows.
> i believe he had took 90% of his wages for months or years maybe to buy shinny hardware... It because the same shit also happened to me..



we're in the same boat here...the key is 2S ,saving and starve 



bencrutz said:


> cmon man, he's exaggerating. he's a manager on a large scale IT company  he wouldn't have much time posting here otherwise :shadedshu



How dare you spreading FUD 
Btw how are you old friend?is your shoulder been recovered?


----------



## bencrutz (Oct 19, 2013)

1d10t said:


> How dare you spreading FUD
> Btw how are you old friend?is your shoulder been recovered?





not 100% but okay, better be, coz i spent pretty much 3 titans for it :shadedshu and now am broke


----------



## NeoXF (Oct 19, 2013)

TheHunter said:


> What's with these crappy 4k benchmarks, I dont care for that reso, even less for that lousy fps. I mean I would never play at 24-50fps..
> 
> 
> Give us some 1920x1200 reso benchmarks.



LOLWUT 


Anyway, here's an compiled comparison of another NDA-breaking run...


----------



## 15th Warlock (Oct 19, 2013)

the54thvoid said:


> Yeah but i'm using those drivers and i'm not showing pci-e 3 unfortunately.
> 
> As for Xzibit 's point, yes, that's valid. If an Ivybridge board is used it negates any issues
> 
> Like I said, not trolling but AMD's set up isn't 'potentially' using equal specs on both cards. If other reviews use Ivy, then all's cool (if the lane bandwidth is even a problem in the first place!)



I don't show PCIe 3.0 either unless I force enable it using the file on the following link, regardless of what driver I have installed, you might wanna give it a try David:

http://nvidia.custhelp.com/app/answers/detail/a_id/3135/

Dunno if it would have made much of a difference on a single card setup though, I can tell you this from personal experience, jumping from PCIe 3.0 in my Haswell rig to PCIe 2.0 on my SB-E setup didn't make that much of a difference, yes I'm comparing rendering a little over 6MP per frame on my setup vs over 10MP on a 4K monitor, but what I'm trying to say is even at such high resolutions, it doesn't seem like dual Titans are constrained by PCIe ver 2.0.

And yes, IB-E natively supports PCIe 3.0 on X79 , dunno if hardware sites who published these benchmarks were using that or IB-E, not much information has been made public by them...


----------



## NeoXF (Oct 19, 2013)

^
Well, sometimes full 16x per card in dual setup does wonders (I've just recently seen a R9 280X crossfire review w/ 8x and 16x results and... it's there). But I do agree that on single card setups, it means next to squat. And I also know that most people claiming it would make a difference are sore GTX 780 users...

BTW guys:
http://wccftech.com/amd-radeon-r9-290x-hawaii-xt-uber-mode-crossfirex-performance-leaked/

And:




Wadda ya think?


----------



## Ahhzz (Oct 20, 2013)

So, if I understand the graphic correctly, running a XFire 290x gets you between 1.8 times and 2 times the performance of running a single. Is that right? And did they do anything with something like Skyrim with tons of VRAM requirement to see if XFire handles well over a 3G point?


----------



## HumanSmoke (Oct 20, 2013)

Ahhzz said:


> And did they do anything with something like Skyrim with tons of VRAM requirement to see if XFire handles well over a 3G point?


You mean a unified memory pool between cards ? If so then no. AFR doesn't support it at present. The little used SuperTiling implementation I think did support unified memory - but since SuperTiling has its own issues (geometry, overdraw overhead, no OGL support) I think it may have gone the way of the dodo.


----------



## NeoXF (Oct 20, 2013)

Ahhzz said:


> So, if I understand the graphic correctly, running a XFire 290x gets you between 1.8 times and 2 times the performance of running a single. Is that right? And did they do anything with something like Skyrim with tons of VRAM requirement to see if XFire handles well over a 3G point?



Like it's been said... we're still far off from unified VRAM on multi-GPU setups.
But on the bright side, I think AMD mentioned something about 6GB versions of R9 290X.
Which begs the question, have they "fixed" the uneven VRAM placement on their cards? Look at GTX 660 Ti, nVidia uses 2GB as default... and that doesn't exactly fit with a 192bit bus (there's also 3GB version). AMD never (/could do) did this before, it was either double or nothing.


----------



## SIGSEGV (Oct 21, 2013)

> http://www.chiphell.com/thread-881612-1-1.html



Detail information about XFX R9-290x  



			
				Videocardz dot com says said:
			
		

> Furmark — 94 °C
> 3DMark — 52 °C
> 3DMark + OC — 55 °C
> Metro 2033 — 70 °C




with reference cooler ? not bad..


----------



## Xzibit (Oct 21, 2013)

SIGSEGV said:


> Detail information about XFX R9-290x
> 
> with reference cooler ? not bad..



The final runs were done with 100% Fan Speed. Furmark and 3DMark 11 are fine

He try'n to find the OC limit in 3DMark 11 it seams and those runs are fine.

It looks like the core will be similar to the OC of Tahiti on reference but then again if you look at the settings you want someone better at running benchmarks to release #'s and screenshot settings


----------



## radrok (Oct 21, 2013)

Some other leaked review scores














source (higher resolution benches on there)
http://forums.anandtech.com/showpost.php?p=35630412&postcount=13

It's looking very, very strong.

R9 290X Lightning, anyone?


----------



## Kovoet (Oct 21, 2013)

I'll wait for first hand review from Wizzard before i make anymore comments


----------



## radrok (Oct 21, 2013)

Kovoet said:


> I'll wait for first hand review from Wizzard before i make anymore comments



Like the 98% of this forums users  Me included.


----------



## EarthDog (Oct 21, 2013)

radrok said:


> R9 290X Lightning, anyone?


No thanks.. If it is anything like the GTX 780 Lightning anyway...


----------



## the54thvoid (Oct 21, 2013)

radrok said:


> Some other leaked review scores
> 
> http://i.imgur.com/s9xQyNY.jpghttp://i.imgur.com/WSL1mrQ.jpg
> http://i.imgur.com/hxm7Qzr.jpg
> ...



Hmm... Those charts show it weaker in 2 of the three compared to Titan.  No comment until I see official reviews....

Apart from saying I would have no interest in a 290X Lightning as MSI have been doing sub par on that front.  7970 with poor connectivity - read W1zz's review and the 780 Lightning with shit voltage locks to keep NV happy and no workaround.  I think the Matrix cards are looking like better bets these days.


----------



## EarthDog (Oct 21, 2013)

Wizz's review, with respect, missed out on A LOT for that card simply due to the canned nature of his review process. It is honestly not made for these types of cards so it misses out on those angles. 

The LN2 bios I don't think he even tried to use and ships with the same power limit as the stock bios rendering it useless. When you use the 300% bios MSI released, there are power use problems that, when cranked to 300% show ~250% power use out of the gate (though power meters show same consumption). That said without a third part bios, its less useful than the Classified or HOF.

Here is our review on it. We are only one of two sites that actually mentioned the power use issue...(and the other one blamed it on MSI AB, not the bios) and quite frankly the only site that gave it the rating it deserved.

I have been working with MSI for almost 8 weeks now trying to get a proper bios out but nada. For that card, look for 3rd party bios for sure...


----------



## Prima.Vera (Oct 21, 2013)

Kovoet said:


> I'll wait for first hand review from Wizzard before i make anymore comments



Friday or next Monday?


----------



## radrok (Oct 21, 2013)

EarthDog said:


> Wizz's review, with respect, missed out on A LOT for that card simply due to the canned nature of his review process. It is honestly not made for these types of cards so it misses out on those angles.
> 
> The LN2 bios I don't think he even tried to use and ships with the same power limit as the stock bios rendering it useless. When you use the 300% bios MSI released, there are power use problems that, when cranked to 300% show ~250% power use out of the gate (though power meters show same consumption). That said without a third part bios, its less useful than the Classified or HOF.
> 
> ...



As you said we have to look for 3rd party bioses, I reckon Skynet's bios gives full voltage and power limit control, right?

He's good at doing that, I'm using his 400%+ power limit bios for my GPUs.


----------



## EarthDog (Oct 21, 2013)

SKynet's are hit and miss. I went through some testing with the guy...

The last one I tried with him had the 300% limit (= 640W) but the voltage only allowed it to go to 1.21v. Does he have newer ones out that allow you to use at least 1.25v (what you can sort of use from the factory)?


----------



## the54thvoid (Oct 21, 2013)

EarthDog said:


> SKynet's are hit and miss. I went through some testing with the guy...
> 
> The last one I tried with him had the 300% limit (= 640W) but the voltage only allowed it to go to 1.21v. Does he have newer ones out that allow you to use at least 1.25v (what you can sort of use from the factory)?



I don't know much about the actual physical components but so far from all I've seen (mostly over at OCN) the 1.21v is a limit that must be addressed via software that reads from the NCP4026 thingy.
The actual drivers are the problem.  The Afterburner soft mod bypasses the driver and reads from the NCP2046 magic box.  It's the direct reading from there that allows the voltage change - that's why it doesn't work on EVGA's Precision X (apparently).  Although Precision is based on AB it is not the same.
I don't think a BIOS can change it past 1.21 unless the drivers are modded too?  But I could be totally wrong as I technically know nothing about electronics


----------



## radrok (Oct 21, 2013)

All I've found is up to 1.212v, thought for a moment he had it fully unlocked but the post specified unlocked "up to 1.212v".

Have you tried asking MSI the usual unlocked (non public) version of afterburner?

TBH a part of this big fail is to be given to nvidia and its useless limits, they should just stick to limit reference cards, annoying pricks they are.


----------



## xorbe (Oct 21, 2013)

I think there's an MSI AB command that allows 1.325v, check OCN forum
http://www.overclock.net/t/1421221/gtx780-titan-any-ncp4206-card-vdroop-fix-solid-1-325v
(Er, if that was about 780 / Titan.)


----------



## the54thvoid (Oct 21, 2013)

xorbe said:


> I think there's an MSI AB command that allows 1.325v, check OCN forum
> http://www.overclock.net/t/1421221/gtx780-titan-any-ncp4206-card-vdroop-fix-solid-1-325v
> (Er, if that was about 780 / Titan.)



Yes, that's what I was talking about above.  If you have the NCP4206 chip (which can be found through a DOS command) you can insert two lines in the AB profile under settings.  But that allows up to 1.3v afaik.  A LLC mod allows an extra .025v.

But we're hijacking a thread here - sorry folks.

But ,here's hoping no such hijinks are required to get the best out of the 290X but my gut is saying that Uber mode means they have some TDP shit going down.  They all seem to brag these days about being power efficient.  Irrelevant for enthusiast cards tbh.


----------



## EarthDog (Oct 21, 2013)

Right, which, does not work with Lightning? I recall trying this on the reference 780 and it didn't work. Not sure on the 780 Lightning though. To be honest, not too worried about it as I have a version of MSI AB that goes up a lot higher than that (@ Radrok). 

My goal is to get the 300% bios working with the +100mv that MSI gives you out of the gate. I don't care if it comes from MSI (though it damn well better!!!) or 3rd party, but I haven't seen anything over 1.21v myself from Skyn3t without adding those registry entries.

Its not the driver it seems. As I can, with the right bios and MSI AB, run WELL past 1.35v.



> But we're hijacking a thread here - sorry folks.


+1 oops.


----------

