# Radeon R9 290X Clock Speeds Surface, Benchmarked



## btarunr (Oct 1, 2013)

Radeon R9 290X is looking increasingly good on paper. Most of its rumored specifications, and SEP pricing were reported late last week, but the ones that eluded us were clock speeds. A source that goes by the name Grant Kim, with access to a Radeon R9 290X sample, disclosed its clock speeds, and ran a few tests for us. To begin with, the GPU core is clocked at 1050 MHz. There is no dynamic-overclocking feature, but the chip can lower its clocks, taking load and temperatures into account. The memory is clocked at 1125 MHz (4.50 GHz GDDR5-effective). At that speed, the chip churns out 288 GB/s of memory bandwidth, over its 512-bit wide memory interface. Those clock speeds were reported by the GPU-Z client to us, so we give it the benefit of our doubt, even if it goes against AMD's ">300 GB/s memory bandwidth" bullet-point in its presentation.

Among the tests run on the card include frame-rates and frame-latency for Aliens vs. Predators, Battlefield 3, Crysis 3, GRID 2, Tomb Raider (2013), RAGE, and TESV: Skyrim, in no-antialiasing, FXAA, and MSAA modes; at 5760 x 1080 pixels resolution. An NVIDIA GeForce GTX TITAN was pitted against it, running the latest WHQL driver. We must remind you that at that resolution, AMD and NVIDIA GPUs tend to behave a little differently due to the way they handle multi-display, and so it may be an apples-to-coconuts comparison. In Tomb Raider (2013), the R9 290X romps ahead of the GTX TITAN, with higher average, maximum, and minimum frame rates in most tests. 






*RAGE*
The OpenGL-based RAGE is a different beast. With AA turned off, the R9 290X puts out an overall lower frame-rates, and higher frame latency (lower the better). It gets even more inconsistent with AA cranked up to 4x MSAA. Without AA, frame-latencies of both chips remain under 30 ms, with the GTX TITAN looking more consistent, and lower. At 4x MSAA, the R9 290X is all over the place with frame latency.



 

 

 

 

*TESV: Skyrim*
The tester somehow got the game to work at 5760 x 1080. With no AA, both chips put out similar frame-rates, with the GTX TITAN having a higher mean, and the R9 290X spiking more often. In the frame-latency graph, the R9 290X has a bigger skyline than the GTX TITAN, which is not something to be proud of. As an added bonus, the VRAM usage of the game was plotted throughout the test run. 



 

 



*GRID 2*
GRID 2 is a surprise package for the R9 290X. The chip puts out significantly, and consistently higher frame-rates than the GTX TITAN at no-AA, and offers lower frame-latencies. Even with MSAA cranked all the way up to 8x, the R9 290X holds out pretty well on the frame-rate front, but not frame-latency.



 

 

 

 

*Crysis 3*
This Cryengine 3-based game offers MSAA and FXAA anti-aliasing methods, and so it wasn't tested without either enabled. With 4x MSAA, both chips offer similar levels of frame-rates and frame-latencies. With FXAA enabled, the R9 290X offers higher frame-rates on average, and lower latencies.



 

 

 

 

*Battlefield 3*
That leaves us with Battlefield 3, which like Crysis 3, supports MSAA and FXAA. At 4x MSAA, the R9 290X offers higher frame-rates on average, and lower frame-latencies. It gets better for AMD's chip with FXAA on both fronts.



 

 

 

 

Overall, at 1050 MHz (core) and 4.50 GHz (memory), it's advantage-AMD, looking at these graphs. Then again, we must remind you that this is 5760 x 1080 we're talking about. Many Thanks to Grant Kim.

*View at TechPowerUp Main Site*


----------



## jigar2speed (Oct 1, 2013)

At 800 MHz (core) and 4.60 GHz (memory) - there is a significant advantage to AMD - any heat from Nvidia - AMD can increase the speed of the chip plus the RAM speed can upped to ridicules speeds. 

Looks like AMD's card partners are going to enjoy their time tweaking this chip.


----------



## Slacker (Oct 1, 2013)

I can't wait for this card to be officially lifted of its NDA and gets benched. It looks really promising and maybe a great upgrade from my 6970


----------



## HumanSmoke (Oct 1, 2013)

> The memory is clocked at 1125 MHz (4.20 GHz GDDR5-effective).


Should read 4.50GHz


----------



## boogerlad (Oct 1, 2013)

I hope this means high overclockability... Core clocks seem low, and memory clocks are low probably because of bus instability due to so many traces.


----------



## Nordic (Oct 1, 2013)

So Dynamic underclocking? Not the clear winner over titan I had hoped for.


----------



## btarunr (Oct 1, 2013)

james888 said:


> So Dynamic underclocking? Not the clear winner over titan I had hoped for.



At $599 (compared to TITAN's $999), I'm sure even you can't complain.


----------



## Nordic (Oct 1, 2013)

btarunr said:


> At $599 (compared to TITAN's $999), I'm sure even you can't complain.


Compared to the titan it is a great deal for some great hardware. That is not my point though. I am not interested in upgrading from my 7970. I still hoped it would do more for very superficial reasons that involve popcorn. I just want to see AMD beat nvidea without question.

I also did not mean dynamic underclocking as a negative. That is just how the article described it with out the term I used. I was wondering if such a term would be correct.


----------



## Steevo (Oct 1, 2013)

I am guessing the review sample is a ES spin, so final could have higher clocks, and as well increased memory clocks as AMD have mentioned.


Considering this is almost hand in hand with titan at 800Mhz, 1Ghz would would provide somewhere around 20% more performance depending on achievable memory speeds.

27% increase in SP count but W1zz's benchmarks of Titan VS 7970 shows the AMD only 12% slower at that resolution, so either the card is crippled and someone got it for other testing, or they have failed to reach their target.

A efficiency improvement in the 7970 for higher core speeds and better memory would have been a better investment for them if this were true. I doubt this to be true, instead its most likely a crippled ES card.

http://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_Titan/22.html







Or its meant to show the validity of the core architecture at the same speed as Titan, while consuming less power, and having more features possibly.


----------



## GoldenTiger (Oct 1, 2013)

btarunr said:


> At $599 (compared to TITAN's $999), I'm sure even you can't complain.



And against a $650 or less GTX 780 like the Superclocked ACX or similar, that are faster than Titans right out of the box? It gets murkier there . Personally I've sold my GTX 780 and am looking to spend $500-800 on a card or pair of cards to run faster. Right now it's not looking like this one's going to be anything particularly better, unfortunately.


----------



## hilpi (Oct 1, 2013)

Very fishy Tomb Raider Benchmark, Min FPS higher or the same as the AVG FPS...


----------



## buggalugs (Oct 1, 2013)

I don't think they are the final clock speeds, and I don't think AMD will stop using dynamic overclocking.


----------



## 1d10t (Oct 1, 2013)

with the same core clock,less TMU and ROP but higher bus width AMD can perform on par or besting Titan.Clearly they mock nVidia by pricing this card only $599.Sure,nVidia can release Titan Ultra anytime,but when it does AMD just bumps clock and rename to R9 290XGE²  



Steevo said:


> Or its meant to show the validity of the core architecture at the same speed as Titan, while consuming less power, and having more features possibly.



i'm second to that mate


----------



## SIGSEGV (Oct 1, 2013)

1d10t said:


> with the same core clock,less TMU and ROP but higher bus width AMD can perform on par or besting Titan.Clearly they mock nVidia by pricing this card only $599.Sure,nVidia can release Titan Ultra anytime,but when it does AMD just bumps clock and rename to R9 290XGE²



Ultra enthusiast market vs Enthusiast market.. 
They (AMD) redefined it well bro...


----------



## the54thvoid (Oct 1, 2013)

It's the same old story.  NDA is 14 days away, by most accounts.  Hopefully they're wrong.  Until then all the benchmarks we see are dubious.

I'm thinking the Titan benches are dubious due to the huge latency spikes. It's one of the most consistent cards out there.  Unless of course it's not that good at handling triple screen resolutions?  

Either way, those clocks look far too low on the core for that same manufacturing process.  You'd think AMD would learn from low clock speeds (the original 7970).  I call 'meh' until it's official.


----------



## TRWOV (Oct 1, 2013)

Maybe drivers aren't there yet? That would explain why the dynamic overclocking isn't working. Steevo's suggestion that this isn't final silicon could be true too.


----------



## RCoon (Oct 1, 2013)

All I see is this




These benchmarks mean nothing, and I highly doubt the core or memory clock is correct. Like way off. I certainly dont see these benchmarks as being very open and clear.


----------



## esrever (Oct 1, 2013)

At 800mhz gpu and 4500mhz vram. The gpu does not hit AMD's specs. Which says 4b triangles per second and over 300GB/s vram. You need 1ghz GPU and 5000mhz vram to hit the specs. This leak is either bullshit or an ES that won't reflect final performance, more likely bullshit.


----------



## HumanSmoke (Oct 1, 2013)

the54thvoid said:


> I'm thinking the Titan benches are dubious due to the huge latency spikes. It's one of the most consistent cards out there.  Unless of course it's not that good at handling triple screen resolutions?


Without any horizontal axis (time) in the graphs there's only so much you can infer I suppose. As for the Titan and 5760x1080, PCPer and others have tested and the graphs look a bit smoother (horizontal axis notwithstanding)







the54thvoid said:


> Either way, those clocks look far too low on the core for that same manufacturing process.  You'd think AMD would learn from low clock speeds (the original 7970).  I call 'meh' until it's official.


Me to. 800MHz seems absurdly low for a GPU that is ~30% smaller than a GK110 which is conservatively clocked 10% higher,


----------



## net2007 (Oct 1, 2013)

I think you guys are missing the point here. If these benchmarks are real, the 290x hangs with a stock titan. It's obvious the clocks are low.


----------



## sunweb (Oct 1, 2013)

Holy shader O_O Is this true? Cores at 800 Mhz + ram at 4,5 Ghz + no overclocking and still stomps Titan at some games while goes toe to toe with the others? Damn. I need more benchmarks, if its true then wow.


----------



## unholythree (Oct 1, 2013)

RCoon said:


> All I see is this
> http://emotibot.net/pix/6233.gif
> These benchmarks mean nothing, and I highly doubt the core or memory clock is correct. Like way off. I certainly dont see these benchmarks as being very open and clear.



Agreed, I'm pretty close to an AMD fanboy; I want them to succeed badly but hype is a dangerous thing. I know why I'm still running a 1090T.


----------



## Hayder_Master (Oct 1, 2013)

it's not beat Titan it's just same or 2 FPS more and maybe less in some games


----------



## Recus (Oct 1, 2013)

From slightly faster than GTX 780 to faster than Titan. Seems legit


----------



## Aquinus (Oct 1, 2013)

I think I'll continue to sit tight so our own W1zzard can give us numbers that we can trust. 

Either way, if the benchmarks are legit, it's a great price point for the kind of performance it will put out. I can't see how people are complaining except for that it's a benchmark before the end of the NDA.


----------



## HumanSmoke (Oct 1, 2013)

Recus said:


> From slightly faster than GTX 780 to faster than Titan. Seems legit


Ah, but it's only the Titan Paradox Edition...its a special one-off card that has an average framerate 6 frames per second lower than its minimum framerate. No doubt when you overclock it, the card travels backwards in time and becomes its own inventor


----------



## HTC (Oct 1, 2013)

Recus said:


> From slightly faster than GTX 780 to faster than Titan. Seems legit



Remember: this is 5760 x 1080 pixels resolution.

A wider interface + higher memory should mean better performance @ higher resolutions but not necessarily so @ lower ones, no?


----------



## librin.so.1 (Oct 1, 2013)

inb4 "lolnub Y U run RAGE with vsync for benchmarks? lolwutanub!".


----------



## dj-electric (Oct 1, 2013)

Those specs are rather weird yet understandable.
Heck, if they are true  in conjuntion with those tests than one massive of an overclocking potential is waiting, in both core and memory segments.

BUT, it is not unreasonable to see 1250Mhz memory chips on the 290X to get those >300GBps they were talking about, otherwise why would they present such a bold lie?

I dont know, just waiting a couple of days to get my hands on one and there will be no more doubts, for me.


----------



## Pedro Lisboa (Oct 1, 2013)

*Fake benchmarks*

These are fake benchmarks :shadedshu.
Wait for some oficial tests and after that make a serious comparison between Titan/GTX 780 and R9 290X.


----------



## Recus (Oct 1, 2013)

HTC said:


> Remember: this is 5760 x 1080 pixels resolution.
> 
> A wider interface + higher memory should mean better performance @ higher resolutions but not necessarily so @ lower ones, no?



B.. but Titan has 6 GB memory for high res.


----------



## EarthDog (Oct 1, 2013)

Steevo said:


> I am guessing the review sample is a ES spin, so final could have higher clocks, and as well increased memory clocks as AMD have mentioned.
> 
> 
> Considering this is almost hand in hand with titan at 800Mhz, 1Ghz would would provide somewhere around 20% more performance depending on achievable memory speeds.
> ...


INteresting speculation, but you can't take a clock speed and 1:1 it to performance. Same with SP/Shader count...


----------



## springs113 (Oct 1, 2013)

Wizz got the card!!! He's laying low for that matter.


----------



## dj-electric (Oct 1, 2013)

springs113 said:


> Wizz got the card!!! He's laying low for that matter.



W1zz get cards so early he is probably sitting hours in-front of the screen just working on his evil laugh and munching on popcorn while reading all speculations and arguments.

http://i.imgur.com/BWUVqyi.png


----------



## HTC (Oct 1, 2013)

Recus said:


> B.. but Titan has 6 GB memory for high res.



You're right ... DUH ...

Maybe the difference is the interface? Dunno


----------



## jigar2speed (Oct 1, 2013)

I need your address WIZZARD, i got some beer i would love to share -  




Sheesh - me plans to rob Wizzard while he is drunk and score that shinny R9 290X


----------



## springs113 (Oct 1, 2013)

Lol Wizz come role with me lol i can get you free flights to anywhere in the world. lmfao...I think after this review it's bye bye hydro copper (at least from my main rig)  Eventhough I have the Creative ZXR I always preferred AMDs version of audio to NVIDIAs.  Plus AMDs color(visuals) have always appeared to be more precise/crisp.


----------



## Filiprino (Oct 1, 2013)

So, AMD still sucks on OpenGL. Nice.


----------



## Crap Daddy (Oct 1, 2013)

Dj-ElectriC said:


> I dont know, just waiting a couple of days to get my hands on one and there will be no more doubts, for me.



Couple of days?


----------



## EarthDog (Oct 1, 2013)

A "couple" seems to vary significantly from person to person, LOL! Couple WEEKS, sure.


----------



## Slomo4shO (Oct 1, 2013)

btarunr said:


> At $599 (compared to TITAN's $999), I'm sure even you can't complain.



The Titan has been available since February. In addition, there are already 780 models that outperform the Titan within $100 of the launch price of the R9 290X.... Releasing a product that is equivalent performance 8 months later isn't impressive by any means as I am sure price cuts can easily make this product lineup irrelevant if Nvidia decides to release the rumored Titan Ultra this year.  Also, maxwell is less than a year away. This launch is become more and more disappointing as the days progress.


----------



## W1zzard (Oct 1, 2013)

Corrected the GPU clock to 1050 MHz. The ES card the benches were run on has two BIOSes, one with 800/1125 and the other with 1050/1125. The benches were run at 1050/1125


----------



## springs113 (Oct 1, 2013)

W1zzard said:


> Corrected the GPU clock to 1050 MHz. The ES card the benches were run on has two BIOSes, one with 800/1125 and the other with 1050/1125. The benches were run at 1050/1125



I know you have the card W1zz


----------



## Vario (Oct 1, 2013)

unholythree said:


> Agreed, I'm pretty close to an AMD fanboy; I want them to succeed badly but hype is a dangerous thing. I know why I'm still running a 1090T.



Because the 1090t is the last seriously badass thing AMD has produced on the cpu side of things.  







6 cores baby ... not modules, cores.


----------



## RCoon (Oct 1, 2013)

W1zzard said:


> Corrected the GPU clock to 1050 MHz. The ES card the benches were run on has two BIOSes, one with 800/1125 and the other with 1050/1125. The benches were run at 1050/1125



Interesting. 1050mhz core clock 290X vs 876mhz core on the Titan. Leads me to wonder what overclocking headroom there is left on the 290X, if much at all, when compared to how far a 780 or Titan overclocks(1200mhz core without breaking a sweat). And all this nets a measly 0-8 FPS in some cases?



Vario said:


> Because the 1090t is the last seriously badass thing AMD has produced on the cpu side of things.
> 
> http://i.imgur.com/82i6YGm.jpg
> 
> 6 cores baby ... not modules, cores.



1100T would like a word with you.
That being said, my 1055T @ 4Ghz chirps gleefully at games.


----------



## Vario (Oct 1, 2013)

RCoon said:


> Interesting. 1050mhz core clock 290X vs 873mhz core on the Titan. Leads me to wonder what overclocking headroom there is left on the 290X, if much at all, when compared to how far a 780 or Titan overclocks(1200mhz core without breaking a sweat). And all this nets a measly 0-8 FPS in some cases?
> 
> 
> 
> ...



yeah your right.


----------



## EarthDog (Oct 1, 2013)

LOL, so does my Intel CPU, but who really gives a crap? LOL! This is r9 290X stuff peeps!


----------



## Vario (Oct 1, 2013)

EarthDog said:


> LOL, so does my Intel CPU, but who really gives a crap? LOL! This is r9 290X stuff peeps!



Yeah man, I think I'll upgrade my 7970 once these are out for 6mo- a year.  Get that price drop.


----------



## Slomo4shO (Oct 1, 2013)

W1zzard said:


> Corrected the GPU clock to 1050 MHz. The ES card the benches were run on has two BIOSes, one with 800/1125 and the other with 1050/1125. The benches were run at 1050/1125



You missed this in your edit:



> Over all, at *800 MHz* (core) and 4.50 GHz (memory), it's advantage-AMD, looking at these graphs. Then again, we must remind you that this is 5760 x 1080 we're talking about. Many Thanks to Grant Kim.


----------



## sweet (Oct 1, 2013)

RCoon said:


> Interesting. 1050mhz core clock 290X vs 876mhz core on the Titan. Leads me to wonder what overclocking headroom there is left on the 290X, if much at all, when compared to how far a 780 or Titan overclocks(1200mhz core without breaking a sweat). And all this nets a measly 0-8 FPS in some cases?



You forgot the fact that Titan constantly boosts to 1GHz when running at stock setting  nVidia's dynamic boost was created for this benchmark cheat.


----------



## EarthDog (Oct 1, 2013)

sweet said:


> nVidia's dynamic boost was created for this benchmark cheat.


----------



## Steevo (Oct 1, 2013)

EarthDog said:


> INteresting speculation, but you can't take a clock speed and 1:1 it to performance. Same with SP/Shader count...



If you did that math and looked at the scaling in SP count and core speed the correlation in performance is dependent on the memory and GPU core speed. It is usually within a few percent on all cards except those artificially crippled.

And we are not talking about exponential increases, but percentage of raw output standardized. 

So on he 7970 tests the increase was directly tied to core and memory speed, a 10% increase on both resulted in a 10% increase in performance. Same with Titan. 


I am aware of the boost speeds Titan uses, and so it does make it a apples to coconut comparison, if this test card is stuck at 800Mhz and is still hand in hand with Titan that is boosting to 1Ghz speeds, what happens when the memory is ramped up and the core is capable of reaching the same clocks as the 7970?


----------



## RCoon (Oct 1, 2013)

Steevo said:


> I am aware of the boost speeds Titan uses, and so it does make it a apples to coconut comparison, if this test card is stuck at 800Mhz and is still hand in hand with Titan that is boosting to 1Ghz speeds, what happens when the memory is ramped up and the core is capable of reaching the same clocks as the 7970?



The card was benched at 1050, edited in main article, so unless somebody clarifies further:

290X @ 1050mhz vs Titan boosted to 1000mhz


----------



## EarthDog (Oct 1, 2013)

WHo knows Steevo. 

Let's wait for real benchmarks instead of trying to continue to speculate with values that really cannot be used to achieve any greater accuracy than guessing in the first place.


----------



## m1dg3t (Oct 1, 2013)

How i LOOOOOOOOOOVE speculation! At least this thread hasn't urned into an nVidia thread 

W1zz: I got some spare coke and an extra hooker, you interested?


----------



## RCoon (Oct 1, 2013)

m1dg3t said:


> How i LOOOOOOOOOOVE speculation! At least this thread hasn't urned into an nVidia thread
> 
> W1zz: I got some spare coke and an extra hooker, you interested?



Extra? You have two?


----------



## Steevo (Oct 1, 2013)

You can't handle two? Coke got to your willy son?

I heard they are going to fix that problem with some magic blue powder, that's right, add crushed Viagra to the coke for a better pecker picker upper.


----------



## Casecutter (Oct 1, 2013)

Honestly, it more a leak of what the "Hawaii part" can do rather than saying it’s actual R9 290X performance.  Those clocks could be more representative of the Hawaii LE part.  
This crap going to bubble up... and there's less inference to draw form it because we don't know how old it is, or what the drivers were. 
So keep the seat belt on it's going to get bumpy, while in this  holding pattern.


----------



## m1dg3t (Oct 1, 2013)

RCoon said:


> Extra? You have two?



An "extra" would imply more than 1, nothing more. Nothing less. Use your imagination 



Steevo said:


> You can't handle two? Coke got to your willy son?
> 
> I heard they are going to fix that problem with some magic blue powder, that's right, add crushed Viagra to the coke for a better pecker picker upper.



 Heinekken hard - on strikes again!  



Casecutter said:


> Honestly, it more a leak of what the "Hawaii part" can do rather than saying it’s actual R9 290X performance.  Those clocks could be more representative of the Hawaii LE part.
> This crap going to bubble up... and there's less inference to draw form it because we don't know how old it is, or what the drivers were.
> So keep the seat belt on it's going to get bumpy, while in this  holding pattern.



AMD/ATi are trolling nVidia and the mainstream, that is all. Same thing nVidia has been doing from time... Looks like the red team finally put the shoe on the other foot


----------



## btarunr (Oct 1, 2013)

Graphs corrected for 1050/4500 MHz.


----------



## FrustratedGarrett (Oct 1, 2013)

Those are decent numbers. When not overclocked, the 290X is around the same performance as Titan except for Crysis 3 and Rage. It seems that AMD cards don't do well in those two games. Even when overclocked, the 290x still losses to the Titan in those games. 

Of course this is a multi-display setup and things might look different on a 1440P monitor. 

All of this shouldn't matter at this point. We need to see how efficient Mantle is. If Mantle does deliver on the 9x promise, which is quite likely given that console developers have managed to squeeze folds more in performance out of those console chips using low level coding that interfaces semi-directly with the hardware.  

Truly what matters at this point to me as a PC gamer is getting next gen PC games; a revolution in PC games. We need games with better mechanics, graphics and physics. 
All these old games just don't cut it for PC gamers like myself. The crappy game mechanics and physics especially are no longer bearable.


----------



## Xero717 (Oct 1, 2013)

I can't help but think if AMD truly had a Titan killer they would of bragged about having the world's fastest single GPU at GPU'14.  Their marketing is beyond over the top, and they would of had a field day with that.  

TrueAudio is neat, don't get me wrong, but this was a GPU conference, and they spent more time talking about that than touting their new flagship.  Maybe they learned from Bulldozer, but I doubt it.

As a side note, I have no doubt they will beat nVidia's offerings at price/performance.


----------



## tacosRcool (Oct 1, 2013)

I must be losing my mind but I thought there was another article on the R9 290X with dual bios, with the other one being overclocked?


----------



## Recus (Oct 1, 2013)

RCoon said:


> The card was benched at 1050, edited in main article, so unless somebody clarifies further:
> 
> 290X @ 1050mhz vs Titan boosted to 1000mhz



You must oc Titan if you want boost to 1000MHz, so HD 290X also was oc or BS again.


----------



## EarthDog (Oct 1, 2013)

Recus said:


> You must oc Titan if you want boost to 1000MHz, so HD 290X also was oc or BS again.


Reference Titan boosts to 1000MHz...

http://www.overclockers.com/nvidia-gtx-titan-video-card-review


----------



## the54thvoid (Oct 1, 2013)

tacosRcool said:


> I must be losing my mind but I thought there was another article on the R9 290X with dual bios, with the other one being overclocked?



No, I saw that too.  I figure it got pulled before it evoked the wrath of AMD.


----------



## sweet (Oct 1, 2013)

Recus said:


> You must oc Titan if you want boost to 1000MHz, so HD 290X also was oc or BS again.



You are a typical victim of the fall assumption created by nVidia's dynamic boost. The cards with this technology always boost themselves to the highest stable clock assuming the power target (6xx) or temperature target (Titan, 7xx) is satisfied. In case of Titan, with stock setting it runs games at more than 1GHz. The base clock is mostly just for show.


----------



## Casecutter (Oct 1, 2013)

EarthDog said:


> Reference Titan boosts to 1000MHz...
> 
> http://www.overclockers.com/nvidia-gtx-titan-video-card-review



I think you missed where they said, "and the temp target was set at 90°C"... that isn't stock correct?

W1zzard only found slightly below 1000Mhz on his referance review, but it ran there alot as it's a dark diamond.
http://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_Titan/34.html


----------



## EarthDog (Oct 1, 2013)

Not sure I missed anything...? 

Here is a paragraph from the article talking about what you are though...



> As mentioned in the initial article, the 876 MHz stock boost clock was quite conservative. When run for short periods and a moderate distance from the temp target (at stock that’s 80 °C), it was boosting to 1006 MHz consistently. If you adjusted the fan profile to keep it away from 80 °C, it would probably stay there indefinitely.



Is adjusting the fan profile overclocking? Overclocking is adjusting the clock speeds in my head. He adjusted the fan profile in the second graph which was then overclocked and hit 1100+ Mhz. The first is stock. 

So, with temps in order and not using the overly 'quiet' stock fan settings, this card will boost 1006 Mhz all day long. I guess we are splitting hairs? lOl!


----------



## sweet (Oct 1, 2013)

Casecutter said:


> I think you missed where they said, "and the temp target was set at 90°C"... that isn't stock correct?
> 
> W1zzard only found slightly below 1000Mhz on his referance review, but it ran there alot as it's a dark diamond.
> http://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_Titan/34.html



I can see the 993MHz -1006 MHz gap is small enough, right? So, this bench is between a 1050 MHz R9 290X and a 993MHz Titan. Personally I think it's a fair comparison, not to mention the mem clock of R9 290X is far more lower than Titan's.


----------



## Patriot (Oct 1, 2013)

hilpi said:


> Very fishy Tomb Raider Benchmark, Min FPS higher or the same as the AVG FPS...



Either those 2 collumns got switched on accident or they are fake... who knows...

min being 29 makes more sense 35 avg... would fall about same % behind amd...


----------



## EarthDog (Oct 1, 2013)

sweet said:


> not to mention the mem clock of R9 290X is far more lower than Titan's.


You are forgetting about the 512bit bus on the 290X...


----------



## Ahhzz (Oct 1, 2013)

Slacker said:


> I can't wait for this card to be officially lifted of its NDA and gets benched. It looks really promising and maybe a great upgrade from my 6970



Absolutely. I've got about a year to look for solid replacements for my dual 6950's, and I'd love to see ASUS put out a rock-steady OC pair for a good price...


----------



## Casecutter (Oct 1, 2013)

EarthDog said:


> I guess we are splitting hairs? lOl!


True enough not a worry.


----------



## Patriot (Oct 1, 2013)

*Folding performance*

I want to see how this will fold... 
or in the very least...
FAHbench
http://proteneer.com/blog/?page_id=1671


----------



## TRWOV (Oct 1, 2013)

Too bad there isn't any GPU WUs available at WCG. I bet this would net about 200K ppd


----------



## Patriot (Oct 1, 2013)

TRWOV said:


> Too bad there isn't any GPU WUs available at WCG. I bet this would net about 200K ppd



 Fellow DCer.
There are other DC projects that make good use of GPUs both BOINC and f@h.


----------



## EpicShweetness (Oct 2, 2013)

I'm just gonna stand back, wait for the release, and let the flame war burn itself out. Ifs its a $600 card it's gonna be amazing regardless. If it ends up being less then $600 we have a absolute game changer on our hands.


----------



## Fluffmeister (Oct 2, 2013)

Fairly interesting results, if unsurprising in places (GRID 2 is just a continuation of the Dirt Showdown results). Looking at things from a purely technological progress perspective, I can't help but think it's still being compared to what amounts to be cut-down nV technology that has already been on the market in one form or another for a good year now already. 

I appreciate the excitement of course, and no doubt it will thankfully bring prices into more reasonable realms for all us mere mortals.


----------



## FR@NK (Oct 2, 2013)

Im hoping for $499-$549 range. I might do $599 for an asus custom pcb.


----------



## sweet (Oct 2, 2013)

FR@NK said:


> Im hoping for $499-$549 range. I might do $599 for an asus custom pcb.



FYI, there will be no custom for R9 290X, just like Titan. Maybe the only custom cards are those 8000 of R9 290X Battlefield 4 edition.
We will have to wait for R9 290, a 780 counter part, for custom pcb.


----------



## Serpent of Darkness (Oct 2, 2013)

*Epic Fails @ Math...*

@ the Tomb Raider Benchmark Graph:

-FXAA-
GTX Titan, Min = 35.0 fps; Max = 40.9 fps; avg stated = 29.1 fps; actual avg =  37.95 fps.
error off = 24.62%.

This implies that the GTX Titan's avg is smaller than it's minimum...


RX9-990, Min = 32.0 fps; Max = 48.6 fps; avg stated = 38.6 fps; actual avg = 40.30 fps.
error off = 4.22%.

Looking at the huge error of the GTX Titan's performance, this graph comes into question, and also, the author's ability comes into question.

-MSAA 4x-
GTX Titan, Min = 15.1 fps; Max = 19.4 fps; avg stated = 15.1 fps; actual avg = 17.25 fps.
error off = 12.46%

In this, author states that the Minimum fps is equal to the average...  Again, this is another error...

RX9-990, Min = 14.7 fps; Max = 22.0 fps; avg stated = 18.2 fps; actual avg = 18.35 fps.
error off = 0.82%


----------



## RCoon (Oct 2, 2013)

sweet said:


> FYI, there will be no custom for R9 290X, just like Titan. Maybe the only custom cards are those 8000 of R9 290X Battlefield 4 edition.
> We will have to wait for R9 290, a 780 counter part, for custom pcb.



Like Titan (and arguably the 780) it will be made for water!


----------



## haswrong (Oct 2, 2013)

sweet said:


> FYI, there will be no custom for R9 290X, just like Titan. Maybe the only custom cards are those 8000 of R9 290X Battlefield 4 edition.
> We will have to wait for R9 290, a 780 counter part, for custom pcb.



AMD gone green -> AMviDia? they copy just about everything, except the performance..
i expected much much more consistent and higher performance results.. this card changes nothing and will be easily dominated by evga gtx 780 classified for just a slightly higher price.

basically theres no wonder AMD was hiding these cards as long as humanly possible.. if they had a game changer, itd be available in the summer already. yup, its the bitter reality..


----------



## uuuaaaaaa (Oct 2, 2013)

Serpent of Darkness said:


> @ the Tomb Raider Benchmark Graph:
> 
> -FXAA-
> GTX Titan, Min = 35.0 fps; Max = 40.9 fps; avg stated = 29.1 fps; actual avg =  37.95 fps.
> ...



The first one that you pointed out is correct, the avg cannot be below the min fps. 

The other ones are perfectly possible. The average is not the (Max_fps+Min_fps)/2... These are weighted averages. Imagine a situation were your game runs at 55 fps 99% of the time, and 1% were it runs at 300fps. Is the average fps 205, or will it be closer to 55?


----------



## Xzibit (Oct 2, 2013)

I don't think its a mystery to anyone that's owned the Tomb Raider reboot that the benchmark is screwy since launch.


----------



## H82LUZ73 (Oct 2, 2013)

Slacker said:


> I can't wait for this card to be officially lifted of its NDA and gets benched. It looks really promising and maybe a great upgrade from my 6970



That make 2 of us ,I will just go 1 card this time crossfire is a pain to maintain some times,I am also interested in the Blu Ray /Audio playback of this 290x card.Say Wizzz That is a hint


----------



## Recus (Oct 2, 2013)

sweet said:


> You are a typical victim of the fall assumption created by nVidia's dynamic boost. The cards with this technology always boost themselves to the highest stable clock assuming the power target (6xx) or temperature target (Titan, 7xx) is satisfied. In case of Titan, with stock setting it runs games at more than 1GHz. The base clock is mostly just for show.



I know how it works. I'm talking about this


----------



## Patriot (Oct 2, 2013)

uuuaaaaaa said:


> The first one that you pointed out is correct, the avg cannot be below the min fps.
> 
> The other ones are perfectly possible. The average is not the (Max_fps+Min_fps)/2... These are weighted averages. Imagine a situation were your game runs at 55 fps 99% of the time, and 1% were it runs at 300fps. Is the average fps 205, or will it be closer to 55?



His post title was aptly named fails @math... 

Yes I think many of us have noted your min can't be above your average.
looks like the 2 rows got swapped tbh....


----------



## sweet (Oct 2, 2013)

Recus said:


> I know how it works. I'm talking about this http://s15.postimg.org/lmwsuf5rf/Capture.jpg



For a Kepler cards like Titan, even though it says the boost clock is 876 MHz, it still boosts to ~1000 MHz when running, stock setting.
Try to google the term "Kepler boost" and you will know what I meant with "cheating"?


----------



## HumanSmoke (Oct 2, 2013)

sweet said:


> For a Kepler cards like Titan, even though it says the boost clock is 876 MHz, it still boosts to ~1000 MHz when running, stock setting.
> Try to google the term "Kepler boost" and you will know what I meant with "cheating"?


If reasonably intelligent dynamic core tuning is your idea of cheating then pretty much everyone is doing it. I'll take a wide guess and say you don't have an issue with Intel's or AMD's implementations(both CPU and GPU) of the same technology considering both these companies predate Nvidia's use.

Nice bait though.


----------



## Ahhzz (Oct 2, 2013)

I was pretty sure both camps were guilty of catering their cards to their "preferred brand" of benchmark... kinda sounds like Intel, way back


----------



## HumanSmoke (Oct 2, 2013)

Ahhzz said:


> I was pretty sure both camps were guilty of catering their cards to their "preferred brand" of benchmark... kinda sounds like Intel, way back


Pretty much. Nvidia had the whole 3DMark2003 issue, and ATI pretty much invented benchmark tuning for GPUs when it released a top end card (Rage Pro Turbo) whose only difference from the card it replaced was a driver fine-tuned for synthetic benchmarks. 
The graphics landscape is littered with "optimizations" from most anyone who ever released hardware.


----------



## buildzoid (Oct 3, 2013)

HumanSmoke said:


> If reasonably intelligent dynamic core tuning is your idea of cheating then pretty much everyone is doing it. I'll take a wide guess and say you don't have an issue with Intel's or AMD's implementations(both CPU and GPU) of the same technology considering both these companies predate Nvidia's use.
> 
> Nice bait though.



AMD doesn't have dynamic OC on their gpus. Intel and AMD CPUs only turbo to the specified turbo frequency. If it says max turbo frequency 4.2 on the box the cpu will run upto 4.2 never above 4.2. If Nvidia specified the boost speed as 1ghz it wouldn't be cheating but when you say 867mhz and run 1ghz it's cheating.


----------



## HumanSmoke (Oct 3, 2013)

buildzoid said:


> AMD doesn't have dynamic OC on their gpus. Intel and AMD CPUs only turbo to the specified turbo frequency. If it says max turbo frequency 4.2 on the box the cpu will run upto 4.2 never above 4.2. If Nvidia specified the boost speed as 1ghz it wouldn't be cheating but when you say 867mhz and run 1ghz it's cheating.


It's actually 876 Mhz....and that figure is the "minimum guaranteed" boost (maximum guaranteed sustainable) not the maximum achievable- which is fairly common knowledge to be 992MHz at default voltage ( 9 speed bins of 13 MHz/0.013v steppings). It's not so much cheating as being ignorant of the actual facts on how the boost algorithm actually works.

As an amusing aside, why do a number of people howl about the max sustainable boost over the minimum guaranteed boost, but yet always quote the lower specification numbers associated with base frequency. For example, every man and his AMD loving dog seem to attribute Titan's FP32 number at 4500 Teraflops of compute...yet if you were taking the boost into consideration -i.e. just as in gaming benchmarks - the number is actually 5333. Weird huh?

Anyhow we seem to be getting off the subject


----------



## SIGSEGV (Oct 3, 2013)

according wccftech :


----------



## sweet (Oct 3, 2013)

HumanSmoke said:


> It's actually 876 Mhz....and that figure is the "minimum guaranteed" boost (maximum guaranteed sustainable) not the maximum achievable- which is fairly common knowledge to be 992MHz at default voltage ( 9 speed bins of 13 MHz/0.013v steppings). It's not so much cheating as being ignorant of the actual facts on how the boost algorithm actually works.
> 
> As an amusing aside, why do a number of people howl about the max sustainable boost over the minimum guaranteed boost, but yet always quote the lower specification numbers associated with base frequency. For example, every man and his AMD loving dog seem to attribute Titan's FP32 number at 4500 Teraflops of compute...yet if you were taking the boost into consideration -i.e. just as in gaming benchmarks - the number is actually 5333. Weird huh?
> 
> Anyhow we seem to be getting off the subject


I understand your point of view, but you seem to be new with the term "Kepler boost". The difference between 876MHz and 993MHz (13*9 = 9 step Kepler boost) is significant. And in any case, running the card at higher clock than it should be in benchmarks is an act of cheating. Dynamic boost is a nice utility for users though.

On the topic, the final specs  


> http://www.webhallen.com/se-sv/hard...bundle_limited_edition__battlefield_4_premium
> 
> GPU Codename – Hawaii
> GPU Process – 28nmlarge
> ...



Personally, I'm disappoint with the VRM settings  The core clock will be hard to deal with, however the mem can be much higher if samsung/hynix chips were used.


----------



## Blín D'ñero (Oct 3, 2013)

sweet said:


> [...]
> 
> On the topic, the final specs
> 
> ...


What "final". That page states:


> Vi kommer endast att få in ett fåtal av dessa så passa på att boka nu! Observera att priset på produkten och nedan specifikationer är preliminära och kan komma att ändras (framförallt klockfrekvenserna är inte 100 % bekräftade ännu).
> Bokningar är inte bindande och om vi justerar ner priset får ni givetvis tillbaka mellanskillnaden, precis som vanligt på Webhallen med andra ord.


 in google english:





> We will only get a few of these so make sure you book now! *Please note that* the price of the product and *the below specifications are provisional and subject to change (especially the frequencies are not 100% confirmed yet).*
> Reservations are not binding and if we adjust the price down, you will of course refund the difference, just as usual on Webhallen in other words.



7299 Kr (Swedish Kroner) is € 845. If that's true, then...

Hmm well they sell the Titan for 8490 Kr ( = € 983,-) ... which is normal.

[EDIT:
Naaa... price is already known to be $599, and their description translates: 





> Price, images, product description and release date are preliminary and subject to change.
> 
> [...]


/EDIT]


----------



## sweet (Oct 3, 2013)

Blín D'ñero said:


> 7299 Kr (Swedish Kroner) is € 845. If that's true, then...
> 
> Hmm well they sell the Titan for 8490 Kr ( = € 983,-) ... which is normal.



There are only 8000 of this version. It is bundled with BF4 premium (~140$ on Origin), and comes the premium fee for the first owners.


----------



## Blín D'ñero (Oct 3, 2013)

The description says


> Pris, bild, produktbeskrivning och releasedatum är preliminära och kan komma att ändras.
> ASUS Radeon R9-290X 4GB med Battlefield 4 på köpet i extremt begränsad utgåva!
> 
> Exklusivt för Webhallen får du dessutom battlefield 4 Premium på köpet. Premium innehåller bland annat:
> ...



in google english:


> Price, images, product description and release date are preliminary and subject to change.
> ASUS Radeon R9-290X 4GB with Battlefield 4 on the purchase of extremely limited edition!
> 
> Exclusive to Webhallen, you get battlefield 4 Premium *for free*. Premium includes:
> ...



*For free.*
That makes the card € 845.


----------



## erocker (Oct 7, 2013)

Thread cleaned up. Any trolling, flaming or posting in an uncivilized manner may result in the loss of posting privileges. 

Please behave appropriately.

Thank you.


----------

