# Final Radeon R9 290 Series Specifications Leaked



## btarunr (Oct 7, 2013)

Disappointed at the $729.99 Newegg.com pricing of the Radeon R9 290X? No worries. AMD's second SKU based on the "Hawaii" silicon could be lighter on the wallet. Japanese retailers leaked the specifications sheets of both the upcoming R9 290X, and its lighter sibling, the R9 290 (non-X). Specifications of the R9 290X match rumors. The chip features 2,816 stream processors, up to 1000 MHz of GPU clock, single-precision floating point performance of 5.16 TFLOP/s, and 4 GB of GDDR5 memory across a 512-bit wide memory interface, clocked at 5.00 GHz, yielding 320 GB/s of memory bandwidth. The R9 290, on the other hand, features 2,560 stream processors, up to 948 MHz GPU clocks, 4.9 TFLOP/s single-precision floating point performance, and the same memory subsystem as the R9 290X. Both cards feature an identical combination of power connectors, 8-pin PCIe and 6-pin PCIe. Both feature hardware support for DirectX 11.2, OpenGL 4.3, and Mantle.





*View at TechPowerUp Main Site*


----------



## Bjorn_Of_Iceland (Oct 7, 2013)

Looks like this would smoke the Titan in price and performance.


----------



## MikeGR7 (Oct 7, 2013)

I sure hope we'll see the glory days of 6950 again and 290 unlocks to 290X!


----------



## jigar2speed (Oct 7, 2013)

^ +1, Also AMD seems to be holding this cards on specs - GTX 770 has 7GHZ of RAM, same applied on R9 will boost the memory bandwidth heavily.


----------



## dom99 (Oct 7, 2013)

If the R9 290 overlcocks to the same performance as the 290X, at a significantly lower price, that smy next card!

I would also love to see those shaders unlocked but time will tell.


----------



## Andrei23 (Oct 7, 2013)

'Looks like this would smoke the Titan in price and performance.'

Let's hope it does, the gpu market could sure use some competition. I would really want to get a second gtx 780 but I will be damned if I am going to pay another £540 to get one.


----------



## MikeGR7 (Oct 7, 2013)

But let's see the 290X performance first!
If the leaked price of 700$ is correct then it better beat the crap out of 780!


----------



## SIGSEGV (Oct 7, 2013)

there is a huge gap between R9-290 and R9-280X on its stream processors count


----------



## HumanSmoke (Oct 7, 2013)

SIGSEGV said:


> there is a huge gap between R9-290 and R9-280X on its stream processors count


25%. Still closer than the competitions 50% ( GTX 780 and GTX 770)


----------



## the54thvoid (Oct 7, 2013)

Bjorn_Of_Iceland said:


> Looks like this would smoke the Titan in price and performance.



In price yes, performance? unknown.  We need the reviews out for that assumption.  I would like to see it beat Titan but I am concerned about the use of boost clocks (or Turbo).  The Kepler series has a massive flaw and that is the power limits.  Boost 2.0 is a TDP throttler and that for enthusiasts is a bad thing.

If they release a flagship card that follows in the footsteps of Nvidia it will be really sad.  With GK110 Nvidia stopped enthusiasts from playing too much with their cards (read the reviews, Boost 2.0 is not user friendly).  

What we need is a kick ass AMD product which R9 290X will undoubtedly be, that doesn't have over protective software or driver limits.  

As for smoking Titan, all we have so far is fog.  Roll on the 15th.



HumanSmoke said:


> 25%. Still closer than the competitions 50% ( GTX 780 and GTX 770)



Yeah but that's GK110 to GK104.  At least the 290 and 290X is the same chip.


----------



## buggalugs (Oct 7, 2013)

Hurry up and release, I want one....290X that is. The price isn't cheap but its not outrageous either.


Going by the price it will be better than the 780, equivalent to titan at stock and better than titan with overclocking.


----------



## the54thvoid (Oct 7, 2013)

buggalugs said:


> *Going by the price* it will be better than the 780, equivalent to titan at stock and better than titan with overclocking.



You mean of course better value?  At the moment a GTX 780 is better value (in price/performance) than a GTX Titan*.  Fingers crossed AMD wants to rock the boat with pricing, in a good way.

*In fact, almost anything is 

But you don't buy really expensive things like that based on whether it is good value or not - you buy whether you want it or not.


----------



## SIGSEGV (Oct 7, 2013)

free screenshot tool

cr : videocardz.com

October 5th?


----------



## HumanSmoke (Oct 7, 2013)

the54thvoid said:


> Yeah but that's GK110 to GK104.  At least the 290 and 290X is the same chip.


I believe the SIGSEGV's comparison was between the 280X (Tahiti XT) and 290 (Hawaii Pro).


----------



## the54thvoid (Oct 7, 2013)

HumanSmoke said:


> I believe the SIGSEGV's comparison was between the 280X (Tahiti XT) and 290 (Hawaii Pro).



Ah, cool.


----------



## haswrong (Oct 7, 2013)

how about opengl 4.4? one would expect some native support here..


----------



## bogami (Oct 7, 2013)

512 bit and 4 gb ram ! I m litle concerned if it is enough ! The lest time ATI made R-2900x on 512 bit -512 pipe it was not on expected preformance and GTX8800 win the day !
I seriously hope R9 290X will bee better and cheaper than TITAN and GTX 780.
Given that NVIDIA seriously overestimates the price only the best products and competitive manufacturer can splinter prices ! 
We do not want just € 500  price on the market  also much more capable GPU' s !
First visible test results show a better product than the competition and price is litle too high Can not wait for the first real test, 15's is coming and if all go well ,it will be by the time too buy AMD GPU !


----------



## 1d10t (Oct 7, 2013)

Turn out earlier speculation was right...i hope the launch price could be lower 



MikeGR7 said:


> I sure hope we'll see the glory days of 6950 again and 290 unlocks to 290X!



290 and 290X share the same PCB,layout and power requirement,so probability could higher than 7950 to 7970 



SIGSEGV said:


> there is a huge gap between R9-290 and R9-280X on its stream processors count



two R9 270X ? 



haswrong said:


> how about opengl 4.4? one would expect some native support here..



for greater purpose like creating your own Steam Box?


----------



## haswrong (Oct 7, 2013)

1d10t said:


> for greater purpose like creating your own Steam Box?



a megatextured one


----------



## Xaser04 (Oct 7, 2013)

The standard 290 looks quite interesting, especially if the small ~5% performance delta clock for clock you get with the 7970 - 7950 holds true with the 290x-290 (both have a 256SP deficit). 

Hopefully the 290 (non x) won't be gimped to prevent it matching the "full x" model.


----------



## HopelesslyFaithful (Oct 7, 2013)

I find the 280x more interesting. You get way more performance with 2 cards at 600 bucks compared to the 290x single :/


----------



## Ghiltanas (Oct 7, 2013)

Go AMD 

No ROPs spec. there, but i think will be in a good number


----------



## jigar2speed (Oct 7, 2013)

HopelesslyFaithful said:


> I find the 280x more interesting. You get way more performance with 2 cards at 600 bucks compared to the 290x single :/



But but but Crossfire is broken- AMD driver Sucks  /flame suits on.


----------



## TheinsanegamerN (Oct 7, 2013)

HopelesslyFaithful said:


> I find the 280x more interesting. You get way more performance with 2 cards at 600 bucks compared to the 290x single :/



yeah, but you have to deal with crossfire instability, and you immediately lose that performance edge when the drivers dont want to work right, or when the game does not work properly with crossfire (the batman games come to mind, and rage is the perfect example of crossfire driver/game incompatibility.) none of that is an issue with single gpu setups, which is why these things sell so well.


----------



## a_ump (Oct 7, 2013)

HumanSmoke said:


> 25%. Still closer than the competitions 50% ( GTX 780 and GTX 770)



eh, 2560 to 2816 that's only a ~9.1% difference, not 25%. just thought i'd clarify. I suppose if you take into account lower clocks i could see that tho. maybe, but they have the same memory and 50mhz down clock isn't really that much imo, plus you know vendors are going to release custom PCB and OC'd out of the box. We'll just have to wait, but i think performance between those two is going to be very close.

EDIT: also how does the dual-gpu card fit into AMD's new naming-scheme? R9 295X? 290XT(X)?


----------



## Xaser04 (Oct 7, 2013)

a_ump said:


> eh, 2560 to 2816 that's only a ~9.1% difference, not 25%. just thought i'd clarify. I suppose if you take into account lower clocks i could see that tho. maybe, but they have the same memory and 50mhz down clock isn't really that much imo, plus you know vendors are going to release custom PCB and OC'd out of the box. We'll just have to wait, but i think performance between those two is going to be very close.
> 
> EDIT: also how does the dual-gpu card fit into AMD's new naming-scheme? R9 295X? 290XT(X)?



I think he means the difference between the 290 (non X) - 2560SP and the R280X - 2048SP.


----------



## TheinsanegamerN (Oct 7, 2013)

a_ump said:


> eh, 2560 to 2816 that's only a ~9.1% difference, not 25%. just thought i'd clarify. I suppose if you take into account lower clocks i could see that tho. maybe, but they have the same memory and 50mhz down clock isn't really that much imo, plus you know vendors are going to release custom PCB and OC'd out of the box. We'll just have to wait, but i think performance between those two is going to be very close.
> 
> EDIT: also how does the dual-gpu card fit into AMD's new naming-scheme? R9 295X? 290XT(X)?



R299XTreme Space Heater Edition?


----------



## Yeoman (Oct 7, 2013)

a_ump said:


> eh, 2560 to 2816 that's only a ~9.1% difference, not 25%. just thought i'd clarify. I suppose if you take into account lower clocks i could see that tho. maybe, but they have the same memory and 50mhz down clock isn't really that much imo, plus you know vendors are going to release custom PCB and OC'd out of the box. We'll just have to wait, but i think performance between those two is going to be very close.
> 
> EDIT: also how does the dual-gpu card fit into AMD's new naming-scheme? R9 295X? 290XT(X)?



R9-290x2 might be a basic solution, afterall the x is already present. But it will be interesting to see what they go with. I'm a little surprised they didn't leave the "x90" spot for a dual gpu option.


----------



## jihadjoe (Oct 7, 2013)

So 290X vs Titan and 290 vs GTX780?


----------



## hardcore_gamer (Oct 7, 2013)

Yeoman said:


> R9-290x2 might be a basic solution, afterall the x is already present. But it will be interesting to see what they go with. I'm a little surprised they didn't leave the "x90" spot for a dual gpu option.



May be they haven't planned a dual GPU card because the power consumption of a single card is already too high.


----------



## Yeoman (Oct 7, 2013)

hardcore_gamer said:


> May be they haven't planned a dual GPU card because the power consumption of a single card is already too high.



Could be. Though I suppose only time will tell. 

A 7990 is already a pretty power hungry beast, judging from the relatively low clock speeds of 290/290x (about 800mhz sans any turbo function I believe), I don't know if some kind of twin 290/290x is out of the question entirely. 

I suppose '90' was never a clear indicator that a card was dual gpu or not, after all there is a 7790 (obviously not dual 7750's or whatever), so I think there are a number of ways open to them to mark something as a dual gpu flagship card...I guess its not something they will sweat right now. It will be a while since we see one I suppose, although hopefully not as long as it took for a reference 7990 to materialise.


----------



## HisDivineOrder (Oct 7, 2013)

Looks like we may FINALLY get a GK110 part that's less than $500 and it will be because AMD FINALLY decided to actually compete.  This is why them sitting the GPU race out for two years was such a horrible thing for all of us.

This is why relying on bundles for two years is a horrible move.  This is why them starting the current generation (7970) out with a $50 increase for a marginal 10-20% gain in performance was such a horrible thing for us all.  That allowed nVidia to rely on a mid-range part at high end pricing that was only very barely less expensive.

Now AMD shows up two years later after six months of nVidia owning the high end with their true high end part.  And they're raising the price... again on the high end.

At least they're also releasing a lower part at a reasonable price, which will force nVidia to do the same.  But man it sure took them forever to do it.


----------



## Brusfantomet (Oct 7, 2013)

bogami said:


> 512 bit and 4 gb ram ! I m litle concerned if it is enough ! The lest time ATI made R-2900x on 512 bit -512 pipe it was not on expected preformance and GTX8800 win the day !
> I seriously hope R9 290X will bee better and cheaper than TITAN and GTX 780.
> Given that NVIDIA seriously overestimates the price only the best products and competitive manufacturer can splinter prices !
> We do not want just € 500  price on the market  also much more capable GPU' s !
> First visible test results show a better product than the competition and price is litle too high Can not wait for the first real test, 15's is coming and if all go well ,it will be by the time too buy AMD GPU !



it was 512 bit internal bus, the memory interface to the ram was 256 bit, GTX 8800 had 384 bit.

Might be time to retire my to 6950 card.


----------



## NeoXF (Oct 7, 2013)

Brusfantomet said:


> it was 512 bit internal bus, the memory interface to the ram was 256 bit, GTX 8800 had 384 bit.
> 
> Might be time to retire my to 6950 card.



GTX 280/285 was 512bit...


----------



## Xzibit (Oct 7, 2013)

jigar2speed said:


> But but but Crossfire is broken- AMD driver Sucks  /flame suits on.



Unless AMD is going the Nvidia route and only supporting DirectX .? on software calls (Only 780/Titan support DX 11.1 natively).  There is a *small possibility* they improved the new line-up.  Don't hold your breath tho.

Sapphire specs have all chips supporting DirectX 11.2, PowerTune & ZeroCore from R7 240 on up. So were looking at new/improved chips across the line. 

If not Raj mentioned phase 2 driver to be released in Autumn.


----------



## TheinsanegamerN (Oct 7, 2013)

These cards look wonderful, now, whether to get a r290/290x, or spend only $300 and grap a 7970 while i can....the 550ti is getting a little slow for 1080p gaming.


----------



## corsaro (Oct 7, 2013)

TheinsanegamerN said:


> These cards look wonderful, now, whether to get a r290/290x, or spend only $300 and grap a 7970 while i can....the 550ti is getting a little slow for 1080p gaming.



that's like '' i think my 7790 will have a problem with  4k games!''


----------



## Hilux SSRG (Oct 7, 2013)

Still trying to wrap my head around AMD's new gfx naming scheme.  Does anyone know what the die size will be on this monster? And if it's 28nm?


----------



## W1zzard (Oct 7, 2013)

Hilux SSRG said:


> Still trying to wrap my head around AMD's new gfx naming scheme.  Does anyone know what the die size will be on this monster? And if it's 28nm?



it's 28 nm because there is no better process available at this time at TSMC, where all NV and AMD GPUs are produced.


----------



## Intel God (Oct 7, 2013)

Has everyone seen this terrible 290X firestrike score?







Compared to a stock clocked 780's


----------



## v12dock (Oct 7, 2013)

Hmm the FP32 performance ~1.6TF higher than the 2880 kepler chip. The R9 290X should be a computing beast


----------



## jihadjoe (Oct 7, 2013)

jigar2speed said:


> ^ +1, Also AMD seems to be holding this cards on specs - GTX 770 has 7GHZ of RAM, same applied on R9 will boost the memory bandwidth heavily.



I imagine 7GHz RAM is now in short supply. If you check out the recent batch of reviews some of the new cards barely make it past 6GHz.


----------



## Ahhzz (Oct 7, 2013)

HopelesslyFaithful said:


> I find the 280x more interesting. You get way more performance with 2 cards at 600 bucks compared to the 290x single :/



I love your logic, and almost agree. But I can't. 
Too many "high-end" games don't do well with the Crossfire setups, and, iirc, the (double) RAM is not used in a XFire setup. Plus, the 280x is just a rebranded 7970 with a little boost, which doesn't put me in a comforted mood... I think I'll just hope for a good price point in a year or so on the 290.




Brusfantomet said:


> it was 512 bit internal bus, the memory interface to the ram was 256 bit, GTX 8800 had 384 bit.
> 
> Might be time to retire my to 6950 card.



I'll be making that statement this time next year   Little early to adopt it, plus I'm not budgeted for it yet, but next year, I'll be scrounging some pennies!


----------



## HumanSmoke (Oct 7, 2013)

v12dock said:


> Hmm the FP32 performance ~1.6TF higher than the 2880 kepler chip. The R9 290X should be a computing beast


Will depend on driver support and application. Case in point, the FirePro W8000 should annihilate the Quadro K5000 ( 3225 TFlops vs 2169, a 48% advantage), but due to the vagaries of workload and priorities in driver writing...





Hardware is only one part of the equation...as is usual


----------



## Blín D'ñero (Oct 7, 2013)

Ahhzz said:


> I love your logic, and almost agree. But I can't.
> Too many "high-end" games don't do well with the Crossfire setups, and, iirc, the (double) RAM is not used in a XFire setup. Plus, the 280x is just a [...]



List those "too many"? You can't, because reality is that crossfire runs great on most games and scales very well. Which "(double) RAM" is not used?? You don't know what you're talking about. Educate yourself about AFR.


----------



## HopelesslyFaithful (Oct 7, 2013)

Ahhzz said:


> I love your logic, and almost agree. But I can't.
> Too many "high-end" games don't do well with the Crossfire setups, and, iirc, the (double) RAM is not used in a XFire setup. Plus, the 280x is just a rebranded 7970 with a little boost, which doesn't put me in a comforted mood... I think I'll just hope for a good price point in a year or so on the 290.
> 
> 
> ...



i thought i included saying the 7970/280x would be more interesting. Granted i am a big F@H fan so thats why i would go with the xfire personally. But i get the interest in the 290/x


----------



## TheinsanegamerN (Oct 7, 2013)

Blín D'ñero said:


> List those "too many"? You can't, because reality is that crossfire runs great on most games and scales very well. Which "(double) RAM" is not used?? You don't know what you're talking about. Educate yourself about AFR.



he is referring to the fact that some people believe you get double the framebuffer with 2 cards. IE, two 3gb 7970s would give you 6gb of frame buffer, where in fact you only have 3gb


----------



## TheinsanegamerN (Oct 7, 2013)

corsaro said:


> that's like '' i think my 7790 will have a problem with  4k games!''



sorry. meant 550tis (sli). forgot the s on the end.


----------



## TheoneandonlyMrK (Oct 8, 2013)

well well, the day has come,, now come on wizzard,release a review its 0ct 8  some of us are still up n all


----------



## eidairaman1 (Oct 8, 2013)

MikeGR7 said:


> I sure hope we'll see the glory days of 6950 again and 290 unlocks to 290X!



you must of came after the R300 Series


----------



## Ahhzz (Oct 8, 2013)

Blín D'ñero said:


> List those "too many"? You can't, because reality is that crossfire runs great on most games and scales very well. Which "(double) RAM" is not used?? You don't know what you're talking about. Educate yourself about AFR.



Let's see.... Skyrim had issues with Crossfire, Fallout 3 and Vegas, I heard that Serious Sam had some problems, never played it myself, COD Black Ops 2, BC2, I heard about BF3, didn't play... Yeah, they come out with their CAP for them sometime down the road, but there are still issues that you shouldn't expect to deal with. I'm ignoring the statements of "Growing pains" and the like. 



Here. You may educate yourself. 
http://www.tomshardware.com/forum/245454-33-crossfire-faqs

"The data is mirrored by both cards so two 1GB cards will still result in 1GB of VRAM being available."

So, in games like Skyrim, where VRAM is at a premium, loading large resolution packs can result in reaching your VRAM limit very quickly. A dual 2Gb card setup doesn't provide you with 4Gb of VRAM for use, you're limited to the 2Gb, since the data is mirrored on both cards.

Nice tone by the way.


----------



## Ahhzz (Oct 8, 2013)

HopelesslyFaithful said:


> i thought i included saying the 7970/280x would be more interesting. Granted i am a big F@H fan so thats why i would go with the xfire personally. But i get the interest in the 290/x



I can understand that  I guess the "Ooooh, shiny and new!!!!" gets to me on the 290 as opposed to the "Rebranded?? "  I did miss that the 280X was running cheaper than the comparable 7970's, so I guess I may have to re-eval that one... it's probably going to end more in my range than the 290x. And all my irritation at miserable XFire support out of the box aside, I'll be looking hard to try to grab a pair of whatever I get next year. *cheers* 

*edit*
Did a little quick search, and found this article. this is the TL;DR version



Price when rated:	$299 (as opposed to the almost $400 tag on the comparable 7970)

Pros
Very fast performance with AAA games.
EyeFinity can now use HDMI and DVI, as well as DisplayPort

Cons
Not new architecture: This is a tweaked Radeon HD 7970 GHz Edition
Memory runs at reference-design spec

I especially like the HDMI/DVI EyeFinity change, as I just got my third monitor, and lacking a DP, I'm doing the adapter dance, and it's erratic as crap...:shadedshu


----------



## wiak (Oct 8, 2013)

SIGSEGV said:


> there is a huge gap between R9-290 and R9-280X on its stream processors count




280X is basicly a HD 7970 card so...
290 is the new power card
290X is so hardcore its out of this world


----------

