# Radeon R9 and Radeon R7 Graphics Cards Pictured Some More



## btarunr (Sep 26, 2013)

Here's a quick recap of AMD's updated product stack, spread between the R9 and R7 series. This article can help you understand the new nomenclature. AMD's lineup begins with the Radeon R7 250 and Radeon R7 260X. The two are based on the 28 nm "Curacao" silicon, which is a variation of the "Pitcairn" silicon the previous-generation Radeon HD 7870 was based on. The R7 250 is expected to be priced around US $89, with 1 GB of RAM, and performance rated at over 2,000 points by 3DMark Firestrike benchmark. The R7 260X, features double the memory at 2 GB, higher clock speeds, possibly more number crunching resources, Firestrike score of over 3,700 points, and a pricing that's around $139. This card should turn up the heat against the likes of GeForce GTX 650 Ti Boost. 

Moving on, there's the $199 Radeon R9 270X. Based on a chip not much unlike "Tahiti LE," it features 2 GB of memory, and 3DMark Firestrike score of over 5,500 points. Then there's the Radeon R9 280X. This card, priced attractively at $299, is practically a rebrand of the Radeon HD 7970 GHz Edition with. It features 3 GB of RAM, and over 6,800 points on 3DMark Firestrike. Then there are the R9 290 and R9 290X. AMD flew dozens of scribes thousands of miles over to Hawaii, and left them without an official announcement on the specifications of the two. From what AMD told us, the two feature 4 GB of memory, over 5,000 TFLOP/s compute power, and over 300 GB/s memory bandwidth. The cards we mentioned are pictured in that order below.



 

 

 

 

 



More pictures follow.



*Radeon R7 250*


 

 

*Radeon R7 260X*


 

 

*Radeon R9 270X*


 

 

 

*Radeon R9 280X*


 

 

 

*Radeon R9 290*


 

 

 

*Radeon R9 290X*


 

 



*View at TechPowerUp Main Site*


----------



## RCoon (Sep 26, 2013)

I thought the 290 is the 2500 SP variant and the 290X is the 2800 SP variant.


----------



## springs113 (Sep 26, 2013)

All the talk i hear about the firestrike score are from the 290 version.  I do believe the X variant is well in the 8k mark.  Like I have stated on numerous threads, If it's is neck and neck with my 780 I am getting it no matter what the misses say.


----------



## btarunr (Sep 26, 2013)

springs113 said:


> All the talk i hear about the firestrike score are from the 290 version.  I do believe the X variant is well in the 8k mark.  Like I have stated on numerous threads, If it's is neck and neck with my 780 I am getting it no matter what the misses say.



Unless AMD's PowerPoint skills suck, R9 290X Firestrike is under 8000.







Notice the top block (which ends at 8000) is fading at the top. GTX TITAN's Firestrike score is ranging between 8,200 and 8,700.


----------



## springs113 (Sep 26, 2013)

theoretically speaking the r290 n  gtx 780  should trade blows,  which mean I'm going to take a chance and purchase it.  I will be upgrading to a larger ssd 12gb 830 to 256 840pro.


----------



## 1d10t (Sep 26, 2013)

Dual SL-DVI?Does this mean DP obsolete when running Eyefinity?


----------



## the54thvoid (Sep 26, 2013)

I for one was utterly unimpressed with last nights fanfare.  But I'm only a hardware whore so it's my common leanings that made it thus.

But... to have an event and mention your top two cards on the super-dooper new architecture and say nothing really about them is a bit weird.  If this was an Nvidia event you'd be saying, hmm... manufacturing issues then.

But we know they'll be taking pre-orders as of the 4th October so the cards are very real (or soon to be).  

We need prices, performance reviews and more importantly - will they stop using shitty whining chokes on their reference PCB's?  All 3 of my 7970's had coil whine.  I RMA'd my first because of it - the replacement was the same.  Another vendor had the same issue.

Btw - my Titan also has whine but it is much less (but unforgivable for a prestige card).


----------



## RCoon (Sep 26, 2013)

the54thvoid said:


> I for one was utterly unimpressed with last nights fanfare.  But I'm only a hardware whore so it's my common leanings that made it thus.
> 
> But... to have an event and mention your top two cards on the super-dooper new architecture and say nothing really about them is a bit weird.  If this was an Nvidia event you'd be saying, hmm... manufacturing issues then.
> 
> ...



I ordered takeaway, waitied for an additional hour, and watched a poor quality live stream. What did I get? A bitter taste in my mouth, and the distinct impression that the 290X will merely be a contender to my existing 780's (neither of which have coil whine on reference models). Props to AMD for getting the hype, but after the first two presentations, I disconnected and went to bed, knowing I wouldnt be buying a new GPU for 2 years.


----------



## The Quim Reaper (Sep 26, 2013)

..so the R7 260X is basically a 7870 with a $50 price cut?

EDIT:

It's most definitely NOT a 7870 class card with that firestreak score, more like a 7790 

WTF!!!


----------



## dj-electric (Sep 26, 2013)

The Quim Reaper said:


> ..so the R7 260X is basically a 7870 with a $50 price cut?
> 
> EDIT:
> 
> ...



At 139$, it will probably be on par with the HD 7850 card, maybe a bit less


----------



## sergionography (Sep 26, 2013)

RCoon said:


> I ordered takeaway, waitied for an additional hour, and watched a poor quality live stream. What did I get? A bitter taste in my mouth, and the distinct impression that the 290X will merely be a contender to my existing 780's (neither of which have coil whine on reference models). Props to AMD for getting the hype, but after the first two presentations, I disconnected and went to bed, knowing I wouldnt be buying a new GPU for 2 years.



Obviously? Whats the point of buying a 650$ card if its gonna last you less than 2 years.


----------



## NeoXF (Sep 26, 2013)

The Quim Reaper said:


> ..so the R7 260X is basically a 7870 with a $50 price cut?
> 
> EDIT:
> 
> ...



R7-260X is Bonaire XTX... a slightly faster HD 7790 with 2GB of VRAM as standard (default HD 7790 had 1GB).

It is in no way or shape "basically a 7870". And about the price, 7790 launched at 150$ (1GB version)... so I don't know what you're smoking, the price sounds just about right.

As for HD 7870 replacement, that's R7-270 or R7-270X (not sure which one), no clue about the codename (probably Curacao Pro), but probably a beefed up Tahiti LE (HD 7870 XT) as it's reported to ~10% slower than a GTX 760.

Anyway, what is it with all this outrage. Aside from the stream drops and a not a lot of stage presence from most of the people that hosted it, it's all great and/or interesting news.


----------



## The Quim Reaper (Sep 26, 2013)

NeoXF said:


> It is in no way or shape "basically a 7870". And about the price, 7790 launched at 150$ (1GB version)... so I don't know what you're smoking, the price sounds just about right.



Sigh...I remember the days when a new generation of cards meant that the previous mid-range performance would now be available at and priced at the entry level.

Now all we seem to get is re-brands with lower power usage.


----------



## brandonwh64 (Sep 26, 2013)

I bet after people are seeing these rebrands they will be keeping their 7970's

The only two of the bunch worth getting would be the 290/X


----------



## NeoXF (Sep 26, 2013)

The Quim Reaper said:


> Sigh...I remember the days when a new generation of cards meant that the previous mid-range performance would now be available at and priced at the entry level.
> 
> Now all we seem to get is re-brands with lower power usage.



Again... no clue what you're talking about. Radeon HD 7000s are selling for dirt cheap, even in my crummy-priced country. And cards that are faster than their predecessors and launching at either prices lower than the cards they replace @ current price OR at least launch price. R9-290X is a slightly different story, but top-tier cards have always been this way.
I mean FFS, R9-280X would be 299 bucks in the states... that's CLOSE TO 3x the performance  per dollar you got in early 2012 from a launch priced HD 7970.


----------



## EarthDog (Sep 26, 2013)

NeoXF said:


> Again... no clue what you're talking about. Radeon HD 7000s are selling for dirt cheap, even in my crummy-priced country. And cards that are faster than their predecessors and launching at either prices lower than the cards they replace @ current price OR at least launch price. R9-290X is a slightly different story, but top-tier cards have always been this way.
> I mean FFS, R9-280X would be 299 bucks in the states... that's CLOSE TO 3x the performance  per dollar you got in early 2012 from a launch priced HD 7970.


Sure, but... its not launch period, and the price for a 7970 now is $300. 

I see his point. Its clear. But not sure we can know that information yet. We do not know where these will perform. And the rebrands are clouding the picture is the real problem.

Would I jump from a 7970 to an R9 280X... Nope, not unless I fold or mine due to less power consumption. 

The rebrands, from BOTH companies, are about annoying though...


----------



## PHaS3 (Sep 26, 2013)

Anyone else note that the R9 290X has no crossfire connectors on the PCB in any of the images AMD have supplied? 

Odd. I know it may point to it being an image not resembling the final product, but the other higher end cards have them in the images... Interesting.


----------



## fullinfusion (Sep 26, 2013)

Im getting real tired of all these new card's pictures.
How about the spec sheet for pete's sake!

Now on to the looks of the gpu's cover, id almost call her a double bagger.. looks like its made from lego's 
Give me a reference 7970 sleek look and a beautiful looking back plate like the 6 series had  
and that make me happy.

I noticed the following: 

Radeon R9 270X no bios switch and only one gold finger crossfire link, 
and better make sure AMD of quality control, ya have a loose screw btw 

Radeon R9 280X bios switch and two gold finger crossfire links 
R9 290 no bios switch and no gold fingers 
R9 290X no bios switch and no gold fingers


----------



## TheoneandonlyMrK (Sep 26, 2013)

PHaS3 said:


> Anyone else note that the R9 290X has no crossfire connectors on the PCB in any of the images AMD have supplied?
> 
> Odd. I know it may point to it being an image not resembling the final product, but the other higher end cards have them in the images... Interesting.



Its been noted in one of the mirriad of Amd new gpu threads that the crossfire bridge link bandwidth is not upto 4k + and it crossfires through the pciex buss, thats due to the 2.5gb/s bandwidth on that link not being enough for page transfers at Uhd resolutions.


----------



## fullinfusion (Sep 26, 2013)

btarunr said:


> Unless AMD's PowerPoint skills suck, R9 290X Firestrike is under 8000.
> 
> http://www.techpowerup.com/img/13-09-26/264a.jpg
> 
> Notice the top block (which ends at 8000) is fading at the top. GTX TITAN's Firestrike score is ranging between 8,200 and 8,700.


So is the Firestrike run in normal or Extreme? because stock one of my 7970's are just over 7100 points and thats all stock cpu and gpu clocks


----------



## EarthDog (Sep 26, 2013)

The reference Titan scored 9.2k with 320.49 and 4770k @ 4Ghz...

EDIT: In the review, it scored 8.7k... on a 4Ghz 3770k with older drivers.

EDIT2: Those scores are from normal settings.


----------



## fullinfusion (Sep 26, 2013)

EarthDog said:


> The reference Titan scored 9.2k in our review...


obviously not an Xtream run Right?


----------



## EarthDog (Sep 26, 2013)

Obviously not. 

Edited my previous post...

Also, reference 780 scored 8.4k with 4ghz 4770k 320.18 drivers.

LOL Holy crap... 780 Classified scored 9.4k w 4Ghz 4770k and 320.49

Makes me wonder if they used an AMD CPU based system to test these out as those scores seem low for a Titan competitor... and Firestrike, though it loves cores, a 4770k will easily compete with it there.


----------



## fullinfusion (Sep 26, 2013)

EarthDog said:


> Obviously not.
> 
> Edited my previous post...
> 
> ...


Thats what I was thinking... So now when are we getting a review?


----------



## EarthDog (Sep 26, 2013)

On the 290x? No clue yet from us... For some odd reason we were passed over to go to Hawaii... other than that, I could tell you, but would then have to off you... in many pieces. 

I am sure W1zz and the others will get them up in due time too.


----------



## springs113 (Sep 26, 2013)

According to my sources the NDA lifts sometime next week.


----------



## Hilux SSRG (Sep 26, 2013)

Thanks AMD, you pulled a Nvidia.  Everything below the flagship appears to be a rehash of a rehash.


----------



## HumanSmoke (Sep 26, 2013)

btarunr said:


> Unless AMD's PowerPoint skills suck, R9 290X Firestrike is under 8000.
> Notice the top block (which ends at 8000) is fading at the top. GTX TITAN's Firestrike score is ranging between 8,200 and 8,700.


A more relevant comparison may be between the two cards in the slide (same source, same µarch). The 280X is for intents and purposes an HD 7970GE (with a 20MHz core bump and slightly faster memory if the voices in the walls are to be believed).
If the 290X scores ~8000, and the 280X rebadge scores 6800 (according to the two AMD slides), wouldn't it follow that AMD's own calculation is that the 290X is 17-18% faster than a warmed over 7970GE ?



theoneandonlymrk said:


> Its been noted in one of the mirriad of Amd new gpu threads that the crossfire bridge link bandwidth is not upto 4k + and it crossfires through the pciex buss, thats due to the 2.5gb/s bandwidth on that link not being enough for page transfers at Uhd resolutions.


The Crossfire bridge is good for 2.5GT/sec (gigatransfers) which is 250MB/sec, not 2.5gb/s


----------



## Casecutter (Sep 26, 2013)

NeoXF said:


> R7-260X is Bonaire XTX... a slightly faster HD 7790 with 2GB of VRAM as standard (default HD 7790 had 1GB).
> 
> It is in no way or shape "basically a 7870". And about the price, 7790 launched at 150$ (1GB version)... so I don't know what you're smoking, the price sounds just about right.
> 
> ...


IDK... Even I have to say it all left me underwhelmed.  Is there any date for when NDA are lifted?   I seem to have all the same question as I had prior. 

Like Bonaire XTX aka R7 260X;  It supports this new TrueAudio would this mean the original Bonaire always had this but wasn't implemented?  Or is this XTX an new spin of the original?   If the Fire Strike numbers projected are factual, I'm not overwhelmed as 3500-3600 was what most claim the 7790 offers, AMD's slide says 3700  that’s like 5%?  What’s so XTX about it?  Then that price of $140 even giving it 2Gb it's not great, I believe a GTX 650Ti Boost is 3800.

The R9 270X as at $200 has me feeling 7870 with not any extra performance given the 5500 score seems blasé.   A R9 280X (7970Ghz) at $300 seem to come out alright the supposed recent MSRP of a 7970Ghz was like $380, but hasn't been hard to find at $275 working rebates and code.

They might be all okay once we see some real gaming B-M, but I want(ed) more than a bump in clocks.  I hope AMD did something more than re-name Pitcairn as Curacao, they had to do some minor tweaks or this is almost more underwhelming than what Nvidia got away with.  



HumanSmoke said:


> If the 290X scores ~8000, and the 280X rebadge scores 6800 (according to the two AMD slides), wouldn't it follow that AMD's own calculation is that the 290X is 17-18% faster than a warmed over 7970GE ?


That's a very reasonable use of the data. Good call!  And a little unnerving...


----------



## HumanSmoke (Sep 26, 2013)

Casecutter said:


> Is there any date for when NDA are lifted?



3 October by all accounts. The same day pre-orders are available.


----------



## TheoneandonlyMrK (Sep 26, 2013)

HumanSmoke said:


> A more relevant comparison may be between the two cards in the slide (same source, same µarch). The 280X is for intents and purposes an HD 7970GE (with a 20MHz core bump and slightly faster memory if the voices in the walls are to be believed).
> If the 290X scores ~8000, and the 280X rebadge scores 6800 (according to the two AMD slides), wouldn't it follow that AMD's own calculation is that the 290X is 17-18% faster than a warmed over 7970GE ?
> 
> 
> The Crossfire bridge is good for 2.5GT/sec (gigatransfers) which is 250MB/sec, not 2.5gb/s


Thanks for correcting my vague memory.

While it's nice to wax lyrical about benches its premature to be stating any numbers as fact or one as beating the other.
All will be revealed when the nda clears and wizzards review arrives.


----------



## The Von Matrices (Sep 26, 2013)

I've been discussing the lineup with some other people in a few forums, and aggregating the data, the best determination we've come up with based on the available specifications (memory capacity and Firemark scores) is:

R9 290X = Hawaii XT (Full Chip - 2816 SP, 4GB w/512-bit, ~$600)
R9 290 = Hawaii Pro (Binned Chip - 2304 or 2560 SP, 3GB w/384-bit, ~$400)
R9 280X = Tahiti XT2 (7970 Ghz - 2048 SP, 3GB w/384-bit, ~$300)
R9 270X = Tahiti LE (7870 XT - 1536 SP, 2GB w/256-bit, ~$200)
R7 260X = Bonaire XT (7790 - 896 SP, 2GB w/128-bit, ~$139)
R7 250 = Cape Verde XT (7770 - 640 SP, 1GB w/128-bit, <$89)

The R7 250 is the unique card because while all the other rebrands have equal or higher clock speeds than their predecessors, the R7 250 is a 7770 with lower clock speeds to fit into the 75W PCIe power specification.  It would surpass the 7750 in performance due to the extra shaders, but it would not surpass a 7770 due to the lower clock speeds.

It would seem that Pitcairn is dead.  This also makes a lot of sense considering Tahiti is a relatively big chip and if the lineup only contained full Tahitis and full Pitcairns you have nowhere for partially defective dies to be used.  The reason the R7 250 is a Cape Verde chip and not a cut down Bonaire is that it doesn't feature the audio processing hardware.


----------



## HumanSmoke (Sep 26, 2013)

theoneandonlymrk said:


> While it's nice to wax lyrical about benches its premature to be stating any numbers as fact or one as beating the other.
> All will be revealed when the nda clears and wizzards review arrives.


Hence my use of the words "May" and "IF". Nor did I state any of the numbers as "fact", hence the qualifications I made -that is your (mis)interpretation of my post.

I'm pretty sure my analysis stands up to scrutiny a whole lot better than some of the emotive arguments flying around the forums

If you aren't sold on AMD's own comparative benchmarks on their own products what does that say about the level of confidence you (or anyone else) has in what the company? If their own internal benchmarking is flawed or spurious, what about the company's other claims? Or is this more a case of picking and choosing depending upon the feelgood factor ?


The Von Matrices said:


> I've been discussing the lineup with some other people in a few forums, and aggregating the data, the best determination we've come up with based on the available specifications....


Seems spot on...I had reached much the same conclusion on another site, so in the interests of self-interest I'd say your guesstimates are excellent!


----------



## Casecutter (Sep 26, 2013)

The best I could find as a Fire Strike score for the 7870 Tahiti LE was like - 6000.  If we say AMD slide is using a FX-9590 that might account for a 5550 score.  But if a Tahiti LE they have to do something about power consumption.  If they think the LE is a good $200 offering I would absolutely disagree.


----------



## NeoXF (Sep 26, 2013)

HumanSmoke said:


> Seems spot on...I had reached much the same conclusion on another site, so in the interests of self-interest I'd say your guesstimates are excellent!



Sorry, but I call bollocks on that discussion and the one in this thread as of late as well.

Another slide from that site, showing a more updated chart with 1050-1100MHz/6000MHz (same speculated specs for R9 280X) Radeon HD 7970 shows it getting more than 7800 points. How exactly is that the same thing with 6800...

On that same test system, w/ the 4,5GHz i7-3960X, QC DDR3-1866CL9 and SSD, I'd wager R9 290X would go for over 9000 marks (no joke intended).


----------



## Zaxx420 (Sep 26, 2013)

Next question is ofc...how well will they o/c? Hoping the 260X will allow a decent O/C cuz that's right in my 'Best Bang for the Buck' price range of $120 to $150.


----------



## HumanSmoke (Sep 27, 2013)

NeoXF said:


> Sorry, but I call bollocks on that discussion and the one in this thread as of late as well.
> 
> Another slide from that site, showing a more updated chart with 1050-1100MHz/6000MHz (same speculated specs for R9 280X) Radeon HD 7970 shows it getting more than 7800 points. How exactly is that the same thing with 6800...


Probably because the 7800 is obviously a factory overclocked 7970GHz?



NeoXF said:


> On that same test system, w/ the 4,5GHz i7-3960X, QC DDR3-1866CL9 and SSD, I'd wager R9 290X would go for over 9000 marks (no joke intended).


Maybe, maybe not. That still doesn't explain why my hypothesis based on AMD's own information is "bollocks".
Do AMD use different system specifications when doing comparative testing?
Are AMD's technical staff too ignorant to interpret a benchmark?
Are AMD deliberately lying?


----------



## Vario (Sep 27, 2013)

Reminds me of the 8800 gt to 9800 gt nonexistent innovation stunt from Nvidia in the mid 2000's LOL.


----------



## Fluffmeister (Sep 27, 2013)

Vario said:


> Reminds me of the 8800 gt to 9800 gt nonexistent innovation stunt from Nvidia in the mid 2000's LOL.



Constantly pushing their own technology, entire rebrands for OEMS and new product generations... not to mention endless games with AMD logos at the start.

You have to give them credit, they are finally learning to take a hint.

Hell, keep this up they may eventually be considered the bad guys.


----------



## The Von Matrices (Sep 27, 2013)

HumanSmoke said:


> Do AMD use different system specifications when doing comparative testing?
> Are AMD's technical staff too ignorant to interpret a benchmark?
> Are AMD deliberately lying?



You have to give them credit - if they wanted to obfuscate performance, there are few better ways than using Firemark scores.


----------



## Fluffmeister (Sep 27, 2013)

The Von Matrices said:


> You have to give them credit - if they wanted to obfuscate performance, there are few better ways than using Firemark scores.



Only AMD can get away with hosting 4 hours of PR nonsense, reveal barely anything at all about the product most people were interested in... and still get credit for it.


----------



## TheoneandonlyMrK (Sep 27, 2013)

HumanSmoke said:


> Probably because the 7800 is obviously a factory overclocked 7970GHz?
> 
> 
> Maybe, maybe not. That still doesn't explain why my hypothesis based on AMD's own information is "bollocks".
> ...



Are you so neg on Amd you will jump on any possible neg stat , , sorry poorly (possibly on purpose) stated stat in a chart im not assed if the card blows a fart but reading your never ending vitreous is tiring which is nice its bedtime after all.
I would not mind buying this card but im still waiting on wizards review as yours is poor and largely unfounded opinion. 

Based on possibly a rumour tickling graph.

Let me put a widely sensible supposition to you Say Amd release a top spec card and at the same time they Know there competition can release a similar spec card specialy cooled if needed to piss on the fire slightly by present day benches in pr terms not good even if it were a paper launch. 

They are and have kept details secret on And with purpose . THAT is a fact.


----------



## Vario (Sep 27, 2013)

Well the 7970 is a beast, no doubt about it, if they can keep churning out rebranded ghz editions on the cheap then I don't see a downside.  Sort of like the 8800gtx/ultra->9800gtx/ultra as I said earlier.  I'd like to see what the 90x can do!


----------



## 1d10t (Sep 27, 2013)

theoneandonlymrk said:


> Its been noted in one of the mirriad of Amd new gpu threads that the crossfire bridge link bandwidth is not upto 4k + and it crossfires through the pciex buss, thats due to the 2.5gb/s bandwidth on that link not being enough for page transfers at Uhd resolutions.





The Von Matrices said:


> I've been discussing the lineup with some other people in a few forums, and aggregating the data, the best determination we've come up with based on the available specifications (memory capacity and Firemark scores) is:
> 
> R9 290X = Hawaii XT (Full Chip - 2816 SP, 4GB w/512-bit, ~$600)
> R9 290 = Hawaii Pro (Binned Chip - 2304 or 2560 SP, 3GB w/384-bit, ~$400)
> ...



so best bet for R9 290 are 2560 SP,4 GB VRAM,512 bit wide,feature link-bus Crossfire,launch price $499,and will take a seat 5 - 10% slower than GTX 780


----------



## The Von Matrices (Sep 27, 2013)

1d10t said:


> so best bet for R9 290 are 2560 SP,4 GB VRAM,512 bit wide,feature link-bus Crossfire,launch price $499,and will take a seat 5 - 10% slower than GTX 780



I just noticed that the TPU GPU database has the R9 280X powered by a Curacao XT (aka Pitcairn XT).  While it might be true, it confuses me from a manufacturing standpoint:

This would mean that except for the R9 290 (non-X) that AMD would be using only fully-enabled dies.  This seems to make no business sense since any chips with defects can't be sold.  Granted, 28nm is a mature process, but usually manufacturers have two tiers of product - a fully enabled one and a feature cut one to use the defective dies.  Especially for Tahiti - a 365mm^2 chip has to have a significant number of defective die, and I doubt AMD wants to put a disabled Tahiti into their mobile lineup due to power concerns.  I guess then have OEM products to use the defective parts?

In that case the lineup would look like this

R9 290X = Hawaii XT (Full Chip - 2816 SP, 4GB w/512-bit, ~$600)
R9 290 = Hawaii Pro (Binned Chip - 2560 SP, 4GB w/512-bit, ~$400-450)
R9 280X = Tahiti XT2 (7970 Ghz - 2048 SP, 3GB w/384-bit, ~$300)
R9 270X = Curacao XT (7870 - 1280 SP, 2GB w/256-bit, ~$200)
R7 260X = Bonaire XT (7790 - 896 SP, 2GB w/128-bit, ~$139)
R7 250 = Cape Verde XT (7770 - 640 SP, 1GB w/128-bit, <$89)



Fluffmeister said:


> Only AMD can get away with hosting 4 hours of PR nonsense, reveal barely anything at all about the product most people were interested in... and still get credit for it.



I disagree that it's specific to AMD; that's what most electronics-related press conferences are like nowadays.  Think of the 2013 Microsoft and Sony press conferences at E3; they were basically identical in structure to the AMD press conference and revealed very little about the hardware itself.


----------



## HumanSmoke (Sep 27, 2013)

theoneandonlymrk said:


> Are you so neg on Amd you will jump on any possible neg stat , , sorry poorly (possibly on purpose) stated stat in a chart im not assed if the card blows a fart but reading your never ending vitreous is tiring which is nice its bedtime after all.
> I would not mind buying this card but im still waiting on wizards review as yours is poor and largely unfounded opinion.
> 
> Based on possibly a rumour tickling graph.
> ...


Wow. That was a chore to read. Ever heard of punctuation and grammar, or English your second language? 
Please let me simplify:
Either the charts AMD posted are correct- in which case information can be extrapolated from them OR they are incorrect and AMD is either lying or has some serious QC and/or internal communications problems.
It is either one or the other.

For someone who  isn't  "assed if the card blows a fart" you seem to be posting an awful lot. Personally I am interested in the new releases, both from an enthusiast standpoint, and as someone who advises and builds systems for others. If you don't like what is said please use, by all means use the ignore function in your CP.

YW


----------



## wolf (Sep 27, 2013)

Well for a new generation card, this seems severely underwhelming, what is it maybe ~20% faster give/take than a 7970Ghz?

I expected more, especially with 3840x2160 firmly in the sights of consumers over the next couple of years.

I can only hope it drives prices down, and that Nvidias answer will be better than this. I hope this for the sake of GPU's getting faster overall.

Also, WTF with their new naming scheme :/



Hilux SSRG said:


> Thanks AMD, you pulled a Nvidia.  Everything below the flagship appears to be a rehash of a rehash.



Thanks Hilux SSRG, you pulled a fanboy.


----------



## kn00tcn (Sep 27, 2013)

you expected more on the same 28nm? what about 6970 on the same 40nm as 5870?

there's talk that maxwell will also be 28nm in 2014, then a 'big maxwell' late 2014 or even early 2015 at 20nm

i dont expect miracles without massive architecture changes or massive size changes

i'd rather have 2 separate chips, gaming & compute, but that's expensive to do (we can see the results of such a split, 680 has much better gaming even though it has worse compute over 580)

the other area to boost is software, which is what mantle is doing, we'll see what bf4 will be like in december


----------



## Lionheart (Sep 27, 2013)

Here's a video from Linus tech tips at AMD's event but just skip to around 11:38 in the video to have a glance at AMD's new flagship GPU 

http://youtu.be/3MU-DIKvY3U?t=11m38s


----------



## fullinfusion (Sep 27, 2013)

Lionheart said:


> Here's a video from Linus tech tips at AMD's event but just skip to around 11:38 in the video to have a glance at AMD's new flagship GPU
> 
> http://youtu.be/3MU-DIKvY3U?t=11m38s


And to add to my excitement I see a bios switch!


----------



## The Von Matrices (Sep 27, 2013)

btarunr said:


> From what AMD told us, the two feature 4 GB of memory, *over 5,000 TFLOP/s compute power*, and over 300 GB/s memory bandwidth.



With these specs you need only 6 of these cards to beat the Titan supercomputer.


----------



## Casecutter (Sep 27, 2013)

The Von Matrices said:


> This seems to make no business sense since any chips with defects can't be sold.


There's nothing positive these are the only models they intend to release or just a sprinkling of some spots.  That said it pretty hard to fill in when they have such tight price steps... Idk.  Just saying these might not be the only one to release, or as Nvidia did you hold back binned part and release them 6 month from now as R8's?  Nvidia has shown the two derivative models per wafer is no longer and that they can to appropriated to completely different segments.   Not says it a bad thing, but that the common way of looking at this may well be no more.



HumanSmoke said:


> Either the charts AMD posted are correct- in which case information can be extrapolated from them OR they are incorrect.


Or AMD just maded that R9 290X bar to fade out... After you used that chart to arrive at the 17-18% above comment, I noticed that the others all seem to end, while that one fades... IMO.  It perhaps could be AMD trickery, that slide is not a clear measurement of data indicating performance.  If it was AMD would have to stipulate the system spec’s and other parameters to make relevant engineering data... It's a P-R slide, as such not something we should deem as gospel.


----------



## The Von Matrices (Sep 27, 2013)

Casecutter said:


> There's nothing positive these are the only models they intend to release or just a sprinkling of some spots.  That said it pretty hard to fill in when they have such tight price steps... Idk.  Just saying these might not be the only one to release, or as Nvidia did you hold back binned part and release them 6 month from now as R8's?  Nvidia has shown the two derivative models per wafer is no longer and that they can to appropriated to completely different segments.   Not says it a bad thing, but that the common way of looking at this may well be no more.



Nvidia makes a mind-boggling number of OEM card variations to efficiently allocate defective chips.  Only the best chips get used for retail cards.  This results in a lot of complaints from enthusiasts because an Nvidia OEM card with the same model number as a retail card might have a completely different core, clock speeds, and memory configuration from the retail card with the only similarity between the two being an approximate performance.  Maybe AMD is going this same route with its defective chips?


----------



## TheoneandonlyMrK (Sep 27, 2013)

HumanSmoke said:


> Wow. That was a chore to read. Ever heard of punctuation and grammar, or English your second language?
> Please let me simplify:
> Either the charts AMD posted are correct- in which case information can be extrapolated from them OR they are incorrect and AMD is either lying or has some serious QC and/or internal communications problems.
> It is either one or the other.
> ...



The reason i reply is because you post soooo much drivel.......:shadedshu
you are misguiding others with your myopic view of the world based on what most agree is a scant pr slide meant to churn the rumour mill.

but with you its allway's essentially,,

amd are lieing shit nobs ,OR they Are lieing shit nobs(cross out one or two from each side though each post) ,,, yeah your english makes it sound like more than that,, but it is'nt


----------



## Hilux SSRG (Sep 27, 2013)

wolf said:


> Thanks Hilux SSRG, you pulled a fanboy.



Not a fanboy comment.  Just stating that both camps have put out rehashes and not new chips for the low to mid range for the last two generations.  It's a shame really that both companies are unfortunately keeping prices artificially high on old product.  And I bet Nvidia will pull a gtx 800 series rehash of the 600 series no doubt, for the low to mid range.


----------



## The Von Matrices (Sep 27, 2013)

Hilux SSRG said:


> Not a fanboy comment.  Just stating that both camps have put out rehashes and not new chips for the low to mid range for the last two generations.  It's a shame really that both companies are unfortunately keeping prices artificially high on old product.  And I bet Nvidia will pull a gtx 800 series rehash of the 600 series no doubt, for the low to mid range.



It all comes down to TSMC.  They killed half-node processes and now are taking longer than 2 years for the full-node processes as well.  Nvidia and AMD can only do so much without a process advancement.  Without a process advancement, they can't increase performance without increasing die size and manufacturing cost; hence, no more performance in cards at the same price point.

Edit: this simplifies things a bit because they could reduce margins or design a new chip, but reducing margins would bring them back to the HD4000/GTX200 series price wars which are undesirable for them, and a redesigned low end chip would only be viable for a short time before 20nm came in and made it obsolete - they never would recoup their investment.


----------



## Xzibit (Sep 27, 2013)

Casecutter said:


> Or AMD just maded that R9 290X bar to fade out... After you used that chart to arrive at the 17-18% above comment, I noticed that the others all seem to end, while that one fades... IMO.  It perhaps could be AMD trickery, that slide is not a clear measurement of data indicating performance.  If it was AMD would have to stipulate the system spec’s and other parameters to make relevant engineering data... It's a P-R slide, as such not something we should deem as gospel.



Holy ****

Someone with a brain.


----------



## Casecutter (Sep 27, 2013)

The Von Matrices said:


> Maybe AMD is going this same route with its defective chips?


Good point and might well be a reason for the model matrix, let it rain confusion.


----------



## Xzibit (Sep 27, 2013)

Casecutter said:


> Good point and might well be a reason for the model matrix, let it rain confusion.



More confusion...

If you go back and look at the presentation Raja when hes talking about "Over 5 TFLOPS Compute"

Hes talking about the series R9 290.  Not specific to R9 290X. 

GK110 - 780 to Titan there is a 517 differance

Tahiti - 7950 to 7970 GHZ there is a 1229 difference 

Add that to the speculation...


----------



## Hilux SSRG (Sep 27, 2013)

The Von Matrices said:


> It all comes down to TSMC. They killed half-node processes and now are taking longer than 2 years for the full-node processes as well. Nvidia and AMD can only do so much without a process advancement. Without a process advancement, they can't increase performance without increasing die size and manufacturing cost; hence, no more performance in cards at the same price point.



True, but I got to wonder if in the process of designing chips both AMD and Nvidia have reduced performance in order to include low power.  I do not know enough about die shrinks but anyone can see Intel has quit on performance and only seeks low power for their CPUs.  I wonder if GPUs are/have gone that direction as well.


----------



## Casecutter (Sep 27, 2013)

Xzibit said:


> More confusion...


Just meant the new naming convention, can provide new ways to perplex and obscure.


----------



## The Von Matrices (Sep 27, 2013)

Hilux SSRG said:


> True, but I got to wonder if in the process of designing chips both AMD and Nvidia have reduced performance in order to include low power.  I do not know enough about die shrinks but anyone can see Intel has quit on performance and only seeks low power for their CPUs.  I wonder if GPUs are/have gone that direction as well.



I'm not sure that low power is AMD and NVidia's primary concern, particularly in the present.  Mainstream laptops have already gotten rid of the discrete GPU, and the remaining mobile GPUs aren't particularly optimized for low power consumption.  Both manufacturers have realized that they can't beat integrated graphics power consumption no matter how well they design a GPU.  The low-power consumption graphics segment is now solely filled by integrated graphics.  Think of NVidia's Optimus - they've essentially admitted that it's simply not possible to beat the power consumption of integrated graphics even by downclocking or shutting off most of the shaders of a discrete GPU.  The discrete GPU has been relegated to situations where performance is more of a concern than power consumption.  I don't see that relationship between discrete and integrated graphics changing anytime soon.


----------



## Xzibit (Sep 27, 2013)

btarunr said:


> Unless AMD's PowerPoint skills suck, R9 290X Firestrike is under 8000.
> 
> 
> 
> ...



They still suck at it because the R9 280X doesn't match up to the 7970 GHZ.

7970 @ Stock  (925/1375)
Fire Strike
Score:6622
Graphic Score:7402

7970 @ 7970 GHZ/Boost (1050/1500)
Fire Strike
Score:7210
Graphic Score:8235


----------



## HumanSmoke (Sep 27, 2013)

Casecutter said:


> Or AMD just maded that R9 290X bar to fade out... After you used that chart to arrive at the 17-18% above comment, I noticed that the others all seem to end, while that one fades...


Yes, quite possible. Maybe the 18% is invalid....so the only other metric AMD released was the "over 5TFlop" FP32 figure. Presumably if the number was closer to 6 then that is what they would have used, no? Assuming a 5.1-5.5 range conveniently gives you an 18.6% increase over Tahiti's 4.313 TF number up to a maximum of ~27%


Casecutter said:


> It perhaps could be AMD trickery, that slide is not a clear measurement of data indicating performance.  If it was AMD would have to stipulate the system spec’s and other parameters to make relevant engineering data... It's a P-R slide, as such not something we should deem as gospel.


Again, quite possible. It's not as though AMD hasn't got a prior record in this regard. BUT in the absence of any other data what would *YOU* suggest be used as a comparison and speculation point? (This is not a rhetorical question)

Or is it verboten to speculate ? I seem to recall that many people here have speculated  and hypothesised on much less information than that. Allow me to refresh your memory on your speculation for what is now called the 290X:


Casecutter said:


> We're not going to see anything like 39% in just a re-spin on 28Nm. I figure 25% in hardware, what they find in new release driver who knows.  *If they gave us 7990 performance which is 22% more than a Titan @2650x, while holding to the alleged 250W... in a $550 card*.


As far as performance I speculated upon, I worked on the given information and came up with ~18% over the HD 7970GE based on the Firestrike slide with preliminary driver support, and and the FP32 numbers.The fact that the GTX 780 scores around the same:




Led me to conclude that the cards were likely evenly matched....as other sources have intimated:


> R290 X unfortunately I cannot release pricing or specification info yet, but price wise its similar to GTX 780 but slightly faster, right now, of course AMD could change the launch price at any time


The other speculation I indulged in was price, and the GPU under the various cards skirts.
None of the speculation has been shown inordinately out of the likely expected parameters so far.


----------



## The Von Matrices (Sep 28, 2013)

I want to make sure that everyone knows that Firestrike is both a GPU and CPU benchmark.  You can sway the scores by +/-1000 with the same GPU just by changing the CPU.  These AMD scores are worthless for comparing to competing cards unless they were both run on the same platform.  But since AMD won't divulge this information, then we just have to accept that any graphics card card scoring within about 20% could be equal in performance to AMD's new graphics cards.  It could swing either way.


----------



## Serpent of Darkness (Sep 29, 2013)

From 6990 to 7990, and including the 7970, they all average around a 25% to 35% increase in core frequency when they are overclocked.  Let's assume this is true.

2816 Streaming Processors (2) (1000 MHz boost) = 5.632 GFLOPs.

If you consider the OC potential of the previous generations.  The average is at around 30%,

Single 6990 base clock is 830 Mhz.  It's max OC is roughly around 1060 to 1120 Mhz.  27.71%
Single 7970 base clock is 900 Mhz.  1050 Mhz boost.  It's max OC is roughly 1321 Mhz w/ LN2.  46.78%
Single 7990 base clock is 950 Mhz.  1000 Mhz boost.  It's max OC is roughly 1100 Mhz.  10.00%; this may actually be higher...

Average all together from up above = 28.16%...

What's my point.
10% OC headroom: 1.1 x 5.632 GFLOPs = 6.19852 GFLOPs.
20% OC headroom: 1.2 x 5.632 GFLOPs = 6.7584 GFLOPs.
28.16% OC Headroom: 1.2816 x 5.632 GFLOPs = 7.21813 GFLOPs.
30.00% OC Headroom: 1.3 x 5.632 GFLOPs = 7.3216 GFLOPs.

So, if plausible, it may not be that difficult or impossible to surpass the 5.0 to 6.0 GFLOPs boundaries with the up coming R9-990...  Sadly, only time will tell...


----------



## Maban (Sep 29, 2013)

Anyone else notice the DVI ports on the R9 280X, 290, and 290X are DVI-D instead of DVI-I?


----------



## EarthDog (Sep 29, 2013)

Maban said:


> Anyone else notice the DVI ports on the R9 280X, 290, and 290X are DVI-D instead of DVI-I?


And?


----------



## Maban (Sep 29, 2013)

EarthDog said:


> And?



And...that means it can't output to VGA. It's just an interesting fact.


----------



## Casecutter (Sep 30, 2013)

HumanSmoke said:


> Or is it verboten to speculate


Oh no, speculation is all we have, either in late July as in my words or as of you working from a new slide.  Just don't get all "all-uppie" when someone points out an alternate estimation. I just said it fades...

As to what you bolded in my words you need to revival the context of the question:


Prima.Vera said:


> I would love to see the 9970 having the same performance level as two 7970 cards.





Casecutter said:


> We're not going to see anything like 39% in just a re-spin on 28Nm. I figure 25% in hardware, what they find in new release driver who knows. If they gave us 7990 performance which is 22% more than a Titan @2650x, while holding to the alleged 250W... in a $550 card. Then where's that leave Volcanic Island 4-5 months ?



I believe I used W1zzard reference GTX780 review and that could've skewed the percentages a little.  
http://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_780/26.html

All I was saying if they could deliver such performance (of a 7990), it would be much more than I could envision... I didn't say I was expecting.  While I wrote Volcanic... meant what does that means for 20Nm process in a couple of month most still though Q1 '14, even in July it was thought Volcanic Island was 20Nm.   
I exhaled a while back... take a breath and wait for the 15th.


----------



## EarthDog (Sep 30, 2013)

Maban said:


> And...that means it can't output to VGA. It's just an interesting fact.


Who the hell is buying a $400+ card to run on a monitor with only VGA inputs? If you can afford such a card, you better have a monitor with DVI on it, LOL!


----------



## The Von Matrices (Sep 30, 2013)

EarthDog said:


> Who the hell is buying a $400+ card to run on a monitor with only VGA inputs? If you can afford such a card, you better have a monitor with DVI on it, LOL!



I can see two (small) groups of people having problems with this:

1) The few die-hards still using 1600/1920x1200 CRTs
2) Reviewers who test monitors for latency by comparing an LCD with a CRT.


----------



## EarthDog (Sep 30, 2013)

Forget the voiceless minority.


----------



## The Von Matrices (Sep 30, 2013)

EarthDog said:


> Forget the voiceless minority.



I like compatibility but if there is no longer a large enough userbase to justify the legacy features then by all means remove them and use the savings toward features that more people will use.  Right now DVI-D to VGA converters cost around $200 but if you want to mix legacy hardware with new hardware then you need to be willing to pay for adapters.


----------



## a_ump (Sep 30, 2013)

The Von Matrices said:


> I like compatibility but if there is no longer a large enough userbase to justify the legacy features then by all means remove them and use the savings toward features that more people will use.  Right now DVI-D to VGA converters cost around $200 but if you want to mix legacy hardware with new hardware then you need to be willing to pay for adapters.



Where did you get they cost $200? i simple adapter would work wouldn't it? those are only $5-15. and converter boxes are $45-90. all this with a simple google search.

Also i use a VGA monitor with my GTX 560 because of budget i couldn't get a new monitor. Just because people can get great hardware to go in the case doesn't mean they aren't stretching their limit. Hell i don't plan to replace my current monitor until it breaks so i may be part of that voiceless userbase  lol


----------



## The Von Matrices (Sep 30, 2013)

a_ump said:


> Where did you get they cost $200? i simple adapter would work wouldn't it? those are only $5-15. and converter boxes are $45-90. all this with a simple google search.
> 
> Also i use a VGA monitor with my GTX 560 because of budget i couldn't get a new monitor. Just because people can get great hardware to go in the case doesn't mean they aren't stretching their limit. Hell i don't plan to replace my current monitor until it breaks so i may be part of that voiceless userbase  lol



The simple adapters only work for DVI-I/DVI-A ports (which means they already have native analog output; the adapter just rearranges the pins); these cards have only DVI-D ports which means no native analog output.  You are right; I overestimated the active adapters' cost; I did find a Startech adapter for $45.99.

VGA has a resolution limitation of about 3MP (QXGA 2048x1536) @60Hz; if you are using a VGA monitor, you can't have a resolution above that, but even those types of monitors are rare.  These high end cards are overkill for a single 2MP or 3MP 60Hz monitor.  Rest assured, the mid range and lower end cards have DVI-I ports so that you can still use cheap VGA adapters, and they are a much better performance match for the VGA resolutions.

EDIT: Interestingly enough, DisplayPort to VGA adapters are much cheaper than DVI-D to VGA adapters.  You can get a DisplayPort to VGA adapter for $24.71, although it is limited to 1920x1200 @60Hz (not QXGA 2048x1536 @60Hz).


----------



## TheoneandonlyMrK (Sep 30, 2013)

EarthDog said:


> Who the hell is buying a $400+ card to run on a monitor with only VGA inputs? If you can afford such a card, you better have a monitor with DVI on it, LOL!



dunno i just realised my new card has no hdmi.. dohhh I got a 7970 as they are tooo cheap damn it,, R9 will be out in 3 mins to smash my new card in 

yes fine ive a dvi monitor but i didnt wanna watch the footy on that:shadedshu
in short choice is goood


----------



## leeb2013 (Oct 8, 2013)

btarunr said:


> Unless AMD's PowerPoint skills suck, R9 290X Firestrike is under 8000.
> 
> http://www.techpowerup.com/img/13-09-26/264a.jpg
> 
> Notice the top block (which ends at 8000) is fading at the top. GTX TITAN's Firestrike score is ranging between 8,200 and 8,700.



It's got me completely stumped too, but no-one else has even questioned it. 

I can't understand it, how is a Firestrike score of 8000 good? My $200 7950 gets a graphics score of over 8000!!

http://www.3dmark.com/3dm/1299529


----------

