# Radeon R9 290X Could Strike the $599.99 Price-point



## btarunr (Sep 27, 2013)

AMD's next-generation flagship graphics card, the Radeon R9 290X, could strike a US $599.99 (or 499.99€, £399.99 before taxes) price-point, turning up the heat on the more expensive offerings by NVIDIA - GeForce GTX 780 and GTX TITAN. The card should be available from mid-October. Based on the new 28 nm "Hawaii" silicon, the card is expected to feature 2,816 GCN stream processors, spread across 44 SIMDs (11 computing units). Other specifications include 172 TMUs, 44 ROPs, and a 512-bit wide GDDR5 memory interface, holding 4 GB of memory, which likely achieves its >300 GB/s memory bandwidth with a 5.00 GHz memory clock. The company is expected to launch 6 GB variants of the card a little later.





*View at TechPowerUp Main Site*


----------



## silapakorn (Sep 27, 2013)

If this true then Titan's price will make even less sense.


----------



## hardcore_gamer (Sep 27, 2013)

I don't give a fµ¢k about the memory bus as long as it beats Titan, especially at this price.


----------



## dj-electric (Sep 27, 2013)

300+GBps is enough for high-end gaming these days and i'm totally fine with it.


----------



## fullinfusion (Sep 27, 2013)

The $600 dollar price point was stated by Erocker a while ago!

Time to cash in E!


----------



## Prima.Vera (Sep 27, 2013)

tom_mili said:


> Only 384 bit ? That tells the whole story that a site said there would be 6 GB version of 290X. It would also be very weird to pair a 512 bit mem interface with 6 GB VRAM by using 1Gb chips.
> 
> I agree that this thing should perform better than Titan yet it costs less.



What's the point of having 6GB of VRAM on your card, when 99.9% of the games are with x32 bit exes, and can only use maximum of 3.5GB VRAM and 3.5GB system RAM??


----------



## fullinfusion (Sep 27, 2013)

Prima.Vera said:


> What's the point of having 6GB of VRAM on your card, when 99.9% of the games are with x32 bit exes, and can only use maximum of 3.5GB VRAM and 3.5GB system RAM??


The point is.... IT has room to breeze through anything


----------



## tom_mili (Sep 27, 2013)

Prima.Vera said:


> What's the point of having 6GB of VRAM on your card, when 99.9% of the games are with x32 bit exes, and can only use maximum of 3.5GB VRAM and 3.5GB system RAM??



I don't know, maybe for e-peen and peace of mind that it would run games until the next decade 

I don't even really benefit from my 3GB card, yet


----------



## RCoon (Sep 27, 2013)

tom_mili said:


> I don't know, maybe for e-peen and peace of mind that it would run games until the next decade
> 
> I don't even really benefit from my 3GB card, yet



But think of SKYRIM! I can annihilate 3GB pretty fast.
Also my 780 was brought to its knees(below 60FPS) in metro last light regardless of a 1.2Ghz overclock.


----------



## Aquinus (Sep 27, 2013)

I want to see some real benchmarks. The lack of a 512-bit bus probably means that it doesn't need it. I would like to think that AMD's EEs know what they're doing, at least more so then everyone posting here so I would stop getting bent out of shape over it. It's a decent price point that I would consider but that completely depends on benchmarks.


----------



## erocker (Sep 27, 2013)

I wonder if it's true that they're going to do pre-orders on the BF4 editions before the full specs are even released?! No thanks!


----------



## buildzoid (Sep 27, 2013)

I only see one issue with the 384bit bus statement and that is that all the PCB pictures show 16 memory chips.
4GB / 16 = 256MB chips any other composition will give more than 4GB so the bus must be 512 bit or you use chips with less than 256MB which I have yet to see used on a modern GPU
Also asus stated that the bus is 512 bit link here: http://rog.asus.com/265242013/graphics-cards-2/amd-announces-new-r9-290x-and-new-graphics-lineup/


----------



## Sempron Guy (Sep 27, 2013)

4GB on a 384-bit mem bus? now that is weird


----------



## buggalugs (Sep 27, 2013)

Retail price is reasonable, only a littler more than the $549 of the 7970, and the 7970 was much faster than any Nvidia card at the time of release. Nvidia's high pricing on Titan and the 780 is dragging the price up it seems.

 Retail price might not be the whole story though, depending on availability, the stores could wack on an early adopter tax, another $50-$80 if demand is more than supply. It happened with the 7970 for the first couple of months. Hopefully there is plenty of stock, so I can get one for $599.


----------



## HumanSmoke (Sep 27, 2013)

Aquinus said:


> I want to see some real benchmarks. The lack of a 512-bit bus probably means that it doesn't need it.


Sounds about right. Tahiti certainly isn't bandwidth constrained, more a raster op deficiency- which seems to have been remedied with Hawaii

RE: Pricing, according to Gibbo at OcUK...


> R290 X unfortunately I cannot release pricing or specification info yet, but price wise its similar to GTX 780 but slightly faster, right now, of course AMD could change the launch price at any time


and...


> R270 is HD 7950 re-boxed and these shall be reference design in beginning and prices from £170 region, so same as our lowest price 7950's. So OcUK is already at the new price, the newly boxed R270's will arrive middle-end October.
> 
> R280X is the HD 7970 GHz re-boxed and these shall be reference design in beginning and prices shall be around £230 region, so same again as our lowest priced 7970's. This makes our current Asus 7970's a bargain as they are TOP and Platinum cards, the equivalent of these in R280 X shall be in the £250-£300 region and available in their R280 X packaging around end of October.


----------



## ensabrenoir (Sep 27, 2013)

*not again....*

...oh... oh.... I was warned as the launch get closer and closer..... the specs will fade... lower and lower.....


----------



## progste (Sep 27, 2013)

399 would be crazy... please do it! =)


----------



## jigar2speed (Sep 27, 2013)

progste said:


> 399 would be crazy... please do it! =)



No no, 299 would be crazy... Please do it.  












:shadedshu


----------



## ensabrenoir (Sep 27, 2013)

progste said:


> 399 would be crazy... please do it! =)



CRAZY GENEROUS!!!.....too bad they're not owned by vonage

....that commercial might not be playing in every country.......


----------



## springs113 (Sep 27, 2013)

I see $549-599...more likely going to be $609.99 all the way up to $639.99 from retailers like newegg.  It's 512 bit no doubt and we will receive the specs by next week.


----------



## Mathragh (Sep 27, 2013)

Prima.Vera said:


> What's the point of having 6GB of VRAM on your card, when 99.9% of the games are with x32 bit exes, and can only use maximum of 3.5GB VRAM and 3.5GB system RAM??



Actually, Dice already stated that they will ship BF4 with a 64bit executable. I expect a lot more game delevopers going that route since going 32bit wont make a lot of sense when you can adress 8GB in both new consoles.


----------



## Ikaruga (Sep 27, 2013)

btarunr said:


> the company is expected to launch 6 GB variants of the card a little later.



It's the early adopter enthusiast who are willing to buy everything what's new regardless of the price, so why not start with the best and most expensive version?


----------



## ensabrenoir (Sep 27, 2013)

just read this :
http://www.legitreviews.com/amd-radeon-r9-290x-battlefield-4-edition-8000-specs-pre-order_125224

makes me believe there will be a general  purpose 290 and a limited edition titian killer with price to match....


----------



## Naito (Sep 27, 2013)

I really hope it does. As much as I like Nvidia, their prices have gone crazy. Competition means lower prices.


----------



## The Von Matrices (Sep 27, 2013)

Sempron Guy said:


> 4GB on a 384-bit mem bus? now that is weird



It's a 512-bit bus, so 4GB makes sense.  I don't understand how they will put 6GB on a 512-bit bus though.  Maybe reduce the bus width to 384-bit and clock it at 7GHz to match the bandwidth of the 512-bit bus?


----------



## gdubc (Sep 27, 2013)

Ikaruga said:


> It's the early adopter enthusiast who are willing to buy everything what's new regardless of the price, so why not start with the best and most expensive version?



Its because they know many simply cannot wait and will buy this, then sell it to buy the next best thing when it comes along!


----------



## the54thvoid (Sep 27, 2013)

ensabrenoir said:


> just read this :
> http://www.legitreviews.com/amd-radeon-r9-290x-battlefield-4-edition-8000-specs-pre-order_125224
> 
> makes me believe there will be a general  purpose 290 and a limited edition titian killer with price to match....



From that article:



> This means that you’ll have to put down a deposit without knowing the price or clock speeds of the card that you will be purchasing



Let me see, buy a card with no official specification - stupid.  Buy a card with no official pricing - stupid.

What the hell are they playing at?  I'm sure some people will blindly buy but really? - it's not very intelligent to buy any tech without first having it peer reviewed.


----------



## Crap Daddy (Sep 27, 2013)

We still don't know nothing about the performance. Not that we know something definitely about the price. Apart from the "David Copperfield" mantle trick with BF4 (come December), the card appears to be slightly faster than the 780, five months late to the party and fifty bucks cheaper. This still means $600 which is a hell of a lot of money to spend just to play games.


----------



## NeoXF (Sep 27, 2013)

The most widely rumoured 512bit @ 5000MHz (320GB/s) makes no sense... why... isn't it 384bit @ 7000MHz (336GB/s) much cheaper... I mean ffs, nVidia has GDDR5s that can clock that high on their middle-of-the-road high end card GTX 770...

I'd like to see Samsung's 7.2GHz GDDR5 chips on it...


Also, GDDR6 on PI, pl0x AMD...


----------



## TheHunter (Sep 27, 2013)

^
Because it was a ES gpu and at conservative speeds, also so it maintains 260W TDP. 

Those 1200mhz will OC to 1500mhz easy, presto over 400gb/s.




And they can always do a 8GB variant with that 512bit bus..


----------



## chodaboy19 (Sep 27, 2013)

This is great news! It will definitely bring down the price of Titan and 780. I have been waiting for the 780 to drop in price to $399.


----------



## TheoneandonlyMrK (Sep 27, 2013)

NeoXF said:


> The most widely rumoured 512bit @ 5000MHz (320GB/s) makes no sense... why... isn't it 384bit @ 7000MHz (336GB/s) much cheaper... I mean ffs, nVidia has GDDR5s that can clock that high on their middle-of-the-road high end card GTX 770...
> 
> I'd like to see Samsung's 7.2GHz GDDR5 chips on it...
> 
> ...



except those same sammy's are now premium stock(damn fire bs) so atm a wider bus with lower cost memory might be wise, who knows eh


----------



## MxPhenom 216 (Sep 27, 2013)

Should drop Nvidias prices if this is the case.


----------



## shovenose (Sep 27, 2013)

I am not an AMD fan but maybe the R9 290X would be a good upgrade from my GTX670 2GB.


----------



## Easy Rhino (Sep 27, 2013)

by the time this makes it to market nvidia will have dropped its price on the titan and 780 and have a new more powerful card out that is only SLIGHTLY more expensive. AMD is just way too behind.


----------



## MxPhenom 216 (Sep 27, 2013)

Easy Rhino said:


> by the time this makes it to market nvidia will have dropped its price on the titan and 780 and have a new more powerful card out that is only SLIGHTLY more expensive. AMD is just way too behind.



I dont think so. Nvidias next cards will be Maxwell next year.


----------



## Casecutter (Sep 27, 2013)

Crap Daddy said:


> We still don't know nothing about the performance. Not that we know something definitely about the price. Apart from the "David Copperfield" mantle trick with BF4 (come December), the card appears to be slightly faster than the 780, five months late to the party and fifty bucks cheaper. This still means $600 which is a hell of a lot of money to spend just to play games.


I'm thinking like you fifty bucks less GTX780 isn’t justifiable, even while being faster the GTX780 or to spar with Titan... $600 is now imaginably uglier than the original 7970 was at $550.  If the $600 price is for a "BF4 Special Edition Box Set" I could see that, but reference cards need to MSRP for $550.

When Matt Skynner said, "I can't reveal a pricepoint but we're looking at _more traditional enthusiast_ GPU pricepoints. … So this next-generation line is targeting more of the _enthusiast market _versus the ultra-enthusiast one." 

$600 MSRP, while then higher for OEM customs is not holding true to his statement(s).  And I will chastise AMD for it every chance I have until they come to their senses.  Nvidia might do it and get away with it, but that doesn't make AMD vindicated to follow.  While making what Matt Skynner said a crock!


----------



## Easy Rhino (Sep 27, 2013)

MxPhenom 216 said:


> I dont think so. Nvidias next cards will be Maxwell next year.



right, like they have never released a card early to beat out competition before...


----------



## EpicShweetness (Sep 27, 2013)

*(╯°□°）╯︵ ┻━┻) ​*
  ​


----------



## MxPhenom 216 (Sep 27, 2013)

Easy Rhino said:


> right, like they have never released a card early to beat out competition before...



right, like 20nm is ready yet, which is what Maxwell is suppose to be on. 

Only thing I think Nvidia has up their sleeves is a fully fledged GK110 chip called Titan Ultra or whatever name they can think of. Count on the fact it wont be 785.


----------



## Crap Daddy (Sep 27, 2013)

Given how this looks like I highly doubt we'll see a price drop on the 780.  Remember there's  also Batman bundled with all cards. As for a reaction maybe prepare for another price shock, a dual GK110.


----------



## xvi (Sep 27, 2013)

..and I could get off the couch and do laundry.

..but I ain't.


----------



## the54thvoid (Sep 27, 2013)

MxPhenom 216 said:


> right, like 20nm is ready yet, which is what Maxwell is suppose to be on.
> 
> Only thing I think Nvidia has up their sleeves is a fully fledged GK110 chip called Titan Ultra or whatever name they can think of. Count on the fact it wont be 785.



Everyone's missing a trick here.  GTX Titan with nothing more than a bios refresh - TDP of 300watts, unlockable voltage to 1.21volts (more if the base hardware spec is within tolerances) and better cooling, a la ACX from EVGA.

Titan has a lot of leg room.  Alternatively, I said before, a GTX 780 with it's standard 3GB, the same cores as a Titan with mentioned bios revisions.  Let the AIB's throw in custom components (VRM,s etc) and you have a GTX 785 or allow the AIB's to customise the Titan base product and Nvidia remove the DP compute to keep the K20x crowd happy.

Yes, Nvidia can easily up Titan's performance by 10-15% from Bios alone.  Let partners play with hardware components and it's up by 15-25%.  (Look at Radrok and Khemist's Unigine Valley scores, about 110-115 fps ave on 1080 4xAA Ultra quality).


----------



## MxPhenom 216 (Sep 27, 2013)

Dropping prices on the 780 and Titan, and releasing a Titan Ultra would be the better route for Nvidia IMO.

More or less only because I want to see the 780 price drop so I can get another


----------



## arbiter (Sep 27, 2013)

Noticed one thing missing on this card, No crossfire connection?


----------



## EarthDog (Sep 27, 2013)

the54thvoid said:


> Everyone's missing a trick here.  GTX Titan with nothing more than a bios refresh - TDP of 300watts, unlockable voltage to 1.21volts (more if the base hardware spec is within tolerances) and better cooling, a la ACX from EVGA.
> 
> Titan has a lot of leg room.  Alternatively, I said before, a GTX 780 with it's standard 3GB, the same cores as a Titan with mentioned bios revisions.  Let the AIB's throw in custom components (VRM,s etc) and you have a GTX 785 or allow the AIB's to customise the Titan base product and Nvidia remove the DP compute to keep the K20x crowd happy.
> 
> Yes, Nvidia can easily up Titan's performance by 10-15% from Bios alone.  Let partners play with hardware components and it's up by 15-25%.  (Look at Radrok and Khemist's Unigine Valley scores, about 110-115 fps ave on 1080 4xAA Ultra quality).


The Titan can barely survive on its own VRMs at stock none the less trying to suck out 10-15% more performance out of it on air for the masses. If I was NVIDIA or a board partner, I wouldn't release a Titan with much more than it has already for fear of inordinately high RMA rates potentially dramatically cutting into profits. 



arbiter said:


> Noticed one thing missing on this card, No crossfire connection?


It is handled through the PCIe interface...


----------



## arbiter (Sep 27, 2013)

EarthDog said:


> It is handled through the PCIe interface...



Sounds like a performance killer to me.


----------



## EarthDog (Sep 27, 2013)

Quite the opposite for PCIe3... there is more bandwidth available there than there was/is through the SLI/CFx connections. 

Now, PCIe2... that may be another story. But I surely hope that, if it works, nobody is silly enough to buy a $600 Titan matching beast and slap that bad boy on a PCIe2 board.


----------



## haswrong (Sep 27, 2013)

NeoXF said:


> The most widely rumoured 512bit @ 5000MHz (320GB/s) makes no sense... why... isn't it 384bit @ 7000MHz (336GB/s) much cheaper... I mean ffs, nVidia has GDDR5s that can clock that high on their middle-of-the-road high end card GTX 770...
> 
> I'd like to see Samsung's 7.2GHz GDDR5 chips on it...
> 
> ...



its quite mind boggling. i asked myself the same question. amd said they were unable to clock the bus at 6ghz stable.. is that a problem to hire an engineeer who can design a stable bus? i dont care if its super wide or just standard wide, but if nvidia can make their own stable, amd shouldnt lag behind as they are with everything else. lets call tron to make a little inspection and report on the amd created environment for bits, bytes, and little helpless shaders.


----------



## erocker (Sep 27, 2013)

arbiter said:


> Sounds like a performance killer to me.



For a few hundred Megabytes of bandwidth? Nah.


----------



## HumanSmoke (Sep 27, 2013)

EarthDog said:


> The Titan can barely survive on its own VRMs at stock none the less trying to suck out 10-15% more performance out of it on air for the masses. If I was NVIDIA or a board partner, I wouldn't release a Titan with much more than it has already for fear of inordinately high RMA rates potentially dramatically cutting into profits.


The other alternative is to re-release the Titan with the two missing power phases as some vendors have done with the reference PCB for their own GTX 780's.






Reducing the VRAM from 6GB to 3GB, and neutering the FP64 capability could serve to provide a differentiator between the two models.

There are a lot of permutations possible. Likely depends on AMD's final pricing of the 290X and 290, how long it would take to put into action, and whether Nvidia see the effort viable versus the lifespan of the cards and sales potential. Allowing AIB's to raise voltage limits on cards built to handle the increased power (the 8+2, 12+2, 16+2 configs)  helps Nvidia and AIB's but there still needs to be a reference card if Nvidia want widespread and continuing site review benchmark PR


----------



## Casecutter (Sep 27, 2013)

haswrong said:


> its quite mind boggling. *i asked myself the same question. amd said* they were unable to clock the bus at 6ghz stable.. is that a problem to hire an engineeer who can design a stable bus? i dont care if its super wide or just standard wide, but if nvidia can make their own stable, amd shouldnt lag behind as they are with everything else. lets call tron to make a little inspection and report on the amd created environment for bits, bytes, and little helpless shaders.


Why do you say its' some problem with engineering or ability to run stable?  Who's your source... Self?  

It conceivably has more to do with the memory controllers within the chip and of it's implemented by either side.  AMD figures the cost of a wide bus negates the cost of memory to manage that spec.  Perhaps it was that the volume from supplier wasn't there for both Nvidia and AMD (Nvidia got there first and AMD realizes their going to need boat loads) and it was smarter to go wider and offer 4Gb with more bandwidth.



HumanSmoke said:


> There are a lot of permutations possible.


But can they do those quickly and then sell it for less...?


----------



## the54thvoid (Sep 27, 2013)

EarthDog said:


> The Titan can barely survive on its own VRMs at stock none the less trying to suck out 10-15% more performance out of it on air for the masses



That's pretty untrue mate.  We've had good dialogue over how crippled Titan is but it's not "barely" surviving at stock - that's just BS.  

In your favour though, adequate cooling is required on the card to maintain clocks and keep VRM's cool.  Titans need better coolers (ACX, etc) for consistent high clocks.  I do forgot sometimes that my card is under water.  I figure you buy a Titan - you buy water cooling too 

Anyway, isn't this about the fabled, mystical R9 290X card?  I really would like to see the bare PCB and see what AMD have built.   No point having an awesome new chip and deviously good API's coming if the reference card is a piece of crap.  Let's have some robust solid chokes and voltage circuitry that can take a beating.
And not that bloody blower fan......


----------



## Casecutter (Sep 27, 2013)

the54thvoid said:


> And not that bloody blower fan


Titan and 780 referance cooler where fine with the blower in terms of noise can't truely equate airflow though, but they sounded good.  AMD is hopefully using that to better against as the B-M.


----------



## HumanSmoke (Sep 27, 2013)

Casecutter said:


> But can they do those quickly and then sell it for less...?


Are you asking a question or just paraphrasing what I wrote?


HumanSmoke said:


> There are a lot of permutations possible. Likely depends on AMD's final pricing of the 290X and 290, *how long it would take to put into action, and whether Nvidia see the effort viable versus the lifespan of the cards and sales potential.*


----------



## haswrong (Sep 27, 2013)

HumanSmoke said:


> Reducing the VRAM from 6GB to 3GB, and neutering the FP64 capability could serve to provide a differentiator between the two models.



umm, didnt that already happen? (gtx780)

i really like 6gb of vram.
i really like the presence of fp capability.
i wouldnt be interested in nerfed hardware. i never was.
we need to move forward, not backward..


----------



## HumanSmoke (Sep 28, 2013)

haswrong said:


> umm, didnt that already happen? (gtx780)


Not really the same thing on the spec sheet even if the practical realities are somewhat closer- and I think the possible part we're discussing is more aimed at marketing bullet-points than any performance part missing from the inventory.
GTX 780 = 2304 shaders, 3GB GDDR5, 1:24 FP64
Titan = 2688 shaders, 6GB GDDR5, 1:3 FP64

Other possible combinations are therefore
2688 shader, 3GB GDDR5, 1:3 FP64 and 2688 shader, 6GB GDDR5, 1:24 FP64
You could also add 7Gb/sec effective memory if the GK 110's memory controller could be QA'ed for the speed. Running out of spec for OC'ed cards is generally a whole lot different from reference validation.


haswrong said:


> i really like 6gb of vram.


So buy the 6GB version. Just as you like 6GB isn't it conceivable that someone else would be happy to sacrifice 3GB to have 30-40% reduction in price?


haswrong said:


> i really like the presence of fp capability.


Same argument. See above.


haswrong said:


> i wouldnt be interested in nerfed hardware. i never was.


Titan : 2688 shaders....K6000 : 2880 shaders. Titan is a salvage part...so you're lusting over a nerfed part already


haswrong said:


> we need to move forward, not backward..


 Tell that to Jen Hsun and Rory, and provide an alternative income stream for them to recoup their loss of ROI.
It's a nice idea...but basically an idealized scenario totally divorced from reality.
HD 7870XT (Tahiti LE) 75% enabled die (shaders). Introduced 5 months after the fully enabled part.
GTX 660Ti (GK104-300) 88% enabled die (shaders). Introduced 5 months after the fully enabled part.
HD 6930 (Cayman CE) 83% enabled die (shaders). Introduced 12 months after the fully enabled part.
GTX 560Ti 448SP (GF110-270) 88% enabled die (shaders). Introduced 13 months after the fully enabled part.
HD 5830 (Cypress LE) 70% enabled die (shaders). Introduced 5 months after the fully enabled part


----------



## haswrong (Sep 28, 2013)

HumanSmoke said:


> nerfed


 im lusting for fully operational gk110 with samsung memory and custom pcb and clock at least 1ghz. for $300 like in the day of 3dfx. thats all, im humble!


----------



## MxPhenom 216 (Sep 28, 2013)

Saw a leaked pre order page with a price showing ~$735 for the BF4 edition of this card.

http://www.overclock.net/t/1429858/taobao-asus-radeon-r9-290x-bundled-with-bf4-735

Probably not very credible, but still something to look at.


----------



## HumanSmoke (Sep 28, 2013)

MxPhenom 216 said:


> Saw a leaked pre order page with a price showing ~$735 for the BF4 edition of this card.
> http://www.overclock.net/t/1429858/taobao-asus-radeon-r9-290x-bundled-with-bf4-735
> Probably not very credible, but still something to look at.



The price might not mean a great deal in itself- maybe not even the comparison with other cards if there's an "early adopter tax" applied which is likely.

According to this site the R9-290X is $839.83, but as a comparison, the Gigabyte GTX 780 WF3 OC is $814, and the Asus DC2OC is $821.

If nothing else, it says that the etailer isn't one that will lure customers from Newegg!


----------



## Crap Daddy (Sep 28, 2013)

Apparently NDA is lifted on October 2. Seems reasonable since on the 3rd pre-orders should start.


----------



## TheoneandonlyMrK (Sep 28, 2013)

HumanSmoke said:


> The price might not mean a great deal in itself- maybe not even the comparison with other cards if there's an "early adopter tax" applied which is likely.
> 
> According to this site the R9-290X is $839.83, but as a comparison, the Gigabyte GTX 780 WF3 OC is $814, and the Asus DC2OC is $821.
> 
> If nothing else, it says that the etailer isn't one that will lure customers from Newegg!



Retailers are always going to add a early adopters tax if supply is a bit light v demand, Its already shaking up the price structure even now ,,joy.


----------



## DRDNA (Sep 28, 2013)

I'm thinking this will be about the same in performance as a 7990 as the price on the 7990 has dropped to about the same as the  290X will be sold for.


----------



## the54thvoid (Sep 28, 2013)

DRDNA said:


> I'm thinking this will be about the same in performance as a 7990 as the price on the 7990 has dropped to about the same as the  290X will be sold for.



Not a chance.  I wish it was though.  I'd def buy one.


----------



## crazyeyesreaper (Sep 28, 2013)

hmmm 44 rops instead of 48? if that isnt a mistake in the news post I smell cut down GPU and this 290X isnt fully enabled.  44 ROPs is a wierd number

most cards are 8 16 24 32 40 48 usually a multiple of 8

44 doesnt really fit in with that


----------



## The Von Matrices (Sep 28, 2013)

crazyeyesreaper said:


> hmmm 44 rops instead of 48? if that isnt a mistake in the news post I smell cut down GPU and this 290X isnt fully enabled.  44 ROPs is a wierd number
> 
> most cards are 8 16 24 32 40 48 usually a multiple of 8



If AMD used 4 ROPs per CU it would make sense.  Tahiti has 32 ROPs and 8 CUs; it would make sense that Hawaii with 11 CUs has 44 ROPs


----------



## W1zzard (Sep 28, 2013)

CUs and ROPs are completely decoupled nowadays, so you could have 1 ROP and 50 CUs (which makes no sense of course)

Also ROPs and CUs are not the same thing .. ROPs = rasterizers, CUs= thingies that have the shaders in them


----------



## The Von Matrices (Sep 28, 2013)

W1zzard said:


> CUs and ROPs are completely decoupled nowadays, so you could have 1 ROP and 50 CUs (which makes no sense of course)
> 
> Also ROPs and CUs are not the same thing .. ROPs = rasterizers, CUs= thingies that have the shaders in them



I wasn't implying any direct link between the ROPs and CUs, just that AMD might have wanted to keep the same ratio as Tahiti.


----------



## Vario (Sep 28, 2013)

This thing is gonna crush nvidia's offerings.  Glad to see AMD (or whats left of ATI) still putting out some nice hardware even if the CPU offerings (don't kill me here guys) aren't on par.


----------



## The Von Matrices (Sep 28, 2013)

Vario said:


> This thing is gonna crush nvidia's offerings.  Glad to see AMD (or whats left of ATI) still putting out some nice hardware even if the CPU offerings (don't kill me here guys) aren't on par.



I think the biggest unknown of this launch at this point is not what the R9 290X is or how it performs but how NVidia chooses to respond.  That is what I am most curious about.


----------



## dj-electric (Sep 28, 2013)

OK. bta, i got that you now know something you didn't earlier.

A 384BIT controller was mentioned here an you were kinda angry about how AMD could not explain this at the conference. Now i see a bunch of comments got deleted and it mentions 512BIT. ?


----------



## erocker (Sep 28, 2013)

Dj-ElectriC said:


> OK. bta, i got that you now know something you didn't earlier.
> 
> A 384BIT controller was mentioned here an you were kinda angry about how AMD could not explain this at the conference. Now i see a bunch of comments got deleted and it mentions 512BIT. ?



I have my doubts that AMD even knows how many bits the memory bus is.


----------



## ensabrenoir (Sep 28, 2013)

*straight from the side of my neck......but i  bet its still true*



Vario said:


> This thing is gonna crush nvidia's offerings.  Glad to see AMD (or whats left of ATI) still putting out some nice hardware even if the CPU offerings (don't kill me here guys) aren't on par.



.....uh ....no.... Been ages since we've  seen any true crushing..... except our hopes and dreams about new hardware's performance.

i expect  80% of nvidia performance for a lesser p[rice or 5% better for the same price


----------



## buildzoid (Sep 28, 2013)

ensabrenoir said:


> .....uh ....no.... Been ages since we've  seen any true crushing..... except our hopes and dreams about new hardware's performance.
> 
> i expect  80% of nvidia performance for a lesser p[rice or 5% better for the same price



we already have something with about 80% of the performance of a GTX Titan it's called a 7970 Ghz edition and cost a 3rd of the price this is esentially 1.375 7970s so it should be as fast or faster than the Titan


----------



## crazyeyesreaper (Sep 28, 2013)

If thats the case then the 290x would have the same issues as Tahiti where the ROP count to Shader count makes no sense

640 shaders 16 rops   40 shaders per ROP
1280 shaders 32 rops 40 shaders per ROP
2048 shaders 32 rops 64 shaders per ROP
2816 shaders 44 rops 64 shaders per ROP

So in terms of efficiency of shaders to rop and its relation to performance the new GPU will likely hit the same wall as the Tahiti does.

at 2816 shaders 48 rops it drops to 58.6 shaders per ROP still not quite where it needs to be 

The way AMD usually designed a GPU was to start in the middle scale up and scale down

so 7870 = starting point 1280 shaders 80 TMU 32 ROPs
half a 7870 = 7770 640 40 16
scaling up would have been 1920 120 48  what we got was 2048 128 32

When it comes to GPUs there are of course diminishing returns however a balanced design tends to be better overall.

just look at the 7770 to 7870 to 7970
128bit > 256bit > 384bit
1gb > 2gb > 3gb
640 > 1280 > 2048
16 ROPs > 32 ROPs > 32 ROPs
40 TMUs > 80 TMUs > 128 TMUs

AMDs approach in the past would have been
128bit > 256bit > 384bit > 512bit
1gb > 2gb > 3gb > 4gb
640 > 1280 > 1920 > 2560
16 ROPs > 32 ROPs > 48 ROPs > 64 ROPs ( cut back 8 ROP increase still puts it at 45 Shaders per ROP at 56 ROPs. Allowing for a GPU with 3200 shaders 200 TMU 64 ROP = 50 shaders per ROP
40 TMUs > 80 TMUs > 120 TMUs > 160 TMUs

you can see where things dont quite make sense
Granted wafer size die size getting perfectly working chips all come into play but you get the idea. In terms of AMDs own designs and efficiencies

increasing shader count without proper ROP count tends to result in issues

5850 vs 5870 comes to mind. back with the 1440 shaders vs 1600 shaders at the same clock speeds performance was about 2% difference due to ROP limitation

normally with a die shrink ROP can do more work thus its been alright but since we are still stuck at 28nm I would have rather seen 48 ROPs for better shader to ROP ratio I am also rambling like mad and dont give a fuck. THe long story short seems to be that 64 shaders per rop is not nearly as efficient as 40 shaders per rop.


----------



## MxPhenom 216 (Sep 28, 2013)

http://videocardz.com/46115/amd-radeon-r9-290x-cost-similar-gtx-780


----------



## fullinfusion (Sep 28, 2013)

MxPhenom 216 said:


> http://videocardz.com/46115/amd-radeon-r9-290x-cost-similar-gtx-780


ok and?


----------



## Frag_Maniac (Sep 28, 2013)

From purecain's closed thread...

"AMD Radeon R9 290X will support new technology TrueAudio, a truly programmable audio technology for game developers."

I was watching the stage demo in Hawaii yesterday and couldn't help but be skeptical whether everyone would even be able to take advantage of this feature, since it's common that many can't even successfully get HDMI audio out of their current cards, myself included.


----------



## arbiter (Sep 29, 2013)

Frag Maniac said:


> From purecain's closed thread...
> 
> "AMD Radeon R9 290X will support new technology TrueAudio, a truly programmable audio technology for game developers."
> 
> I was watching the stage demo in Hawaii yesterday and couldn't help but be skeptical whether everyone would even be able to take advantage of this feature, since it's common that many can't even successfully get HDMI audio out of their current cards, myself included.



you can, you just gotta go in sound panel (least on windows) and set audio output to hdmi port tv is on. But how many people use output like that? HTPC yea it would be useful but for a desktop you have a speaker set for sound.


----------



## TheGuruStud (Sep 29, 2013)

arbiter said:


> you can, you just gotta go in sound panel (least on windows) and set audio output to hdmi port tv is on. But how many people use output like that? HTPC yea it would be useful but for a desktop you have a speaker set for sound.



It's a lot easier than that.

In VLC you can easily select audio outputs.





I assume you meant switching back and forth was a pain in the ass.


----------



## Vario (Sep 29, 2013)

arbiter said:


> you can, you just gotta go in sound panel (least on windows) and set audio output to hdmi port tv is on. But how many people use output like that? HTPC yea it would be useful but for a desktop you have a speaker set for sound.
> 
> http://img46.imageshack.us/img46/8012/lebp.png



HDMI audio works perfect on my 7850. Hooked it up to my monitor and forgot all about the fact that the monitor had speakers built in LOL.  Sound came right out no need to change any settings.


----------



## Steevo (Sep 29, 2013)

Frag Maniac said:


> From purecain's closed thread...
> 
> "AMD Radeon R9 290X will support new technology TrueAudio, a truly programmable audio technology for game developers."
> 
> I was watching the stage demo in Hawaii yesterday and couldn't help but be skeptical whether everyone would even be able to take advantage of this feature, since it's common that many can't even successfully get HDMI audio out of their current cards, myself included.



I have never had any issues, even when using multiple audio outputs, Netflix on one monitor and game on another.


But it is easier to complain and whine, instead of doing something productive isn't it?

Have you tried to RMA your card, different HDMI cable, has the issue occured with different drivers?


----------



## Mussels (Sep 29, 2013)

might be time to ugprade my 5870's.


oh and i've never had HDMI audio issues either - generally a bad receiver is the issue, or people not understanding the technology and its limitations (stuff like TV's having only 2.0 output, receiver before TV, etc)


----------



## Patriot (Sep 29, 2013)

HumanSmoke said:


> Not really the same thing on the spec sheet even if the practical realities are somewhat closer- and I think the possible part we're discussing is more aimed at marketing bullet-points than any performance part missing from the inventory.
> GTX 780 = 2304 shaders, 3GB GDDR5, 1:24 FP64
> Titan = 2688 shaders, 6GB GDDR5, 1:3 FP64
> 
> ...



.....
7970 and 7950 were released ~10 days apart...
680 and 670 were 2 mo. apart...

They are not all just fused off chips... the die is smaller... a spread of performance is important.


----------



## HumanSmoke (Sep 29, 2013)

Patriot said:


> .....
> 7970 and 7950 were released ~10 days apart...
> 680 and 670 were 2 mo. apart...


And?
BTW: The 7970 was paper-launched on the 22nd December, availability was from the 9th January, and the 7950 launch on the 31st January.


Patriot said:


> They are not all just fused off chips... *the die is smaller*..


Yeah? You got some proof of that? 
Is Tahiti LE a physically smaller die than Tahiti XT?
Is GK104-300 physically smaller than GK104-400?
Is Cayman CE physically smaller than Cayman XT?
Is GF110-270 physically smaller than GF110-375?
Is Cypress LE physically smaller than Cypress XT?


Crap Daddy said:


> Apparently NDA is lifted on October 2. Seems reasonable since on the 3rd pre-orders should start.


Well, if this to be believed the NDA lifts on the 15th October. Could be anxious two week wait for those who stump up their credit card info for the pre-order.

I don't think I've ever bought something that I didn't know the price of, and didn't know the specifications of at the time I made payment. Don't think I ever will.


----------



## Prima.Vera (Sep 29, 2013)

Mussels said:


> might be time to ugpgrade my 5870's.



Personally I am waiting for the 3xx series. So far there are absolutely no games that require new videocard(s). My (yours) 5870 Crossfire, when working properly is easily on pair with a 7870 card, even in some cases like a 7950 card; specially on those Unreal engine based games.


----------



## Jaffakeik (Sep 29, 2013)

Hmm looks very nice,but is it really needed becuase 7970 still gets max out of games these days


----------



## noctuanhd14 (Oct 9, 2013)

I thought the pre-orders for the 290x opened up on Oct 3rd?  I just checked and noticed its still not up, did I miss some news announcement?


----------



## HumanSmoke (Oct 9, 2013)

64 ROP !




[May the source be with you]


----------



## TheGuruStud (Oct 9, 2013)

Prima.Vera said:


> Personally I am waiting for the 3xx series. So far there are absolutely no games that require new videocard(s). My (yours) 5870 Crossfire, when working properly is easily on pair with a 7870 card, even in some cases like a 7950 card; specially on those Unreal engine based games.



  my xfire OCed 7950s say otherwise. The games are so unoptimized that you need dual high end cards just to play games maxed out at 1080. Forget 1440+ on almost anything. AA will devour your decent frame rates.


----------



## Prima.Vera (Oct 12, 2013)

TheGuruStud said:


> my xfire OCed 7950s say otherwise. The games are so unoptimized that you need dual high end cards just to play games maxed out at 1080. Forget 1440+ on almost anything. AA will devour your decent frame rates.



Sorry mate, but if you say that with occ 7950s in CrossFire you have problems running your games in 1080p, then you're doing something wrong or is time for you to dump that AMD processor. My 580/5850 cards pull minimum 40-50 fps in ALL games with Ultra Quality. 













BTW:

592$ for preordering a Saphire 290X

http://www.shopblt.com/cgi-bin/shop...110040015013_BTF3729P.shtml&order_id=!ORDERID!


----------



## TheGuruStud (Oct 12, 2013)

Prima.Vera said:


> Sorry mate, but if you say that with occ 7950s in CrossFire you have problems running your games in 1080p, then you're doing something wrong or is time for you to dump that AMD processor. My 580/5850 cards pull minimum 40-50 fps in ALL games with Ultra Quality.
> 
> http://media.bestofmicro.com/X/A/369262/original/SkyrimUltraCPUBottleneck2013.pnghttp://media.bestofmicro.com/W/W/369248/original/F12012UltraCPUBottleneck2013.png
> 
> ...



Who plays F1   And skyrimjob is an RPG :shadedshu   (those also appear to be two threaded only games)

I don't have problems at 1080, but it does take 2 cards. One will not work. Dropping under 60 fps is not acceptable, which, you left out. Minimum framerate is important, not the average or max.

Also, I have an OCed spintel 4670k, too. It's not any faster. And I believe nothing that Tom's or Anand says, EVER, unless it's substantiated elsewhere (they've been shills for way too long).

I won't be buying it b/c of EA, but Battlefield 4 beta never drops under 60 fps maxed at 1080, so I guess that's good news for a lot of people.


----------



## TheHunter (Oct 12, 2013)

And early retail pricing too, not bad imo. I think it will go down for few 30-70€ later..


R9-290X for 615-670€
http://geizhals.at/eu/./?cat=gra16_512&xf=1440_R9+290X#xf_top


R9-290 for 500-550€
http://geizhals.at/eu/./?cat=gra16_512&xf=1440_R9+290#xf_top


----------



## NeoXF (Oct 13, 2013)

TheHunter said:


> And early retail pricing too, not bad imo. I think it will go down for few 30-70€ later..
> 
> 
> R9-290X for 615-670€
> ...



I'm seeing 290Xs start from 603€ and 290s from 473€ right now... go even lower (and I suspect they will) and you've got a deal.


----------



## the54thvoid (Oct 14, 2013)

Well, that's Overclockers UK sold out of their Pre-Order BF4 editions...

http://www.overclockers.co.uk/showproduct.php?prodid=GX-015-AM&groupid=701&catid=56&subcat=1752


----------



## Prima.Vera (Oct 14, 2013)

I would love to see the video card for 600$ or 450€. Unfortunately this is a dream in EU...


----------

