# AMD Readies Radeon R9 390X to Take on GeForce GTX 980



## btarunr (Sep 12, 2014)

It turns out that the big OEM design win liquid cooling solutions maker Asetek was bragging about, is the Radeon R9 390X, and the "undisclosed OEM" AMD. Pictures of a cooler shroud is doing rounds on Chinese tech forums, which reveals something that's similar in design to the Radeon R9 295X2, only designed for single-GPU. The shroud has its fan intake pushed to where it normally is for single-GPU cards; with cutouts for the PCIe power connectors, and a central one, through which liquid cooling tubes pass through. 

One can also take a peek at the base-plate of the cooler, which will cool the VRM and memory under the fan's air-flow. The cooler design reveals that AMD wants its reference-design cards to sound quieter "at any cost," even if it means liquid cooling solutions that can be messy with multi-card CrossFire setups, and in systems that already use liquid-cooling for the CPU; and leave it to AIB partners to come up with air-cooled cards, with meatier heatsinks. Other specs of the R9 390X are unknown, as is launch date. It could be based on a member of the "Pirate Islands" family of GPUs, of which the new "Tonga" GPU driving the R9 285 is a part of. A possible codename of AMD's big chip from this family is "Fiji."





*View at TechPowerUp Main Site*


----------



## RCoon (Sep 12, 2014)

Is this the unlocked cores that 8PACK accidentally blurted out on the OCUK forums? Something about the 290X chip not being fully unlocked.
I'd find it highly unlikely that adding a few extra cores to an existing archi would warrant a whole new series of card though. Merely speculation.


----------



## btarunr (Sep 12, 2014)

RCoon said:


> Is this the unlocked cores that 8PACK accidentally blurted out on the OCUK forums? Something about the 290X chip not being fully unlocked.
> I'd find it highly unlikely that adding a few extra cores to an existing archi would warrant a whole new series of card though. Merely speculation.



Nah, it's a new chip. Fiji. 

3840 GCN1.2 "Pirate Islands" cores, 384-bit GDDR5, 48 ROP, 6 GB, Ijustmadethatup, and didI?.


----------



## RCoon (Sep 12, 2014)

btarunr said:


> Nah, it's a new chip. Fiji.
> 
> 3840 GCN1.2 "Pirate Islands" cores, 384-bit GDDR5, 48 ROP, 6 GB, Ijustmadethatup, and didI?.



I'm not your internet troll errand boy anymore! I feel like AMD missed the boat and should have honored our memory and called it Zeus.


----------



## dj-electric (Sep 12, 2014)

If Fiji won't be dramatically more efficient than Hawaii, it's gonna turn nasty.


----------



## Kaotik (Sep 12, 2014)

Tonga is NOT Pirate Islands, it's Volcanic Islands.
Hawaii is NOT Volcanic Islands, it's Sea Islands.

Just check the AMD CodeXL if you don't want to take my word on it.


----------



## sgtheadhole (Sep 12, 2014)

http://wccftech.com/amd-hawaii-gpu-...chip-48-compute-units-3072-stream-processors/

Seems as though the 290X might have been disabled with some core parts...


----------



## RCoon (Sep 12, 2014)

Dj-ElectriC said:


> If Fiji won't be dramatically more effiecent than Hawaii, it's gonna turn nasty.



Same node size, it can't be that much more efficient. Only way to improve performance is to add more cores. More cores means more heat, more heat means stock cooler is an AIO!


----------



## HumanSmoke (Sep 12, 2014)

RCoon said:


> Is this the unlocked cores that 8PACK accidentally blurted out on the OCUK forums? Something about the 290X chip not being fully unlocked.
> I'd find it highly unlikely that adding a few extra cores to an existing archi would warrant a whole new series of card though. Merely speculation.





sgtheadhole said:


> http://wccftech.com/amd-hawaii-gpu-...chip-48-compute-units-3072-stream-processors/
> Seems as though the 290X might have been disabled with some core parts...


Well, if it's a simple matter of binning some Hawaii GPUs why is production slated for the* first half of 2015* ?


> * Press Release:*
> Thursday, August 14, 2014 — Asetek® today announced that it has secured a design win with an undisclosed OEM customer for a graphics liquid cooling product. The ambitious project is forecasted by the customer to result in 2 – 4 million dollars of revenue. *Shipping is scheduled to begin in the first half of 2015*. The design win continues Asetek’s success in the growing graphics liquid cooling market.


That's a freaking long time to get a fully enabled die out the door considering Hawaii has been selling for twelve months already.
If a single GPU card NEEDS watercooling to tame the furnace then the likely part IMO is the Bermuda/Fiji GPU. The time frame seems pretty much ballpark for AMD's big die answer to GM 200.


----------



## buildzoid (Sep 12, 2014)

The cooler looks awesome. I wonder how many cores it will have because if the R9 285 is any indication a 3000+ stream processor core would pull 300+ W and while it would explain the use of a CLC but a 300+W card fighting Nvidia's ultra low power Maxwell stuff sounds like a terrible idea. I don't really care what power draw the card has as long as AMD and the AIBs find a way to cool it but for many people it could be a major problem. From the looks of the 980's leaked perfoemance AMD would only need 3,072 SPs @ 1Ghz to win but such a card doesn't make sense since it won't be much faster than the current stuff.


----------



## RCoon (Sep 12, 2014)

HumanSmoke said:


> Well, if it's a simple matter of binning some Hawaii GPUs why is production slated for the first half of 2015 ?



Hence why I said "merely speculation" and "highly unlikely"


----------



## Sony Xperia S (Sep 12, 2014)

Kaotik said:


> Tonga is NOT Pirate Islands, it's Volcanic Islands.
> Hawaii is NOT Volcanic Islands, it's Sea Islands.
> 
> Just check the AMD CodeXL if you don't want to take my word on it.



AMD CodeXL may be fake, wrong or intentionally confusing.

Hawaii cannot be in Sea Islands, otherwise these guys from AMD need to go back to school to learn some basic geography!

ahah rofl


----------



## TheDeeGee (Sep 12, 2014)

Well that wouldn't be hard for AMD, since the 980 is only 2% faster than a 780 Ti.

As much as i like NVIDIA, they failed big time with the 800 Series.


----------



## HumanSmoke (Sep 12, 2014)

Sony Xperia S said:


> Hawaii cannot be in Sea Islands, otherwise these guys from AMD need to go back to school to learn some basic geography!
> ahah rofl


Wouldn't be the first time. Cape Verde is a "Southern Islands" GPU yet Cape Verde is in the Northern hemisphere.


buildzoid said:


> The cooler looks awesome. I wonder how many cores it will have because if the R9 285 is any indication a 3000+ stream processor core would pull 300+ W and while it would explain the use of a CLC but a 300+W card fighting Nvidia's ultra low power Maxwell stuff sounds like a terrible idea.


Regardless of the article title, I would think a 390X would go up against the big Maxwell GM 200 not the GK 104 replacement GTX 980. If AMD need watercooling and 300+W to stay competitive with a sub-200W GTX 980 they may as well go back to school since the disparity would be much worse than the GTX 480 vs HD 5870 was five years ago - somebody's heading in the wrong direction.


----------



## Sony Xperia S (Sep 12, 2014)

Svarog said:


> Well that wouldn't be hard for AMD, since the 980 is only 2% faster than a 780 Ti.
> 
> As much as i like NVIDIA, they failed big time with the 800 Series.



Man, these 2% is in the area of probability for statistical error. This is not a serious difference at all.



btarunr said:


> Nah, it's a new chip. Fiji.
> 
> 3840 GCN1.2 "Pirate Islands" cores, 384-bit GDDR5, 48 ROP, 6 GB, Ijustmadethatup, and didI?.



Jump from 2816 to 3840 SPs will make a noticeable performance improvement, given that those shaders are also further optimised.


----------



## RCoon (Sep 12, 2014)

Sony Xperia S said:


> Jump from 2816 to 3840 SPs will make a noticeable performance improvement,



He's joking. If you don't get it, google AMD R10 Zeus, and cry for my naiivity.



Svarog said:


> Well that wouldn't be hard for AMD, since the 980 is only 2% faster than a 780 Ti.



These made up statistics don't help anyone.


----------



## FrustratedGarrett (Sep 12, 2014)

buildzoid said:


> The cooler looks awesome. I wonder how many cores it will have because if the R9 285 is any indication a 3000+ stream processor core would pull 300+ W and while it would explain the use of a CLC but a 300+W card fighting Nvidia's ultra low power Maxwell stuff sounds like a terrible idea. I don't really care what power draw the card has as long as AMD and the AIBs find a way to cool it but for many people it could be a major problem. From the looks of the 980's leaked perfoemance AMD would only need 3,072 SPs @ 1Ghz to win but such a card doesn't make sense since it won't be much faster than the current stuff.



Nvidia's Ultra Low power? The Maxwell chip is a heavily cut down chip with no 64 bit support and a 128 bit memory interface, it performs well at 1080P for the same reason the new R9 285 performs well at 2k. 
Last I checked, the R9 285 consumes around as much power as the GTX760 while being 20%-40% faster depending on the game:


----------



## HumanSmoke (Sep 12, 2014)

FrustratedGarrett said:


> Nvidia's Ultra Low power? The Maxwell chip is a heavily cut down chip with *no 64 bit support*


LOL.
And just for the record, the GTX 750 Ti uses a fully enabled GM 107.


FrustratedGarrett said:


> Last I checked, the R9 285 consumes around as much power as the GTX760 while being...whatever...


Last I checked, the GTX 760 was a 2.5 year old GK 104 Kepler, not Maxwell.


----------



## Sony Xperia S (Sep 12, 2014)

RCoon said:


> He's joking.



I wouldn't be surprised at all given what they just typed _"...Other specs of the R9 390X are unknown, as is launch date. It could be...".
_
It may be that well-calculated misleading marketing move from AMD to try to spoil nvidia's fun... 
_
_


----------



## buildzoid (Sep 12, 2014)

FrustratedGarrett said:


> Nvidia's Ultra Low power? The Maxwell chip is a heavily cut down chip with no 64 bit support and a 128 bit memory interface, it performs well at 1080P for the same reason the new R9 285 performs well at 2k.
> Last I checked, the R9 285 consumes around as much power as the GTX760 while being 20%-40% faster depending on the game:


The 760 isn't maxwell that's a cut down GK104


----------



## FrustratedGarrett (Sep 12, 2014)

buildzoid said:


> The 760 isn't maxwell that's a cut down GK104



I know, That chart compares GCN 1.2 to Kepler and it shows that better performing GCN card actually consumes slightly less power than Kepler.


----------



## vega22 (Sep 12, 2014)

Sony Xperia S said:


> I wouldn't be surprised at all given what they just typed _"...Other specs of the R9 390X are unknown, as is launch date. It could be...".
> _
> It may be that well-calculated misleading marketing move from AMD to try to spoil nvidia's fun...



if they wanted to do that they could just sell the full fat hawaii core , i am sure it could beat the 780ti by 2% too :lol:


----------



## FrustratedGarrett (Sep 12, 2014)

HumanSmoke said:


> LOL.
> And just for the record, the GTX 750 Ti uses a fully enabled GM 107.
> 
> Last I checked, the GTX 760 was a 2.5 year old GK 104 Kepler, not Maxwell.



By cut down I meant a highly truncated design. The Maxwell chip has a 128 bit memory interface and no 64 bit floating point or integer operations support.


----------



## Sony Xperia S (Sep 12, 2014)

marsey99 said:


> if they wanted to do that they could just sell the full fat hawaii core , i am sure it could beat the 780ti by 2% too :lol:



The fully fat Hawaii core is already being sold as 290X, there is nothing additional which they can do over it even if the chip itself has other parts disabled in it.


----------



## vega22 (Sep 12, 2014)

Sony Xperia S said:


> The fully fat Hawaii core is already being sold as 290X, there is nothing additional which they can do over it even if the chip itself has other parts disabled in it.



sorry i must be really dense, so please bare with me a moment.

how exactly can it be the full fat core if it has things removed?

how can anything with only a portion of itself active be classed as the full fat version?

side note, im guessing you missed the joke too....


----------



## Mathragh (Sep 12, 2014)

Lol well atleast this is a very good article by modern popular journalism standards!


----------



## Sony Xperia S (Sep 12, 2014)

marsey99 said:


> how exactly can it be the full fat core if it has things removed?



They are there having another function, I suppose.


----------



## RCoon (Sep 12, 2014)

I just realised AMD's GPU's are going to start sounding like Intel Extreme processors:

R9390X, just wait for the cut down R9360X and R9330X


----------



## EarthDog (Sep 12, 2014)

RCoon said:


> Is this the unlocked cores that 8PACK accidentally blurted out on the OCUK forums? Something about the 290X chip not being fully unlocked.
> I'd find it highly unlikely that adding a few extra cores to an existing archi would warrant a whole new series of card though. Merely speculation.


Hell, look at all the rebrands with NO changes, lol... I wouldn't put it past them... but doubt it at the same time.


----------



## d1nky (Sep 12, 2014)

Knew this was coming, like rcoon mentioned theres been a few leaks and rumours. 

Looks good though.. maybe 3 with the X99 Asus deluxe.... PORN!


----------



## fullinfusion (Sep 12, 2014)

Why I even read these GPU news articles anymore is anyone's guess.. I think I need to get my head checked. All I ever mostly see is pissing and moaning from certain parties. 

Such negativity lately  on tpu is starting to be a real downer... 

Thanks OP for the news feed,
To me red or green I WELCOME the new hardware be it as it is, change is always nice and welcomed in my books.. 

I won't flame on anyone in general but when I read certain comments I think to myself.... Why don't they work for those companies??

There's real engineers and then there's the Google knowitalls


----------



## d1nky (Sep 12, 2014)

What made me truly impartial was trying out hardware from red, green and blue. I guess people can be scorned and vent that out on forums or be naive and inexperienced.

edit: cool looking leaf blower lol


----------



## Sony Xperia S (Sep 12, 2014)

when you start with this psychological nonsense, I start vomiting


----------



## GhostRyder (Sep 12, 2014)

RCoon said:


> I just realised AMD's GPU's are going to start sounding like Intel Extreme processors:
> 
> R9390X, just wait for the cut down R9360X and R9330X


That would be awesome, time to start mixing and matching parts to try to confuse people 



fullinfusion said:


> Why I even read these GPU news articles anymore is anyone's guess.. I think I need to get my head checked. All I ever mostly see is pissing and moaning from certain parties.
> 
> Such negativity lately  on tpu is starting to be a real downer...
> 
> ...


^This, its always the same people who complain article to article no matter what team or what is coming out, I agree with you!  Edit:

Its cool to see a reference model with an AIO in all honesty if that turns out to be true.  Its something that could provide the average user with a quiet machine and the max overclocks without having to worry about crazy setups and such.  Air cooling is still going to be a choice for those who want it, but now doing this is going to give people another option.  I really did love PNY when they did the LC systems on the GTX 580 as they worked great in my eyes and I feel the 295X2 had a really nice reference concept.

This is also just a quick shot to spin up hype which is common when the other company has something special or coming soon.  We have to wait for more details to surface before claims are made.


----------



## ensabrenoir (Sep 12, 2014)

....You know the only reason the 980 didn't seem to be note worthy (besides power consumption) is because truely it is the 970 and the 970 is really a 960.   Nvidia knew Amd would relase a 390 and Now Nvidia will drop  their *true* high end  card .....and price it accordingly


----------



## GhostRyder (Sep 12, 2014)

fullinfusion said:


> Ok smart ass show me one of my posts where I posted on an article bitching and complaining... Same ppl lmao
> 
> You can see who the suck ups are and the idiots... No pun intended... Go start searching away on all my so called article posts before pointing fingers troll man. But then I cant blame you if you didnt read what i said if full you didn't understand my post about welcoming new hardware lol


...I didn't say you were bitching and complaining, I meant I liked your comment and agreed with you that the same people come out complaining on every article no matter what comes out...

Sorry if my wording was off.


----------



## Hilux SSRG (Sep 12, 2014)

The cooler looks great.  I just can't imagine the 390/390x competing with 970/980s, seems that it would compete closer to 780ti or it's successor.


----------



## fullinfusion (Sep 12, 2014)

GhostRyder said:


> ...I didn't say you were bitching and complaining, I meant I liked your comment and agreed with you that the same people come out complaining on every article no matter what comes out...
> 
> Sorry if my wording was off.


Lol no worries comment deleted.. I guess we share the same wording off ... I have a habbit of doing that too lol.. Thanks man


----------



## GhostRyder (Sep 12, 2014)

fullinfusion said:


> Lol no worries comment deleted.. I guess we share the same wording off ... I have a habbit of doing that too lol.. Thanks man


No problem, I just wanted to make sure there was not some confusion because I completely agree with you .  I should have been more clear with what I was referring to.  I agree that people keep complaining about new hardware and take things to literally or assume before full release.

I wish we could get through an article without the same few people starting up arguments about new hardware.  I mean just because something has a 6+8pin connector does not automatically mean its going to be power hungry or because there is a reference AIO that the card is going to heat up hotter than the sun.  Sometime things are put on it to cater to enthusiasts or people who want more from their card.  I for one love the idea even if I still would put a water block on it and prefer cheaper reference cards for the reason of removing the cooler.


----------



## fullinfusion (Sep 12, 2014)

@GhostRyder No need explain because I do the same thing at times.. And shh don't stir the worms up lol..

I can't see a picture of this new card till I get off roaming... But sure hope there's one to see when I get home. Its going to be a pain in the butt for crossfire users having 2 rads and such.. It be real cool if AMD designed the loop for crossfire users like us so all we need to do is a quick hose removal and replace with an added component but till its final who knows.. I know my 650d case won't take two extra rads but where there's a will there's a way.

Edit:, curiously I had to enable pictures and there goes.my monthly plan lol..
They need quick disconnect fittings and a serviceable rad to refill the system.. Also a few extra small hoses and a few extra long ones and crossfire would be so easy with this card on a single 120 rad providing it was thick enough, had the proper fpi and pressure fan.. @amd I'm talking to you!


----------



## GhostRyder (Sep 12, 2014)

fullinfusion said:


> @GhostRyder No need explain because I do the same thing at times.. And shh don't stir the worms up lol..


Trying not to 



fullinfusion said:


> I can't see a picture of this new card till I get off roaming... But sure hope there's one to see when I get home. Its going to be a pain in the butt for crossfire users having 2 rads and such.. It be real cool if AMD designed the loop for crossfire users like us so all we need to do is a quick hose removal and replace with an added component but till its final who knows.. I know my 650d case won't take two extra rads but where there's a will there's a way.


That is one of the issues I would see with this, hopefully there will be air cooled models from the other OEMs at launch or soon after.  Not everyone would like to deal with that type of cooler or has a case the size of a car


----------



## Deadlyraver (Sep 12, 2014)

AMD is lagging behind, they need to pick up the pace in terms of design if they want the high-performance throne this year. All they are doing is relying on a solution to be the bridge to higher specs when they should be looking into higher-quality OEM components to reduce the temperatures towards overclocking.


----------



## HM_Actua1 (Sep 12, 2014)

Garbage


----------



## buildzoid (Sep 12, 2014)

Deadlyraver said:


> AMD is lagging behind, they need to pick up the pace in terms of design if they want the high-performance throne this year. All they are doing is relying on a solution to be the bridge to higher specs when they should be looking into higher-quality OEM components to reduce the temperatures towards overclocking.


The AMD card have some of the highest quality PCB components available on the market it's just the cores that are power hungry.


----------



## fullinfusion (Sep 12, 2014)

One says garbage another says lagging behind. 
The one that said garbage, why is it garbage? Your system spec says 2 titans no?  Does the new AMD card make you feel threatened in any kind of way especially after dumping $800,on a gsync monitor?? I'm asking and not here to argue.  I betcha if it was from nvidia youd be praising hallalja finally!! No?

I'm serious these one word posts shouldn't even be allowed.. 

And the other comment about lagging behind.. Who they lagging behind nvidia? Wasn't nvidia lagging a few years ago? 

Look at it this way, maybe AMD is doing it for a reason.. No one knows what the reason is unless your god lol.. Its like anything in life. Some excel at things where others go at there own pace. You also didn't bear the bit coin miners complain on AMD cards and the coins they raked in did ya. 

It all boils down to preference. Me, I like AMD, always have and always will. Why? Because I owned nvidia before and I didn't like there driver software to control the card plus you needed other software to control clock speeds.. I just four d AMD/ ATI much better for me and have stuck with them sense the 3800 series. Some say well nvidia don't have screen tare, I know my AMD card has zero issues in that reguard and why's that PPP ask? Its called before playing the game go into the graphics setting and change the resolution.. Enter.. Save then go back and set it to your normal setting and problem solved 

Sorry I ramble when board and passing the miles is much easier lol..

Anyone predict a price on these cards?
If they make a reference.blower type my guess would be around $400 and water around $489..
Unless mineing goes full out again lol


----------



## Sony Xperia S (Sep 12, 2014)

fullinfusion said:


> Anyone predict a price on these cards?



It's in your *best* interest to ask the company for lower cost. Don't be more stupid than they are when all the time by all means they look only how to lower cost and gain more profit.

$400 would be quite nice but I know they have the habit to price those at at least $549.99. 

meh


----------



## utengineer (Sep 12, 2014)

Deadlyraver said:


> AMD is lagging behind, they need to pick up the pace in terms of design if they want the high-performance throne this year. All they are doing is relying on a solution to be the bridge to higher specs when they should be looking into higher-quality OEM components to reduce the temperatures towards overclocking.



NVIDIA owner here...In my opinion AMD already took the crown this year with the 295x2.  GPUBoss has the 295x2 and Titan Z practically equal in Compute and Synthetic performance.  With a 1000$ price versus the Z's $3000, I would say NVIDIA got blind-sided.  Factor the OEM water block AMD includes at $1500, they are still doing more with less cost.  Question is can AMD do it again with the 390X versus the 980.  

What do you use to determine the mythical "Performance Crown?"


----------



## Sony Xperia S (Sep 12, 2014)

and for Mr. JHH even this is not enough, not very long time ago he used the time to explain how there was even more room for price increase.


----------



## the54thvoid (Sep 12, 2014)

Gosh, lot's of angst in here.....

Everyone seems to be missing the point.  Why is Maxwell good?  More efficiency than Kepler.  Why is this good?  Because for the entire industry, efficiency is good.  For the end user, efficiency is good.  

This post is about a water loop design for a card from the outset.  That sets alarm bells ringing in this house.  The 295x2 was AMD's Lance of Ungodly Awesomeness.  It truly slayed Nvidia's most powerful beast and AMD deserve all the credit they can get for making green look dumb as fuck.

HOWEVER, if their next gpu flagship (single chip) requires a water loop as the base construct.... WTF?  Every single chip vendor, be it Qualcomm, Nvidia, Intel, AMD APU's etc are driving toward efficiency.  It seems unreasonable to slap a water block on a card that didn't need it so you have to assume that if this is for AMD's next single chip, it's already in the wrong place.

I'm all for a 390X to compete with 980 (or 980Ti, depending on release dates) but not if AMD need to cool it with a water solution.  

I'm going to fly against all of you people and call this out as FUD.  In the current ecosystem, any move to higher power consumption on a new architecture is a negative move.  And if it's not got higher consumption, why slap a water block on it?

Nah, something smells here.  I'll believe it only when I see it.


----------



## HumanSmoke (Sep 12, 2014)

FrustratedGarrett said:


> By cut down I meant a highly truncated design. The Maxwell chip has a 128 bit memory interface


Uncore typically accounts for just over half the area of a GPU die, and the largest part of the uncore are the memory interfaces and controllers. Obviously, the trade off came down to decreasing vRAM I/O die area to increase yields amongst other things. Pretty common practice for high-volume chips (see AMD's Cape Verde and Bonaire for example)


FrustratedGarrett said:


> ...and no 64 bit floating point or integer operations support.


Ah, I see - well in that case.........you're still wrong. GM 107 does include FP64 support (1:32).  Obviously double precision is a must-have for a low-end card.


----------



## Sony Xperia S (Sep 12, 2014)

I am against what you say and for what AMD want to do. Because ultimately higher power consumption equals to higher performance than if the former was lower. 

I am positive towards aggressive performance jumps.


----------



## buildzoid (Sep 12, 2014)

the54thvoid said:


> Gosh, lot's of angst in here.....
> 
> Everyone seems to be missing the point.  Why is Maxwell good?  More efficiency than Kepler.  Why is this good?  Because for the entire industry, efficiency is good.  For the end user, efficiency is good.
> 
> ...



Well Nvidia's 980 looks like a power efficient side grade compared to the 780Ti. Amd probably decided that it's a great chance to make a card that is much more power full even if it pulls a ton of power. If they make a 3328-3840 SP GPU it will be much faster than anything Nvidia is making the only downside will be that it'll probably pull 300+W


----------



## the54thvoid (Sep 12, 2014)

Sony Xperia S said:


> I am against what you say and for what AMD want to do. Because ultimately higher power consumption equals to higher performance than if the former was lower.
> 
> I am positive towards aggressive performance jumps.



Worst case for a 3Gb buffer for a stock 780Ti versus the 290X Lightning.  780Ti has a meagre 3% lead....







On W1zzard's average power chart the lower performing 290X uses 47W more power. 






More power does not equate to more performance.  And even on the same arch, more power can lead to poorer returns of scale on performance due to design tolerances.

It is simply foolish and naive in today's tech world to move your design regime to a higher power draw chip.  Especially if you think it's going to need water cooling from the offset.

The one caveat of rationale is that AMD have scored a cost effective cooling solution and simply want a product that is whisper quiet and very cool, i.e. it doesn't actually need to be water cooled.  That would be good, except if you already own a custom loop.

The second caveat (of technology lore) is that it is so much faster than what they have now they figure the power draw is acceptable.  But given what we've seen the past few years, that's unlikely (from either camp).

No, simply, no - if AMD are pushing the power limits skywards, that's a dumb move when everyone is demanding lower power.  And not to forget, there are legislators out there right now cracking down on excessive domestic power consumption.....


----------



## Lionheart (Sep 12, 2014)

Hitman_Actual said:


> Garbage



I agree.. Your comment is garbage!


----------



## GhostRyder (Sep 12, 2014)

the54thvoid said:


> Gosh, lot's of angst in here.....
> 
> Everyone seems to be missing the point.  Why is Maxwell good?  More efficiency than Kepler.  Why is this good?  Because for the entire industry, efficiency is good.  For the end user, efficiency is good.
> 
> ...


Just because it has a water cooler does not mean it was absolutely necessary.

Take for example the recent announcement of the 6+8Pin connectors on the 970 and 980 when people thought they would not need that much power.  But then it comes back to do they even really need that as well?  Well maybe so or maybe not its just there to give the extra headroom for the people buying it or relieve any potential stress similar to the GTX 750ti and how it can have the connector or not depending on overclocks.

Similar with the R9 295X2 having a water cooler it turned out to be great but not completely necessary as we have seen with the R9 290X Dual Core Devil 13.

Just because there is something like a water cooler in reference does not mean it was absolutely necessary.  I think people need to look more towards this being something completely different and cool rather than "ZOMG ITZ GOING TO USE MOAR POWER".

I think its more of AMD heard people complaining about reference coolers and said ok lets show them what we can do.


----------



## the54thvoid (Sep 12, 2014)

GhostRyder said:


> ....Just because there is something like a water cooler in reference does not mean it was absolutely necessary.  I think people need to look more towards this being something completely different and cool rather than "ZOMG ITZ GOING TO USE MOAR POWER".
> 
> I think its more of AMD heard people complaining about reference coolers and said ok lets show them what we can do.



Read my last few lines of my post 

The bit where I say, unless they don't actually need it.....


----------



## GhostRyder (Sep 12, 2014)

the54thvoid said:


> Read my last few lines of my post
> 
> The bit where I say, unless they don't actually need it.....


That post came after I had already started writing/finishing up the post I was working on so I missed it lol .  But yea I doubt we are going to run into a card that requires something like 3 8 Pin connectors or anything crazy on the reference at least (And a single GPU).


----------



## the54thvoid (Sep 12, 2014)

GhostRyder said:


> That post came after I had already started writing/finishing up the post I was working on so I missed it lol .  But yea I doubt we are going to run into a card that requires something like 3 8 Pin connectors or anything crazy on the reference at least (And a single GPU).



What Bullzoid said might not be far from the truth.  There is a chance it does have high-ish power consumption because AMD have seen that the 750ti Maxwell was good on efficiency but not exactly fast (did it not equate on speed to a 660?).  With that they figure 980 might be that side ways upgrade.

The way I see it coming from Nv is that they'll maybe try and tout the 980 sli pitch for 4k gaming with low power consumption. Frankly, that's what I'm looking for in my next gpu - a 4K performance that doesn't eat power like a silicon valley troll.

But.... GM210 will probably be a very fast card.  Nv might not focus too much on it's power draw.  Let's face it, as Humansmoke pointed out - the Asetek news article does state 2015 release.  Next year will be the war of the big parts and perhaps 390X versus 980Ti could well be a fiery, ungodly eco-disaster!


----------



## HumanSmoke (Sep 12, 2014)

the54thvoid said:


> More power does not equate to more performance.  And even on the same arch, more power can lead to poorer returns of scale on performance due to design tolerances.


Very easily seen with overclocking. I'm not too sure that some people are aware of how transistor leakage scales with input power. A case of diminishing returns, as you've noted.


the54thvoid said:


> The one caveat of rationale is that AMD have scored a cost effective cooling solution and simply want a product that is whisper quiet and very cool, i.e. it doesn't actually need to be water cooled.


Quite possibly. A good vapour chamber air cooler isn't overly cheap to produce, and if Asetek cut AMD a sweetheart deal that involves not just their next single GPU flagship but the series after that and dual-GPU parts, they'd likely cut their asking price for a long-term association.


the54thvoid said:


> The second caveat (of technology lore) is that it is so much faster than what they have now they figure the power draw is acceptable.  But given what we've seen the past few years, that's unlikely (from either camp).


The third caveat is that die shrinking and smaller process nodes tend to generate high localized temps and make the silicon somewhat more sensitive to voltage/transistor leakage/clock speed limitation/cooling. The AIO solution maybe to forestall any heat dissipation issues with 16nmFF/16nmFF+ (and 14nm if they go that direction). Taking the cooling out of the equation would give the chip design team more leeway in how much they pack into the GPUs


the54thvoid said:


> No, simply, no - if AMD are pushing the power limits skywards, that's a dumb move when everyone is demanding lower power.  And not to forget, there are legislators out there right now cracking down on excessive domestic power consumption.....


Unfortunately, the R9 295X2 kind of flies in the face of that reasoning (however logical it does seem). The card obliterated the PCI-SIG spec and contravenes the electrical specification for PSU cabling. I wouldn't be surprised to see both AMD and Nvidia continue to butt up against the PCI-SIG spec for the high end and concentrate their Perf/watt metrics in the lower tier (volume) cards.


----------



## GhostRyder (Sep 12, 2014)

the54thvoid said:


> What Bullzoid said might not be far from the truth.  There is a chance it does have high-ish power consumption because AMD have seen that the 750ti Maxwell was good on efficiency but not exactly fast (did it not equate on speed to a 660?).  With that they figure 980 might be that side ways upgrade.
> 
> The way I see it coming from Nv is that they'll maybe try and tout the 980 sli pitch for 4k gaming with low power consumption. Frankly, that's what I'm looking for in my next gpu - a 4K performance that doesn't eat power like a silicon valley troll.


Well the 750ti seems to equate to around a 650ti boost (with a little more than it) while consuming less power.

Well Nvidia also realized pretty fast that 3gb on the 780 series was not enough for 4K hence the push for the Titan/Titan Black/Titan-Z advertised more as 4k gamers including OEM builds.  This round the 970 will be enough based on its benches and extra ram in at least SLI to perform well in a 4k game (Or should based on where the others already fall).

I do not think either way any of us are running into an eco disaster with the next series of cards.  There are going to be some limits that even AMD would not want to cross when it comes to power usage because it would cause more risks.


----------



## Casecutter (Sep 12, 2014)

A mock-up SLA that’s painted with rattle can for who knows what? AND the crowd goes… Full Speculation Mode!

I’ll throw my version of guesswork… Asetek actually has developed a fan/pump- closed loop waterblock- radiator system that's going to be all contained within the envelope of that shroud.


----------



## HumanSmoke (Sep 12, 2014)

Casecutter said:


> A mock-up SLA that’s painted with rattle can for who knows what? AND the crowd goes… Full Speculation Mode!


Well, isn't that that the nature of a tech site? I seem to remember people basing much more speculation on much less at-hand information in the past.


Casecutter said:


> I’ll throw my version of guesswork… Asetek actually has developed a fan/pump- closed loop waterblock- radiator system that's going to be all contained within the envelope of that shroud.


Good luck with that _speculation_. Radiator fin density would be very fine pitched to fit within the confines of the ATX specification (especially when you take into account pump placement in relation to the radial fan) for add-in cards and also need a prodigious fan rotation rate. Both seem at odds with the concept of a "quiet" solution, assuming that is feasible in the first place. If the mock up is indicative of the final product then the mid-shroud cut-out tends to point to an external tubing/rad option. If the mock up isn't indicative then all anyone has to go on is the contract price for one or more SKU's sometime next year which may, or may not, be from AMD (basically Asetek's press release). If you're referring to Asetek's interposer design, it doesn't include the radiator.


----------



## HTC (Sep 13, 2014)

Maybe they're just opting for a water solution because they have VERY low expectations of their air coolers, *noise wise*. Pretty much ALL of the reviews thrashed R9x and R9 because of sound issues stating the cards were good BUT the noise held them back (as well as a shoddy stock cooler).

In the previous generation, they held themselves back with that poor-excuse-of-a-stock-cooler: maybe this time around they've actually learned from their mistake?

Ofc, it could be the card DOES need this water solution and is STILL noisy ... but one can hope they can make it much more quieter this time around ...


Personally, i don't give a rat's ass who "wins" the performance war: as long as the difference to the competitor is small so they can battle it out price wise ...


----------



## fullinfusion (Sep 13, 2014)

Just hurry up amd I has 2 slots to fill plus winter is coming 
I need a room heater and Im not being a smart ass... What way to heat a room!

And these -40c Canadian winters, OMG are they great for overclocking


----------



## GhostRyder (Sep 13, 2014)

fullinfusion said:


> Just hurry up amd I has 2 slots to fill plus winter is coming
> I need a room heater and Im not being a smart ass... What way to heat a room!
> 
> And these -40c Canadian winters, OMG are they great for overclocking







Sorry had to make the joke when I saw that comment 

I agree, my room heater is waiting for the cold winter nights.  I love not having to run the heater much in the winter (I am a Texas man so excuse me ).

I wish we had more details surrounding this card.


----------



## LeonVolcove (Sep 13, 2014)

Must i wait for this card or should i buy a Radeon R9 290 instead?


----------



## overpass (Sep 13, 2014)

That cooler def' inspired from 295x behemoth. A refreshing change it was, an AIO CLC LCS. Here's hoping that Asetek will make it more configurable so that the Crossfire setups will use the two rads that could be combined with redundant holes replaced by caps, in one elegant solution of one single loop! A liquid cooling bridge now included in place of Crossfire bridge! It can be done, and will be abso-freakinlutely awesome! And the tube should be transparent with red fluid clearly showing with nano-particles that reflect UV light, and if possible invent a fluid that changes color with temperature that will be so cool


----------



## RejZoR (Sep 13, 2014)

I guess i'll be buying Radeon again after all. I'll have to wait for some benchmarks, but the lack of interest from NVIDIA's side to actually boost any performance for next gen that was so friggin awesome to skip a model number in the end also means AMD won't be going mad with performance either (because there won't be any need to do so). Meaning we'll be friggin stagnating on GPU's segment yet another year or two. Yay.


----------



## fullinfusion (Sep 13, 2014)

GhostRyder said:


> Sorry had to make the joke when I saw that comment
> 
> I agree, my room heater is waiting for the cold winter nights.  I love not having to run the heater much in the winter (I am a Texas man so excuse me ).
> But a degree is a degree lower so it for sure helps for sure lol
> ...


Texas in the winter is like a mild summer day here... Man some ppl lol


----------



## bubbleawsome (Sep 13, 2014)

I used to live in Texas. One 500w PC kept half the house heated well.  My room was ~75 all winter.


----------



## fullinfusion (Sep 13, 2014)

bubbleawsome said:


> I used to live in Texas. One 500w PC kept half the house heated well.  My room was ~75 all winter.


Texas is like florida minus the big ass spiders lol.... @cadaveca you still have that picture when you were out in your shed benching when it was like -30C or better to show this lad what winter is about? lol


----------



## Sony Xperia S (Sep 13, 2014)

the54thvoid said:


> the lower performing 290X uses 47W more power



Honestly, I don't give a fuck about those +47 W.

Do you know why?

Because the time, I spend using the card at those consumption levels, is so little, that it will make a very negligible cost increase in my electricity bill. I am not greedy for several dozens cents, or even few euros.

And yes, you are right, there is a sweet spot where performance / power consumption is optimal and beyond that you spend more power for lower performance increase.

But do you really give a fuck or just for the sake of arguments?


----------



## the54thvoid (Sep 13, 2014)

Sony Xperia S said:


> Honestly, I don't give a fuck about those +47 W.
> 
> Do you know why?
> 
> ...



1) I owned a Titan and now a Classified 780Ti - the expense to me isn't relevant.
2) It's the overall ethical stand I'm taking - It's better that the tech industry moves to lower power devices, for all our sakes.
3) I'm not arguing about anything, just stating facts and my opinions.  
4) I'm happy you don't give a fuck, have a balloon and some candy.


----------



## 1d10t (Sep 13, 2014)

When people start arguing eff vs heat,there's no end to this.Take a look at CPU side,Intel got "more efficient" and "way faster" yet people blindly forgot they operates nearly twice temperatures AMD had.I mean come on, Intel CPU idles temperature is AMD load temperatures.
Same matters applied here.If AMD decide to slap some closed loop watercooling for theirs next single or dual or whatever GPU,it clearly indicates fan legacy are not viable option.Yes it will hot,but not as hell as if you've been there before.It's seem AMD *FIRST* move to embedding closed watercooling for their hig-end GPU's will continue year ahead...and likely followed by nVidia sooner or later 
I don't mind more power hungry or excessive heat,i just need the ability to do MSAA HBAO+SSAO multi monitor at reasonable prices


----------



## HumanSmoke (Sep 13, 2014)

1d10t said:


> I don't mind more power hungry or excessive heat


Just as well, since it's here to stay. Both AMD and Nvidia need GPGPU to survive - HSA, cloud computing/VDI, HPC, workstation. Nvidia built an empire from professional graphics, and while bang-for-buck might be OK for the consumer market (although in truth all it has done is allow AMD to retain parity), the pro markets are predicated upon high core count, high bandwidth parts. The pro markets also tend to pay the bills better than most other segments...5% of graphics unit sales account for 30% of discrete board revenue, and that percentage grows as the low end of the market gets gobbled up by IGPs. With OpenCL finally starting to see some kind of adoption, AMD don't really have any reason not to bifurcate their product line as Nvidia has done. Monolithic GPU for pro and high end consumer (where the large die and R&D is largely paid off by high ASP pro parts), and stripped down "good enough" mainstream/low end parts area-ruled for maximum yield and prioritized feature set.


----------



## Sony Xperia S (Sep 13, 2014)

the54thvoid said:


> It's the overall ethical stand I'm taking - It's better that the tech industry moves to lower power devices, for all our sakes.



Why do you think it's better? Because the current trend says so or you have another point about caring about the planet or so?

For the sake of it, it would be better if there are no devices at all which need electricity from the grid. 

Keep in mind, that not always lower temporary power consumption, would mean lower absolute power consumption in the end. Simply because when you make the calculation in the end, it might turn that you actually burnt more power using your device longer time.


----------



## RejZoR (Sep 13, 2014)

People think we spend more on electricity expenses because our systems use more power and produce more heat. Well that's dumb. I had my room radiators closed pretty much the entire winter because computer was entertaining me and also heating the room enough that i didn't really need to spend additional heating device which in turn requires resources to operate which at the end of the day cost money. Plus GeForce cards have always been more expensive, even if for tiny bit than Radeon cards so the stuff you might save in 5 years time of using the card was already wasted when you bought the more expensive but more power saving gfx card. So it doesn't really matter. The differences are too small to justify them in the long run because lets face it, most of us here don't have the same graphic card for 4-5 years. For me it's already long to have the same one for 2 years. And when i do have them for longer it's because graphic card vendors are dicking around with renaming old stuff and making tiny jumps and expecting us to pay full price every time. Well F that.


----------



## 1d10t (Sep 13, 2014)

HumanSmoke said:


> Just as well, since it's here to stay. Both AMD and Nvidia need GPGPU to survive - HSA, cloud computing/VDI, HPC, workstation. Nvidia built an empire from professional graphics, and while bang-for-buck might be OK for the consumer market (although in truth all it has done is allow AMD to retain parity), the pro markets are predicated upon high core count, high bandwidth parts. The pro markets also tend to pay the bills better than most other segments...5% of graphics unit sales account for 30% of discrete board revenue, and that percentage grows as the low end of the market gets gobbled up by IGPs. With OpenCL finally starting to see some kind of adoption, AMD don't really have any reason not to bifurcate their product line as Nvidia has done. Monolithic GPU for pro and high end consumer (where the large die and R&D is largely paid off by high ASP pro parts), and stripped down "good enough" mainstream/low end parts area-ruled for maximum yield and prioritized feature set.



From my point of of view,nVidia needed GPGPU badly.AMD already won all console market,competes greatly in iGPU segment and "always had" competitive prices in off board GPU.To me,HSA,Mantle or any other software development derivatives is just another extended capabilites AMD GPU's.They might useful,but will not add something to already been there.
For professional market,same things applies and there 2 kinds of firm,the one that depends on hardware and the one that rely on their creativity.A $750 Quadro K4000 may look better than $450 Firepro W5000  and might faster,but the outcome is nothing different.
Bottom line,it's your choice.Efficient it not a sole measurement and hardware doesn't reflects intelligence.     



RejZoR said:


> People think we spend more on electricity expenses because our systems use more power and produce more heat. Well that's dumb. I had my room radiators closed pretty much the entire winter because computer was entertaining me and also heating the room enough that i didn't really need to spend additional heating device which in turn requires resources to operate which at the end of the day cost money. Plus GeForce cards have always been more expensive, even if for tiny bit than Radeon cards so the stuff you might save in 5 years time of using the card was already wasted when you bought the more expensive but more power saving gfx card. So it doesn't really matter. The differences are too small to justify them in the long run because lets face it, most of us here don't have the same graphic card for 4-5 years. For me it's already long to have the same one for 2 years. And when i do have them for longer it's because graphic card vendors are dicking around with renaming old stuff and making tiny jumps and expecting us to pay full price every time. Well F that.



+1.


----------



## Recus (Sep 13, 2014)

Sony Xperia S said:


> I am against what you sGTXay and for what AMD want to do. Because ultimately higher power consumption equals to higher performance than if the former was lower.
> 
> I am positive towards aggressive performance jumps.



Cool story, friendo.






Wilkes consumes more power but is slower than Tsubame.



1d10t said:


> When people start arguing eff vs heat,there's no end to this.Take a look at CPU side,Intel got "more efficient" and "way faster" yet people blindly forgot they operates nearly twice temperatures AMD had.I mean come on, Intel CPU idles temperature is AMD load temperatures.
> Same matters applied here.If AMD decide to slap some closed loop watercooling for theirs next single or dual or whatever GPU,it clearly indicates fan legacy are not viable option.Yes it will hot,but not as hell as if you've been there before.It's seem AMD *FIRST* move to embedding closed watercooling for their hig-end GPU's will continue year ahead...and likely followed by nVidia sooner or later
> I don't mind more power hungry or excessive heat,i just need the ability to do MSAA HBAO+SSAO multi monitor at reasonable prices



GTX 480 is more power hungry, hotter, faster than HD 5870 but everyone said that HD 5870 is better. Explain.


----------



## vega22 (Sep 13, 2014)

yea! my amd cpu can idle bellow ambient mofo!!!!

stick that in your pipe and smoke it intel :rofl:


----------



## Sony Xperia S (Sep 13, 2014)

Recus said:


> GTX 480 is more power hungry, hotter, faster than HD 5870 but everyone said that HD 5870 is better. Explain.



GTX 480 had a lot of problems and some people even claimed that it was broken from the very beginning. Understand?

If everyone says something, it doesn't mean that it is true, of course in that case HD 5870 didn't have any of the problems of its counterpart.

But generally, human kind is so primitive exactly because of mutual agreements between most.


----------



## Recus (Sep 13, 2014)

Sony Xperia S said:


> GTX 480 had a lot of problems and some people even claimed that it was broken from the very beginning. Understand?
> 
> If everyone says something, it doesn't mean that it is true, of course in that case HD 5870 didn't have any of the problems of its counterpart.
> 
> But generally, human kind is so primitive exactly because of mutual agreements between most.



So high power usage and heat are problems to Nvidia but not for AMD?


----------



## the54thvoid (Sep 13, 2014)

Sony Xperia S said:


> GTX 480 had a lot of problems and some people even claimed that it was broken from the very beginning. Understand?
> 
> If everyone says something, it doesn't mean that it is true, of course in that case HD 5870 didn't have any of the problems of its counterpart.
> 
> But generally, human kind is so primitive exactly because of mutual agreements between most.



What is your native language Xperia? Your english is very good but it is structured in a 'foreign' way.  I'll read your posts as constructive due to a language difference, as opposed to you being 'eccentric'


----------



## 1d10t (Sep 13, 2014)

Recus said:


> GTX 480 is more power hungry, hotter, faster than HD 5870 but everyone said that HD 5870 is better. Explain.



Price? 
Would you like to pay $100 more for 21ºC







and extra 4.2 fps?




Nah...it's your call after all.I don't mind


----------



## Sony Xperia S (Sep 13, 2014)

the54thvoid said:


> What is your native language Xperia? Your english is very good but it is structured in a 'foreign' way.  I'll read your posts as constructive due to a language difference, as opposed to you being 'eccentric'



Yes, my native languages are different. But I thought it was correctly written. How should I structure the very same as native British?


----------



## the54thvoid (Sep 13, 2014)

Sony Xperia S said:


> Yes, my native languages are different. But I thought it was correctly written. How should I structure the very same as native British?



You structure it just fine - don't change it.


----------



## Steevo (Sep 13, 2014)

I hope they do, for competition sake, not that currently I could afford such a card, but it drives the price of all cards down, and makes all cards that don't reach such price/performance metrics seem silly, or underpowered. See Titan Black for prime example.


----------



## newconroer (Sep 13, 2014)

Steevo said:


> I hope they do, for competition sake, not that currently I could afford such a card, but it drives the price of all cards down, and makes all cards that don't reach such price/performance metrics seem silly, or underpowered. See Titan Black for prime example.


I don't agree with this Steve. Certain GPUs are out there just to pick-up filler sales to small niche of the market and consumer base - and an often ignorant one at that. 
Your example of Titan Black (or any Titan for that matter) is perfect. 
Between the option of SLI 680, 690, Titan, or Titan Black, the latter three are pointless, unless there's some physical or specific reason impeding you using 680 SLI.


----------



## arbiter (Sep 13, 2014)

Anyone think that 390x is even close to being out? since amd rls'ed 285 few weeks ago would be kinda stupid for them to drop 300 series card so close to it?


----------



## HumanSmoke (Sep 13, 2014)

arbiter said:


> Anyone think that 390x is even close to being out? since amd rls'ed 285 few weeks ago would be kinda stupid for them to drop 300 series card so close to it?


Well, according to Asetek's PR trumpeting, the cooling solution won't be out until next year, so if the 390X is AMD's answer to the GTX 980 it's either going to be verrrrrry late to the party or released with air cooling.....or, and this is probably the actual scenario - the 390X and its AIO is meant for a completely different market segment (vs the big Maxwell chip), and the GTX 980 analogue is actually the ~350-390mm^2 "Bermuda" chip as illustrated a few months ago by 3DCenter.

From my perspective it looks like AMD will be soldiering on with the 290/290X for a while yet. The only obvious imminent launch seems to be the fully enabled Tonga (R9 285X), which I'm guessing will drop once Nvidia show their hand and AMD tune the clocks.


1d10t said:


> and extra 4.2 fps?
> 
> 
> 
> ...


Why would you cherry pick a single benchmark to highlight the supposed difference between cards? If you went to the trouble of finding the GTX 480 review, surely you'd use the actual aggregated performance summary which would be a better indicator? - Oops, scrub that - it shows a 10% advantage rather than the 3.6% you're trying to depict at that resolution.


----------



## Steevo (Sep 14, 2014)

Hitman_Actual said:


> AMD=old n busted
> 
> Nvidia GTX980  = New hotness


Praytell fine sir, where might I purchase this new card?

Oh yes, much like how truly awesome flying unicorns are its currently in the land of fairy tales, where your dreams CAN come true, as long as you wish hard enough. 


Meanwhile, in reality, 


newconroer said:


> I don't agree with this Steve. Certain GPUs are out there just to pick-up filler sales to small niche of the market and consumer base - and an often ignorant one at that.
> Your example of Titan Black (or any Titan for that matter) is perfect.
> Between the option of SLI 680, 690, Titan, or Titan Black, the latter three are pointless, unless there's some physical or specific reason impeding you using 680 SLI.




Yes, just like many rebrands, ignorant people will blindly buy due to it being "new".  But those of us here, who read reviews, both for the brands we use, and those who compete and try to understand the strengths and weaknesses in each, should understand what and why, and shouldn't at least voice our opinions of mediocre overpriced hardware that does nothing to improve status quo, or to provide real world benefits to ourselves and mainstream consumers.


----------



## Chetkigaming (Sep 14, 2014)

28nm are busted, waiting for maxw refresh.


----------



## GhostRyder (Sep 14, 2014)

arbiter said:


> Anyone think that 390x is even close to being out? since amd rls'ed 285 few weeks ago would be kinda stupid for them to drop 300 series card so close to it?


Not really, AMD is waiting to see the cards first and then make their own adjustments on top of the fact the cooler itself is not slated for anything in 2014.  Even with the waiting its more or less not going to change much right now because with recent results it seems that the 980 will not be too much higher than the 780ti which would mean AMD can play the waiting game since the 290X is plenty to keep relevance (as long as price matches) while they work out something else.



Steevo said:


> I hope they do, for competition sake, not that currently I could afford such a card, but it drives the price of all cards down, and makes all cards that don't reach such price/performance metrics seem silly, or underpowered. See Titan Black for prime example.


Well we need some better prices as lately things have gotten out of control to get the top end.  Its at a point though where I believe people need to step up more and preach their problems with the price system more than anything since the biggest concern seems to be the same people whining many times about prices are the first ones purchasing said cards in high quantities (Referring to some of which are on this thread even).  I agree completely with you that the pricing is getting ridiculous for cards and things like the Titans were just ridiculous beyond belief.

I can see a bright light hopefully the price of these next generation of cards will not be to crazy.


----------



## 1d10t (Sep 14, 2014)

HumanSmoke said:


> Why would you cherry pick a single benchmark to highlight the supposed difference between cards? If you went to the trouble of finding the GTX 480 review, surely you'd use the actual aggregated performance summary which would be a better indicator? - Oops, scrub that - it shows a 10% advantage rather than the 3.6% you're trying to depict at that resolution.



I'm too lazy to dug deeper,it's already on the same page 
Let start over again...Would you like to pay $100 more to have astonishingly 96ºC








or pay less in trade of 10% slower?That's like 90 fps vs 100 fps...10 fps faster,or more likely 54 fps vs 60 fps....6 fps faster 
Man,this whole percentage thingy make them look good on paper 



Steevo said:


> Praytell fine sir, where might I purchase this new card?
> Oh yes, much like how truly awesome flying unicorns are its currently in the land of fairy tales, where your dreams CAN come true, as long as you wish hard enough



He's just trolling.Like i stated before...



1d10t said:


> hardware doesn't reflects intelligence


----------



## HumanSmoke (Sep 14, 2014)

1d10t said:


> I'm too lazy


I don't think I believe you. You can't be bothered including representative data when its sitting right in front of you after searching the database for two separate reviews and locating three separate pages within those two separate reviews. Colour me sceptical at the very least.
As for the GTX 480 and HD 5870 the cards themselves don't really interest me so best you don't try to derail the thread too much. The point being made is that the GF100 offered _marginally_ more performance for worse power and heat...3-4 years on, the GK110 offers marginally more performance for _better_ power and heat - so it is less about individual cards than a developing trend and how priorities for what is good or bad are dependant upon who makes the card.

The point is, if for example 90+°C is bad for the GTX 480, then the same is true for any other card that hits those temps. Noise, cost and any features tend to be subjective to the user so while they may be the subject of discussion they won't hold a universal value for everyone - the same can be said for heat production also if the user puts the cards underwater, or is happy so long as the components don't exceed the IC thermal threshold limit.
The only real difference - apart from cost - is that Nvidia cut its teeth on large GPUs that leveraged their software ecosystem. They started out big and hot (G80 -> GT200 -> GF100/110 -> GK110) and power/heat have been fairly consistent factors in the last 8 years - and an easy target, especially given that AMD pursued a small die policy after the R600. Now that compute has suddenly become something to have, and AMD's GPUs have become more complex (and steadily larger), the same people who quite happily pointed out the disparity between the 144W of Cypress and the 257W of GF100, and the (smaller) gap between the 185W of the HD 6970 and 226W of the GF110, have suddenly gone silent when the 290X pulls 271W against the 269W of the 780 Ti. I'd point out that this isn't obviously a one way street since Performance-per-watt wasn't a big talking point for people defending Fermi but suddenly became a must have feature with Kepler and now Maxwell. Those people aren't any more immune from criticism than the hypocrites on the other side of the imaginary divide. The only real remedy for being taken to task is to apply the same metrics as they apply to all GPUs irrespective of vendor.


----------



## 1d10t (Sep 14, 2014)

HumanSmoke said:


> I don't think I believe you. You can't be bothered including representative data when its sitting right in front of you after searching the database for two separate reviews and locating three separate pages within those two separate reviews. Colour me sceptical at the very least.



I make a conclusion in my earlier post..duh 



HumanSmoke said:


> As for the GTX 480 and HD 5870 the cards themselves don't really interest me so best you don't try to derail the thread too much. The point being made is that the GF100 offered _marginally_ more performance for worse power and heat...3-4 years on, the GK110 offers marginally more performance for _better_ power and heat - so it is less about individual cards than a developing trend and how priorities for what is good or bad are dependant upon who makes the card.



So its my fault then...


Recus said:


> GTX 480 is more power hungry, hotter, faster than HD 5870 but everyone said that HD 5870 is better. Explain.



Yeah..pretty much 3 star n00bs fault for responding  



HumanSmoke said:


> The point is, if for example 90+°C is bad for the GTX 480, then the same is true for any other card that hits those temps. Noise, cost and any features tend to be subjective to the user so while they may be the subject of discussion they won't hold a universal value for everyone - the same can be said for heat production also if the user puts the cards underwater, or is happy so long as the components don't exceed the IC thermal threshold limit.



As you said,this is a matter of subjective so your opinion is yours only.I never had and never tested R290X before so i know nothing about it.I'm just happy to have two R9 290,$250 more cheaper than GTX 780 or $500 if you SLI'ed them,faster in multimonitor when you crank details up,in exchange of excessive heat.And please don't talk about power consumption as 10W could make any difference .


----------



## eidairaman1 (Sep 14, 2014)

I like hearing news of the 290x price slash now i Can get a Sapphire 290X VaporX.


----------



## Toothless (Sep 14, 2014)

With all this talk of energy usages and comparing GPUs that aren't even out yet sure makes me wonder why my laptop raised the bill $40 last month..

Being serious. I'm happy that new GPUs are coming out around the time I should be starting to work. New cool stuff to play with. I've used Intel, AMD, and Nvidia GPUs (Intel with their uber stronk integrated ) and it really comes down to what you like best. I like Nvidia because the frames are smoother and "normally" use less power, while AMD likes to shoot their GPUs with all this wattage and go for with blunt force for higher fps. Though I like AMD for their attempts at making some smexy red and black cards. Mmmm those colors are so nice together.


----------



## newconroer (Sep 14, 2014)

Steevo said:


> Yes, just like many rebrands, ignorant people will blindly buy due to it being "new".  But those of us here, who read reviews, both for the brands we use, and those who compete and try to understand the strengths and weaknesses in each, should understand what and why, and shouldn't at least voice our opinions of mediocre overpriced hardware that does nothing to improve status quo, or to provide real world benefits to ourselves and mainstream consumers.



That's exactly right. The more knowledge we gain and share, the less people will accept rebadges and incremental gains on what is classed as 'new' hardware.


----------



## bubbleawsome (Sep 14, 2014)

I'll admit, nvidia has always given me a smoother experience. But who cares, new cards! I don't see what everyone's problem is with this.


----------



## Steevo (Sep 14, 2014)

Empiracle evidence based on a single users subjective experiance, I like it. While no one else may share your observations you have properly written a disagreeing comment while adding value to the thread. Have a updog.


----------



## takomako (Sep 14, 2014)

noob question but will this beast work on pcie 2?


----------



## GhostRyder (Sep 14, 2014)

takomako said:


> noob question but will this beast work on pcie 2?


Yea it will hence why many people are still running sandy-bridge systems with PCI-E 2.0 on new cards with no problems (Since the CPU is more than enough for most games).


----------



## takomako (Sep 14, 2014)

GhostRyder said:


> Yea it will hence why many people are still running sandy-bridge systems with PCI-E 2.0 on new cards with no problems (Since the CPU is more than enough for most games).



i have fx 8350. however i wanted to know if its going to work 100% due to the difference of the generations.


----------



## eidairaman1 (Sep 15, 2014)

takomako said:


> i have fx 8350. however i wanted to know if its going to work 100% due to the difference of the generations.



Yes it will work just fine


----------



## GhostRyder (Sep 15, 2014)

takomako said:


> i have fx 8350. however i wanted to know if its going to work 100% due to the difference of the generations.


Yes it will, I just meant that is why people are not shooting for new products all the time since PCIE 2.0 is not fully saturated at even 8x speeds.  The pcie 3.0 cards work just fine in a 2.0 slot and vice versa.


----------



## Casecutter (Sep 15, 2014)

HumanSmoke said:


> Good luck with that _speculation_.


You missed it Sonny... that was a joke!


----------



## Serpent of Darkness (Sep 15, 2014)

I think R9-390x will have a release date similar to R9-290x which was some time in early-mid November 2013.   NVidia will probably have a GTX 980 Ti out before December too.  I can't see AMD ignoring the profit gains from the Winter Sales.

GTX 980 is roughly a 10% increase over GTX 780 Ti and a 11% increase in over R9-290x.  If we are to assume this is true,  AMD doesn't need to put a lot into the R9-390x to out perform GTX 980, and the R9-390x may not even be the contender for GTX 980.  The new contender for R9-390x, or same tier product, is GTX Titan Black_2/GTX 980 Ti.   So R9-380 and 380x will probably act as the equivalent to GTX 980 reference.

If AMD follows the same trend, it's 28nm GPUs will have a 30%-50% increase in the amount of Streaming Processors.  GCN 2.1...  It will have the same or higher TDP, and a newer cooler to increase heat and reduce temperatures.  Clock Speeds will probably boost up to 1100 to 1150 mhz.

Come in March 2015, NVidia will probably have a GTX Titan out, and AMD will have the R9-390x2 out in May.


----------



## anubis44 (Sep 22, 2014)

btarunr said:


> Nah, it's a new chip. Fiji.
> 
> 3840 GCN1.2 "Pirate Islands" cores, 384-bit GDDR5, 48 ROP, 6 GB, Ijustmadethatup, and didI?.



It's likely to have 64 ROPS, not 48, since the Hawaii core has 64 ROPS.


----------



## Nihilus (Sep 29, 2014)

The R9 390x looks like Robocop's dick.


----------



## Steevo (Sep 30, 2014)

LOL. It had better be metal, hard and unstoppable and keep em coming if they expect anyaction in the face of the 970/980's


----------



## Mombasa69 (Dec 11, 2014)

Oh dear here we go again all the children screaming at each other to what's the best GPU, same old nonsense I've been reading on forums for the last freaking ten years, just change the name of the GPU in the comments.

LOL AMD and NVIDIA are LAUGHING ALL THE WAY TO THE BANK, MUPPETS!


----------



## newconroer (Dec 11, 2014)

Mombasa69 said:


> Oh dear here we go again all the children screaming at each other to what's the best GPU, same old nonsense I've been reading on forums for the last freaking ten years, just change the name of the GPU in the comments.
> 
> LOL AMD and NVIDIA are LAUGHING ALL THE WAY TO THE BANK, MUPPETS!



Sadly yes they are. The amount of revisions, renames, rebadges, and all the while they are already light years ahead in both capability and manufacturing, but they sand bag to play the market game.
The same is true of Intel and AMD.

The only slack I give them is the benefit of the doubt that manufacturing that involves other companies creates efficiency and visibility issues on what's happening with your products.
The market needs a small and inclusive company that handles every bit of the GPU product from start to finish.


----------

