# AMD Fiji XT R9 390X Specs Leak



## birdie (Nov 11, 2014)

Instead of a thousand words.

In short:

4096 streaming processors.
4GB HBM RAM with over 500GB/sec bandwidth.
Core: 1GHz
RAM: 1.25GHz
TSMC: 20nm (kinda implausible)


----------



## Schmuckley (Nov 11, 2014)

What makes you think that's the new GPU?
..and Why is it being tested on a Crosshair IV?
It doesn't make much sense to me.


----------



## manofthem (Nov 11, 2014)

AMD needs to pull a rabbit out of their hat; I want to see some power efficiency or I'm going to Nvidia


----------



## GhostRyder (Nov 11, 2014)

birdie said:


> Instead of a thousand words.
> 
> In short:
> 
> ...


Very peculiar indeed, I am interested to see a little more information about this because these choices seem to be different from what I was expecting.


----------



## Dent1 (Nov 11, 2014)

manofthem said:


> AMD needs to pull a rabbit out of their hat; I want to see some power efficiency or I'm going to Nvidia



Power efficiency says the person running 2x R9 290's


----------



## happita (Nov 11, 2014)

Dent1 said:


> Power efficiency says the person running 2x R9 290's



All the more reason for people like him to complain about AMD's power efficiency.


----------



## chinmi (Nov 11, 2014)

Is power efficiency with AMD really that bad? How much us$ per month difference in electrical bill compared with using Nvidia card?


----------



## Ebo (Nov 11, 2014)

I dont care about the electricity bill, i just use it, thats what its there fore. I never turn off my machines, and some lights never go dark, i couldnt care less. Lets say the difference was 100 kilowatt a year, it wouldnt bother me at all, I would use the saved power on some other gadget instead, who cares.


----------



## 64K (Nov 11, 2014)

chinmi said:


> Is power efficiency with AMD really that bad? How much us$ per month difference in electrical bill compared with using Nvidia card?



It depends on where in the USA you live and how many cards you're running but to answer your question in general. If your AMD card was using 150 watts more than the Nvidia equivalent card. The average cost in the US for a kWh is 12 cents. If you game for an average of 20 hours a week then your monthly electric bill is going to go up ~$1.56

So, not much.


----------



## the54thvoid (Nov 11, 2014)

Ebo said:


> I dont care about the electricity bill....



Good thing the industry does.  Lower power processing requirements are currently driving most new developments - everything is about power reduction as portability becomes more and more relevant.  But this is irrelevant, this topic is rumour - as much as the Titan II one is on the news section of TPU.  There's nothing in this OP that suggests anything about power draw - we don't know what architecture it is.  This could be more efficient than Hawaii - we don't know.

Let's hope the 390x is a Maxwell killer - otherwise we're facing more hideous prices from Nvidia's top range.


----------



## Mathragh (Nov 11, 2014)

chinmi said:


> Is power efficiency with AMD really that bad? How much us$ per month difference in electrical bill compared with using Nvidia card?


The powerconsumption itself usually isn't the problem. The results for a system when it comes to cooling, noise, space and powerdelivery however are usually more significant.

If this card has higher powerdraw, but also uses an all-in-one cooling solution then AMD trades in worse powerdelivery and space characteristics for better cooling, noise and presumably performance. Imho that isn't a bad trade-off to make when you're at the limits of what you can currently do technology and architecture wise.

Regarding electricity costs vs poweruse: unless you're a really hardcore gamer who does little else on his/her PC, idle poweruse is way more important than load poweruse. AMD has traditionally done quite well for single monitor idle powerconsumption, but multimonitor and media consumption have done worse.


----------



## Franzen4Real (Nov 11, 2014)

I don't think it is the cost difference of a higher powered card that people care about... it's the heat and noise levels to deal with.

edit: as Mathragh posted at the same time


----------



## Fluffmeister (Nov 11, 2014)

Indeed, I also prefer my room not to become a sauna when I game.


----------



## manofthem (Nov 11, 2014)

The power efficiency comes into play when you run your card(s) quite often. Since my aim has been geared toward crunching and folding, power has become a concern for me.

Running my CPU and gpus 24/7 adds up quickly, pulling about ~550+ watts, and that's just one rig. So yes, power is a concern but not really for mere gaming. If gaming were my only concern, I wouldn't be worried at all.


Thus, I would like to see AMD do better on their power efficiency on the cards, as many others would too.


----------



## INSTG8R (Nov 11, 2014)

I like how everyone forgets about Zero Core...


----------



## the54thvoid (Nov 11, 2014)

INSTG8R said:


> I like how everyone forgets about Zero Core...



Is good for long idle only.  Doesn't help when you are in 3d applications or gaming.  But again - 390x has no mention of power usage so any discussion of energy usage is speculation - which has been thrashed to death.


----------



## Fluffmeister (Nov 11, 2014)

Keep cool, keep all the power....


----------



## INSTG8R (Nov 11, 2014)

the54thvoid said:


> Is good for long idle only.  Doesn't help when you are in 3d applications or gaming.  But again - 390x has no mention of power usage so any discussion of energy usage is speculation - which has been thrashed to death.



Still saving power tho isn't it? NV has nothing even close. It doesn't take that long to kick in either.


----------



## TheGuruStud (Nov 11, 2014)

So, what you're saying by the specs is that I gotta have it?

You'd be damn right. Nvidia can eat a dong. I stuck a 660ti in a buddie's build b/c it was cheap and it's pretty sad (stuttering and crashes). Nvidia drivers are shit for older games. Works fine on new games. I see where their priority is (benchmarks).


----------



## GhostRyder (Nov 11, 2014)

the54thvoid said:


> Is good for long idle only.  Doesn't help when you are in 3d applications or gaming.  But again - 390x has no mention of power usage so any discussion of energy usage is speculation - which has been thrashed to death.


Well that's the usual of discussions regarding the 390X at all:
Person 1: Hey did you hear about the (Insert new rumor), sounds cool right and could be nice for the new 390X?
Person 2: Yea but the 290X uses electricity and was a bit hotter which is all that matters.
Person 1: But what does that have to do with the 390X, is that relevant?

Thus the conversation goes on to be an off topic discussion when that happens.



INSTG8R said:


> I like how everyone forgets about Zero Core...





INSTG8R said:


> Still saving power tho isn't it? NV has nothing even close. It doesn't take that long to kick in either.


Well Zero Core is for idle and kicks in making power usage very minimal which is very obvious and works.  Nvidia does however have an equivalent which makes them all pretty much on par.



Fluffmeister said:


> Keep cool, keep all the power....


Dude we get it your a fanboy and obviously do not care about the 390X so you can stop.



64K said:


> It depends on where in the USA you live and how many cards you're running but to answer your question in general. If your AMD card was using 150 watts more than the Nvidia equivalent card. The average cost in the US for a kWh is 12 cents. If you game for an average of 20 hours a week then your monthly electric bill is going to go up ~$1.56
> 
> So, not much.


Yep but some places have really high electricity especially some countries which have more than triple normal costs in the U.S. which could factor in so I can understand some of what said (Though at that point my argument is why get a high end card if your that concerned because obviously your money could be used better elsewhere).




the54thvoid said:


> Good thing the industry does.  Lower power processing requirements are currently driving most new developments - everything is about power reduction as portability becomes more and more relevant.  But this is irrelevant, this topic is rumour - as much as the Titan II one is on the news section of TPU.  There's nothing in this OP that suggests anything about power draw - we don't know what architecture it is.  This could be more efficient than Hawaii - we don't know.
> 
> Let's hope the 390x is a Maxwell killer - otherwise we're facing more hideous prices from Nvidia's top range.


I am with you on that, better efficiency also is great for the mobile market which could mean better laptop GPU's among other devices.


----------



## Fluffmeister (Nov 11, 2014)

GhostRyder said:


> Dude we get it your a fanboy and obviously do not care about the 390X so you can stop..



Just pointing out AMD care about power consumption, no need to get your knickers in a twist sweetness.

Although they are clearly full of shit too.


----------



## 64K (Nov 11, 2014)

GhostRyder said:


> Yep but some places have really high electricity especially some countries which have more than triple normal costs in the U.S. which could factor in so I can understand some of what said (Though at that point my argument is why get a high end card if your that concerned because obviously your money could be used better elsewhere).



Yeah, I know. I have seen someone say they pay 40 cents per kWh in a European country iirc. I was just responding to the guy who asked how much US $ it would cost.


----------



## Mathragh (Nov 11, 2014)

Fluffmeister said:


> Just pointing out AMD care about power consumption, no need to get your knickers in a twist sweetness.
> 
> Although they are clearly full of shit too.


They care as much as Nvidia does.
The roles currently seem the reverse of what they were when that video came out however. Back then you had the GTX480 which for its time used lots of power while AMD had the 5870 which was comparable in performance and was 1,5 times as powerefficient on the same process.
AMD back then made use of the situation by producing that video. This time around Nvidia seems to refrain from stuff like that however, while also lowering the prices of their card way more than they'd have to based on product positioning and demand.


----------



## manofthem (Nov 11, 2014)

Can't we discuss AMD, Nvidia, power, etc without insults and name calling? 

Boy I regret my initial comment in this thread


----------



## GhostRyder (Nov 11, 2014)

64K said:


> Yeah, I know. I have seen someone say they pay 40 cents per kWh in a European country iirc. I was just responding to the guy who asked how much US $ it would cost.


I know you knew and was speaking specifically at that person I was just making a general comment.



Mathragh said:


> They care as much as Nvidia does.
> The roles currently seem the reverse of what they were when that video came out however. Back then you had the GTX480 which for its time used lots of power while AMD had the 5870 which was comparable in performance and was 1,5 times as powerefficient on the same process.
> AMD back then made use of the situation by producing that video. This time around Nvidia seems to refrain from stuff like that however, while also lowering the prices of their card way more than they'd have to based on product positioning and demand.


Yep but the problem is what people view as important in the moment which reverses quite often.

I have to say though the 390X only having 4gb of ram maybe a little disappointing to me as I was hoping for more but then again the new compression method makes all previous ram amounts bigger which could be pretty nice either way.  I do hope power consumption goes down but ill take heavy performance numbers especially at high resolutions over super low efficiency for my needs (Though I would prefer both to come in one package).


----------



## Fluffmeister (Nov 11, 2014)

Mathragh said:


> They care as much as Nvidia does.
> The roles currently seem the reverse of what they were when that video came out however. Back then you had the GTX480 which for its time used lots of power while AMD had the 5870 which was comparable in performance and was 1,5 times as powerefficient on the same process.
> AMD back then made use of the situation by producing that video. This time around Nvidia seems to refrain from stuff like that however, while also lowering the prices of their card way more than they'd have to based on product positioning and demand.



Absolutely, and as then like now, even stated in this very thread, if the performance is there, people will buy a card regardless of the power consumption and for that matter price.

If people don't like the video they should take it up with AMD.


----------



## Sasqui (Nov 11, 2014)

manofthem said:


> Can't we discuss AMD, Nvidia, power, etc without insults and name calling?
> 
> Boy I regret my initial comment in this thread



Eh, someone had to say it   I'm sticking with my power gobbling 290x's for a few years.  Until then, AMD does have their work cut out for them


----------



## Ferrum Master (Nov 11, 2014)

If it is really 20nm, I am sold, at last a real upgrade... old school, brute and guarantees better specs without much thinking. 

Otherwise I don't care especially power wise really, just as most of us. It just has became a popular archetypal argument to use in tech forums. It does more power, yes, do we actually can feel it, NO, open the darn window open if you actually are anemic and live in a matchbox sized flat. You feel 50-100W more difference in the room? Quad SLI and CFX users order some ice cubes then to sit on. The seconds is the driver bashing, despite the fact, both have problems, always the same critters whine and whine about obscure problems, and I bet 90% are hardware faults. RMA your PC. 

Philosophers sigh...


----------



## Schmuckley (Nov 11, 2014)

No one questions why the motherboard used is a Crosshair 4?
Who's to say it's not one of these? 
http://www.newegg.com/Product/Product.aspx?Item=N82E16814103131


----------



## Ferrum Master (Nov 11, 2014)

Schmuckley said:


> No one questions why the motherboard used is a Crosshair 4?
> Who's to say it's not one of these?
> http://www.newegg.com/Product/Product.aspx?Item=N82E16814103131



Do your math about streaming processors yourself.


----------



## erocker (Nov 11, 2014)

the54thvoid said:


> Good thing the industry does.  Lower power processing requirements are currently driving most new developments - everything is about power reduction as portability becomes more and more relevant.  But this is irrelevant, this topic is rumour - as much as the Titan II one is on the news section of TPU.  There's nothing in this OP that suggests anything about power draw - we don't know what architecture it is.  This could be more efficient than Hawaii - we don't know.
> 
> Let's hope the 390x is a Maxwell killer - otherwise we're facing more hideous prices from Nvidia's top range.


Industry and consumer are two different things. Unless it's something ridiculous, I don't care about power consumption either.


----------



## birdie (Nov 12, 2014)

Fudzilla says 390X will also be on a 28nm node.

It's all rumors, of course, but if it's true, it kinda sucks since there will be no 20nm GPUs in 2015, and we'll have to wait until 2016 before we get something on a 14/16nm process.


----------



## yogurt_21 (Nov 13, 2014)

so they thinking 7990 in single gpu form? interesting.


----------



## GhostRyder (Nov 13, 2014)

birdie said:


> Fudzilla says 390X will also be on a 28nm node.
> 
> It's all rumors, of course, but if it's true, it kinda sucks since there will be no 20nm GPUs in 2015, and we'll have to wait until 2016 before we get something on a 14/16nm process.


Not sure if I believe it mostly because it would make the idea of AMD holding out on releasing a new GPU all the more fruitless because it would also be a 28nm chip.  The whole point I had read was to wait for the 20nm to be completely ready for them so they could undercut things with it.  Problem is that if this is true then that would all have been in vain and waiting like this to be completely pointless (unless of course the Fiji chips really are that far behind).

But I am speaking my opinion on that and nothing more.


----------



## Ebo (Nov 15, 2014)

64K said:


> It depends on where in the USA you live and how many cards you're running but to answer your question in general. If your AMD card was using 150 watts more than the Nvidia equivalent card. The average cost in the US for a kWh is 12 cents. If you game for an average of 20 hours a week then your monthly electric bill is going to go up ~$1.56
> 
> So, not much.



Where we live in Europe, we pay 34 cents pr. KW and that actually cheap since we get a 25% deduction due to yearly use, i just checked our prices a few secs ago. To reduce power is in all peoples interest, but for a computer running 24/7/365 it dosent really matter taken over a whole year. 
Like written in the qoute price difference is 1.56 a week, then do that for 52 weeks to get a year, that gives you roughly 80 dollars. I dunno about you, but I dont have to time to care about 80 dollars for a whole year, it simply dosent matter. Where we live the difference would be roughly 3 times higher, and again 240 dollars in a year, comon, it still dosent matter.


----------



## 64K (Nov 15, 2014)

Ebo said:


> Where we live in Europe, we pay 34 cents pr. KW and that actually cheap since we get a 25% deduction due to yearly use, i just checked our prices a few secs ago. To reduce power is in all peoples interest, but for a computer running 24/7/365 it dosent really matter taken over a whole year.
> Like written in the qoute price difference is 1.56 a week, then do that for 52 weeks to get a year, that gives you roughly 80 dollars. I dunno about you, but I dont have to time to care about 80 dollars for a whole year, it simply dosent matter. Where we live the difference would be roughly 3 times higher, and again 240 dollars in a year, comon, it still dosent matter.



I'm not sure that I'm understanding you Ebo. If you are paying 34 cents per kWh and use an AMD card that might draw 150 watts more than an Nvidia equivalent and game ~20 hours a week on average then it would increase your electric bill by $4.42 a month or $53 per year.


----------



## yogurt_21 (Nov 17, 2014)

64K said:


> I'm not sure that I'm understanding you Ebo. If you are paying 34 cents per kWh and use an AMD card that might draw 150 watts more than an Nvidia equivalent and game ~20 hours a week on average then it would increase your electric bill by $4.42 a month or $53 per year.


he used the monthly rate increase as the weekly rate increase. Still though 50$ is a new game. So there's that. Imo the power consumption goes out the window when you compare it to A/C, electric heaters, electric dryers, pool pumps, household lighting/fans, etc.  

13 kWh is nothing when you're normally bulled for over 1200. So while power is a consideration it won't make or break my choice. I will grant though that I'd have likely gotten a 970 rather than an R9 290 had they been available when I was buying in July.


----------



## Asourcious (Nov 17, 2014)

I hate it when people completely freak out when I recommend an AMD CPU or GPU and they say"OMG IF U BUY TEH AMD UR GUNNA SPEND MILLIONS OF DOLLARS ON POWAR AND U GUNNA HAVE TO GET A 1500 WATT POWAH SUPPLY BRO" Anyone else had this happen to them?


----------



## GreiverBlade (Nov 17, 2014)

Asourcious said:


> I hate it when people completely freak out when I recommend an AMD CPU or GPU and they say"OMG IF U BUY TEH AMD UR GUNNA SPEND MILLIONS OF DOLLARS ON POWAR AND U GUNNA HAVE TO GET A 1500 WATT POWAH SUPPLY BRO" Anyone else had this happen to them?


well ... exactly the word of a friend ... when i took my 290 ... he run a 780 and has a 850w PSU ... i run a 290 and have a 650w PSU ... we both have the same MVIIR and i5-4690K ... you get the idea? and yes my 290 at 1000/1300 outperform his stock 780 in nearly any test we ran ... he has 200w more but doesn't OC his 780


----------



## Fluffmeister (Nov 17, 2014)

Asourcious said:


> I hate it when people completely freak out when I recommend an AMD CPU or GPU and they say"OMG IF U BUY TEH AMD UR GUNNA SPEND MILLIONS OF DOLLARS ON POWAR AND U GUNNA HAVE TO GET A 1500 WATT POWAH SUPPLY BRO" Anyone else had this happen to them?



Definitely, and now we can finally confirm the GTX 480 hate was just fanboyism too.


----------



## GhostRyder (Nov 17, 2014)

GreiverBlade said:


> well ... exactly the word of a friend ... when i took my 290 ... he run a 780 and has a 850w PSU ... i run a 290 and have a 650w PSU ... we both have the same MVIIR and i5-4690K ... you get the idea? and yes my 290 at 1000/1300 outperform his stock 780 in nearly any test we ran ... he has 200w more but doesn't OC his 780


Well of course power consumption is the new hot topic right now and will stay that way probably for another 2-3 months depending on the future release.



64K said:


> I'm not sure that I'm understanding you Ebo. If you are paying 34 cents per kWh and use an AMD card that might draw 150 watts more than an Nvidia equivalent and game ~20 hours a week on average then it would increase your electric bill by $4.42 a month or $53 per year.


Yea its not enough to really shake a fist at...Most people would not notice a difference in their electric bill (Of course depending on location and in some of the high cost areas) between some of the high end cards.  I also still stand on the point that if electricity is a problem you should probably look more towards a lower video card instead of the top anyways as that is a much better way to save money on all fronts especially seeing how many cards will 1080p Ultra game even on the lower ends.

Either way Fiji will be the thing that either ends this debate or causes it to continue depending on how much better it is (or if) and how much power it uses.


----------



## the54thvoid (Nov 17, 2014)

Fluffmeister said:


> Definitely, and now we can finally confirm the GTX 480 hate was just fanboyism too.



So true.  

It will be interesting to see if the news that Nvidia reworked the silicon for GK210 for their newest compute part means that any further Maxwell development is 'distant'.  Or was it just far cheaper for them to refine a known quantity and keep GM200 held back for consumer later on?  This is relevant for the 390X as it could mean perhaps GM200 isn't as good as people are hoping, maybe the 390X will surprise us all in a nice way?

Time will tell.  In the meantime, let the misguided brand loyalists spout their nonsense, be it green or red.


----------



## HumanSmoke (Nov 18, 2014)

the54thvoid said:


> It will be interesting to see if the news that Nvidia reworked the silicon for GK210 for their newest compute part means that any further Maxwell development is 'distant'.  Or was it just far cheaper for them to refine a known quantity and keep GM200 held back for consumer later on?


Parallel development I believe. The Kepler development team and the Maxwell team aren't a unified research pool as I understand it. The Kepler team will move to Pascal, the Maxwell team to Volta. Obviously with sharing of information and movements within the teams.
GK 210 looks like a natural development of GK 110 ( area-ruled refinements, doubled L1 cache and register file etc.), and was probably taped out around the same time (April) as GM 200. GK210 might be also be a "Plan B", but I'm thinking that might also be a way to strengthen the existing Kepler based enterprise market (higher GPU density per rack, variable load programming through dynamic boost etc.). Í don't think its a coincidence that the K80 is being aimed at forestalling Xeon Phi adoption judging by the launch partners.


the54thvoid said:


> This is relevant for the 390X as it could mean perhaps GM200 isn't as good as people are hoping, maybe the 390X will surprise us all in a nice way?


Anything is possible, although the 390X and it's Fiji-based FirePro (which are probably more relevant given AMD's HSA compute commitment) aren't going to cure too many woes. Power usage shouldn't be an issue to the vast majority of potential users, but it is indicative of a company buying into the compute-at-any-cost ethos that Nvidia pioneered with the G80, GT200, GF100. Where Nvidia seem to be actively bifurcating their product line, I'm more interested to see how AMD's Bermuda pans out. After all, large die/large power budget is fine for enthusiasts and enterprise, but it doesn't make for a healthy profit line when priced down the consumer product stack., and for my money, AMD desperately need a second-tier chip that is a GM204 competitor, especially for the mobile market since the second tier GPU becomes the halo product in the laptop market.


the54thvoid said:


> Time will tell.  In the meantime, let the misguided brand loyalists spout their nonsense, be it green or red.


Of which, 99.99% won't ever buy the Fiji or GM200....at least, not while it is the current flagship.


----------



## midnightoil (Nov 18, 2014)

Don't see why people are so fixated on 20nm ... HBM is a much bigger advancement than a node drop.

20nm will be 'ready' for this ... however it depends whether AMD designed the new series to work on a 'low power' process or not ... because there is no high power 20nm bulk process anywhere ... TSMC cancelled theirs, and with it went any possibility of NVIDIA 20nm, and maybe AMD too.


----------



## Blue-Knight (Nov 18, 2014)

64K said:


> If you game for an average of 20 hours a week then your monthly electric bill is going to go up ~$1.56
> 
> So, not much.


Let's say I let a computer run x hours a day for 1 month with 150 watts difference from another machine with AMD hardware:

Energy cost here, ¢/kWh = $0.32
12 hours = +$17.28 (or $207,36 /year)
16 hours = +$23.04 (or $276,48 /year)
24 hours = +$34.56 (or $414,72 /year)

That would be a great impact on my monthly bill. I pay about $60 /month. Power inefficiency costs a lot. :O


----------



## RealNeil (Nov 18, 2014)

I'll wait for the actual release before I get excited about it.
Power is pretty cheap where I live, so 150W doesn't bring the bill up noticeably.

I'll just continue to buy the best that I can afford at the time. Today's GPUs are already pretty freakin' nice,.........




Oh, and I like this little runner.


----------



## eidairaman1 (Nov 18, 2014)

Sonic 1/2.



RealNeil said:


> I'll wait for the actual release before I get excited about it.
> Power is pretty cheap where I live, so 150W doesn't bring the bill up noticeably.
> 
> I'll just continue to buy the best that I can afford at the time. Today's GPUs are already pretty freakin' nice,.........
> ...


----------



## Lionheart (Nov 18, 2014)

manofthem said:


> Can't we discuss AMD, Nvidia, power, etc without insults and name calling?
> 
> Boy I regret my initial comment in this thread



This site used to be like that, but as of lately the fanboys & trolls are ruining this site


----------



## Scrizz (Nov 18, 2014)

Lionheart said:


> This site used to be like that, but as of lately the fanboys & trolls are ruining this site


yep... all the kids rolling in as computers become more popular. :/


----------



## Dbiggs9 (Nov 18, 2014)

I am a green Team fan boy! but i might buy one of these.


----------



## AsRock (Nov 18, 2014)

erocker said:


> Industry and consumer are two different things. Unless it's something ridiculous, I don't care about power consumption either.



I do but buying a new card to replace a ( or 2 ) 290(x)  because of it is totally not worth it in the end as your not going see the savings for years to come.

Just been playing FC3 for the last 45 minutes without any issue at all.


----------



## GhostRyder (Nov 18, 2014)

RealNeil said:


> I'll wait for the actual release before I get excited about it.
> Power is pretty cheap where I live, so 150W doesn't bring the bill up noticeably.
> 
> I'll just continue to buy the best that I can afford at the time. Today's GPUs are already pretty freakin' nice,.........
> ...


Power generally is like that even in some of the pricey areas mostly because unless you run it non-stop at those levels for extreme lengths of time it won't adjust your bill much.  Though we should encourage lower power consumption its not the end all that should be the only focus.



Lionheart said:


> This site used to be like that, but as of lately the fanboys & trolls are ruining this site


Same with a lot of sites as of late.

Fiji is going to be interesting especially because of the AIO reference design and these specs.  I think based on the basis of specs and the shrink Fiji is more aimed beyond the smaller GM chips and aimed at the possible Titan-II more than anything.  But that is a straight up speculation/opinion just based on the bare minimum knowledge of the card we have.


----------



## RejZoR (Nov 18, 2014)

This is the reason why i'm holding of with purchase of GTX 970. It can turn out that R9-390 will be a killer, but even if it won't be, it will force NVIDIA to lower their prices a bit so it's really a win win for the consumers either way. Unless if you rushed for GTX already...


----------



## eidairaman1 (Nov 18, 2014)

Ill sit out till r9 500/600 series. By then maybe amd has a performance quadcore-octocore


----------



## GhostRyder (Nov 18, 2014)

eidairaman1 said:


> Ill sit out till r9 500/600 series. By then maybe amd has a performance quadcore-octocore


Personally I maybe interested in this series even though it will go against my usual principle of skipping a generation.  Though that's a stretch for me as well as I will probably wait until the GTX 1080/R9 4XX series come before replacing my setup.


----------



## RealNeil (Nov 18, 2014)

GhostRyder said:


> Personally I maybe interested in this series even though it will go against my usual principle of skipping a generation.  Though that's a stretch for me as well as I will probably wait until the GTX 1080/R9 4XX series come before replacing my setup.



Your Tri-290X setup should be good for a long time. (in fact,...all of your system should be)


----------



## Frag_Maniac (Nov 18, 2014)

I'll patiently wait for real world results and to see what feature set it supports, the latter being a big priority for me on my next GPU.

I like that Nvidia's Pascal will have unified memory, stacked DRAM, and NVLink, and Shadowplay is well ahead of Raptr Game DVR.

So far it's Nvidia Pascal all the way for me. Even if AMD makes a valiant effort in matching features, they've been slipping on drivers again.


----------



## RealNeil (Nov 18, 2014)

Frag Maniac said:


> I'll patiently wait for real world results and to see what feature set it supports, the latter being a big priority for me on my next GPU



This makes sense to me. Wait to see, and know before you buy.


----------



## RejZoR (Nov 20, 2014)

Frag Maniac said:


> I'll patiently wait for real world results and to see what feature set it supports, the latter being a big priority for me on my next GPU.
> 
> I like that Nvidia's Pascal will have unified memory, stacked DRAM, and NVLink, and Shadowplay is well ahead of Raptr Game DVR.
> 
> So far it's Nvidia Pascal all the way for me. Even if AMD makes a valiant effort in matching features, they've been slipping on drivers again.



But then again there are NVIDIA's business practices and all their proprietary garbage. AMD on the other hand doesn't intentionally cripple competition the way NVIDIA does, AMD Mantle doesn't intentionally cripples NVIDIA, it just leaves them at the usual D3D level and all the physics goodies are done through DirectCompute, which means they also work on NVIDIA graphic cards (and even Intel). Unlike PhysX which again only works on NVIDIA cards. I just like to support and use products of the company that is less of a douchebag, even if i don't get those 2fps more in games. It's one of the reasons why i'm with AMD for the last couple of years. They've been less douchey than NVIDIA. And i know i'm not the only one who decides for products this way. Many people decide for products based on other factors that aren't purely performance based...


----------



## vega22 (Nov 20, 2014)

Blue-Knight said:


> Let's say I let a computer run x hours a day for 1 month with 150 watts difference from another machine with AMD hardware:
> 
> Energy cost here, ¢/kWh = $0.32
> 12 hours = +$17.28 (or $207,36 /year)
> ...



now lets re do those figures with the real world difference of about 30 watts under load (290x over 980).

$80 a year saving from replacing 1 290x with 1 980.

works out a bit less once you convert the cash over here in £££ but still i dont care either.

if you use this as an argument and still overclock... you funneh!


----------



## RCoon (Nov 20, 2014)

marsey99 said:


> now lets re do those figures with the real world difference of about 30 watts under load (290x over 980).



232W average vs 156W average for 290X vs 980 (TPU's review figures), that's not even peak or maximum(unrealistic) measurements. So there's more than 30W difference in average power consumption. More like 75W's difference. That being said, a lot of countries have wildly different electricity costs.

But yeah, buying a card and overclocking the snot out of it, whilst also claiming you care so very much about power consumption, is wholeheartedly crazy. That's why I don't overclock anything anymore. I enjoy the cheap running costs of my PC.


----------



## Blue-Knight (Nov 20, 2014)

marsey99 said:


> if you use this as an argument and still overclock... you funneh!


I do not overclock.



RCoon said:


> But yeah, buying a card and overclocking the snot out of it, whilst also claiming you care so very much about power consumption, is wholeheartedly crazy.


I care about power consumption.

Even 30 watts will make a considerable difference, let's see:
I am currently "burning" 95 watts on average (my monitor doesn't stay on 24/7). Keep in mind this is just for computer, not other items in my house...

At $0.28 /kWh (the rate is lower on winter, about $0.23). The rate varies each month, so let's say it is $0.28 on average...

95 watts:
+$19.60 in 1 month.
+235.20 in 1 year.

125 watts:
+$25.20 in 1 month (+$5.60).
+302.40 in 1 month (+$67.20).

It is not THAT much, but that $67.20 will be added to my yearly savings and then I can buy more hardware at the end of the year.


----------



## AsRock (Nov 20, 2014)

RCoon said:


> 232W average vs 156W average for 290X vs 980 (TPU's review figures), that's not even peak or maximum(unrealistic) measurements. So there's more than 30W difference in average power consumption. More like 75W's difference. That being said, a lot of countries have wildly different electricity costs.
> 
> But yeah, buying a card and overclocking the snot out of it, whilst also claiming you care so very much about power consumption, is wholeheartedly crazy. That's why I don't overclock anything anymore. I enjoy the cheap running costs of my PC.



232w average for a 290X i take it that our talking without vsync being on ( just the card usage ) as from what i have seen  is that my total system usage is normally no more than 230w ( 170w-250w) with vsync on.

All so if you leave the computer with a game running for a moment did you know by putting the menu on or the games map the power usage can drop ( if minimizing not a option ), of hand i found this with Skyrim and Just Cause 2,  some even drop to near idle usages.


----------



## RCoon (Nov 20, 2014)

Blue-Knight said:


> I do not overclock.
> 
> 
> I care about power consumption.
> ...



I also care about power consumption. But you are in the same boat as me, you care, and you don't overclock. I was responding to the previous post mentioning caring about power but also overclocking (in that it's hypocritical).



AsRock said:


> 232w average for a 290X i take it that our talking without vsync being on as from what i have seen  is that my total system usage is normally no more than 230w ( 170w-250w) with vsync on.
> 
> All so if you leave the computer with a game running for a moment did you know by putting the menu on or the games map the power usage can drop ( if minimizing not a option ), of hand i found this with Skyrim and Just Cause 2,  some even drop to near idle usages.



I'm just taking figures from the TPU figures, I presume W1zzard turns VSync off for his averages. Yeah, in game menus don't process anything besides a single screen, and not much animation, so it's most probably going to run at a tiny percentage of the GPU.


----------



## 64K (Nov 20, 2014)

RCoon said:


> I also care about power consumption. But you are in the same boat as me, you care, and you don't overclock. I was responding to the previous post mentioning caring about power but also overclocking (in that it's hypocritical).
> 
> 
> 
> I'm just taking figures from the TPU figures, I presume W1zzard turns VSync off for his averages. Yeah, in game menus don't process anything besides a single screen, and not much animation, so it's most probably going to run at a tiny percentage of the GPU.



Yeah, W1zzard has said before he turns off VSync when he benches games.


----------



## RejZoR (Nov 20, 2014)

There is no way you'll gonna save 80 bucks a year by replacing 290x with GTX970. It's just not gonna happen ever. Unless if you're living in a country with the most ridiculous electricity price.


----------



## 64K (Nov 20, 2014)

Looking at W1zzard's test I see that there is about 110 watts difference at average and peak between a GTX 980 and a R9 290X.

http://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_980/24.html

If you game for an average 20 hours a week (some game more some less) then saving around 110 watts over a year per kWh would save you

kWh cost   Savings

10 cents     $11.44
20 cents     $22.88
30 cents     $34.32
40 cents     $45.76

to save $80 a year you would have to be paying 70 cents per kWh. I pay on average 10 cents a kWh so the savings doesn't mean anything to me really.


----------



## RealNeil (Nov 20, 2014)

$.1002 per residential kWh is my rate here. So running a few PCs doesn't break the bank.


----------



## the54thvoid (Nov 20, 2014)

Can we stop talking about power consumption in a thread about AMD's next architecture GPU? It's not relevant. I mentioned that a while back in my own post about power  -it's not relevant.
What about any updated rumours on die size, shaders, frequency etc. That would be more in line with the OP instead of pan global energy comparisons.


----------



## RealNeil (Nov 20, 2014)

Sorry, I was just responding to other posts.

I'd love to see some concrete specs on this card. If we find out soon enough, it may effect whether I buy a few GTX-970s.


----------



## the54thvoid (Nov 22, 2014)

Just read a post from a few days back at an unreliable source (Videocardz, that also was sourcing Fudzilla).  It's suggesting that the next Flagship Maxwell part won't see the light of day until after Fiji comes out.  As someone who is brand agnostic and is looking for a good solution for 4K gaming, that would be really good for AMD.  There are a few folk in my shoes (surely) that want to make the move to 4K but don't want to rely on 2 or 3 way GPU configurations.  If Fiji does come out, and it performs well, then it'd be a blow for NV for 4k.
It's possible as well that AMD might not make the mistake of Tahiti (7970) and price it as high as NV's previous flagship - recall 7970 came in at well over £400 in UK.  If Fiji comes in at a good price, it might rip the carpet from under Nvidia - seeing as they are gouging on the GTX 980 parts currently.


----------



## RealNeil (Nov 22, 2014)

Whomever has the flagship parts, gouges. That's usually been so in the past, as I've seen it happen time and again.
I remember that I was pleasantly surprised at the price point of the R9-280X and the R9-290X cards when they first came out.
I was thinking that Team Green was gonna ~have~ to lower prices for sure, considering their price (far lower) to performance levels. But they didn't have to do it after all because of the Bitcoin Mining buyout of Radeon parts.
280 and 290 parts were so expensive that Team Green cracked a smile and went on with the same, or higher prices.

I couldn't afford either side's good stuff.


----------



## DOA (Nov 23, 2014)

100Watts = accidentally leaving a light on in the attic


----------



## AsRock (Nov 23, 2014)

DOA said:


> 100Watts = accidentally leaving a light on in the attic



About time you got new light bulbs if yours taking that much lol.


----------



## lukart (Nov 23, 2014)

Still, the 390 should come out under 28nm... not much energy savings Im guessing


----------



## Xzibit (Nov 23, 2014)

the54thvoid said:


> Can we stop talking about power consumption in a thread about AMD's next architecture GPU? It's not relevant. I mentioned that a while back in my own post about power  -it's not relevant.
> *What about any updated rumours on die size, shaders, frequency etc.* That would be more in line with the OP instead of pan global energy comparisons.



Rumor comparisons

Nvidia GM200






AMD Fiji


----------



## DOA (Nov 23, 2014)

AsRock said:


> About time you got new light bulbs if yours taking that much lol.


All LED here except attic and a few motion detectors. They come on so seldom it is not worth the cost of replacement.

So far the rumors I see are just what we expect, no die shrink, more power.


----------



## the54thvoid (Nov 23, 2014)

Xzibit said:


> Rumor comparisons
> 
> Nvidia GM200
> 
> ...



Yeah, had seen them but the scores are all over the shop....  I think early engineering samples without any driver optimisations and hence, much variance.  In comparison, what does a 780ti and a 290x currently score in the same benchmarks?


----------



## anubis44 (Nov 23, 2014)

Mathragh said:


> They care as much as Nvidia does.
> The roles currently seem the reverse of what they were when that video came out however. Back then you had the GTX480 which for its time used lots of power while AMD had the 5870 which was comparable in performance and was 1,5 times as powerefficient on the same process.
> AMD back then made use of the situation by producing that video. This time around Nvidia seems to refrain from stuff like that however, while also lowering the prices of their card way more than they'd have to based on product positioning and demand.



All nVidia has done is throw out the double precision floating point circuitry in their chips. It's not magic or rocket science. Their lower power consumption is the direct result of cutting out a feature. Whether that feature is important to you or not is your decision, but that's how they've done it. They aren't smarter than AMD, they're certainly not wizzards. AMD can do exactly the same thing if they want, and frankly, I wish they would, as I would still buy a gaming card based on such a chip, since I don't use double precision floating point in any applications I use.

Much like a car company pulling out the rear passenger seat in their latest car model, and then advertising that it's 'even lighter, gets better fuel economy!' there's no mystery behind how they did it.


----------



## erocker (Nov 23, 2014)

At least Nvidia finally put a reasonable price on a gaming card with the GTX 970. You probably shouldn't buy one anubis44. Anyways, this thread isn't about Nvidia, stay on topic. I'm still waiting for the next batch of high-end cards from both companies.


----------



## eidairaman1 (Nov 23, 2014)

Amd hasnt made any announcements. We probably wont see any till January 2015.


----------



## AsRock (Nov 23, 2014)

erocker said:


> *At least Nvidia finally put a reasonable price on a gaming card with the GTX 970.* You probably shouldn't buy one anubis44. Anyways, this thread isn't about Nvidia, stay on topic. I'm still waiting for the next batch of high-end cards from both companies.



Said like nVidia's price drop\change had nothing to do with AMD having good prices way before.

Sorry but nVidia do do good stuff but they will rob you blind 1st.


----------



## Fluffmeister (Nov 23, 2014)

Last time I checked nVidia aren't holding a gun to peoples heads forcing them to buy anything, although based on anubis44's most recent thread plenty of people here are in la la land when it comes to the mean old world of being a successful business.


----------



## GhostRyder (Nov 24, 2014)

erocker said:


> At least Nvidia finally put a reasonable price on a gaming card with the GTX 970.


Something that deserves praise honestly since its such a good value card.


AsRock said:


> Said like nVidia's price drop\change had nothing to do with AMD having good prices way before.
> 
> Sorry but nVidia do do good stuff but they will rob you blind 1st.


I agree with you there.  AMD tends to release at better prices normally with few exceptions.  But then again both are out to make money so when push comes to shove prices change to reflect that.


Xzibit said:


> Rumor comparisons
> 
> Nvidia GM200
> 
> ...


This chart has been really cool to see but I only hope the rumor/specs come out to be true and hopefully on a different node.




the54thvoid said:


> Yeah, had seen them but the scores are all over the shop....  I think early engineering samples without any driver optimisations and hence, much variance.  In comparison, what does a 780ti and a 290x currently score in the same benchmarks?


Yea sometimes these engineering samples are not the best way to judge something because we end up with results that do not match what the mainstream gets.


----------



## Xzibit (Nov 26, 2014)

the54thvoid said:


> Yeah, had seen them but the scores are all over the shop....  I think early engineering samples without any driver optimisations and hence, much variance.  *In comparison, what does a 780ti and a 290x currently score in the same benchmarks?*



I have no idea..

*Chiphell* leaked some benches.










Card looks good power wise and performance.  Have to wait on price.


----------



## THE_EGG (Nov 26, 2014)

Xzibit said:


> I have no idea..
> 
> *Chiphell* leaked some benches.
> 
> ...


If it turns out to be performing around those figures it will be a damn good card by AMD. It should also 'force' Nvidia to release a 'full-fat' so to speak Maxwell 9xx series card or at least drastically reduce GTX980 prices to something more reasonable. IMO the 980 is too overpriced especially when 970s can be had for two-thirds (or less) the price but perform only 10-20% worse (or less if OCed obviously). OR AMD could try to milk the market and price it above 980 price levels due to it performing higher than the 980. I don't know it's all speculation right now anyways. Looking forward to the release of this card.


----------



## HumanSmoke (Nov 26, 2014)

Xzibit said:


> Card looks good power wise and performance.


It will be a very early qualification sample *assuming* it isn't clickbait. I've never heard of the poster who "leaked" this (isn't Coolaler usually the serial benchmark leaker for AMD?).

If this is the big die + HBM Fiji, then 30% performance improvement over Hawaii doesn't look that interesting TBH, and the power consumption goes against the earlier stories regarding Asetek's reference AIO for the card.
Something smells a little off.


----------



## Mathragh (Nov 26, 2014)

Xzibit said:


> I have no idea..
> 
> *Chiphell* leaked some benches.
> 
> ...



Power and performance wise it could just as well be the big maxwell? Assuming there is some truth to these rumors at all.


----------



## SIGSEGV (Nov 27, 2014)

Xzibit said:


> I have no idea..
> 
> *Chiphell* leaked some benches.
> 
> ...



i'm impressed with the results if it's gonna be true because this card is just mere midrange segment from amd (R9 380/X)


----------



## xorbe (Nov 27, 2014)

I thought gpus were starting to plateau like cpus but I guess not just yet


----------



## RCoon (Nov 27, 2014)

HumanSmoke said:


> the power consumption goes against the earlier stories regarding Asetek's reference AIO for the card.
> Something smells a little off.



It sure does. If Coolaler isn't part of this, it smells an awful lot like clickbait.


----------



## the54thvoid (Nov 27, 2014)

SIGSEGV said:


> i'm impressed with the results if it's gonna be true because this card is just mere midrange segment from amd (R9 380/X)



Where do you get that from?  I see it referenced as 'Captain Jack' but apart from that - no naming of gpu.

I'm also highly doubtful the slide is representative.  I think it will perform easily at 30% more than 290X but the power reduction seems extreme as folk have said for a card that is known to have a leak of an AIO water cooler.  Time will tell and judging by the fact it's almost December, no new cards (gpu die variants as opposed to memory increase) this year.  My upgrade itch remains unscratched....


----------

