# Next AMD Flagship Single-GPU Card to Feature HBM



## btarunr (Jan 13, 2015)

AMD's next flagship single-GPU graphics card, codenamed "Fiji," could feature High-Bandwidth Memory (HBM). The technology allows for increased memory bandwidth using stacked DRAM, while reducing the pin-count of the GPU, needed to achieve that bandwidth, possibly reducing die-size and TDP. Despite this, "Fiji" could feature TDP hovering the 300W mark, because AMD will cram in all the pixel-crunching muscle it can, at the expense of efficiency from other components, such as memory. AMD is expected to launch new GPUs in 2015, despite slow progress from foundry partner TSMC to introduce newer silicon fabs; as the company's lineup is fast losing competitiveness to NVIDIA's GeForce "Maxwell" family.





*View at TechPowerUp Main Site*


----------



## GhostRyder (Jan 13, 2015)

I believe the HBM is almost but confirmed at this point (I guess things could always change) but as far as this GPU goes its going to be interesting what choices they make and how much power it truly outputs.  Personally as far as efficiency goes I am kinda mixed on the high end of cards because while I want more efficiency it is not the primary concern.  Though I hope that if they decide that, I hope this top tier card is seriously powerful and beats their previous generation but a significant margin.

Although the naming rumors seem to be a curious part for me because most leaks and such seem to point to calling the next "top" card the R9 380X which would mean that a 390X is in the shadows somewhere and if so where do these specs fall under.  More of this to me is just going to be waiting for official confirmation than anything but I am getting more and more intrigued by what I hear.


----------



## Steevo (Jan 13, 2015)

They did it with GDDR5, pulled a rabbit out of a hat for the most part, using a new and unproven hardware.


----------



## happita (Jan 13, 2015)

With all the problems foundry partners have had with decreasing the size nodes down to 20nm, I just had to jump onto my R9 290 from my previous 5850. The high-end isn't where power consumption should be a worry, it is the mid/high, mid, mid/lower end segments that should be a priority in trying to get that watt number lower. If the newly appointed CEO Lisa Su works this company around, and the launch of the next wave of cards AND CPUs/APUs is even  a little bit successful, AMD will be in a much better position than it's currently in.


----------



## LocutusH (Jan 13, 2015)

These AMD hyper-space-nextgen-dxlevel13123-gcn technologies look so good on the paper, but somhow always fail to show their strength when it comes to real world games after release...


----------



## Ferrum Master (Jan 13, 2015)

They are going full bank = maximum risk.

I guess they bet on the techprocess, no matter what, it all will lay out when FinFETs arrive, and the power question will be drawn away.

It may be a Intel like Tick...


----------



## Petey Plane (Jan 13, 2015)

happita said:


> With all the problems foundry partners have had with decreasing the size nodes down to 20nm, I just had to jump onto my R9 290 from my previous 5850. The high-end isn't where power consumption should be a worry, it is the mid/high, mid, mid/lower end segments that should be a priority in trying to get that watt number lower. If the newly appointed CEO Lisa Su works this company around, and the launch of the next wave of cards AND CPUs/APUs is even  a little bit successful, AMD will be in a much better position than it's currently in.



The high end also needs to worry about power consumption, unless you want a single GPU pulling 600+ watts and needing a 280mm rad with 4 fans just to keep it cool.  Efficiency is just as important in the high-end, if not more so, because greater efficiency means more performance per watt.


----------



## happita (Jan 13, 2015)

Petey Plane said:


> The high end also needs to worry about power consumption, unless you want a single GPU pulling 600+ watts and needing a 280mm rad with 4 fans just to keep it cool.  Efficiency is just as important in the high-end, if not more so, because greater efficiency means more performance per watt.



I'm not saying it shouldn't be. I'm saying that no true enthusiast will look at power consumption as the main reason whether or not to purchase a high-end card for their system. EVERYONE likes lower power consumption, but when a company's offerings are pretty much all higher watt consuming products versus their competitors, it makes people wonder how efficient their design really is. I'm not knocking AMD, I'm just being realistic here. But at the same time, AMD is not really competing with Nvidia's Maxwell cards ATM, they're just lowering prices on their R9 series instead in the meantime.


----------



## the54thvoid (Jan 13, 2015)

Power draw itself is relative.  If this card can perform 50% faster than 290X but only consume 10% more power, that's okay.  If it can just about drive a single 4k panel, it's a win.  
The win/lose scenario kicks in with NV's part. but again, the mythical touted GM200 is also suggested to be nowhere near as efficient as GM204.  We'll all win if the new AMD card comes out on steroids.


----------



## GreiverBlade (Jan 13, 2015)

LocutusH said:


> These AMD hyper-space-nextgen-dxlevel13123-gcn technologies look so good on the paper, but somhow always fail to show their strength when it comes to real world games after release...



i can't see that ... my 290 still go strong and both of my friend who have some 770/780Ti and 970 are surprised that my 190$ 2nd hand (under water ... tho) is nearly toe to toe with the cards they respectively own, meaning the 780Ti and 970, the 770 is out of league. (i also had a 770 before alongside with a 7950 )


----------



## erocker (Jan 13, 2015)

300w? Cool, not bad.
New tech? Awesome
If price and performance are in line, my PCI-E slot is ready.


----------



## Mathragh (Jan 13, 2015)

So there are two guys mentioned, both from AMD and working on stuff.

One page mentions the 380X and the other 300W. Why do then people all of a sudden conclude they're both working on the 380X? wouldn't t make a lot more sense if the 2nd guy worked on the 390X?


----------



## Steevo (Jan 13, 2015)

LocutusH said:


> These AMD hyper-space-nextgen-dxlevel13123-gcn technologies look so good on the paper, but somhow always fail to show their strength when it comes to real world games after release...



Whut?

AMD/ATI delivered some pretty good products, like the 9xxx, X800, 4xxx series, 5xxx series, 7xxx series and the R9 is still great at 4K output. The change from VLIW to GCN was just a few years ago and their first success was with the 7xxx series on it and they still make those chips as a very competitive offering today, which speaks volumes about how good it was and is. 

Their most fatal flaw is and remains the promise of many software advantages about 2 years before they are actually available or ever being available. And their CPU cache latency issues.


----------



## Ferrum Master (Jan 13, 2015)

Petey Plane said:


> because greater efficiency means more performance per watt.



Guys keep in mind that R9 290 uses first gen 28nm HPM process and 980 uses HPP

They are like apples and oranges really.


----------



## GreiverBlade (Jan 13, 2015)

Ferrum Master said:


> Guys keep in mind that R9 290 uses first gen 28nm HPM process and 980 uses HPP
> 
> They are like apples and oranges really.


so, that mean AMD is holding in perf side but loosing in consumption side with a older and nearly obsolete process??? ok the 980 is a real good one but not so far from a good OC'ed 290X and quite a bit pricier.



happita said:


> they're just lowering prices on their R9 series instead in the meantime.


well that's a good idea tho since you can find a 290 for a 770 price where i am and a 290X at a 970 price ... unless purely fanboy statement, it's wrong to say AMD can't compete (even if as i wrote a bit above the 970/980 are good product ofc)


----------



## Ferrum Master (Jan 13, 2015)

GreiverBlade said:


> so, that mean AMD is holding in perf side but loosing in consumption side with a older and nearly obsolete process??? ok the 980 is a real good one but not so far from a good OC'ed 290X and quite a bit pricier.



Well because I guess nVidia paid a lions share to TSMC to be his lovely puppy. It is been always like that actually... The R9 290 has approximately 20% transistors on board too, that heats up, but still reduces the performance gap at their heat cost. But hey... ATI was a Canadian company... heater during cold winter... actually a two in one 

And actually they must sell their R9 290 no matter what, unsold silicon is a more loss for them than sold for a bargain. I bet they calculated everything as good they can.


----------



## HumanSmoke (Jan 13, 2015)

Mathragh said:


> So there are two guys mentioned, both from AMD and working on stuff.
> One page mentions the 380X and the other 300W. Why do then people all of a sudden conclude they're both working on the 380X? wouldn't t make a lot more sense if the 2nd guy worked on the 390X?


My thoughts as well. 380X supposes a second-tier card which doesn't gel with the initially high price of HBM. Another consideration is if the 380X is a 300W card, then the 390X is either way outside the PCI-SIG, or it someway off in the future on a smaller process node ( If the 380X is a 300W card on 20nm then AMD have some serious problems with BoM).


Ferrum Master said:


> Guys keep in mind that R9 290 uses first gen 28nm HPM process and 980 uses HPP


TSMC doesn't have an HPP process. GM 204 uses 28HPC (High performance mobile computing) since Maxwell is a mobile-centric architecture. The difference in efficiency is more a product of how Nvidia prioritize perf/watt at the expense of double precision - so more a difference in opinion at the ALU level AFAIK.
From TSMC's own literature:


----------



## Ferrum Master (Jan 13, 2015)

HumanSmoke said:


> My thoughts as well. 380X supposes a second-tier card which doesn't gel with the initially high price of HBM. Another consideration is if the 380X is a 300W card, then the 390X is either way outside the PCI-SIG, or it someway off in the future on a smaller process node ( If the 380X is a 300W card on 20nm then AMD have some serious problems with BoM).



Well Boney don't you think also that this number is just the theoretical engineering envelope(summing up the power connector max theoretical delivery current). Do the actually have a real mass produced silicon from GloFo that shows the real consumption numbers. I guess nope... The best they have is still 28nm alfa silicon or even more...

Thanks for correcting, but still that graph is kind of useless.


----------



## HumanSmoke (Jan 13, 2015)

Ferrum Master said:


> Well Boney don't you think also that this number is just the theoretical engineering envelope(summing up the power connector max theoretical delivery current).


If AMD are working on 300W board power - even for ES, how does that portend for a higher tier card in the same series. When has a top tier card used less power than the second-tier card in the same model series?


Ferrum Master said:


> Do the actually have a real mass produced silicon from GloFo that shows the real consumption numbers. I guess nope... The best they have is still 28nm alfa silicon or even more...


AMD had at least a hot lot of silicon at least two months ago. By your reckoning, either AMD haven't made any headway with silicon in the interim (indicating a metal layer revision), or are taking a leisurely approach in revising the silicon.


Ferrum Master said:


> Thanks for correcting, but still that graph is kind of useless.


The chart wasn't provided to supply information on the processes (that's what the individual product briefs are for), it was provided to show what 28nm processes TSMC provides.


----------



## Ferrum Master (Jan 13, 2015)

HumanSmoke said:


> If AMD are working on 300W board power - even for ES, how does that portend for a higher tier card in the same series. When has a top tier card used less power than the second-tier card in the same model series?
> 
> AMD had at least a hot lot of silicon at least two months ago. By your reckoning, either AMD haven't made any headway with silicon in the interim (indicating a metal layer revision), or are taking a leisurely approach in revising the silicon.



I am just trying to understand why the quote appeared on linkedin,  nobody says she didn't work on such project, but nobody told what kind of tech node it had, it could be a catch and the blooper around these news.

The seconds they are using GloFo now, we have no hard info on them and their silicon leakage at this stage. There may be many variables.

And the speculation about the 380X, it is funny that it has not the R9 class in front of it, ain't it?


----------



## snakefist (Jan 13, 2015)

One thing that tends to be overlooked in most comments (and even articles). 

_The silicon is cheap. The development is costly._

When an architecture manages to endure a long time with relatively small improvements (as GCN is), cards made on it make significant profit. Even with a price reduction. Yesterdays flagship becomes mid-high, mid range becomes entry etc.

AMD offerings still generate profit, despite lowered price - and probably a good deal of it.

NVIDIA has done the same multiple times in the past - remember all the re-branding?

Not taking sides, 970 and 980 are certainly excellent products, but they are enthusiast level only - we are yet to see mid-range products (and eagerly, if I may add).

These 'mysterious' AMD cards (I also suppose there are likely two of them) are also eagerly awaited - HBM is a technology which is looking promising, but real life test should confirm to which extent.


----------



## dj-electric (Jan 13, 2015)

Here's my super-not-hardware-geeky answer to this storm of comments:

I don't care if it will feature HBM, LLP or WTF.
If i will recieve a card with low noise, high performance and a good price - i'm in.


----------



## W1zzard (Jan 13, 2015)

Dj-ElectriC said:


> If i will recieve a card with low noise, high performance and a good price - i'm in.


GTX 970


----------



## dj-electric (Jan 13, 2015)

GTX 970 is indeed a prime example.
All i need is to get it shrunk and pack about twice that performance and i'll make the leap


----------



## Mathragh (Jan 13, 2015)

Dj-ElectriC said:


> GTX 970 is indeed a prime example.
> All i need is to get it shrunk and pack about twice that performance and i'll make the leap


Would also be optimal if we had more than one supplier of such a card.


----------



## GreiverBlade (Jan 13, 2015)

Ferrum Master said:


> ATI was a Canadian company... heater during cold winter... actually a two in one
> 
> And actually they must sell their R9 290 no matter what, unsold silicon is a more loss for them than sold for a bargain. I bet they calculated everything as good they can.



i am Swiss i live in mountain ... i fit the customer profile for AMD/ATI cards (even if 44° is the max for my little princess now ... thanks Kryografics) now thanks to that i will be able to get a 2nd 290 + a psu for cheaper than if i wanted to jump on maxwell even ordering another block+backplate + a auxiliary 140X60/65mm rad would set the total a bit bellow a 980  and no way a 970 would be tempting over a "already owned" 290 (i was tempted in the beginning but on a second look it proved to be not a good idea, ie: Side-grade)



Dj-ElectriC said:


> If i will recieve a card with low noise, high performance and a good price - i'm in.





W1zzard said:


> GTX 970


well ... as i paid my 290 on 2nd hand (not used too long fortunately ) and if i take in account the price of the loop/block/backplate  to add to the total price it's still under the price of a 970 for me ... so judging by the temps and silence of operation (on a 240X60mm rad ) my card fits that description  but if you mean "direct out of the box" then yes a 970 is fine. (not blaming nvidia ... blaming the greedy retailer/etailer around me  )


----------



## Ferrum Master (Jan 13, 2015)

GreiverBlade said:


> i2nd 290



Busy putting up Lollipop on my M8.... And I I also live up more more north in here is enough cold...

But I mostly play Skyrim lately... and... you know... 2.5 years... and the CFX still sucks there... 2.5 years.... I have also lived on a SLI setup I do not suggest multiple card setup as such anymore to anyone except they use a triple monitor setup ie they have no choice as it needs horsepower.

I would sell the old one and get the 980 thou...


----------



## GreiverBlade (Jan 13, 2015)

Ferrum Master said:


> Busy putting up Lollipop on my M8.... And I I also live up more more north in here is enough cold...
> 
> But I mostly play Skyrim lately... and... you know... 2.5 years... and the CFX still sucks there... 2.5 years.... I have also lived on a SLI setup I do not suggest multiple card setup as such anymore to anyone except they use a triple monitor setup ie they have no choice as it needs horsepower.
> 
> I would sell the old one and get the 980 thou...


well i said i could get ... but a single 290 is way more than enough for 2015, i will jump on 2nd hand 390X or 980Ti 1080 whatever it will be called  

i also had a 580 SLI and tested a 7870ghz CFX ... well dual card, too, is not for me indeed. i am on a single 1080p monitor so no need for more (planning eventually to go 21/9 29" for fun or a 27" 1440p but ... for now i am fine  )


----------



## HumanSmoke (Jan 13, 2015)

Ferrum Master said:


> I am just trying to understand why the quote appeared on linkedin


LinkedIn is a networking and job marketplace. Given that AMD's workforce has a certain _fluidity _to it, maybe every scintilla of CV worthy background puts her one step closer to her next position.
OTOH, maybe she like most normally level-headed people, suddenly turns into a gabbling fool as soon as they are let loose near a social networking site.


Ferrum Master said:


> nobody says she didn't work on such project, but nobody told what kind of tech node it had, it could be a catch and the blooper around these news. The seconds they are using GloFo now, we have no hard info on them and their silicon leakage at this stage. There may be many variables.


Undoubtedly. Of the obvious candidates (assuming UMC's 28nm HLP/HPM isn't a consideration):
TSMC 28nm HPL/HPC  - both mature with good yields
TSMC 20nm SOC - Very unlikely for a large chip with a large power budget
GloFo 28nm SHP - Already in use for GCN ( Kaveri APU), and reported process for AMD GPUs in 2015
GloFo 20nm LPM - Very unlikely for a large chip with a large power budget, and reportedly being sidelined by GloFo as it concentrates on licensed 14nm-XM.


Ferrum Master said:


> And the speculation about the 380X, it is funny that it has not the R9 class in front of it, ain't it?


Maybe just a timesaving abbreviation. I know I do it myself, as do many others, so I would expect engineers who have to deal with the nomenclature constantly would do likewise.


----------



## Ferrum Master (Jan 13, 2015)

HumanSmoke said:


> maybe she like most normally level-headed people, suddenly turns into a gabbling fool as soon as they are let loose near a social networking site.



That may be the sad story indeed.



HumanSmoke said:


> Maybe just a timesaving abbreviation. I know I do it myself, as do many others, so I would expect engineers who have to deal with the nomenclature constantly would do likewise.



I am an engineer myself,  seldom fire with shortened part number in a area I work and I am specializing on, because it causes so much misunderstandings and we tend to correct each other on these cases and even note on specific revisions on each part numbers(as the number is not enough), I may have used a codename, but yes, it may be an option too...


----------



## trenter (Jan 13, 2015)

"Fiji could feature TDP hovering the 300W mark, because AMD will cram in all the pixel-crunching muscle it can, at the expense of efficiency from other components, such as memory." Could you be any more obvious with your bias? You don't know if the tdp will be 300w, and you certainly don't know what reason it may have for reaching that tdp. How about you wait until it has been tested before spouting nonsense.


----------



## Jorge (Jan 13, 2015)

AMD has a number of options for production of these cards which most people will be very happy with. There is a lot more than just stacked RAM and a smaller node to be had.


----------



## Steevo (Jan 13, 2015)

trenter said:


> "Fiji could feature TDP hovering the 300W mark, because AMD will cram in all the pixel-crunching muscle it can, at the expense of efficiency from other components, such as memory." Could you be any more obvious with your bias? You don't know if the tdp will be 300w, and you certainly don't know what reason it may have for reaching that tdp. How about you wait until it has been tested before spouting nonsense.




So you haven't read the sources, and instead joined to attack the wording of the report?

Good job.


----------



## HumanSmoke (Jan 13, 2015)

Jorge said:


> AMD has a number of options for production of these cards which most people will be very happy with. There is a lot more than just stacked RAM and a smaller node to be had.


Thanks for vague unquantifiable promises. I foresee a bright future for you at AMD's PR department.


----------



## Lionheart (Jan 14, 2015)

trenter said:


> "Fiji could feature TDP hovering the 300W mark, because AMD will cram in all the pixel-crunching muscle it can, at the expense of efficiency from other components, such as memory." Could you be any more obvious with your bias? You don't know if the tdp will be 300w, and you certainly don't know what reason it may have for reaching that tdp. How about you wait until it has been tested before spouting nonsense.



Lolz wtf?


----------



## GhostRyder (Jan 14, 2015)

trenter said:


> "Fiji could feature TDP hovering the 300W mark, because AMD will cram in all the pixel-crunching muscle it can, at the expense of efficiency from other components, such as memory." Could you be any more obvious with your bias? You don't know if the tdp will be 300w, and you certainly don't know what reason it may have for reaching that tdp. How about you wait until it has been tested before spouting nonsense.


I am pretty sure the thread OP has no bias and only based this off the source of this so I would think twice if I was you before adding an opinion like that on him.


----------



## ZoneDymo (Jan 14, 2015)

trenter said:


> "Fiji could feature TDP hovering the 300W mark, because AMD will cram in all the pixel-crunching muscle it can, at the expense of efficiency from other components, such as memory." Could you be any more obvious with your bias? You don't know if the tdp will be 300w, and you certainly don't know what reason it may have for reaching that tdp. How about you wait until it has been tested before spouting nonsense.



There are fanboys and then there guys who post responses like yours....


----------



## LAN_deRf_HA (Jan 14, 2015)

This is becoming a trend. AMD goes all out, nvidia reacts with something they've been holding back. Seems like something has to give here. Hopefully getting stuck at 28nm screws with nvidia a little more than AMD and they can gain some ground.


----------



## RejZoR (Jan 14, 2015)

I was thinking of just ordering a GTX 970, but now I'm hesitating again. Argh.


----------



## buggalugs (Jan 14, 2015)

This news came out a while ago. Design stage of these cards is past, must be in manufacture right now and samples floating around in weeks......


----------



## hardcore_gamer (Jan 14, 2015)

Yes, but can it play at 4K ?


----------



## RejZoR (Jan 14, 2015)

hardcore_gamer said:


> Yes, but can it play at 4K ?



Most likely yes. With such memory it will have tons of bandwidth to support it. It's just up to the GPU design to utilize it now...


----------



## Vayra86 (Jan 14, 2015)

Not sure if anyone here mentioned this, but 300W board power is no more than a baseline, if anything, because GPU design generally doesn't go past 300W due to PCI and power supply limitations.

To be honest I am not expecting miracles, if Tonga was anything to go by.


----------



## RCoon (Jan 14, 2015)

Vayra86 said:


> GPU design generally doesn't go past 300W due to PCI and power supply limitations



295x2 obliterated all PCI-E power specifications and recommendations, so if they really wanted to, they would go past 300W.


----------



## Vayra86 (Jan 14, 2015)

RCoon said:


> 295x2 obliterated all PCI-E power specifications and recommendations, so if they really wanted to, they would go past 300W.



Yes, but that's a dual-GPU card...

The issue is that if you put a single GPU past the 300w mark, many people will run into issues with power supplies for example. It hurts sales, many systems will be incompatible.


----------



## RCoon (Jan 14, 2015)

Vayra86 said:


> Yes, but that's a dual-GPU card...
> 
> The issue is that if you put a single GPU past the 300w mark, many people will run into issues with power supplies for example. It hurts sales, many systems will be incompatible.



The stock 290X hits 282W peak, and surpasses 300W in furmark (unrealistic). That didn't hurt sales too much, and it was only 18W below 300 on a single GPU card. They can get away with it, just not much higher.


----------



## rtwjunkie (Jan 14, 2015)

GhostRyder said:


> I am pretty sure the thread OP has no bias and only based this off the source of this so I would think twice if I was you before adding an opinion like that on him.



Well, you KNOW there is always an extreme fanboy who joins just to troll and dump on a thread, completely unaware that bta is not biased.  Happens on both sides of the fence, sadly, depending whether the news is about the green side or red side.


----------



## W1zzard (Jan 14, 2015)

RCoon said:


> The stock 290X hits 282W peak, and surpasses 300W in furmark (unrealistic). That didn't hurt sales too much


till the day nvidia released gtx 970 and 980


----------



## RCoon (Jan 14, 2015)

W1zzard said:


> till the day nvidia released gtx 970 and 980



Either AMD address performance, or they address power consumption. I don't imagine seeing them addressing both in full.


----------



## RejZoR (Jan 14, 2015)

W1zzard said:


> till the day nvidia released gtx 970 and 980



You can't base the findings on the fact that GPU's are like 1 year apart...


----------



## the54thvoid (Jan 14, 2015)

RejZoR said:


> You can't base the findings on the fact that GPU's are like 1 year apart...



Yes you can.  The sales are based on what each company has on offer right now, not what generation or release date they are.  290x is AMD's best card right now.  GTX 980 is Nvidia's best card (sort of).  It's not an issue of which is newer.

It is AMD's problem they don't have a performance competitor (on perf/watt), not the markets.  FWIW, I think their next card should hit the mark based on rumours so far.  I think it may be as fast as GM200 but it will consume more power.  But if it's a faster card and better at 4K, power draw be damned.  But all being said, that's only my opinion.


----------



## 64K (Jan 14, 2015)

the54thvoid said:


> Yes you can.  The sales are based on what each company has on offer right now, not what generation or release date they are.  290x is AMD's best card right now.  GTX 980 is Nvidia's best card (sort of).  It's not an issue of which is newer.
> 
> It is AMD's problem they don't have a performance competitor (on perf/watt), not the markets.  FWIW, I think their next card should hit the mark based on rumours so far.  I think it may be as fast as GM200 but it will consume more power.  But if it's a faster card and better at 4K, power draw be damned.  But all being said, that's only my opinion.



Power draw is irrelevant to me as well. Even if my next card drew 300 watts which is about 150 watts more than my present card it wouldn't amount to anything. I game an average about 15 hours a week and my electricity costs 10 cents kWh so the difference would be a little less than $1 a month on my bill. What can you buy with $1 these days? A pack of crackers at a convenience store I guess.


----------



## lZKoce (Jan 14, 2015)

64K said:


> Power draw is irrelevant to me as well. Even if my next card drew 300 watts which is about 150 watts more than my present card it wouldn't amount to anything. I game an average about 15 hours a week and my electricity costs 10 cents kWh so the difference would be a little less than $1 a month on my bill. What can you buy with $1 these days? A pack of crackers at a convenience store I guess.



+1, excellent argument. Love it when numbers speak.


----------



## GhostRyder (Jan 14, 2015)

rtwjunkie said:


> Well, you KNOW there is always an extreme fanboy who joins just to troll and dump on a thread, completely unaware that bta is not biased.  Happens on both sides of the fence, sadly, depending whether the news is about the green side or red side.


Yea and it gets quite old especially when you get the people intentionally talking/joking trying to cook up an argument.  Makes the thread cluttered and hard to get some good information from.



Vayra86 said:


> Yes, but that's a dual-GPU card...
> 
> The issue is that if you put a single GPU past the 300w mark, many people will run into issues with power supplies for example. It hurts sales, many systems will be incompatible.


The TDP of an R9 290X is 290Watt according to the database on techpowerup so 10 more is not really much of a difference.  On top of that TDP is usually not representative anyways of actual power usage as the card uses less than 290 watts under load depending on fan speed (Reference I am speaking of) though the fans are not really making up for much of the power usage and GPGPU where I see these figures climbing up towards ~290 is not representative of real world most of the time.  Even so either way I do not think we will be sweating to much with a little bit higher.

I think focusing on performance is better in many ways as long as you do not hit a point that is beyond crazy otherwise you do nullify part of your market with having to invest in more expensive parts to run already expensive parts.  Power draw to me is important but more for the mobile class than anything as the higher class of cards are more for the extreme ranges of setups (higher resolution, surround/eyefinity, 3D, etc).



64K said:


> Power draw is irrelevant to me as well. Even if my next card drew 300 watts which is about 150 watts more than my present card it wouldn't amount to anything. I game an average about 15 hours a week and my electricity costs 10 cents kWh so the difference would be a little less than $1 a month on my bill. What can you buy with $1 these days? A pack of crackers at a convenience store I guess.


Bingo, though to be fair some places do have very high electricity costs compared to you or I so I can see it somewhat but even with that unless your stressing your computer 24/7 it will not amount to much.


----------



## Ionut B (Jan 14, 2015)

All I want to say is, I think they should bring the temps down. I don't really care if they designed the chip to run at 100 degrees Celsius. Nvidia did the same with the 5xx series, which ran really hot. They forget I have other components in my system which may be affected by the high temperatures of the GPU.
So, high temps means more heat means less OC for my CPU. All on air btw.
80 is the maximum acceptable imo.


----------



## TRWOV (Jan 14, 2015)

LocutusH said:


> These AMD hyper-space-nextgen-dxlevel13123-gcn technologies look so good on the paper, but somhow always fail to show their strength when it comes to real world games after release...



If history repeats itself HBM will become the new GPU memory standard, just like GDDR3 and GDDR5 did in the past (GDDR4 was kind of a misstep for ATi and only they used it). I would say that those are more than proven success for ATi's R&D.  Unified shaders and tessellation also caught on, although ATi's tessellation engine (Thruform) didn't become the standard.


----------



## Sasqui (Jan 14, 2015)

Ionut B said:


> All I want to say is, I think they should bring the temps down. I don't really care if they designed the chip to run at 100 degrees Celsius. Nvidia did the same with the 5xx series, which ran really hot. They forget I have other components in my system which may be affected by the high temperatures of the GPU.
> So, high temps means more heat means less OC for my CPU. All on air btw.
> 80 is the maximum acceptable imo.



The 290x reference cooler is/was a complete piece of crap, both in design and manufacturing.


----------



## Pehla (Jan 14, 2015)

im begining to belive in conspiracy teory!! maybe intel and/or nvidia is paying for that shitlords not to make amd die shrink!!i mean everyone do that...,but not amd...
samsung go to freakin 14nm...,intel shrink as well,nvidia...,but not amd...,there is something realy wierd in that picture!!
but dont judge me...,its just a teory...,something that crosed my mind


----------



## xvi (Jan 14, 2015)

Ionut B said:


> All I want to say is, I think they should bring the temps down. I don't really care if they designed the chip to run at 100 degrees Celsius. Nvidia did the same with the 5xx series, which ran really hot. They forget I have other components in my system which may be affected by the high temperatures of the GPU.
> So, high temps means more heat means less OC for my CPU. All on air btw.
> 80 is the maximum acceptable imo.


Eeehhh.. If I recall correctly, someone (I think Intel) has been doing some research on high-temp computing with the theory that it may be cost effective to start designing products that can run safely at rather high temperatures with the intended benefit being that the components become easier (or rather, cheaper) to cool. Imagine a CPU that throttled at, say for example, 150c, and could run quite happily at 120c. The amount of thermal energy a heatsink dissipates increases with thermal delta, so what if we increased that by making the hot side hotter? If AMD's FX chips and Intel's 46xx/47xx chips could run at those temps, we could probably use the stock cooler to achieve the same overclocks we see on high-end air and vis-a-vis high-end air could push in to new territory.

The problem with products that run hot isn't that they were designed to run hot, but more accurately that they were designed to run so close to thermal limits. If those nVidia cards could run at 150c, they'd just turn down the fan speed and most everyone would be happy.


RejZoR said:


> I was thinking of just ordering a GTX 970, but now I'm hesitating again. Argh.


Exact same situation for me. I've been considering a 970, but only because it does really well in the one game I want to play and it plays nicely with some backlighting hardware I have. I'd prefer AMD, but even if the new card performs below expectations, at the very least, it should bump the GTX 970's price down.


----------



## W1zzard (Jan 14, 2015)

If you have a 200W heat load, the heat output to your system/room is the same (200W), no matter if the card is running cool but with high fan speed or warm with low fan speed.



64K said:


> my electricity costs


You still have heat dumped into your room / high fan noise


----------



## RejZoR (Jan 14, 2015)

I have a ridiculously low custom speed fan profile on my HD7950, so it's absolutely silent. It runs hot, but it's silent. So, I frankly don't realyl care what TDP it has for as long as cooler can deal with it at low RPM. Which means my next card will be a WindForce 3X again for sure.


----------



## FordGT90Concept (Jan 14, 2015)

We need to know more about the performance before we can judge 300w a bad thing or not.  If it has 3-5 times the performance of a 290X, I'd argue that it isn't going to waste.  When you can get one 300w card that replaces two 200w cards, I'd call that a win.


----------



## Sasqui (Jan 14, 2015)

FordGT90Concept said:


> If it has 3-5 times the performance of a 290X,



Very doubtful anywhere close to that magnitude (it only implies higher memory bandwidth) ...unless they are talking about an architecture change or serious higher clock on the GPU, it'll probably be on the order of 10%-25% improvement.   Just guessin'


----------



## HumanSmoke (Jan 14, 2015)

RejZoR said:


> You can't base the findings on the fact that GPU's are like 1 year apart...


They both compete in the same market at the same time, and are both current (non-EOL - therefore they can.

By your reasoning, Intel's latest 2-3 platform offerings shouldn't have reviews including AMD FX and 990X chipsets for comparison, since the AMD platform is over 2 (Vishera) and 3 ( 900 series chipset) years old.


RejZoR said:


> hardcore_gamer said:
> 
> 
> > Yes, but can it play at 4K ?
> ...


Bandwidth is only half the equation. HBM is limited to 4GB of DRAM in it's first generational phase. Are you confident that 4GB is fully capable for holding the textures in all scenario's for 4K gaming?


Sasqui said:


> Very doubtful anywhere close to that magnitude (it only implies higher memory bandwidth) ...unless they are talking about an architecture change or serious higher clock on the GPU, it'll probably be on the order of 10%-25% improvement.  Just guessin'


That is likely a fairly low estimation IMO. If the quoted numbers are right, Fiji has 4096 cores which are a 45% increase over Hawaii. The wide memory I/O afforded by HBM in addition to colour compression should also add further, as would any refinement in the caching structure - as was the case between Kepler and Maxwell-assuming it was being accorded the priority that Nvidia's architects imbued their project with.


----------



## FordGT90Concept (Jan 14, 2015)

Sasqui said:


> Very doubtful anywhere close to that magnitude (it only implies higher memory bandwidth) ...unless they are talking about an architecture change or serious higher clock on the GPU, it'll probably be on the order of 10%-25% improvement.   Just guessin'


HBM should mean smaller die required too connect the memory which translates to lower TDP.  The TDP growth is not coming from the HBM, it is coming from elsewhere.

If leaked information is to believed, it has double the stream processors as the 280X, a 17% higher clockspeed, and more than double the memory bandwidth.


----------



## Casecutter (Jan 14, 2015)

Correct me if wrong the Flagship graphics card, codenamed "Fiji" would via the GM200 as the 390/390X, then "Bermuda" is said to become the 380/380X and via the 970/980 correct?

First how does btarunr come up with, _"Despite this, "Fiji" could feature TDP hovering the 300W mark..."_?  the article said "the world’s first 300W *2.5D* discrete GPU SOC using stacked die High Bandwidth Memory and silicon interposer." Honestly that doesn't sound like anything more than an internal engineering project rather than any inferance to "Fiji".   It appears to be pure speculation/assumption, not grounded in any evidence it is a imminent consumer product release.

I also would like btarunr to expound on _"despite slow progress from foundry partner TSMC to introduce newer silicon fabs"_ as that doesn't seem to come from the linked article.  It seems a slam on AMD for not having something for what now 4mo's since 970/980?
We know that TSCM "as normal" effected both companies abilities; Nvidia basically had to hold to 28nm on mainstream, and possible so will AMD for "Bermuda".  Is saying as he does a hint there's some use of 16nm FinFET for "Flagship graphics" cards from both or either? (I don’t think that going out on a limb).  With a 16nm FinFET would be strange for either side to really need push the 300W envelope?  I believed AMD learned that approaching 300W is just too much for the thermal effectiveness of most reference rear exhaust coolers (Hawaii).

Despite many rumors of that mocked up housing, I don’t see AMD releasing a single card "reference water" type cooler for their initial "Fiji" release, reference air cooling will maintain.  I don't discount they could provide a "Gaming Special" to find the market reaction as things progress for a "reference water" cooler, but not primarily.


----------



## HumanSmoke (Jan 14, 2015)

FordGT90Concept said:


> HBM should mean smaller die required too connect the memory which translates to lower TDP.


I think you missed the point of HBM. The lower power comes about due to the lower speed of the I/O ( which is more than offset by the increased width). GDDR5 presently operates at 5-7Gbps/pin. HBM as shipped now by Hynix is operating at 1Gbps/pin













FordGT90Concept said:


> The TDP growth is not coming from the HBM, it is coming from elsewhere.


Maybe the 45% increase in core count over Hawaii ?


Casecutter said:


> Correct me if wrong the Flagship graphics card, codenamed "Fiji" would via the GM200 as the 390/390X, then "Bermuda" is said to become the 380/380X and via the 970/980 correct?


There seem to be two schools of thought on that. Original roadmaps point to Bermuda being the second tier GPU, but some sources are now saying that Bermuda is some future top-tier GPU on a smaller process. The latter begs the question: If this is so, what will be the second tier when Fiji arrives? Iceland is seen as entry level, and Tonga/Maui will barely be mainstream. There is a gap unless AMD are content to sell Hawaii in the $200 market.


Casecutter said:


> So where does btarunr come up with, _"Despite this, "Fiji" could feature TDP hovering the 300W mark..."_?  It appears to be pure speculation/assumption, not grounded in any evidence?


I would have thought the answer was pretty obvious. btarunr's article is based on a Tech Report article (which is referenced as source). The Tech Report article is based upon a 3DC article (which they linked to) which does reference the 300W number along with other salient pieces of information.


Casecutter said:


> I also would like btarunr to expound on _"despite slow progress from foundry partner TSMC to introduce newer silicon fabs"_.With a 16nm FinFET would be strange for either side to really need push the 300W envelope?


TSMC aren't anywhere close to volume production of 16nmFF required for large GPUs (i.e. high wafer count per order). TSMC are on record themselves as saying that 16nmFF / 16nmFF+ will account for *1%* of manufacturing by Q3 2015.


Casecutter said:


> I believed AMD learned that approaching 300W is just too much for the thermal effectiveness of most reference rear exhaust coolers (Hawaii).


IDK about that. The HD 7990 was pilloried by review sites, the general public, and most importantly, OEMs for power/noise/heat issues. It didn't stop AMD from going one better with Hawaii/Vesuvius. If AMD cared anything for heat/noise, why saddle the reference 290/290X with a pig of reference blower design that was destined to follow the HD 7970 as the biggest example GPU marketing suicide in recent times?
Why would you release graphics cards with little of no inherent downsides from a performance perspective, with cheap-ass blowers that previously invited ridicule? Nvidia proved that the blower design doesn't have to be some Wal-Mart looking, Pratt & Whitney sounding abomination as far back as the GTX 690, yet AMD hamstrung their own otherwise excellent product with a cooler guaranteed to cause a negative impression.


Casecutter said:


> Despite many rumors of that mocked up housing, I don’t see AMD releasing a single card "reference water" type cooler for their initial "Fiji" release, reference air cooling will maintain.  I don't discount they could provide a "Gaming Special" to find the market reaction as things progress for a "reference water" cooler, but not primarily.


That reference design AIO contract Asetek recently signed was for $2-4m. That's a lot of AIO's for a run of "gaming special" boards don't you think?


----------



## RejZoR (Jan 14, 2015)

How exactly was HD7970 a GPU marketing suicide? HD7970 was awesome and still is considering its age.


----------



## FordGT90Concept (Jan 14, 2015)

HumanSmoke said:


> I think you missed the point of HBM. The lower power comes about due to the lower speed of the I/O ( which is more than offset by the increased width). GDDR5 presently operates at 5-7Gbps/pin. HBM as shipped now by Hynix is operating at 1Gbps/pin


It's not pin.  It's 128 GiB/s per HBM chip with up to 1 GiB density.


----------



## HumanSmoke (Jan 14, 2015)

RejZoR said:


> How exactly was HD7970 a GPU marketing suicide?


As I said:


HumanSmoke said:


> Why would you release graphics cards with little of no inherent downsides from a performance perspective, with cheap-ass blowers that previously invited ridicule?


Of the "cons" outlined in the reference card review, price was what AMD could charge, perf/watt was a necessary trade off for compute functionality, and PowerTune/ZeroCore weren't a big influence which leaves...





Now, are you going to tell me that the largest negative gleaned from reviews, users, and tech site/forum feedback WASN'T due to the reference blower shroud?
Do you not think that if AMD had put more resources into putting together a better reference cooling solution that the overall impression of the reference board - THE ONLY OPTION AT LAUNCH - might have been better from a marketing and PR standpoint ? How many people stated that they would only consider the HD 7970 once the card was available with non-reference cooling - whether air or water?


----------



## HumanSmoke (Jan 14, 2015)

FordGT90Concept said:


> That's backwards.  128 GiB/s per 1 Gb (128 MiB) chip.  4 of them stacked up gets 4 Gb (512 MiB) and 512 GiB/s effective rate.    Stick 8 of those on the card and you still have 512 GiB/s and 4 GiB of RAM or be ridiculous and stick 16 of them on the card on two memory controllers for 1 TiB/s and 8 GiB of RAM.


FFS. First generation HBM is limited to four 1GB stacks (256MB * 4 layers)








FordGT90Concept said:


> It's not pin.  It's 128 GiB/s per HBM chip with up to 1 GiB density.


I was referring to the effective data rate (also see the slide above). Lower effective memory speed = lower voltage = lower power envelope - as SK Hynix's own slides show





EDIT: Sorry about the double post. Thought I was editing the one above.


----------



## Xzibit (Jan 14, 2015)

HumanSmoke said:


> That reference design AIO contract Asetek recently signed was for $2-4m. That's a lot of AIO's for a run of "gaming special" boards don't you think?



Maybe they plan on AIO everything from now on... It does look like Asetek with sleeves on the tubes.





*EDIT*:





The release of this card also lines up with the Asetek announcement.  Not saying AMD wont have a AIO cooler but at least with EVGA we have proof in a product.


----------



## HumanSmoke (Jan 14, 2015)

Xzibit said:


> Maybe they plan on AIO everything from now on... It does look like Asetek with sleeves on the tubes.


Asetek cooling does seem like the new black. I think you're right in thinking that the EVGA card is using an Asetek cooler judging by comments on the EVGA forum and views of the card without the shroud in place.
If Asetek's cooling becomes will become the de facto standard for AMD's reference cards, it stands to reason that others will follow suit. To my eyes it certainly looks cleaner than Arctic's hybrid solution- but then, I'm not a big fan of Transformer movies either.


Xzibit said:


> The release of this card also lines up with the Asetek announcement.  Not saying AMD wont have a AIO cooler but at least with EVGA we have proof in a product.


Well, the Asetek announcement for the $2-4m contract specifies an OEM (Nvidia or AMD), not an AIB/AIC, so chances are the EVGA contract isn't directly related any more than Sycom or any of the other outfits adding Asetek units to their range. The fact that card pictured is Nvidia OEM reference GTX 980 rather than an EVGA designed product would also tend to work against the possibility.
Having said that, I'm sure EVGA would love to have sales that warrant committing to a seven-figure contract for cooling units for a single SKU.


----------



## FordGT90Concept (Jan 14, 2015)

HumanSmoke said:


> FFS. First generation HBM is limited to four 1GB stacks (256MB * 4 layers)
> 
> 
> 
> ...


Better document:
https://hpcuserforum.com/presentations/seattle2014/IDC_AMD_EmergingTech_Panel.pdf
819.2 Mb/s to 1,228.8 Mb/s

The fourth slide shows 30w for HBM vs 85w for GDDR5.

Edit: From what I gather, the power savings come from the logic board being on the chip rather than off chip.  Everything doesn't have to go as far to get what it needs and that substantially cuts power requirements in addition to improving performance by way of reducing latency.


----------



## HumanSmoke (Jan 14, 2015)

FordGT90Concept said:


> Edit: From what I gather, the power savings come from the logic board being on the chip rather than off chip.  Everything doesn't have to go as far to get what it needs and that substantially cuts power requirements in addition to improving performance by way of reducing latency.


The power savings are certainly helped by moving off-die to interposer as is latency (although trace distance latency is minor compared to the larger decrease due to slower data rate. Latency increases with data rate - for example CAS3 or 4 is common for DDR2, while DDR3 (the basis for GDDR5) on the other hand is closer to 8-10 cycles.
The large power savings are also data rate related (as Hynix themselves highlight). It is no coincidence that vRAM started to take a significant portion of total board power budget (~30%) with the advent of faster GDDR5 running in excess of 5Gbps, or the corresponding rise of LPDDR3 and 4 for system RAM as data rates increased and the need to reduce voltage became more acute.


FordGT90Concept said:


> Better document:
> https://hpcuserforum.com/presentations/seattle2014/IDC_AMD_EmergingTech_Panel.pdf


I think you'll find that SK Hynix's own presentation (PDF) is somewhat more comprehensive.


----------



## RejZoR (Jan 15, 2015)

HumanSmoke said:


> As I said:
> 
> Of the "cons" outlined in the reference card review, price was what AMD could charge, perf/watt was a necessary trade off for compute functionality, and PowerTune/ZeroCore weren't a big influence which leaves...
> 
> ...



Then how come my HD7950 is the most silent card I've ever owned? I had it clocked at 1175/7000 and it was pretty much inaudible even during gaming? Pay that extra 20 bucks and take a card with proper heatsink and not that crap blower heatsink and every card will be silent. Without exceptions.


----------



## HumanSmoke (Jan 15, 2015)

RejZoR said:


> Then how come my HD7950 is the most silent card I've ever owned? I had it clocked at 1175/7000 and it was pretty much inaudible even during gaming? Pay that extra 20 bucks and take a card with proper heatsink and *not that crap blower heatsink* and every card will be silent. Without exceptions.


Given that you also seem to think the reference blower is crap, *what part of this did you not understand? :*


HumanSmoke said:


> Now, are you going to tell me that the largest negative gleaned from reviews, users, and tech site/forum feedback WASN'T due to the *reference blower shroud*?
> Do you not think that if AMD had put more resources into putting together *a better reference cooling solution* that the overall impression of the *reference board - THE ONLY OPTION AT LAUNCH* - might have been better from a marketing and PR standpoint ? How many people stated that *they would only consider the HD 7970 once the card was available with non-reference cooling *- whether air or water?



The whole point I made was that an otherwise excellent board* was let down by an ill-executed REFERENCE COOLER. What could have been a near-perfect product was deliberately sabotaged by its own manufacturer with a substandard cooler.....A point that you just AGREED WITH, but are still arguing against ?!?!- How about stepping away from the keyboard, taking a deep breath, and counting to potato.

This whole point seems totally lost on you, and quite frankly I don't have any interest in further reiterating what I've already written. You're either trolling or have a poor understanding of written English - neither of which I can influence. The subject is closed as far as I'm concerned.



*





HumanSmoke said:


> Why would you release graphics cards *with little of no inherent downsides from a performance perspective*, with cheap-ass blowers...


----------



## RejZoR (Jan 15, 2015)

Sorry for missig few words for fucks sake, it's not like world will end because of it, jesus.


----------



## HammerON (Jan 15, 2015)

Alright, chill out and be respectful. Please do not use offensive language.
It also would not hurt to stop challenging every comment/opinion made, as that is exactly what they are...
Opinions!!!


----------



## GhostRyder (Jan 15, 2015)

RejZoR said:


> Then how come my HD7950 is the most silent card I've ever owned? I had it clocked at 1175/7000 and it was pretty much inaudible even during gaming? Pay that extra 20 bucks and take a card with proper heatsink and not that crap blower heatsink and every card will be silent. Without exceptions.





RejZoR said:


> Sorry for missig few words for fucks sake, it's not like world will end because of it, jesus.


I would cease arguing on this point as stated above if you have a good opinion on and AMD product your going to have to fight certain people as that's against the rules.  The new rules state you have to make a joke on every article with a modified AMD logo on a picture to be part of the thread otherwise you spend pages arguing frivolous points.

Though the blowers are not amazing they are not the worst thing ever and honestly I have seen some of the HD 7970 variants with blowers and even left on auto they are not as loud as I feel they are made out to be though they definitely want aftermarket coolers.  Although the HD 7970 GHz edition was launched later (June vs January) and not the original launch version even though some models still had the standard blower they already had aftermarket variants available at the time as well with the same clocks or more as the only major point was to bring


Xzibit said:


> Maybe they plan on AIO everything from now on... It does look like Asetek with sleeves on the tubes.
> 
> 
> 
> ...


I think Asetek/AIO in general is becoming the new thing mostly because the tech is finally available cheap enough while providing something actually worthwhile.  By that I mean its reliable enough to trust that as a standard cooler to the point where you do not have to worry half of them are going to be returned for faulty pumps or otherwise.  In the past the cards with AIO's I can remember mostly at least are the PNY LC GTX 580 (Both variants) and the Zotac GTX 580.   The PNY was a good card (Had a pair of those) but from what I heard and saw the Zotac's were completely unreliable as most of the feedback on sites that sold them included the pump failing after only a month or so of use (Though the Zotac was a full cover type of block AIO).  I think its the next step the people are making because of the ever growing needs of silence, overclocking becoming so simple, boost clock becoming so adaptive and reliant on temps (etc), and the fact cases are so water cooling ready that it makes size easier to accommodate versus having a massive tri-slot cooler or similar (dual slot massive cards).

I like it honestly, but at the same time I just think its more a pain for people that like to put there own custom coolers on (Like me and waterblocks) but as long as the costs do not skyrocket then its better in the end for everyone.


----------



## RejZoR (Jan 15, 2015)

The only problem I'll have here is the miniATX case in which CPU radiator is already taking place. In theory I could place one on the exhaust, but it's gonna be tight. So, hopefully R9-380X/390X will also come with WindForce3X cooler which I think should be sufficient.


----------



## GhostRyder (Jan 15, 2015)

RejZoR said:


> The only problem I'll have here is the miniATX case in which CPU radiator is already taking place. In theory I could place one on the exhaust, but it's gonna be tight. So, hopefully R9-380X/390X will also come with WindForce3X cooler which I think should be sufficient.


They will of course since not everyone wants an AIO but it might be a little bit depending before it comes out.  I think the AIO is cool but the needs of a air cooler will always be there so they won't ever give that up and we will always see something air wise unless there is some dramatic change that no one can see coming.  I love the windforce coolers especially the new ones as if nothing else they just look fantastic!


----------



## Fluffmeister (Jan 16, 2015)

Loving my 970, but I'd happily go AMD again if this paper tiger can deliver. Now AMD and their Radeon brand are firmly considered the budget option these days... any chance I could pick this bad boy up for $300?


----------



## The N (Jan 18, 2015)

well, $300, may be possible, at the initial launch in market. but AMd never wants to go further in loss, as they're right now. suffering too much in the market even lesser sale due to 970/980.  $300 would be ideal price for consumer end. i personally think 380x "The BAD BOY" will blast the market with price/performance ratio near or equal to Maxwell.


----------



## HumanSmoke (Jan 18, 2015)

Purported Fiji 3DMark 11 benchmark (via Chiphell)






About the same score as a well OC'ed (water) 780 Ti


----------



## Xzibit (Jan 18, 2015)

A card with that much performance would be well north of $1,000

Stock version of EVGA 780 Ti Kingpin Edition wasn't that much better then there FTW.

You would need
EVGA GTX 780 Ti kingpin edition $850 MSRP
450watts
L2N Bios
Added expense of a custom water loop $$
Hope you get a 661 base overclock

If this is even true and it comes in a stock variant from either side I feel a few wallets getting lighter.


----------



## Fluffmeister (Jan 18, 2015)

Xzibit said:


> I feel a few wallets getting lighter.



Let's hope so, it's sad to see AMD in dire straits.

Besides competition as we all know is healthy and whilst I wouldn't mind another 970, the bloody things keep going up in price.


----------



## HumanSmoke (Jan 18, 2015)

Xzibit said:


> A card with that much performance would be well north of $1,000
> 
> Stock version of EVGA 780 Ti Kingpin Edition wasn't that much better then there FTW.
> You would need
> ...


Or you could bang a GTX 980 into a watercooling loop for much the same score....or amp up the clocks for 8327.

Either way, you'd expect the next large die GPUs to do significantly better than the ones presently in use. Once AMD and Nvidia tune their respective top tier GPUs (yield/clock, drivers, BIOS), you should expect the present range to be left in the dust.


Fluffmeister said:


> Let's hope so, it's sad to see AMD in dire straits.
> Besides competition as we all know is healthy and whilst I wouldn't mind another 970, the bloody things keep going up in price.


970's have thankfully stayed relatively static price wise here, but its a smaller market less prone to large volume sales. Some other markets are seeing Maxwell's popularity feed itself. As Barron's noted back in November:


> In terms of quantifying high-end share gains and magnitude, our supply-chain conversations indicate that Nvidia’s GeForce GTX 980/970 has comprised over 80% of high-end card shipments to channel partners since mid-September.


This despite AMD's 290/290X/295X2 dropping prices and continued game bundle offerings. The GTX 970 (especially) and 980 are the new black, and arriving in time for the holiday sales just made the pervasiveness that much more apparent.


----------



## The N (Jan 19, 2015)

HumanSmoke said:


> 970's have thankfully stayed relatively static price wise here, but its a smaller market less prone to large volume sales. Some other markets are seeing Maxwell's popularity feed itself. As Barron's noted back in November:



970 prove to be a most appropriate choice among many high end when you talk about price/performance. but i would say, nvidia charging way too low, for ultra high performnce. thats extremely encouraging towards NVIDIA. 

but here in our country 970 available at high tags like MSI and GIGABYTE @55K or $550. what is really high for us. but in international market newegg/amazon its $350-$400. which is damn worthy. Whereas AMD's at cheap rate.


----------



## HumanSmoke (Jan 19, 2015)

The N said:


> but here in our country 970 available at high tags like MSI and GIGABYTE @55K or $550. what is really high for us. but in international market newegg/amazon its $350-$400. which is damn worthy. Whereas AMD's at cheap rate.


Yeah, prices in New Zealand (where I am) are similarly high, mainly because of a 15% goods and services tax - not that it matters too much. Between having parts shipped direct and mail forwarding companies like MyUS, I tend to get parts at US prices or below (Just shipping to factor in).


----------



## The N (Jan 19, 2015)

HumanSmoke said:


> Yeah, prices in New Zealand (where I am) are similarly high, mainly because of a 15% goods and services tax - not that it matters too much. Between having parts shipped direct and mail forwarding companies like MyUS, I tend to get parts at US prices or below (Just shipping to factor in).



Rightly said. Shipping expenses and customs are highly involved in the changing price of any product. here are taxes high so markets here, charge too much on a single product. after paying these taxes and duties they charge double on every product they brought in.

 like if  970 cost them. let say 40K then they tagged here price 10K or 15K above there cost just to make enourmous profit outta it. they actually know the product has high in demand.  but few sources do not charge extra, some resellers are providing service for cheap service. with no extra profit.


----------



## Super XP (Jan 19, 2015)

Ferrum Master said:


> Well because I guess nVidia paid a lions share to TSMC to be his lovely puppy. It is been always like that actually... The R9 290 has approximately 20% transistors on board too, that heats up, but still reduces the performance gap at their heat cost. But hey... ATI was a Canadian company... heater during cold winter... actually a two in one
> 
> And actually they must sell their R9 290 no matter what, unsold silicon is a more loss for them than sold for a bargain. I bet they calculated everything as good they can.


ATI was a very innovative and technologically superior company. They've had many firsts in the industry. Hopefully AMD's new CEO gets AMD back to running the way its suppose to, lean and mean. And its restructuring bears FRUIT.


----------



## Fluffmeister (Jan 19, 2015)

HumanSmoke said:


> Purported Fiji 3DMark 11 benchmark (via Chiphell)
> 
> 
> 
> ...



Definitely a fake, seems that it's actually CF'd 290's, so yeah not really worth $1000 and falls firmly into the clutching at straws category. 

http://www.3dmark.com/3dm11/9300727


----------



## HumanSmoke (Jan 19, 2015)

Fluffmeister said:


> Definitely a fake, seems that it's actually CF'd 290's, so yeah not really worth $1000 and falls firmly into the clutching a straws category.
> 
> http://www.3dmark.com/3dm11/9300727


Debunked!!!
Thought it might be a little optimistic. a reference 290X scores ~4700-5000, so a 45% increase in core count would tend to put it around 7200 plus whatever increases the memory subsystem bestows assuming the clocks stay roughly equal.


----------



## Xzibit (Jan 19, 2015)

Definitely not $1000, more like $600. Looking at the base core you can get two of *these* and achieve the same thing or faster.


----------



## HumanSmoke (Jan 19, 2015)

Xzibit said:


> Definitely not $1000, more like $600. Looking at the base core you can get two of *these* and achieve the same thing or faster.


Halo parts aren't about perf/price, they're about performance.  More than a few people bought the 290X at the expense of the 290, so budgetary consideration can be secondary to a lot of people. The market Fiji (and GM 200) are aimed at are those people who 1. Want the next best thing, 2. Want a benchmark queen - in which case a single card might not cut it . Combining two 290's might give you the performance of a single 380X(?), but double/triple up on the new cards and the older boards look less competitive.
From a perf/price standpoint many would likely wonder why they'd bother with buying 290's new. The market is flooded with second hand cards (which will surely be added to when Fiji drops) at even cheaper prices.


----------



## Xzibit (Jan 19, 2015)

HumanSmoke said:


> Halo parts aren't about perf/price, they're about performance.  More than a few people bought the 290X at the expense of the 290, so budgetary consideration can be secondary to a lot of people. The market Fiji (and GM 200) are aimed at are those people who 1. Want the next best thing, 2. Want a benchmark queen - in which case a single card might not cut it . Combining two 290's might give you the performance of a single 380X(?), but double/triple up on the new cards and the older boards look less competitive.
> From a perf/price standpoint many would likely wonder why they'd bother with buying 290's new. The market is flooded with second hand cards (which will surely be added to when Fiji drops) at even cheaper prices.



Nothing new. Same thing can be said for every iteration of cards from both sides over the years.

Why even bother applying it to only one side ?


----------



## HumanSmoke (Jan 19, 2015)

Xzibit said:


> Why even bother applying it to only one side ?


Because this is an AMD thread.
Because I replied to a post where you used AMD examples.
I also mentioned GM 200 as equally apropos as an aside.

#notrocketscience


----------



## Xzibit (Jan 19, 2015)

HumanSmoke said:


> Because this is an AMD thread.
> Because I replied to a post where you used AMD examples.
> I also mentioned GM 200 as equally apropos as an aside.
> 
> #notrocketscience



How cute.  I always find it delightful when people over 50 discover hashtaging.


----------



## Fluffmeister (Jan 19, 2015)

HumanSmoke said:


> Debunked!!!
> Thought it might be a little optimistic. a reference 290X scores ~4700-5000, so a 45% increase in core count would tend to put it around 7200 plus whatever increases the memory subsystem bestows assuming the clocks stay roughly equal.



I'm just impressed with those GTX 980 scores you posted, and it isn't even a paper tiger.


----------



## anolesoul (Jan 27, 2015)

_*NOT ...holding my breath...for it's actual release;but, I am "hopeful" that AMD will also release support for DDR4. Or,what is to come---after.*_


----------



## xfia (Jan 27, 2015)

it would awesome to see what a amd apu could do with ddr4


----------



## anolesoul (Jan 27, 2015)

_*Most definitely...would 100% agree! *_


----------



## xfia (Jan 27, 2015)

carrizo launches with fm3 for desktop and drr4  samsung puts up 14nm for both ends on carrizo


----------



## HumanSmoke (Jan 27, 2015)

xfia said:


> carrizo launches with fm3 for desktop and drr4  samsung puts up 14nm for both ends on carrizo


Incorrect. Carrizo supports DDR3 and is 28nm







You are more than likely thinking of Zen, the generation after Carrizo which is mooted to be specced for DDR4 support and made on Globalfoundries (Samsung) 14nm-XM process.


----------



## xfia (Jan 27, 2015)

I know...  that's why I used the faces with beers haha   

how awesome would a fm3 with drr4 be if the cpu side went up to 8 threads and had some ipc to turn heads.


----------



## Prima.Vera (Jan 29, 2015)

I need to understand this 4096bit BUS of 380X/390X...


----------



## TRWOV (Jan 29, 2015)

HBM's I/O is 1024 per stack. Currently 4 stacks is the maximum, hence 4096bit. Not an actual 4096bit bus, it will be connected by a regular 256bit bus, but it would  act as if it was 4096bit. Think of it like GDDR5 vs GDDR3, same speed and interface, just that GDDR5 can send more data per clock.


----------



## xfia (Jan 29, 2015)

8k gaming??


----------



## TRWOV (Jan 29, 2015)

Not with 4GB  HBM 2.0 can't come fast enough... and, for a change, AMD should keep HBM for themselves at least for a while instead of making it a standard like GDDR3/4/5. All this "doing the work for everyone" mindset is killing them.


----------



## xfia (Jan 29, 2015)

I still think 4k is better suited for like editing and artists. have you seen the 5k mac with a 295m?


----------



## HumanSmoke (Jan 29, 2015)

TRWOV said:


> Not with 4GB  HBM 2.0 can't come fast enough... and, for a change, AMD should keep HBM for themselves at least for a while instead of making it a standard like GDDR3/4/5.


How can they keep another companies product for themselves? HBM is primarily a SK Hynix creation which partnered with AMD on devising the spec. The JEDEC specification was ratified over a year ago (JESD235).


TRWOV said:


> All this "doing the work for everyone" mindset is killing them.


Nvidia might feel the same way regarding ShadowPlay, G-Sync, the frame pacing algorithm, the entire GPGPU co-processor industry, and discrete mobile graphics card modules amongst a few other technologies, while Intel developed tech that AMD presently uses would probably fill this entire page. It's very far from a one sided arrangement.


----------



## TRWOV (Jan 29, 2015)

opps, wasn't aware that it was a standard already


----------



## Xzibit (Jan 29, 2015)

I believe its timing exclusivity which results in first to market.

My guess is 380s & 390s will be HBM with lower tiers still using DDR5.  The talked about late Q1 to Q2 300 series release will put HBM 2 production in line with a 400 series release Q3 2016/Q1 2017


----------



## HumanSmoke (Jan 29, 2015)

TRWOV said:


> opps, wasn't aware that it was a standard already


Yeah, that one got rushed through. Less to do with graphics cards than getting established ahead of the competition for adoption I think. Wide I/O 2 isn't that far off either, so Hynix needed to get the product front and centre.


----------



## Xzibit (Jan 29, 2015)

Looks like the Fixer is back.










Maybe he is going to fix cards that aren't using there full 4GB perhaps


----------



## HumanSmoke (Jan 29, 2015)

Xzibit said:


> Looks like the Fixer is back.Maybe he is going to fix cards that aren't using there full 4GB perhaps


Well I doubt it's to fix AMD's financial situation. Maybe they should trade in Fixer 3 for Miracle Worker 1.

If it's supposedly graphics related then it ain't happening anytime soon. Of course, Gibbo stipulates "new product" so I guess there's a chance that the HD 7970 gets a fourth rebranding as a 300 series card!


----------



## Xzibit (Jan 29, 2015)

HumanSmoke said:


> Well I doubt its to fix AMD's financial situation. Maybe they should trade in Fixer 3 for Miracle Worker 1.
> 
> If it's supposedly graphics related then it ain't happening anytime soon. Of course, Gibbo stipulates "new product" so I guess there's a chance that the HD 7970 gets a fourth rebranding as a 300 series card!



The Fixer videos have never announced anything.  They just seem to be a cheesy stab at humor.

Didn't he pay you a visit ?


----------



## xfia (Jan 29, 2015)

I feel like I wasted 3 minutes of my life..  don't tell me that is actually something AMD made


----------



## HumanSmoke (Jan 29, 2015)

Xzibit said:


> The Fixer videos have never announced anything.  They just seem to be a cheesy stab at humor.


I much preferred Roy Taylor's cheesy stab at humour. Pity that AMD didn't appreciate his humour since they demoted him soon after.
After AMD's last publicity extravaganza, I wonder if the lastest video is preparation for the announcement of the R9 285's launch in Somalia


Xzibit said:


> The Fixer videos have never announced anything.


So, just a placeholder for the dust-laden press release that proclaims three consecutive profitable financial quarters!


Xzibit said:


> Didn't he pay you a visit ?


If he did, he didn't get past the front door, as with any other door-knocking religious zealots or panhandlers who happen by.


xfia said:


> I feel like I wasted 3 minutes of my life..  don't tell me that is actually something AMD made


Don't underestimate AMD's marketing. This is a company that marketed a supposedly high-end platform with a cheesy comic book blatantly announcing AMD's insecurity regarding Intel.


----------

