# Radeon R9 390X Taken Apart, PCB Reveals a Complete Re-brand



## btarunr (Jun 15, 2015)

People with access to an XFX Radeon R9 390X graphics card, took it apart to take a peek at its PCB. What they uncovered comes as no surprise - the underlying PCB is identical in design to AMD reference PCB for the Radeon R9 290X, down the location of every tiny SMT component. At best, the brands on the chokes and bigger conductive polymer caps differ; and 512 Gbit GDDR5 chips under the heatspreader, making up 8 GB of the standard memory amount. The GPU itself, codenamed "Grenada," looks identical to the "Hawaii" silicon which drove the R9 290 series. It's highly unlikely that it features updated Graphics CoreNext 1.2 stream processors, as older rumors suggested.



 



*View at TechPowerUp Main Site*


----------



## Xzibit (Jun 15, 2015)

If you were fast enough to look at the Sapphire site before they took it down.  One of the new features they were touting is "*Dynamic Frame Rate Control*" so it might be in the new driver or tied to the new firmware/bios.

*VideoCardz - Sapphire shows off Radeon 300 series, confirms Radeon R9 Fury X*


----------



## manofthem (Jun 15, 2015)

I think this video sums up how I feel about this.  AMD, this one is for you


----------



## HumanSmoke (Jun 15, 2015)

Xzibit said:


> If you were fast enough to look at the Sapphire site before they took it down.  One of the new features they were touting is "*Dynamic Frame Rate Control*" so it might be in the new driver or tied to the new firmware/bios.


Good work AMD. Only 6+ months after they started work on the software. très impressive.


btarunr said:


> It's highly unlikely that it features updated Graphics CoreNext 1.2 stream processors, as older rumors suggested.


I'm figuring if they couldn't be bothered at least updating HDMI from 1.4 spec, the chances of reworked silicon must be approaching zero, although at least one forum poster is hoping you're wrong - unless footwear is the national dish of Slovenia


----------



## NC37 (Jun 15, 2015)

AMD made folks wait all this time...sigh...sorry but AMD, this is for you... 

That Fury better be damn competitive in prices.


----------



## GC_PaNzerFIN (Jun 15, 2015)

Couple years have passed, same cards, new price. And you wonder why you are leaking market share to competitor?


----------



## Chaitanya (Jun 15, 2015)

Looks like Amd is turned into serious meh and switching to Gtx970 was a correct decision.


----------



## Prima.Vera (Jun 15, 2015)

Discusting.


----------



## NC37 (Jun 15, 2015)

Chaitanya said:


> Looks like Amd is turned into serious meh and switching to Gtx970 was a correct decision.



And you'll likely be trading that in very quickly if Pascal performs as it is hinted to be. Both 900 and 300 series are just so stopgap. The only plus side of all this is, AMD at least has the decency to give more VRAM. nVidia cuts everyone to 4GB or forces you to pay ultra premium for more.


----------



## wiak (Jun 15, 2015)

well their new lineup looks like this
R9 Fury X (2015 Fiji)
R9 Fury (2015 Fiji)
R9 390 8GB (2013 Hawaii)
R9 390 8GB (2013 Hawaii)
R9 380 4GB (2014 Tonga)
R7 370 2GB (2012 Pitcairn)
R5 360 2GB (2013 Bonaire) 

next year it might look like 
R9 Fury Maxx (2016)
R9 Fury Maxx (2016)
R9 Fury X (2015 Fiji)
R9 Fury (2015 Fiji)
R9 390X 8GB (2013 Hawaii)
R9 390 8GB (2013 Hawaii)
R9 380 4GB (2014 Tonga)


----------



## ensabrenoir (Jun 15, 2015)




----------



## the54thvoid (Jun 15, 2015)

Interesting to see pricing. If they've done nothing but rebadge the 8gb 290X up an entire 'model range' this is pretty disingenuous. 
All to give their new card a stand out prestige name? I can understand the business model but at least now perhaps some specific people won't be so quick to champion AMD as a transparent and honest company.
They're not breaking the law but they're certainly marketing old tech as a new.


----------



## RejZoR (Jun 15, 2015)

For fucks sake, who gives a shit if it *looks* like R9-290X and that it's *unlikely to use GCN 1.2*

People supposedly already bought cards in BestBuy and they were too fucking dumb to test it against R9-290X benchmarks. Dafaq!?
Some guy even said it was barely catching R9-290X. Like seriously dude, if it's a rebrand, it would be IDENTICAL, not "barely" catching it.

So saying the silicon looks the same as Hawaii based on looking at the GPU from half a meter distance feels like a massive pile of bull manure. People don't realize that only 0.5mm larger chip on each axis can result in thousands and thousands of new transistors.

And also the DX12 factor. With R9-200 series, we were still within the DX11 era. They could pull off rebrand without any regrets. This time around, making DX12 exclusive to Fury series, that would have to be the dumbest business decision in history of Radeon graphic cards. That's why I find it hard to believe. Unless Fury costs $300 and we'll all buy that one without even thinking abour anything else. Which is again, a so unlikely scenario it makes it basically non existent.


----------



## the54thvoid (Jun 15, 2015)

RejZoR said:


> For fucks sake, who gives a shit if it *looks* like R9-290X and that it's *unlikely to use GCN 1.2*
> 
> People supposedly already bought cards in BestBuy and they were too fucking dumb to test it against R9-290X benchmarks. Dafaq!?
> Some guy even said it was barely catching R9-290X. Like seriously dude, if it's a rebrand, it would be IDENTICAL, not "barely" catching it.
> ...



I'm still going on the vast majority of tech sites saying 'rebrand'. And 290X is capable of running DX12, is it not? Or is that your point?
Either way, it surely looks more and more that other than firmware and software, its a 290X rebrand. Again, going by the vast majority of tech sites.

Also, firmware is enough to make a card faster, its why folk flash em.


----------



## HumanSmoke (Jun 15, 2015)

RejZoR said:


> People supposedly already bought cards in BestBuy and they were too fucking dumb to test it against R9-290X benchmarks. Dafaq!?
> Some guy even said it was barely catching R9-290X. Like seriously dude, if it's a rebrand, it would be IDENTICAL, not "barely" catching it.


The guy at {H] that bought the card, benched it with a 4770K based system - the 3DM FireStrike score was near identical to the 290X at the approximately the same clocks using the same CPU. Now considering this is not only linked to in bta's article, but also in the post I put up yesterday, who deserves to be called dumb? If you think benchmark runs even using the same system are IDENTICAL from run to run, I'd suggest you actually try running 3DM sometime


RejZoR said:


> So saying the silicon looks the same as Hawaii based on looking at the GPU from half a meter distance feels like a massive pile of bull manure. People don't realize that only 0.5mm larger chip on each axis can result in thousands and thousands of new transistors.


Then again your argument sinks from sight when people have managed to flash a 290X with a 390X BIOS and have the card POST. That generally doesn't occur when you flash one GPU with a BIOS from another.


RejZoR said:


> And also the DX12 factor. With R9-200 series, we were still within the DX11 era. They could pull off rebrand without any regrets. This time around, making DX12 exclusive to Fury series


Rubbish. AMD have listed their cards as DirectX12 compatible since the prelim spec was announced.








RejZoR said:


> that would have to be the dumbest business decision in history of Radeon graphic cards. That's why I find it hard to believe.


Well, if you'd paid attention you would have realized that AMD conferred DX12 status to every 200 series R7 and R9 some time ago, along with the OEM 300 series rebrands of the 200 series.


----------



## RejZoR (Jun 15, 2015)

New BIOS doesn't make cards faster. It just makes clocks faster...

As for DX12, even Radeon HD4870 can run on DX12 API. Just without ANY DX12 features...

EDIT:
Ok, if AMD has really rebranded R9-290X to R9-390 directly, then they officially fucked it up. Like batshit insane fuckup...


----------



## Breit (Jun 15, 2015)

Whats the problem with rebrands? As long as they are placed in the right spot in the product stack meaning placing it at least one place below the position it was previously in. Say last year your money bought you a R9-280X, this year you get a 290X (now 390X)... You can't argue that it isn't more performance for the money. 

I mean its industry standard. Even nVidia does it all the time.

And recycling PCB designs is also a common habit. GTX 670, GTX 760, GTX 970 used all nearly identical PCBs...


----------



## ironcerealbox (Jun 15, 2015)

No surprises here as it was expected for a month (maybe a little longer). AMD's best chance, now, is the Fury series of cards that will be released in less than 48 hours (my time) and the Arctic Islands Rx-400 series with Samsung 14nm FinFET and SK Hynix HBM2 (probably 2016). The former MIGHT help but it feels like it should have been something they released AT LEAST 3 months ago just to throw a monkey wrench into the market after the Titan X excitement (or earlier, assuming that they could have). There is this weird feeling I get that the cards were ready for some time now but they chose not to release them at a more opportune time (3 months ago). Maybe it was due to software not being up to par with the hardware?

If they fail to deliver on THOSE series of GPUs (Fury and Arctic Islands), then it is, most likely, over. I would like to see AMD more competitive than they are now to keep Nvidia in check (price-wise). I shouldn't speak for everybody but I get the feeling that more than half of us are waiting for next year with 16nm/14nm FinFET and better stacked memory (HBM).

For now, my Nvidia cards are doing fine as long as I don't use the newer drivers they released. It's bad enough with Nvidia pulling some pretty shady stunts lately. I will probably switch to AMD in 2016 ASSUMING both companies are on equal terms by then...and I can say that only because Samsung WILL deliver 14nm FinFET to AMD and SK Hynix WILL deliver HBM2 (to both Nvidia and AMD).


----------



## Chaitanya (Jun 15, 2015)

NC37 said:


> And you'll likely be trading that in very quickly if Pascal performs as it is hinted to be. Both 900 and 300 series are just so stopgap. The only plus side of all this is, AMD at least has the decency to give more VRAM. nVidia cuts everyone to 4GB or forces you to pay ultra premium for more.


I upgraded the PC running GTX 670, my other PC is running a R9-290 . And both my monitors that I use are 1920*1200 resolution and I don't give rodent's rear about the "crippled" RAM allocation of 970. As for that monitor the 970 is more than enough for any detail levels for most games that I play. If the Pascal works as promised, then I might think of replacing that R9-290 with something equivalent.


----------



## RejZoR (Jun 15, 2015)

It's nothing wrong if you rebrand mid and low end. Those ppl don't care about tech itself anyway, they just want affordable cards. But when you start rebranding high end and god forbid enthusiast level, then you know they are selling mist...


----------



## mirakul (Jun 15, 2015)

RejZoR said:


> New BIOS doesn't make cards faster. It just makes clocks faster...
> 
> As for DX12, even Radeon HD4870 can run on DX12 API. Just without ANY DX12 features...
> 
> ...


R9 290/290x support more DX12 features than GTX 980 FYI.
Anyway, it has been a vicious cycle for the red camp. Lack of money > lack of R&D > lack of GPU > lack of money again and so on 
 Hopefully the rumor of a move from Samsung is true. That would mean I have to dye my cape blue, but hey...


----------



## Breit (Jun 15, 2015)

RejZoR said:


> It's nothing wrong if you rebrand mid and low end. Those ppl don't care about tech itself anyway, they just want affordable cards. But when you start rebranding high end and god forbid enthusiast level, then you know they are selling mist...


Right.
But as we all know, Fury will be the new high end. And that's not a rebrand. What we don't know is pricing yet. So why does everyone rant about this rebrand thing without knowing AMDs plans on pricing? Only the pricing decides if it is crap or not.


----------



## RejZoR (Jun 15, 2015)

Are you sure on that one? I know AMD supports more DX12 features with Tahiti than both, Kepler and Maxwell 1, but I'm not so sure about Maxwell 2...


----------



## RejZoR (Jun 15, 2015)

Breit said:


> Right.
> But as we all know, Fury will be the new high end. And that's not a rebrand. What we don't know is pricing yet. So why does everyone rant about this rebrand thing without knowing AMDs plans on pricing? Only the pricing decides if it is crap or not.



Fury is not high end, Fury is enthusiast level. Unless vanilla Fury gets sold for around 350€ which I find very unlikely somehow...


----------



## Ebo (Jun 15, 2015)

That the same bull comming from all over the place on the mighty interweb.

*Nobody* knows yet what minor tweaks have been made on the card R9 390/390X. 

AMD has allways said that they would do a respin of the R9 290x/290

As for Fury, everybody can dream of a Ferrari with a brand new engine, but only a few will actually pay what it costs.
Im one of those guys, *if* the card delivers on preformance, thats all I think about. I will still buy a new card next year anyway when HBM2 is in line.


----------



## ensabrenoir (Jun 15, 2015)

this issue here to me is that......wait didnt Amd...sorta do did this before?   Correct me if i'm wrong but wasn't  my 6870  more of a upgrade from a 5850 than a 5870 ........performance wise at least.....  At least with the green team the numbers went down.  The 680 became the 770 and the new chip took the top spot. Rebrands aint bad when clearly defined, priced accordingly and if they managed to squeeze a noticeable increase in performance out of it.


----------



## Breit (Jun 15, 2015)

RejZoR said:


> Fury is not high end, Fury is enthusiast level. Unless vanilla Fury gets sold for around 350€ which I find very unlikely somehow...


If you see it this way, then 290X was also enthusiast level when it came out and is now propagated down to your high end category.  I see no problem.

For me enthusiast level has to be something insane like Titan-X, Titan-Z and so on... Horrendously overpriced for only a little gain in performance. The best of the best if you will. That doesn't even include 980-Ti. This is high end and so hopefully will be Fury.


----------



## RejZoR (Jun 15, 2015)

No, R9-295X was enthusiast level back then...


----------



## Rowsol (Jun 15, 2015)

Oh my...  I will weep for you, AMD.


----------



## Breit (Jun 15, 2015)

RejZoR said:


> No, R9-295X was enthusiast level back then...


R9-295X had a launch price of $1500. Sure it was enthusiast level. Let's hope Fury won't be that pricey...


----------



## RejZoR (Jun 15, 2015)

Only way I can see this to work is if they sell R9-390X for less than 300€ (preferably a lot below 300). GTX 970's go for as low as 310€. If R9-290X was catching it "just" before, they can't possibly ask more for it than NVIDIA sells a faster GTX 970... It would make ZERO sense...


----------



## Breit (Jun 15, 2015)

The 290X is already cheaper than the GTX-970. I see no reason why this should change with a new name?!
At least you get 8GB framebuffer for the same price instead of the 3.5GB on a GTX-970...


----------



## RejZoR (Jun 15, 2015)

Maybe in US. Here in Europe, majority of R9-290X are still way more expensive than GTX 970. That's why no one is buying them.

I wonder for how much more they'll charge those extra 4GB of VRAM...


----------



## Frick (Jun 15, 2015)

RejZoR said:


> Maybe in US. Here in Europe, majority of R9-290X are still way more expensive than GTX 970. That's why no one is buying them.
> 
> I wonder for how much more they'll charge those extra 4GB of VRAM...



Here they are often cheaper than the 970.


----------



## rooivalk (Jun 15, 2015)

RejZoR said:


> Only way I can see this to work is if they sell R9-390X for less than 300€ (preferably a lot below 300). GTX 970's go for as low as 310€. If R9-290X was catching it "just" before, they can't possibly ask more for it than NVIDIA sells a faster GTX 970... It would make ZERO sense...


After denial and anger, now you're bargaining...





sorry, can't resist xD


I'd buy 390X 8GB though if they sell below 970 price. In my place 290X is more expensive than 970 too.


----------



## chinmi (Jun 15, 2015)

RIP AMD... bye bye... you won't be needed anymore in this world AMD...


----------



## Vayra86 (Jun 15, 2015)

Obvious R9 390X being obvious.

Seen it coming since the R9 290X release and knowing its power signature to be honest. They were already at board power limits pretty much. The only reason Nvidia is taking 28nm further than AMD is because of Maxwell. A good move (again) versus a complete standstill at AMD. Tonga optimizations are not even ported to R9 290X or they are already in there which only goes to show the tremendous power hog it is. AMD's stance on performance has just been wrong ever since they kept scaling up the 7970.


----------



## RejZoR (Jun 15, 2015)

rooivalk said:


> After denial and anger, now you're bargaining...
> 
> 
> 
> ...



Sorry, but I'm not bargaining with anything. I'm looking at this realistically. Initially it was said to be a card with same unit count as R9-290X, but with new more efficient shaders and framebuffer compression. I wouldn't have any problems with that, R9-290X was capable enough as it was, it was just missing that final "zeng" on top. Without that, you're buying a R9-290X 8GB Edition Speciale, almost 2 years later. Would you pay 450 for it knowing a 4GB version of the same thing is getting sold for less than 300, just with different name and no other changes? Be real plz...


----------



## Breit (Jun 15, 2015)

RejZoR said:


> Sorry, but I'm not bargaining with anything. I'm looking at this realistically. Initially it was said to be a card with same unit count as R9-290X, but with new more efficient shaders and framebuffer compression. I wouldn't have any problems with that, R9-290X was capable enough as it was, it was just missing that final "zeng" on top. Without that, you're buying a R9-290X 8GB Edition Speciale, almost 2 years later. Would you pay 450 for it knowing a 4GB version of the same thing is getting sold for less than 300, just with different name and no other changes? Be real plz...



You just don't know pricing yet. Keep your judgement till we know the launch price for sure.
Besides, I'm also from Europe (Germany to be exact) and here pricing for a R9-290X is almost identical to a GTX-970 (~320€).


----------



## RejZoR (Jun 15, 2015)

And what does that tell you? Would you pay same price for a 1 year old chip or would you rather have a brand new chip like Maxwell 2? Be aware, when NVIDIA launched GTX 900 series, R9-200 series was already 1 year old...


----------



## Vayra86 (Jun 15, 2015)

RejZoR said:


> And what does that tell you? Would you pay same price for a 1 year old chip or would you rather have a brand new chip like Maxwell 2? Be aware, when NVIDIA launched GTX 900 series, R9-200 series was already 1 year old...



The discussion on this is also a year old and the AMD/Nvidia market share movement has underlined the truth.


----------



## RejZoR (Jun 15, 2015)

Sorry, my knowledge of English language doesn't include understanding of poetry...


----------



## Breit (Jun 15, 2015)

To be honest, I don't care if a chip is old or brand new. All I care about regarding chips is performance and features for a given amount of money. If AMD can compete here with a 2 year old chip than I'm OK with that.

If nVidia would rebrand their original Titan Black to a Titan Light and offered it for let's say $300 or less, I probably might consider getting one (or two) of those...


----------



## RejZoR (Jun 15, 2015)

Well, and we are at the price again. Totally not a "told you so" moment...


----------



## R-T-B (Jun 15, 2015)

chinmi said:


> RIP AMD... bye bye... you won't be needed anymore in this world AMD...



We need them very much.  What we don't want is for them to go out, and from the looks of things, they are VERY close if they are penny pinching like this.

I know this sounds pathetic, but it may be more important than ever to buy their shit now just to help them survive...

...or maybe we can depend on some newcomer to make a GPU...  hur hur.  I'm looking at you, SiS.  Aren't they still around?


----------



## Vayra86 (Jun 15, 2015)

R-T-B said:


> We need them very much.  What we don't want is for them to go out, and from the looks of things, they are VERY close if they are penny pinching like this.
> 
> I know this sounds pathetic, but it may be more important than ever to buy their shit now just to help them survive...
> 
> ...or maybe we can depend on some newcomer to make a GPU...  hur hur.  I'm looking at you, SiS.  Aren't they still around?



I have never REALLY understood this stance on AMD's survival.

If a company makes shitty products, or no longer improves its product lines, why would it have any right to survive in a competitive market? Goodwill, surviving on the pockets of its customers who get subpar products in return??? If anything, those customers are just supporting AMDs *standstill, but not its survival. *A company like this simply cannot survive without design wins. And AMDs last design win was a decade ago...

Honestly if AMD fails to impress, another company will take its place. x86 is big enough, someone will simply take over the company for its patents and technologies and continue onwards.

As far as the 'beautiful' philosophy of AMD for open-source and market standardization, I would brand it as utopian and surreal. It hasn't worked, it may work for FreeSync, but it won't work for AMD in terms of profit. Profit = survival. Lack of profit = inevitable death.


----------



## R-T-B (Jun 15, 2015)

I'll agree with you it's flawed logic.  I just don't want a monopoly damnit, lol.

Samsung can't buy AMD fast enough...  but at this rate, why would they want to?


----------



## RejZoR (Jun 15, 2015)

R-T-B said:


> We need them very much.  What we don't want is for them to go out, and from the looks of things, they are VERY close if they are penny pinching like this.
> 
> I know this sounds pathetic, but it may be more important than ever to buy their shit now just to help them survive...
> 
> ...or maybe we can depend on some newcomer to make a GPU...  hur hur.  I'm looking at you, SiS.  Aren't they still around?



One thing is avoiding AMD because of being a fanboy. Another is them showing middle finger to customers and losing them this way. R9-390X is a big middle finger. They were even so fucking lazy they couldn't use GCN 1.2 and framebuffer compression on top of R9-290X. I never asked for more shaders, more TMU's and more ROP's. I just wanted GCN 1.2 and framebuffer compression and I'd buy it instantly even without HBM. But now, they can have it. Great business done right there AMD. No card sold to me. Not sure what they are trying to achieve...


----------



## R-T-B (Jun 15, 2015)

Yeah, I think my previous argument (BUY OR AMD WILL DIE!) was pretty poor as noted.  So don't tear me apart too hard gentlemen...  it's a matter of being between a rock and a hard place for a consumer.


----------



## Vayra86 (Jun 15, 2015)

R-T-B said:


> We need them very much.  What we don't want is for them to go out, and from the looks of things, they are VERY close if they are penny pinching like this.
> 
> I know this sounds pathetic, but it may be more important than ever to buy their shit now just to help them survive...
> 
> ...or maybe we can depend on some newcomer to make a GPU...  hur hur.  I'm looking at you, SiS.  Aren't they still around?





R-T-B said:


> I'll agree with you it's flawed logic.  I just don't want a monopoly damnit, lol.
> 
> Samsung can't buy AMD fast enough...  but at this rate, why would they want to?



Patents, x86 licensing, the lucrative server business, and craploads of technology that could serve them well both in ARM and x86 markets. Samsung hasn't got a lot of inhouse GPU knowledge for example, but dóes make its own Exynos chips.

All of the above can be profitable business. Also: size. Samsung thrives on controlling markets through its overwhelming capacity in production, R&D, marketing. The success of the Galaxy phones is a simple example of that. They just pound the market with six thousand models and something is bound to succeed.


----------



## R-T-B (Jun 15, 2015)

One can hope.


----------



## Phobia9651 (Jun 15, 2015)

Vayra86 said:


> Honestly if AMD fails to impress, another company will take its place. x86 is big enough, someone will simply take over the company for its patents and technologies and continue onwards.



From what I know the x86 license is not transmittable through a take-over/buyout of AMD.
No company would take the risk and dive into CPU or GPU development when you face competitors who have been at it for decades.
Sad but true.


----------



## Breit (Jun 15, 2015)

Vayra86 said:


> I have never REALLY understood this stance on AMD's survival.
> 
> If a company makes shitty products, or no longer improves its product lines, why would it have any right to survive in a competitive market?



The problem is that without AMD the market does not have competition anymore, which means nearly no innovation, no progress and high prices from the lone survivor (nVidia in this case). No one really want that.


----------



## Vayra86 (Jun 15, 2015)

Breit said:


> The problem is that without AMD the market does not have competition anymore, which means nearly no innovation, no progress and high prices from the lone survivor (nVidia in this case). No one really want that.



If you had read my post you would know this is a flawed argument.

If Intel and Nvidia 'get' the market for themselves, they will also face the consequences of attaining a monopoly. They are both *much* better off with a weak AMD than they are with no AMD. Besides, the market cannot afford to stagnate further, the node delays are already stagnating enough as it is, and ARM is just around the corner breathing down x86's neck. Just a few examples of what really drives the market, instead of the usual x86-centric thought where it is the only thing in existence.

EVEN IF Intel and Nvidia get x86 for themselves, and EVEN IF AMD will just die in misery instead of being taken over, the situation resulting from that will be temporary at best. ARM will take its place, or a new x86 competitor, hell maybe even a company like Qualcomm will consider stepping up. ARM is already invading the server business and does so through great scalable solutions with a very low overhead. ARM's solution to the performance difference with x86 is quite similar to AMD"s stance on performance increases: chaining more cores and using modular design, much like the FX and the approach to big die GPU's. Meanwhile, Intel is trying to force itself into the ARM business but will never survive on that alone, their business is not tuned for that.


----------



## Breit (Jun 15, 2015)

I'm not sure ARM will ever be entering x86 market.
Maybe someday in the future they might be able to compete in performance with their architecture, but building an entire ecosystem around it is quite another feat which is very unlikely to happen anytime soon.


----------



## FordGT90Concept (Jun 15, 2015)

Remember that is really TSMC that deserves the hate far more so than AMD.  Their lack of an upgraded process in over two years meant all chips are still on 28nm.  Yes, AMD should have at least upgraded the silicon to GCN 1.2.


----------



## Vayra86 (Jun 15, 2015)

Breit said:


> I'm not sure ARM will ever be entering x86 market.
> Maybe someday in the future they might be able to compete in performance with their architecture, but building an entire ecosystem around it is quite another feat which is very unlikely to happen anytime soon.



*It is already happening.* Remember Windows RT? Windows 10 is also being built to run on ARM phones and x86 devices alike. Current-gen WP already runs on ARM.

Regardless of AMD's future, ARM won't stop at ARM alone. In many ways it surpasses x86 in the way it can be configured; every single niche in the market can have its own custom designed SoC, and it won't even break the bank to do so.

In terms of ARM entering the x86 market... this here below is an example of last year...

http://www.cavium.com/newsevents_Ca...ation_Data_Center_and_Cloud_Applications.html


----------



## FordGT90Concept (Jun 15, 2015)

Windows supported IA64 too...which was a flop.


----------



## R-T-B (Jun 15, 2015)

FordGT90Concept said:


> Windows supported IA64 too...which was a flop.



And windows NT supported PPC...  as did it's then main competitor, OS/2.  Lulz.


----------



## Vayra86 (Jun 15, 2015)

Of course. All this support served Windows well in the long run, as it still is the dominating OS all over the world.

ARM is a different animal though, because it has its own market as well and there is overlap, both ARM and x86 devices can do one or the other, which is also the reason it is such a huge threat to x86, but not the other way around because x86 is old and expensive, while ARM is fresh and highly customizable, and geared towards a market of many competitors instead of just two.

With every passing hour, Intel and AMD's existence is becoming less important and has already shifted from 'vital' to 'nice to have' for large portions of the market.


----------



## Breit (Jun 15, 2015)

A bit off-topic, but whatever... 
Are you really interested in universal apps? I mean all native apps are compiled with a specific architecture in mind and cannot be executed on different hardware, even if the operating system has a kernel that runs on said hardware. We are talking about high end hardware here, not tablet/mobile hardware. Sure that niche market is becoming more and more interesting because most people just don't need the performance that is possible with today's chips and are happy with angry birds type games. But for instance a real demanding game has to be a native application and so does every other demanding application that can put the performance of a chip to good use (think of rendering, content creation etc.).
As far as I know, the full Windows 10 will not be available for ARM. Windows RT is a dead end. The Windows 10 you are talking about is only for mobile/tablet and has very little in common with the desktop Windows (besides its name).


----------



## GhostRyder (Jun 15, 2015)

The PCB being the same really is not the proof that this is not GCN 1.2, however it is now more pointing towards this being a rebrand with 8gb GDDR5 and some improvements on the silicon that can give better clocks all around.  Sad to say the least but if priced right it can be a decent deal.

Hopefully once these cards are all over the market we can get 100% confirmation so there will be no doubt left either way.


----------



## Katanai (Jun 15, 2015)

Ahem... I remember seeing a really long thread when the GTX 970 "lies" were discovered, filled with a lot of name calling and hate. I'm just sitting here, sipping on some tears, and adding to this thread. Let's see where it will go...


----------



## yotano211 (Jun 15, 2015)

What is everyone talking about. I lost my mind after the 3rd post.


----------



## FordGT90Concept (Jun 15, 2015)

I think if AMD folded and it isn't bought up, ARM Holdings would step up and introduce processors that can tango with Intel on x86.  Microsoft would return the ARM version of Windows and companies like Qualcomm would release desktop ARM CPUs.  The rise in x86 prices would make ARM very, very attractive.


You'd think the transistor count would have to change if Grenada had GCN 1.2 or even 1.3.  As far as we know, it did not.  Because they recycled the device ID suggests that the internal features haven't changed at all.


----------



## Casecutter (Jun 15, 2015)

RejZoR said:


> It's nothing wrong if you rebrand mid and low end. Those ppl don't care about tech itself anyway, they just want affordable cards. But when you start rebranding high end and god forbid enthusiast level, then you know they are selling mist...


It is regrettable, but when there could be something out there, lurking in the mist it plays with the mind…  While naysayers can minimize AMD this round, knowing AMD or a their design of FinFet and HBM2 might live and still out there, that unknown... it plays on folks, and that is good.



Vayra86 said:


> If a company makes shitty products, or no longer improves its product lines, why would it have any right to survive in a competitive market? Goodwill, surviving on the pockets of its customers who get subpar products in return.


 
I don't believe you can say "makes shitty products" at least in the GPU (CPU completely different letdown let’s not get into that bigger topic); AMD have remained in the fray up until the 970/980 then more got bested on efficiency (keep in mind everything non-gaming was forfeited).  It's more the issue of TSMC (and only having them), while Tahiti was good when it came out it was widely known TSMC had issues. The GK104 got held up as TSMC fixed their problems, and that clean-up provided the 7970GHz, the clock I believe AMD always intending to hit became viable in the yields. At that time one thing AMD got wrong was trying to get in front Nvidia with Tahiti, I don’t know if AMD actually knew TSMC had issues with 28nm that could be fixed, but them jumping in front worked to be a disservice. Had they held back and come right alongside Nvidia release, there offerings and price might have been seen less deficient.

Hawaii as it turns out in AIB customs card it was competitive, and competitive to GM110 even if 5-7 months tardy, when it came Nvidia cut price and kept above it with Ti release.  You need to consider Nvidia can/must R&D like crazy to support the lucrative HPC corporate contracts.  For AMD to even be able to deliver a competitive part vying Nvidia’s ever-ramping Enthusiast Gaming segment is a huge undertaking and cost, given AMD has like zero of corporate HPC market.

Will I (am I) disappointed that AMD couldn’t find the money/justification to re-spin Hawaii and Pitcairn) to something newer, most definitely.  Do I admire AMD for building Fiji and innovating on HBM, absolutely, but not at a cost of sacrificing the mainstream, their bread and butter market share.  The more I pull from the information, AMD failings are not from a lack of technical innovation or expertise, it's the fact that upper management has not (for many years) sculpted the company and “screwed the pooch” in various ways.  The reason we aren’t seeing a true GCN 1.2 Granada probably goes back more than a year ago, and while not great/huge improvement, some big shot’s saw it as an easy decision... it wasn’t providing enough change.  I think some held hope (internally and externally) for GloFo to have something ready on 28SHP, but that direction appeared still blocked. Then in September 2014 with volumes/market share going into “full-funk”, bean counters figured the only way to maintain an effective price was to keep the foot on the gas with current stuff. Aiming not to waste time/cash/engineering the new layout and forgo the cost for a new tape-out at TSMC.  The problem I have if that was the direction than a 390X should’ve been in the market last March-April.  Unless there’s some big revelation that tied to then holding to E3. (I don't)

There’s also a kind of Arms Race mentality, if we can’t bring a Halo product we’ll lose heart and minds. Though, the idea to just be chasing Nvidia is fleeting.  Especially if not shoring up the flanks, you’re not enticing the new gamers, while ignoring the mainstream leaves them feeling “uninhibited” from looking to the other side.

Though to conclude, this mindset of “just go away AMD” you complicate the market by being in it… falls more in the “be careful what you wish for”. Can't you just go about what you want, choosing to ignore there's AMD, but the naysayer can’t/don’t as what else would they do... play games?


----------



## RejZoR (Jun 15, 2015)

It's not what we wish for, it's what AMD is digging for themselves...


----------



## Casecutter (Jun 15, 2015)

RejZoR said:


> It's not what we wish for, it's what AMD is digging for themselves...


In most of these situation it's some big shot’s having nothing more than a quarterly view... and hoping to deflect it from sticking to their legacy.


----------



## HumanSmoke (Jun 15, 2015)

Casecutter said:


> Can't you just go about what you want, choosing to ignore there's AMD, but the naysayer can’t/don’t as what else would they do... play games?


Well, that is hardly vendor specific around here is it? and the proponents of such a point of view are also often the same people who structure their posting for other IHV's thus: Point out perceived fault > segue into dire prophesy about company viability > throw the company a small bone in a superficial attempt at even-handedness > end post with even more dire prophecy and doubts about companies survival.
I can show working (examples) if this seems like a gross generalization.

It is the nature of mainstream forums that are less MIT than Deadwood circa 1870, to over-simplify, not look at the bigger picture, and to apportion blame according to popular opinion rather than seek out the correct information - which takes time, a wide range of source material generally not sexy enough ( lack of pictures, lack of flamebait, high degree of technical material requiring further reading in of itself), and a continuing deep interest to constantly overwrite/supplement the previous information gleaned. That won't ever change because unless you enjoy getting into the nuts and bolts of the industry, it will always be easier to parrot whatever some other guy says.

Having said that, the whole rebranding thing should be pretty easy to understand. AMD simply don't have the R&D resources to fight a three front war (x86, GPU, RISC) without spreading itself exceedingly thin and having to prioritize projects and cancel others. The company, flushed with success after K6 and K7, simply overreached. They decided that they could take on Intel and Nvidia in head to head battles simultaneously, sinking their first real profit into an overpriced ATI when it probably made more sense to expand their foundry business (and open it to third party fabbing), and licence ATI's graphics IP. AMD also lost sight of what bought the company its success - namely acquiring IP (principally from DEC), and appealing to a budget market that hadn't really existed under Intel rule. Those two factors dissolved pretty quickly. K8 slipped, K10 was evolutionary, not revolutionary, Bulldozer's timetable was a disaster, and the $2bn+ debt burden the company saddled itself with effectively meant their foundry business floundered, R&D suffered as people either left as projects slipped (or were cancelled) - or as cost cutting measures. ATI when AMD acquired it was a basket case thanks to a slow gestation of R600, and Nvidia's near flawless architectural (and marketing) execution of G80. Everything that has happened since is a product of the seeds sown in 2005-06 - maybe arguably, a little before since AMD missed a prime opportunity for greater revenue by stalling on outsourcing chip production* to a third party foundry when their own fabs were capacity constrained.

Judging by popular opinion rather than fact, a certain faction of the forum memberships seem to play all AMD's woes on Intel, Nvidia, TSMC, and Globalfoundries - seldom (if ever) looking at the decisions made by the company itself in the past that laid the foundations for the situation in the present - something that is bound to continue as AMD outsource design ( people seem to forget that the Synopsys deal also meant AMD offloading 150 R&D engineers for example) at the expense of in-house bespoke R&D. No doubt the same people will continue to lay the bulk of the blame at the feet of third parties, rather than AMD's BoD and previous strategic planning efforts

*  AMD were allowed under the Intel cross-licence agreement to outsource up to 20% of their chip production to third party foundry partners


----------



## Xzibit (Jun 16, 2015)

Chinese anyone ?


----------



## HumanSmoke (Jun 16, 2015)

Xzibit said:


> Chinese anyone ?


I would, but I know I'll just feel like another benchmark an hour later.

Couldn't be bothered deciphering that chart, or looking too deeply at it since the parts are known quantities, but the 390X numbers look pretty much ballpark for the 290X vs the 980. I'd assume the gap over the 970 denotes 4K benchmarks.


----------



## BiggieShady (Jun 16, 2015)

FordGT90Concept said:


> You'd think the transistor count would have to change if Grenada had GCN 1.2 or even 1.3. As far as we know, it did not. Because they recycled the device ID suggests that the internal features haven't changed at all.


Additionally, AMD never put a version numbers on their GCN reiterations nor they ever bragged with architectural changes inside GCN core.
All the version numbers were added by tech sites and consumers.
Realistically, all AMD could do after R&D cuts, was to optimize transistor layout without changing the architecture ... they didn't gain much efficiency, they did get a smaller die and with that chance to make big fury chip on the same 28nm node.
Everyone is bitching right now about rebrands, but all things considered AMD has done best they could. 
Too bad this can't be a marketing slogan: "Give us your money, we did best we could!"


----------



## steven21uk (Jun 16, 2015)

Fake Fake Fake where is the box this card came out from and as the 390X isn't released until the 25th this is prob a nvidda fan boy or nvidda trying to put u off as there scared of its results when it comes out so don't look into this load of rubbish and if it is true I want to see more proof 2 pics don't show u anything ive seen stuff like this before with processors and video cards before the release dates also fake gpu z pics are easily edited so that's prob a fake as well. I would wait till after the release date for the truth


----------



## HumanSmoke (Jun 16, 2015)

steven21uk said:


> Fake Fake Fake where is the box this card came out from and as the 390X isn't released until the 25th this is prob a nvidda fan boy or nvidda trying to put u off as there scared of its results when it comes out so don't look into this load of rubbish and if it is true I want to see more proof 2 pics don't show u anything ive seen stuff like this before with processors and video cards before the release dates also fake gpu z pics are easily edited so that's prob a fake as well. I would wait till after the release date for the truth


Oh dear, you really are a special one aren't you.
If you'd bothered to click on the links right below the article, you'd see that not only the box is shown, but the XFX rep is involved in the thread.
What is more odd, is that you seem to be the only person in the Western world that is unaware that Best Buy is selling 300-series cards ahead of the official launch ( where the people posting the pictures, benchmarks, and BIOS dumps bought their cards), and Legit Reviews actually sent people to check this out personally.


steven21uk said:


> so don't look into this load of rubbish


What a coincidence! that was my first reaction on attempting to decipher your appallingly bad grammar, but in the interests of keeping even people like yourself informed, I persevered in order to enlighten you on what a hyperlink at the bottom of an article signifies.

What an awesome first post steven, you may now hold the TPU record for the longest string of characters making the least amount of sense. Your certificate will be mailed out.


----------



## steven21uk (Jun 16, 2015)

HumanSmoke said:


> Oh dear, you really are a special one aren't you.
> If you'd bothered to click on the links right below the article, you'd see that not only the box is shown, but the XFX rep is involved in the thread.
> What is more odd, is that you seem to be the only person in the Western world that is unaware that Best Buy is selling 300-series cards ahead of the official launch ( where the people posting the pictures, benchmarks, and BIOS dumps bought their cards), and Legit Reviews actually sent people to check this out personally.
> 
> ...


----------



## PCGamerDR (Jun 16, 2015)

wiak said:


> well their new lineup looks like this
> R9 Fury X (2015 Fiji)
> R9 Fury (2015 Fiji)
> R9 390 8GB (2013 Hawaii)
> ...



TBH they seem to have aged well, the most impressive one being pitcairn that good ol' hd7870, oh the memories. -cryeveritym-


----------



## Vayra86 (Jun 17, 2015)

Casecutter said:


> It is regrettable, but when there could be something out there, lurking in the mist it plays with the mind…  While naysayers can minimize AMD this round, knowing AMD or a their design of FinFet and HBM2 might live and still out there, that unknown... it plays on folks, and that is good.
> 
> I don't believe you can say "makes shitty products" at least in the GPU (CPU completely different letdown let’s not get into that bigger topic);



I remember similar sounds coming from many people shortly after the FX procs were announced. The first time, and the second time when they updated the arch. Only a year later we concluded universally that the FX line was a shitty product.

Can you not see the parallel between the FX back then and the lack of innovation in GPU right now? AMD has started a trickle down attempt with HBM and forced itself into a third rebrand because HBM is too costly for anything but the top end card. Not a smart decision when Nvidia shows the world there is efficiency to be gained, and not just a little of it either.

It happens every time with AMD. Their lack of R&D is converted into a higher production cost, and they just fail to see it. Again. And Again. AMD's answer to lacking efficiency gains is 'more cores' which results in higher cost AND a product that is less competitive across the board. And because the product is less competitive, it goes for a lower price. All aspects of this way of doing business result in reduced margins and profit. Evidenced by the yearly results.

This is the essence of the problem. Now, to look ahead: what is the core setup of Zen again? Oh yeah, a many-core design.


----------



## BiggieShady (Jun 17, 2015)

Vayra86 said:


> Not a smart decision when Nvidia shows the world there is efficiency to be gained, and not just a little of it either.


Nah, AMD did improve efficiency in Fiji by a healthy margin (compared to 200/300 series) but that is highly dependent on clocks, even GM200 looses much of it's efficiency in factory overclocked 980 ti cards ... that nano card is 2x efficient than 290X, and fury x is 1.5x more efficient ... all in all, given their position this was a pretty good launch IMO, since they couldn't have complete lineup made out of Fiji derivatives.


Vayra86 said:


> This is the essence of the problem. Now, to look ahead: what is the core setup of Zen again? Oh yeah, a many-core design.


Of course it is a multiple core design, what else would it be, the point is they are moving away from huge pipeline design and sharing modules between cores.


----------



## HumanSmoke (Jun 17, 2015)

BiggieShady said:


> ... all in all, given their position this was a pretty good launch IMO, since they couldn't have complete lineup made out of Fiji derivatives.


Personally I'd rate it a pretty good launch when the cards actually launch (and live up to the marketing of course). At the moment I'd rate it a pretty good tease.


----------



## Vayra86 (Jun 17, 2015)

BiggieShady said:


> Nah, AMD did improve efficiency in Fiji by a healthy margin (compared to 200/300 series) but that is highly dependent on clocks, even GM200 looses much of it's efficiency in factory overclocked 980 ti cards ... that nano card is 2x efficient than 290X, and fury x is 1.5x more efficient ... all in all, given their position this was a pretty good launch IMO, since they couldn't have complete lineup made out of Fiji derivatives.
> 
> Of course it is a multiple core design, what else would it be, the point is they are moving away from huge pipeline design and sharing modules between cores.



Heheh my previous comment was written before I read about the announcement of the Nano and stuff 

Things have just changed.


----------



## Casecutter (Jun 17, 2015)

Vayra86 said:


> Can you not see the parallel between the FX back then and the lack of innovation in GPU right now?


Lack of innovation in GPU... like a smaller 28mm Die and implantation of HBM that could best the larger GM200 isn't?

AMD’s R&D and wows from making Bulldozer *are not* absolutely same as the cash flow troubles that the GPU side is sullied with today. Sure the loss of profit on the CPU side has had an effect, but it’s the overall health of a lot of bad decisions that weigh on GPU side. To even make that parallel is simplistic.

I don't see the FX and these GPU's as drawing any parallel other than their implementation of Asynchronous Shading, and attaining actual use of multi-core that they hoped the software programing world would “get one board” too the usefulness of that back 7 years ago.

AMD is known to push technology well head of the industry (aka 64-Bit) the idea of true “multi-core” has been the courageous path, and AMD went “all in” on chip designs believing if we build it the industry will change... NOT.  Not to say there wasn’t shortcomings even in the "core design" that hampered multi-path coding, yes.  It was more AMD neglected to comprehend that single-core (hyper-thread) was "super embedded" and software writing wasn't going to change to their idea’s without a hook.

When we now look a Dx12, Asynchronous Shading, FX processor multi-core, the need to why AMD introduce Mantle… their direction is apparent.  They’ve in some ways now nudged/forced Microsoft to step-up, and offer the significances that AMD's technologies have been building to (kind of 64-Bit all over again).  We need to step back (realize) the David-Goliath(s) that are in the industry hold to what they like till they can offer it, in that vain the Intel/Nvidia consortium was not into Microsoft going to low level API. This time I see AMD pressing Mantle challenged Microsoft, showing they might be best to move, or see their dominance in PC Gaming (DX) brought down by other groups the likes Steam machines, Linux, even Apple as they seem to be looking at low level API gaming.


----------



## BiggieShady (Jun 17, 2015)

HumanSmoke said:


> At the moment I'd rate it a pretty good tease.


Fair enough, though I don't doubt it's plenty fast, they did show Fury X running Sniper Elite 3 on 5K screen on stage


----------



## HumanSmoke (Jun 17, 2015)

Casecutter said:


> Lack of innovation in GPU... like a smaller 28mm Die and implantation of HBM that could best the larger GM200 isn't?


I wouldn't pop the champagne corks just yet.
AMD's Fiji is less than 1% smaller than GM 200 - so while technically correct, I wouldn't tout it as a plus. The other point is that we've yet to see Fiji benchmarked, and the only implementations of GM 200 seen are a full part that is wattage constrained by a large framebuffer and adherence to the ATX spec, and a salvage part. Neither are representative of the zenith of the architecture. I am going to hazard a guess that if Nvidia plumbed the Titan X for dual 8-pin input and raised the board power limits accordingly, you would see some significant uplift.
If you were talking SKUs, I wouldn't make the comparison, since Nvidia tend to value the ATX and PCI-SIG spec and their warranties over outright performance, but you were talking architecture - and while Fiji and HBM is undoubtedly a big step forward, just like GM 200 is a scaled up GM 204 which made concessions in design itself, I suspect that Fiji grew out of Tonga and could well be making concessions of its own.


Casecutter said:


> AMD’s R&D and wows from making Bulldozer *are not* absolutely same as the cash flow troubles that the GPU side is sullied with today. Sure the loss of profit on the CPU side has had an effect, but it’s the overall health of a lot of bad decisions that weigh on GPU side.


True. AMD's woes stem from decisions made at board level ten years ago.


----------



## Vayra86 (Jun 18, 2015)

Casecutter said:


> Lack of innovation in GPU... like a smaller 28mm Die and implantation of HBM that could best the larger GM200 isn't?
> 
> AMD’s R&D and wows from making Bulldozer *are not* absolutely same as the cash flow troubles that the GPU side is sullied with today. Sure the loss of profit on the CPU side has had an effect, but it’s the overall health of a lot of bad decisions that weigh on GPU side. To even make that parallel is simplistic.
> 
> ...



It's more of a thing that AMD is ahead of the curve and suffering for it. Their timing simply isn't optimal. They act as industry leaders while they are not even close to being such a company, neither in size, assets, time to market ability and production capacity. Call it courageous, I'd rather call it unwise. Because they are trying to be ahead of the curve, they bleed millions every year. Mantle has done absolutely nothing to further AMD's position on the market and by the time the market has adjusted to DX12, they have no real advantages anymore over the competition because they have shown all their cards way ahead of time. Nvidia and Intel have ample opportunity to adjust now and do so WITHOUT bleeding market potential and have just used all the time before that to maximize profit.

Don't get me wrong, I like how AMD's philosophies are now coming to fruition, but the expenses are way too high.


----------



## Casecutter (Jun 18, 2015)

Vayra86 said:


> Don't get me wrong, I like how AMD's philosophies are now coming to fruition, but the expenses are way too high.


 
Leaders are less about changing course, being the "scrappy" one or realizing the path less travel.  We need AMD to be in the contest and work such innovative maneuvers.  You more advocate they should just be good and get back in line wait your turn, that what the "Titans" want, which is an opportunity to hold a thumb on them. A lot of good that would do, meaning less than the "one-upmanship" that yes… hardly achieve now.  Be good little companies and don't bring your innovative technology into our arena, we industry leaders are doing it "our way".

Innovation is the price of Greatness, and yes sometime it doesn't pay off.  And sometime it take bold moves as when Apple on verge collapse bring back Steve Job's, he had to make cozy with Bill Gates. Every Apple fan decried the relationship, but it got Apple we have today.  I think this time AMD wrangled Microsoft to include such that "innovations," which they weren't intending to include because "industry leaders" had the ear of MS.  Then AMD said, here's what Gamers could have, and it isn't or doesn't need to be tied just to MS and their ecosystem... 

Philo Farnsworth; an Idaho farm boy at the age 15 developed the working concept of TV, built the prototype and he got the Patent. Even won Patent injunctions, against RCA (the industry leader in radio) who poured exorbitant amounts of money to stifle him in court, completely stopping ever getting into production.  RCA finally found a way to circumvent those patents and general public doesn’t know or acknowledges Farnsworth with the invention of TV.

I suppose I see it differently, as being more the basis of an American Sprit to innovation.


----------



## Vayra86 (Jun 18, 2015)

Casecutter said:


> Leaders are less about changing course, being the "scrappy" one or realizing the path less travel.  We need AMD to be in the contest and work such innovative maneuvers.  You more advocate they should just be good and get back in line wait your turn, that what the "Titans" want, which is an opportunity to hold a thumb on them. A lot of good that would do, meaning less than the "one-upmanship" that yes… hardly achieve now.  Be good little companies and don't bring your innovative technology into our arena, we industry leaders are doing it "our way".
> 
> Innovation is the price of Greatness, and yes sometime it doesn't pay off.  And sometime it take bold moves as when Apple on verge collapse bring back Steve Job's, he had to make cozy with Bill Gates. Every Apple fan decried the relationship, but it got Apple we have today.  I think this time AMD wrangled Microsoft to include such that "innovations," which they weren't intending to include because "industry leaders" had the ear of MS.  Then AMD said, here's what Gamers could have, and it isn't or doesn't need to be tied just to MS and their ecosystem...
> 
> ...



I see what you mean, and there is much to be said for progressive thinking like you suggest. And you got it right, I think it has a lot to do with the thought of the American Dream, making something out of nothing, etc. I am from Holland, and us Dutch are known for being tight on spending  Definitely a cultural influence there


----------



## Casecutter (Jun 18, 2015)

Vayra86 said:


> Dutch are known for being tight on spending


I'm not a Trump!!! 

The American Dream is... different the Sprite to dream and develop things; while there's something to be said when America still had some... idea of frugality.


----------



## Casecutter (Jun 18, 2015)

Casecutter said:


> Lack of innovation in GPU... like a smaller 28mm Die and implantation of HBM that *could best* the larger GM200 isn't?


 


HumanSmoke said:


> I wouldn't pop the champagne corks just yet.


Opp's thought I read it as 569mm somewhere else, not the 596mm checking the TPU data base.
So true, not a "whit-worth" of difference.


----------



## XFXSupport (Jun 18, 2015)

Ebo said:


> *Nobody* knows yet what minor tweaks have been made on the card R9 390/390X.





Here's what we do know.  

1.  The memory is sped up by 500-1000Mhz, advances in circuitry made this possible.
2.  The VRM cooler is heavy, rough estimates are 20-30'C cooler from the previous 290s
3.  Bigger cooler, more copper

More on that next week when i have parts on hand and reviews start coming in


----------



## Breit (Jun 19, 2015)

XFXSupport said:


> Here's what we do know.
> 
> 1.  The memory is sped up by 500-1000Mhz, advances in circuitry made this possible.
> 2.  The VRM cooler is heavy, rough estimates are 20-30'C cooler from the previous 290s
> ...



Or you can just read reviews that are already out: http://www.overclock3d.net/reviews/gpu_displays/msi_r9_390x_gaming_8g_review/1


----------

