Monday, June 15th 2015

Radeon R9 390X Taken Apart, PCB Reveals a Complete Re-brand

People with access to an XFX Radeon R9 390X graphics card, took it apart to take a peek at its PCB. What they uncovered comes as no surprise - the underlying PCB is identical in design to AMD reference PCB for the Radeon R9 290X, down the location of every tiny SMT component. At best, the brands on the chokes and bigger conductive polymer caps differ; and 512 Gbit GDDR5 chips under the heatspreader, making up 8 GB of the standard memory amount. The GPU itself, codenamed "Grenada," looks identical to the "Hawaii" silicon which drove the R9 290 series. It's highly unlikely that it features updated Graphics CoreNext 1.2 stream processors, as older rumors suggested.
Sources: 1, 2
Add your own comment

89 Comments on Radeon R9 390X Taken Apart, PCB Reveals a Complete Re-brand

#51
Phobia9651
Vayra86Honestly if AMD fails to impress, another company will take its place. x86 is big enough, someone will simply take over the company for its patents and technologies and continue onwards.
From what I know the x86 license is not transmittable through a take-over/buyout of AMD.
No company would take the risk and dive into CPU or GPU development when you face competitors who have been at it for decades.
Sad but true.
Posted on Reply
#52
Breit
Vayra86I have never REALLY understood this stance on AMD's survival.

If a company makes shitty products, or no longer improves its product lines, why would it have any right to survive in a competitive market?
The problem is that without AMD the market does not have competition anymore, which means nearly no innovation, no progress and high prices from the lone survivor (nVidia in this case). No one really want that.
Posted on Reply
#53
Vayra86
BreitThe problem is that without AMD the market does not have competition anymore, which means nearly no innovation, no progress and high prices from the lone survivor (nVidia in this case). No one really want that.
If you had read my post you would know this is a flawed argument.

If Intel and Nvidia 'get' the market for themselves, they will also face the consequences of attaining a monopoly. They are both *much* better off with a weak AMD than they are with no AMD. Besides, the market cannot afford to stagnate further, the node delays are already stagnating enough as it is, and ARM is just around the corner breathing down x86's neck. Just a few examples of what really drives the market, instead of the usual x86-centric thought where it is the only thing in existence.

EVEN IF Intel and Nvidia get x86 for themselves, and EVEN IF AMD will just die in misery instead of being taken over, the situation resulting from that will be temporary at best. ARM will take its place, or a new x86 competitor, hell maybe even a company like Qualcomm will consider stepping up. ARM is already invading the server business and does so through great scalable solutions with a very low overhead. ARM's solution to the performance difference with x86 is quite similar to AMD"s stance on performance increases: chaining more cores and using modular design, much like the FX and the approach to big die GPU's. Meanwhile, Intel is trying to force itself into the ARM business but will never survive on that alone, their business is not tuned for that.
Posted on Reply
#54
Breit
I'm not sure ARM will ever be entering x86 market.
Maybe someday in the future they might be able to compete in performance with their architecture, but building an entire ecosystem around it is quite another feat which is very unlikely to happen anytime soon.
Posted on Reply
#55
FordGT90Concept
"I go fast!1!11!1!"
Remember that is really TSMC that deserves the hate far more so than AMD. Their lack of an upgraded process in over two years meant all chips are still on 28nm. Yes, AMD should have at least upgraded the silicon to GCN 1.2.
Posted on Reply
#56
Vayra86
BreitI'm not sure ARM will ever be entering x86 market.
Maybe someday in the future they might be able to compete in performance with their architecture, but building an entire ecosystem around it is quite another feat which is very unlikely to happen anytime soon.
It is already happening. Remember Windows RT? Windows 10 is also being built to run on ARM phones and x86 devices alike. Current-gen WP already runs on ARM.

Regardless of AMD's future, ARM won't stop at ARM alone. In many ways it surpasses x86 in the way it can be configured; every single niche in the market can have its own custom designed SoC, and it won't even break the bank to do so.

In terms of ARM entering the x86 market... this here below is an example of last year...

www.cavium.com/newsevents_Cavium_Introduces_ThunderX_A_2.5_GHz_48_Core_Family_of_Workload_Optimized_Processors_for_Next_Generation_Data_Center_and_Cloud_Applications.html
Posted on Reply
#57
FordGT90Concept
"I go fast!1!11!1!"
Windows supported IA64 too...which was a flop.
Posted on Reply
#58
R-T-B
FordGT90ConceptWindows supported IA64 too...which was a flop.
And windows NT supported PPC... as did it's then main competitor, OS/2. Lulz.
Posted on Reply
#59
Vayra86
Of course. All this support served Windows well in the long run, as it still is the dominating OS all over the world.

ARM is a different animal though, because it has its own market as well and there is overlap, both ARM and x86 devices can do one or the other, which is also the reason it is such a huge threat to x86, but not the other way around because x86 is old and expensive, while ARM is fresh and highly customizable, and geared towards a market of many competitors instead of just two.

With every passing hour, Intel and AMD's existence is becoming less important and has already shifted from 'vital' to 'nice to have' for large portions of the market.
Posted on Reply
#60
Breit
A bit off-topic, but whatever... ;)
Are you really interested in universal apps? I mean all native apps are compiled with a specific architecture in mind and cannot be executed on different hardware, even if the operating system has a kernel that runs on said hardware. We are talking about high end hardware here, not tablet/mobile hardware. Sure that niche market is becoming more and more interesting because most people just don't need the performance that is possible with today's chips and are happy with angry birds type games. But for instance a real demanding game has to be a native application and so does every other demanding application that can put the performance of a chip to good use (think of rendering, content creation etc.).
As far as I know, the full Windows 10 will not be available for ARM. Windows RT is a dead end. The Windows 10 you are talking about is only for mobile/tablet and has very little in common with the desktop Windows (besides its name).
Posted on Reply
#61
GhostRyder
The PCB being the same really is not the proof that this is not GCN 1.2, however it is now more pointing towards this being a rebrand with 8gb GDDR5 and some improvements on the silicon that can give better clocks all around. Sad to say the least but if priced right it can be a decent deal.

Hopefully once these cards are all over the market we can get 100% confirmation so there will be no doubt left either way.
Posted on Reply
#62
Katanai
Ahem... I remember seeing a really long thread when the GTX 970 "lies" were discovered, filled with a lot of name calling and hate. I'm just sitting here, sipping on some tears, and adding to this thread. Let's see where it will go...
Posted on Reply
#63
yotano211
What is everyone talking about. I lost my mind after the 3rd post.
Posted on Reply
#64
FordGT90Concept
"I go fast!1!11!1!"
I think if AMD folded and it isn't bought up, ARM Holdings would step up and introduce processors that can tango with Intel on x86. Microsoft would return the ARM version of Windows and companies like Qualcomm would release desktop ARM CPUs. The rise in x86 prices would make ARM very, very attractive.


You'd think the transistor count would have to change if Grenada had GCN 1.2 or even 1.3. As far as we know, it did not. Because they recycled the device ID suggests that the internal features haven't changed at all.
Posted on Reply
#65
Casecutter
RejZoRIt's nothing wrong if you rebrand mid and low end. Those ppl don't care about tech itself anyway, they just want affordable cards. But when you start rebranding high end and god forbid enthusiast level, then you know they are selling mist...
It is regrettable, but when there could be something out there, lurking in the mist it plays with the mind… While naysayers can minimize AMD this round, knowing AMD or a their design of FinFet and HBM2 might live and still out there, that unknown... it plays on folks, and that is good.
Vayra86If a company makes shitty products, or no longer improves its product lines, why would it have any right to survive in a competitive market? Goodwill, surviving on the pockets of its customers who get subpar products in return.
I don't believe you can say "makes shitty products" at least in the GPU (CPU completely different letdown let’s not get into that bigger topic); AMD have remained in the fray up until the 970/980 then more got bested on efficiency (keep in mind everything non-gaming was forfeited). It's more the issue of TSMC (and only having them), while Tahiti was good when it came out it was widely known TSMC had issues. The GK104 got held up as TSMC fixed their problems, and that clean-up provided the 7970GHz, the clock I believe AMD always intending to hit became viable in the yields. At that time one thing AMD got wrong was trying to get in front Nvidia with Tahiti, I don’t know if AMD actually knew TSMC had issues with 28nm that could be fixed, but them jumping in front worked to be a disservice. Had they held back and come right alongside Nvidia release, there offerings and price might have been seen less deficient.

Hawaii as it turns out in AIB customs card it was competitive, and competitive to GM110 even if 5-7 months tardy, when it came Nvidia cut price and kept above it with Ti release. You need to consider Nvidia can/must R&D like crazy to support the lucrative HPC corporate contracts. For AMD to even be able to deliver a competitive part vying Nvidia’s ever-ramping Enthusiast Gaming segment is a huge undertaking and cost, given AMD has like zero of corporate HPC market.

Will I (am I) disappointed that AMD couldn’t find the money/justification to re-spin Hawaii and Pitcairn) to something newer, most definitely. Do I admire AMD for building Fiji and innovating on HBM, absolutely, but not at a cost of sacrificing the mainstream, their bread and butter market share. The more I pull from the information, AMD failings are not from a lack of technical innovation or expertise, it's the fact that upper management has not (for many years) sculpted the company and “screwed the pooch” in various ways. The reason we aren’t seeing a true GCN 1.2 Granada probably goes back more than a year ago, and while not great/huge improvement, some big shot’s saw it as an easy decision... it wasn’t providing enough change. I think some held hope (internally and externally) for GloFo to have something ready on 28SHP, but that direction appeared still blocked. Then in September 2014 with volumes/market share going into “full-funk”, bean counters figured the only way to maintain an effective price was to keep the foot on the gas with current stuff. Aiming not to waste time/cash/engineering the new layout and forgo the cost for a new tape-out at TSMC. The problem I have if that was the direction than a 390X should’ve been in the market last March-April. Unless there’s some big revelation that tied to then holding to E3. (I don't)

There’s also a kind of Arms Race mentality, if we can’t bring a Halo product we’ll lose heart and minds. Though, the idea to just be chasing Nvidia is fleeting. Especially if not shoring up the flanks, you’re not enticing the new gamers, while ignoring the mainstream leaves them feeling “uninhibited” from looking to the other side.

Though to conclude, this mindset of “just go away AMD” you complicate the market by being in it… falls more in the “be careful what you wish for”. Can't you just go about what you want, choosing to ignore there's AMD, but the naysayer can’t/don’t as what else would they do... play games?
Posted on Reply
#66
RejZoR
It's not what we wish for, it's what AMD is digging for themselves...
Posted on Reply
#67
Casecutter
RejZoRIt's not what we wish for, it's what AMD is digging for themselves...
In most of these situation it's some big shot’s having nothing more than a quarterly view... and hoping to deflect it from sticking to their legacy.
Posted on Reply
#68
HumanSmoke
CasecutterCan't you just go about what you want, choosing to ignore there's AMD, but the naysayer can’t/don’t as what else would they do... play games?
Well, that is hardly vendor specific around here is it? and the proponents of such a point of view are also often the same people who structure their posting for other IHV's thus: Point out perceived fault > segue into dire prophesy about company viability > throw the company a small bone in a superficial attempt at even-handedness > end post with even more dire prophecy and doubts about companies survival.
I can show working (examples) if this seems like a gross generalization.

It is the nature of mainstream forums that are less MIT than Deadwood circa 1870, to over-simplify, not look at the bigger picture, and to apportion blame according to popular opinion rather than seek out the correct information - which takes time, a wide range of source material generally not sexy enough ( lack of pictures, lack of flamebait, high degree of technical material requiring further reading in of itself), and a continuing deep interest to constantly overwrite/supplement the previous information gleaned. That won't ever change because unless you enjoy getting into the nuts and bolts of the industry, it will always be easier to parrot whatever some other guy says.

Having said that, the whole rebranding thing should be pretty easy to understand. AMD simply don't have the R&D resources to fight a three front war (x86, GPU, RISC) without spreading itself exceedingly thin and having to prioritize projects and cancel others. The company, flushed with success after K6 and K7, simply overreached. They decided that they could take on Intel and Nvidia in head to head battles simultaneously, sinking their first real profit into an overpriced ATI when it probably made more sense to expand their foundry business (and open it to third party fabbing), and licence ATI's graphics IP. AMD also lost sight of what bought the company its success - namely acquiring IP (principally from DEC), and appealing to a budget market that hadn't really existed under Intel rule. Those two factors dissolved pretty quickly. K8 slipped, K10 was evolutionary, not revolutionary, Bulldozer's timetable was a disaster, and the $2bn+ debt burden the company saddled itself with effectively meant their foundry business floundered, R&D suffered as people either left as projects slipped (or were cancelled) - or as cost cutting measures. ATI when AMD acquired it was a basket case thanks to a slow gestation of R600, and Nvidia's near flawless architectural (and marketing) execution of G80. Everything that has happened since is a product of the seeds sown in 2005-06 - maybe arguably, a little before since AMD missed a prime opportunity for greater revenue by stalling on outsourcing chip production* to a third party foundry when their own fabs were capacity constrained.

Judging by popular opinion rather than fact, a certain faction of the forum memberships seem to play all AMD's woes on Intel, Nvidia, TSMC, and Globalfoundries - seldom (if ever) looking at the decisions made by the company itself in the past that laid the foundations for the situation in the present - something that is bound to continue as AMD outsource design ( people seem to forget that the Synopsys deal also meant AMD offloading 150 R&D engineers for example) at the expense of in-house bespoke R&D. No doubt the same people will continue to lay the bulk of the blame at the feet of third parties, rather than AMD's BoD and previous strategic planning efforts

* AMD were allowed under the Intel cross-licence agreement to outsource up to 20% of their chip production to third party foundry partners
Posted on Reply
#70
HumanSmoke
XzibitChinese anyone ?
I would, but I know I'll just feel like another benchmark an hour later.

Couldn't be bothered deciphering that chart, or looking too deeply at it since the parts are known quantities, but the 390X numbers look pretty much ballpark for the 290X vs the 980. I'd assume the gap over the 970 denotes 4K benchmarks.
Posted on Reply
#71
BiggieShady
FordGT90ConceptYou'd think the transistor count would have to change if Grenada had GCN 1.2 or even 1.3. As far as we know, it did not. Because they recycled the device ID suggests that the internal features haven't changed at all.
Additionally, AMD never put a version numbers on their GCN reiterations nor they ever bragged with architectural changes inside GCN core.
All the version numbers were added by tech sites and consumers.
Realistically, all AMD could do after R&D cuts, was to optimize transistor layout without changing the architecture ... they didn't gain much efficiency, they did get a smaller die and with that chance to make big fury chip on the same 28nm node.
Everyone is bitching right now about rebrands, but all things considered AMD has done best they could.
Too bad this can't be a marketing slogan: "Give us your money, we did best we could!"
Posted on Reply
#72
steven21uk
Fake Fake Fake where is the box this card came out from and as the 390X isn't released until the 25th this is prob a nvidda fan boy or nvidda trying to put u off as there scared of its results when it comes out so don't look into this load of rubbish and if it is true I want to see more proof 2 pics don't show u anything ive seen stuff like this before with processors and video cards before the release dates also fake gpu z pics are easily edited so that's prob a fake as well. I would wait till after the release date for the truth :)
Posted on Reply
#73
HumanSmoke
steven21ukFake Fake Fake where is the box this card came out from and as the 390X isn't released until the 25th this is prob a nvidda fan boy or nvidda trying to put u off as there scared of its results when it comes out so don't look into this load of rubbish and if it is true I want to see more proof 2 pics don't show u anything ive seen stuff like this before with processors and video cards before the release dates also fake gpu z pics are easily edited so that's prob a fake as well. I would wait till after the release date for the truth :)
Oh dear, you really are a special one aren't you.
If you'd bothered to click on the links right below the article, you'd see that not only the box is shown, but the XFX rep is involved in the thread.
What is more odd, is that you seem to be the only person in the Western world that is unaware that Best Buy is selling 300-series cards ahead of the official launch ( where the people posting the pictures, benchmarks, and BIOS dumps bought their cards), and Legit Reviews actually sent people to check this out personally.
steven21ukso don't look into this load of rubbish
What a coincidence! that was my first reaction on attempting to decipher your appallingly bad grammar, but in the interests of keeping even people like yourself informed, I persevered in order to enlighten you on what a hyperlink at the bottom of an article signifies.

What an awesome first post steven, you may now hold the TPU record for the longest string of characters making the least amount of sense. Your certificate will be mailed out.
Posted on Reply
#74
steven21uk
HumanSmokeOh dear, you really are a special one aren't you.
If you'd bothered to click on the links right below the article, you'd see that not only the box is shown, but the XFX rep is involved in the thread.
What is more odd, is that you seem to be the only person in the Western world that is unaware that Best Buy is selling 300-series cards ahead of the official launch ( where the people posting the pictures, benchmarks, and BIOS dumps bought their cards), and Legit Reviews actually sent people to check this out personally.

What a coincidence! that was my first reaction on attempting to decipher your appallingly bad grammar, but in the interests of keeping even people like yourself informed, I persevered in order to enlighten you on what a hyperlink at the bottom of an article signifies.

What an awesome first post steven, you may now hold the TPU record for the longest string of characters making the least amount of sense. Your certificate will be mailed out.[/QUO
HumanSmokeOh dear, you really are a special one aren't you.
If you'd bothered to click on the links right below the article, you'd see that not only the box is shown, but the XFX rep is involved in the thread.
What is more odd, is that you seem to be the only person in the Western world that is unaware that Best Buy is selling 300-series cards ahead of the official launch ( where the people posting the pictures, benchmarks, and BIOS dumps bought their cards), and Legit Reviews actually sent people to check this out personally.

What a coincidence! that was my first reaction on attempting to decipher your appallingly bad grammar, but in the interests of keeping even people like yourself informed, I persevered in order to enlighten you on what a hyperlink at the bottom of an article signifies.

What an awesome first post steven, you may now hold the TPU record for the longest string of characters making the least amount of sense. Your certificate will be mailed out.
just look at best buy no 300 series cards there ?????????????????????????
Posted on Reply
#75
PCGamerDR
wiakwell their new lineup looks like this
R9 Fury X (2015 Fiji)
R9 Fury (2015 Fiji)
R9 390 8GB (2013 Hawaii)
R9 390 8GB (2013 Hawaii)
R9 380 4GB (2014 Tonga)
R7 370 2GB (2012 Pitcairn)
R5 360 2GB (2013 Bonaire)

next year it might look like
R9 Fury Maxx (2016)
R9 Fury Maxx (2016)
R9 Fury X (2015 Fiji)
R9 Fury (2015 Fiji)
R9 390X 8GB (2013 Hawaii)
R9 390 8GB (2013 Hawaii)
R9 380 4GB (2014 Tonga)
TBH they seem to have aged well, the most impressive one being pitcairn that good ol' hd7870, oh the memories. -cryeveritym-
Posted on Reply
Add your own comment
Jul 16th, 2024 19:02 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts