Monday, June 15th 2015

Radeon R9 390X Taken Apart, PCB Reveals a Complete Re-brand
People with access to an XFX Radeon R9 390X graphics card, took it apart to take a peek at its PCB. What they uncovered comes as no surprise - the underlying PCB is identical in design to AMD reference PCB for the Radeon R9 290X, down the location of every tiny SMT component. At best, the brands on the chokes and bigger conductive polymer caps differ; and 512 Gbit GDDR5 chips under the heatspreader, making up 8 GB of the standard memory amount. The GPU itself, codenamed "Grenada," looks identical to the "Hawaii" silicon which drove the R9 290 series. It's highly unlikely that it features updated Graphics CoreNext 1.2 stream processors, as older rumors suggested.
Sources:
1, 2
89 Comments on Radeon R9 390X Taken Apart, PCB Reveals a Complete Re-brand
No company would take the risk and dive into CPU or GPU development when you face competitors who have been at it for decades.
Sad but true.
If Intel and Nvidia 'get' the market for themselves, they will also face the consequences of attaining a monopoly. They are both *much* better off with a weak AMD than they are with no AMD. Besides, the market cannot afford to stagnate further, the node delays are already stagnating enough as it is, and ARM is just around the corner breathing down x86's neck. Just a few examples of what really drives the market, instead of the usual x86-centric thought where it is the only thing in existence.
EVEN IF Intel and Nvidia get x86 for themselves, and EVEN IF AMD will just die in misery instead of being taken over, the situation resulting from that will be temporary at best. ARM will take its place, or a new x86 competitor, hell maybe even a company like Qualcomm will consider stepping up. ARM is already invading the server business and does so through great scalable solutions with a very low overhead. ARM's solution to the performance difference with x86 is quite similar to AMD"s stance on performance increases: chaining more cores and using modular design, much like the FX and the approach to big die GPU's. Meanwhile, Intel is trying to force itself into the ARM business but will never survive on that alone, their business is not tuned for that.
Maybe someday in the future they might be able to compete in performance with their architecture, but building an entire ecosystem around it is quite another feat which is very unlikely to happen anytime soon.
Regardless of AMD's future, ARM won't stop at ARM alone. In many ways it surpasses x86 in the way it can be configured; every single niche in the market can have its own custom designed SoC, and it won't even break the bank to do so.
In terms of ARM entering the x86 market... this here below is an example of last year...
www.cavium.com/newsevents_Cavium_Introduces_ThunderX_A_2.5_GHz_48_Core_Family_of_Workload_Optimized_Processors_for_Next_Generation_Data_Center_and_Cloud_Applications.html
ARM is a different animal though, because it has its own market as well and there is overlap, both ARM and x86 devices can do one or the other, which is also the reason it is such a huge threat to x86, but not the other way around because x86 is old and expensive, while ARM is fresh and highly customizable, and geared towards a market of many competitors instead of just two.
With every passing hour, Intel and AMD's existence is becoming less important and has already shifted from 'vital' to 'nice to have' for large portions of the market.
Are you really interested in universal apps? I mean all native apps are compiled with a specific architecture in mind and cannot be executed on different hardware, even if the operating system has a kernel that runs on said hardware. We are talking about high end hardware here, not tablet/mobile hardware. Sure that niche market is becoming more and more interesting because most people just don't need the performance that is possible with today's chips and are happy with angry birds type games. But for instance a real demanding game has to be a native application and so does every other demanding application that can put the performance of a chip to good use (think of rendering, content creation etc.).
As far as I know, the full Windows 10 will not be available for ARM. Windows RT is a dead end. The Windows 10 you are talking about is only for mobile/tablet and has very little in common with the desktop Windows (besides its name).
Hopefully once these cards are all over the market we can get 100% confirmation so there will be no doubt left either way.
You'd think the transistor count would have to change if Grenada had GCN 1.2 or even 1.3. As far as we know, it did not. Because they recycled the device ID suggests that the internal features haven't changed at all.
Hawaii as it turns out in AIB customs card it was competitive, and competitive to GM110 even if 5-7 months tardy, when it came Nvidia cut price and kept above it with Ti release. You need to consider Nvidia can/must R&D like crazy to support the lucrative HPC corporate contracts. For AMD to even be able to deliver a competitive part vying Nvidia’s ever-ramping Enthusiast Gaming segment is a huge undertaking and cost, given AMD has like zero of corporate HPC market.
Will I (am I) disappointed that AMD couldn’t find the money/justification to re-spin Hawaii and Pitcairn) to something newer, most definitely. Do I admire AMD for building Fiji and innovating on HBM, absolutely, but not at a cost of sacrificing the mainstream, their bread and butter market share. The more I pull from the information, AMD failings are not from a lack of technical innovation or expertise, it's the fact that upper management has not (for many years) sculpted the company and “screwed the pooch” in various ways. The reason we aren’t seeing a true GCN 1.2 Granada probably goes back more than a year ago, and while not great/huge improvement, some big shot’s saw it as an easy decision... it wasn’t providing enough change. I think some held hope (internally and externally) for GloFo to have something ready on 28SHP, but that direction appeared still blocked. Then in September 2014 with volumes/market share going into “full-funk”, bean counters figured the only way to maintain an effective price was to keep the foot on the gas with current stuff. Aiming not to waste time/cash/engineering the new layout and forgo the cost for a new tape-out at TSMC. The problem I have if that was the direction than a 390X should’ve been in the market last March-April. Unless there’s some big revelation that tied to then holding to E3. (I don't)
There’s also a kind of Arms Race mentality, if we can’t bring a Halo product we’ll lose heart and minds. Though, the idea to just be chasing Nvidia is fleeting. Especially if not shoring up the flanks, you’re not enticing the new gamers, while ignoring the mainstream leaves them feeling “uninhibited” from looking to the other side.
Though to conclude, this mindset of “just go away AMD” you complicate the market by being in it… falls more in the “be careful what you wish for”. Can't you just go about what you want, choosing to ignore there's AMD, but the naysayer can’t/don’t as what else would they do... play games?
I can show working (examples) if this seems like a gross generalization.
It is the nature of mainstream forums that are less MIT than Deadwood circa 1870, to over-simplify, not look at the bigger picture, and to apportion blame according to popular opinion rather than seek out the correct information - which takes time, a wide range of source material generally not sexy enough ( lack of pictures, lack of flamebait, high degree of technical material requiring further reading in of itself), and a continuing deep interest to constantly overwrite/supplement the previous information gleaned. That won't ever change because unless you enjoy getting into the nuts and bolts of the industry, it will always be easier to parrot whatever some other guy says.
Having said that, the whole rebranding thing should be pretty easy to understand. AMD simply don't have the R&D resources to fight a three front war (x86, GPU, RISC) without spreading itself exceedingly thin and having to prioritize projects and cancel others. The company, flushed with success after K6 and K7, simply overreached. They decided that they could take on Intel and Nvidia in head to head battles simultaneously, sinking their first real profit into an overpriced ATI when it probably made more sense to expand their foundry business (and open it to third party fabbing), and licence ATI's graphics IP. AMD also lost sight of what bought the company its success - namely acquiring IP (principally from DEC), and appealing to a budget market that hadn't really existed under Intel rule. Those two factors dissolved pretty quickly. K8 slipped, K10 was evolutionary, not revolutionary, Bulldozer's timetable was a disaster, and the $2bn+ debt burden the company saddled itself with effectively meant their foundry business floundered, R&D suffered as people either left as projects slipped (or were cancelled) - or as cost cutting measures. ATI when AMD acquired it was a basket case thanks to a slow gestation of R600, and Nvidia's near flawless architectural (and marketing) execution of G80. Everything that has happened since is a product of the seeds sown in 2005-06 - maybe arguably, a little before since AMD missed a prime opportunity for greater revenue by stalling on outsourcing chip production* to a third party foundry when their own fabs were capacity constrained.
Judging by popular opinion rather than fact, a certain faction of the forum memberships seem to play all AMD's woes on Intel, Nvidia, TSMC, and Globalfoundries - seldom (if ever) looking at the decisions made by the company itself in the past that laid the foundations for the situation in the present - something that is bound to continue as AMD outsource design ( people seem to forget that the Synopsys deal also meant AMD offloading 150 R&D engineers for example) at the expense of in-house bespoke R&D. No doubt the same people will continue to lay the bulk of the blame at the feet of third parties, rather than AMD's BoD and previous strategic planning efforts
* AMD were allowed under the Intel cross-licence agreement to outsource up to 20% of their chip production to third party foundry partners
Couldn't be bothered deciphering that chart, or looking too deeply at it since the parts are known quantities, but the 390X numbers look pretty much ballpark for the 290X vs the 980. I'd assume the gap over the 970 denotes 4K benchmarks.
All the version numbers were added by tech sites and consumers.
Realistically, all AMD could do after R&D cuts, was to optimize transistor layout without changing the architecture ... they didn't gain much efficiency, they did get a smaller die and with that chance to make big fury chip on the same 28nm node.
Everyone is bitching right now about rebrands, but all things considered AMD has done best they could.
Too bad this can't be a marketing slogan: "Give us your money, we did best we could!"
If you'd bothered to click on the links right below the article, you'd see that not only the box is shown, but the XFX rep is involved in the thread.
What is more odd, is that you seem to be the only person in the Western world that is unaware that Best Buy is selling 300-series cards ahead of the official launch ( where the people posting the pictures, benchmarks, and BIOS dumps bought their cards), and Legit Reviews actually sent people to check this out personally. What a coincidence! that was my first reaction on attempting to decipher your appallingly bad grammar, but in the interests of keeping even people like yourself informed, I persevered in order to enlighten you on what a hyperlink at the bottom of an article signifies.
What an awesome first post steven, you may now hold the TPU record for the longest string of characters making the least amount of sense. Your certificate will be mailed out.