Monday, February 9th 2015

Radeon R9 380X Based on "Grenada," a Refined "Hawaii"
AMD's upcoming Radeon R9 380X and R9 380 graphics cards, with which it wants to immediately address the GTX 980 and GTX 970, will be based on a "new" silicon codenamed "Grenada." Built on the 28 nm silicon fab process, Grenada will be a refined variant of "Hawaii," much in the same way as "Curacao" was of "Pitcairn," in the previous generation.
The Grenada silicon will have the same specs as Hawaii - 2,816 GCN stream processors, 176 TMUs, 64 ROPs, and a 512-bit wide GDDR5 memory interface, holding 4 GB memory. Refinements in the silicon over Hawaii could allow AMD to increase clock speeds, to outperform the GTX 980 and GTX 970. We don't expect the chip to be any more energy efficient at its final clocks, than Hawaii. AMD's design focus appears to be performance. AMD could save itself the embarrassment of a loud reference design cooler, by throwing the chip up for quiet custom-design cooling solutions from AIB (add-in board) partners from day-one.In other news, the "Tonga" silicon, which made its debut with the performance-segment Radeon R9 285, could form the foundation of Radeon R9 370 series, consisting of the R9 370X, and the R9 370. Tonga physically features 2,048 stream processors based on the more advanced GCN 1.3 architecture, 128 TMUs, 32 ROPs, and a 384-bit wide GDDR5 memory interface. Both the R9 370 and R9 370X could feature 3 GB of standard memory amount.
The only truly new silicon with the R9 300 series, is "Fiji." This chip will be designed to drive AMD's high-end single- and dual-GPU graphics cards, and will be built to compete with the GM200 silicon from NVIDIA, and the GeForce GTX TITAN-X it will debut with. This chip features 4,096 stream processors based on the GCN 1.3 architecture - double that of "Tonga," 256 TMUs, 128 ROPs, and a 1024-bit wide HBM memory interface, offering 640 GB/s of memory bandwidth. 4 GB could be the standard memory amount. The three cards AMD will carve out of this silicon, are the R9 390, the R9 390X, and the R9 390X2.
Source:
3DCenter.org
The Grenada silicon will have the same specs as Hawaii - 2,816 GCN stream processors, 176 TMUs, 64 ROPs, and a 512-bit wide GDDR5 memory interface, holding 4 GB memory. Refinements in the silicon over Hawaii could allow AMD to increase clock speeds, to outperform the GTX 980 and GTX 970. We don't expect the chip to be any more energy efficient at its final clocks, than Hawaii. AMD's design focus appears to be performance. AMD could save itself the embarrassment of a loud reference design cooler, by throwing the chip up for quiet custom-design cooling solutions from AIB (add-in board) partners from day-one.In other news, the "Tonga" silicon, which made its debut with the performance-segment Radeon R9 285, could form the foundation of Radeon R9 370 series, consisting of the R9 370X, and the R9 370. Tonga physically features 2,048 stream processors based on the more advanced GCN 1.3 architecture, 128 TMUs, 32 ROPs, and a 384-bit wide GDDR5 memory interface. Both the R9 370 and R9 370X could feature 3 GB of standard memory amount.
The only truly new silicon with the R9 300 series, is "Fiji." This chip will be designed to drive AMD's high-end single- and dual-GPU graphics cards, and will be built to compete with the GM200 silicon from NVIDIA, and the GeForce GTX TITAN-X it will debut with. This chip features 4,096 stream processors based on the GCN 1.3 architecture - double that of "Tonga," 256 TMUs, 128 ROPs, and a 1024-bit wide HBM memory interface, offering 640 GB/s of memory bandwidth. 4 GB could be the standard memory amount. The three cards AMD will carve out of this silicon, are the R9 390, the R9 390X, and the R9 390X2.
156 Comments on Radeon R9 380X Based on "Grenada," a Refined "Hawaii"
That is not even close to appropriate.
With the physics I was speaking from experience as an Nvidia user (some of us just switch between brands and are not glued down).
And no, the way smoke reacts to you in physX enabled Batman AA is not realistic at all, it becomes this palpable substance that roles off batman almost, that is just overdoing it massively.
Glowing flying orbs with every specal attack in Warframe makes no sense/is not realisic either, its just shiney colored orbs (ooh so pretty right?).
Maaaaybe Mirror's Edge can be considered a visual increase with it on, breaking glass and tearing cloth etc.
Although on that note, I am playing Splinter Cell 1 on the gamecube again and im mighty impressed with the way cloth reacts in that old old game, and that without PhyX.
I like what PhysX COULD do for us, but the way it is, exclusive to Nvidia, its going nowhere, unless Nvidia to borderline buy the development of an entire game so they can make it about physX from the start.
But that will not happen so it will be just a gimmicky addition and never what it should be.
Oh well, Havok 2.0 is coming still, maybe that will move some mountains.
AMD is a muuuuuch smaller company and does not nearly have the research resources Intel has.
It might be the only competition Intel has but calling it competition is pushing it.
Luckily the pricing makes up for that making it all viable options though.
XBO and PS4 officially support PhysX, so developers can implement it in any game. Unfortunately those consoles have no power, which will make that rather difficult.
Totally changes the game?
Ermmm you mean some extra particles that bounce away when you shoot something or some particle made water flowing somewhere?
Because that does not change the game in any way shape or form.
Its exactly the same gimmicky nonsense that physX does in Warframe.
Hell in that article they do not refer to the physX as "effects" for nothing, thats all it adds, some effects.
It adds nothing but some orbs flying around, while it could be the entire basis for how things are build up and react (ya know... Physics) like those tech demo's they show of it.
The fact that you can turn it off is pretty much the dead give away that it infact does not "totally change the game" because a game that is based around those physX would not work without it.
You cannot turn havok off in for example HL2, because the game does not function anymore if that would be the case.
I played the game for the first time without PhysX. When I played it the second time, I was blown away. There are hundreds of those particles, if not thousands in huge firefights. Also, a lot of debris actually stays on the ground, and interact with your shots and grenade.
Best PhysX implementation I have ever seen.
It does not change the gameplay, it changes the visuals. If you cannot appreciate it, then you must be really spoiled.
If the 380x or the 390x is very good then I might jump over to AMD
Mantle: Duh. It's an interface for their specific hardware. Do you expect Atheros to write drivers that work with Broadcom chips, too?
American Megatrends and Phoenix BIOS's are proprietary tech, and has been since Phoenix devised their BIOS and started charging $290,000 per vendor licence and $25 per BIOS ROM chip in May 1984. "an interface for their specific hardware" is pretty much a definition of proprietary. How is this any different from expecting Nvidia to write engine code for AMD's driver and hardware? Because one thing is certain, AMD has no interest in using, nor supporting PhysX.
2. I doubt nVidia ever offered the implementation details without requesting a bunch of money in exchange.
You think it's okay to complain that nVidia would have to license Mantle, but whine about AMD not having licensed PhysX in the same breath? :rolleyes: Seems you don't understand your own argument. It's hardware limited just like 4k over HDMI, 9k jumbo packets, and AVX. It requires DisplayPort 1.2a because that's where the specification exists to be implemented. A claim of it being "proprietary" is more like "My hardware is too old for this new stuff! Why can't I run this x64 AVX code on my Pentium 4?! I'm gonna whine about it!" nVidia supports DisplayPort; there's absolutely nothing stopping them from creating their own Adaptive-Sync driver-side implementation (and they could even call it Expensive$ync if they want to confuse people who don't understand that it is just an implementation of a damn standard just like FreeSync).
WRT FreeSync "certification", that's no different than any other certification. If you want to stick somebody else's brand name on your product, you've got to get permission from them. That's how trademarks work, silly.
AMD when they announced freesync they claimed some current monitors supported it with no new hardware. Well that is AMD PR marketing for ya. In the end Freesync is a proprietary implication of the standard. No matter how you cut that its still proprietary no matter what.
"AMD freesync tech is a unique AMD hardware/software"
^ Another way to say proprietary
Shit GTA 4 has better Physics than that game and that ran on a good system today runs really well.