Monday, June 22nd 2015

AMD "Fiji" Block Diagram Revealed, Runs Cool and Quiet

AMD's upcoming flagship GPU silicon, codenamed "Fiji," which is breaking ground on new technologies, such as HBM, memory-on-package, a specialized substrate layer that connects the GPU with it, called Interposer; features a hefty feature-set. More on the "Fiji" package and its memory implementation, in our older article. Its block diagram (manufacturer-drawn graphic showing the GPU's component hierarchy), reveals a scaling up, of the company's high-end GPU launches over the past few years.

"Fiji" retains the quad Shader Engine layout of "Hawaii," but packs 16 GCN Compute Units (CUs), per Shader Engine (compared to 11 CUs per engine on Hawaii). This works out to a stream processor count of 4,096. Fiji is expected to feature a newer version of the Graphics CoreNext architecture than "Hawaii." The TMU count is proportionately increased, to 256 (compared to 176 on "Hawaii"). AMD doesn't appear to have increased the ROP count, which is still at 64. The most significant change, however, is its 4096-bit HBM memory interface, compared to 512-bit GDDR5 on "Hawaii."
At its given clock speeds, of up to 1050 MHz core, with 500 MHz memory (512 GB/s bandwidth), on the upcoming Radeon R9 Fury X graphics card, "Fiji" offers a GPU compute throughput of 8.6 TFLOP/s, which is greater than the 7 TFLOP/s rated for NVIDIA's GeForce GTX Titan X. The reference board may draw power from a pair of 8-pin PCIe power connectors, but let that not scare you. Its typical board power is rated at 275W, just 25W more than the GTX Titan X, for 22% higher SPFP throughput (the two companies may use different methods to arrive at those numbers).

AMD claims that the reference cooling solution will pay heavy dividends in terms of temperatures and noise. In a typical gaming scenario, the temperatures will be around 50°C, and noise output under 32 dB. To put these into perspective, the reference NVIDIA GeForce GTX Titan X, sees its load temperatures reach 84°C, and its fan puts out 45 dB, in our testing. The cooling solution is confirmed to feature a Nidec-Servo made 120 mm fan. As with all flagship graphics cards over the past few generations, Radeon R9 Fury X will feature dual-BIOS, and ZeroCore (which powers down the GPU when the display-head is idling, and completely powers down non-primary GPUs in CrossFire setups, unless 3D loads warrant the driver to power them back up).

The Radeon R9 Fury X will be priced at US $649.99, and will be generally available in the next 3 or so weeks.
Source: Hispazone
Add your own comment

73 Comments on AMD "Fiji" Block Diagram Revealed, Runs Cool and Quiet

#26
Amgal
Love it.

"Cool and quiet"

...when there's a CLC waterblock attached to it.
Posted on Reply
#27
moproblems99
repman244And? My point still stands that the Titan could be using less power even if it runs at higher temperature.
Your point is true but I was just adding to your point. Not too mention, a cooler running chip should be more stable, last longer, and not throttle. Like the Titan X or 290X.
Posted on Reply
#28
Evildead666
chinmiStill too expensive, slow and bad driver support... Better buy a 980ti for that amount of money.
Nvidia still wins
And next year when nvidia start using HBM memory amd will surely die, because it can't keep up with the advancement in technology that nvidia r&d team have.
Still, this is a good card... But not as good as the 980ti.
What driver problems is your Fury/X experiencing ?
I'm sure we'd all like to know.

~36Hrs left and the reviews will tell all....
Posted on Reply
#29
KarymidoN
Cheaper than Titan X... lets see if this is faster...
*grabs popcorn*
Posted on Reply
#30
Casecutter
chinmibecause it can't keep up with the advancement in technology
Really AMD appears they could this time... on a shoe string... but yeah maybe that might still come to fruition sometime Q1 of what 2017...
Posted on Reply
#31
semantics
I still wonder if the water cooling was the only way around HBM heat management not saying the card runs extra hot just that the chip is more heatsentivie because of HBM and this was a slapped together solution to the problem.
Posted on Reply
#32
$ReaPeR$
this looks awesome!!!! cant wait for the reviews!!!
Posted on Reply
#33
GreiverBlade
P4-630Wow, thats a very comfortable operating temperature while gaming!
just like my 290 ... rarely seen above 48° :D (ok ok ... not the same level of water cooling tho ...)
Posted on Reply
#34
ensabrenoir
ZoneDymoShould the title not be "AMD Does not trust its own hardware and has to desperately resort to watercooling otherwise their heater might melt" ?
.......naaaah : Amd dose not trust its customers water cooling skills ........so they do it for them!!


Posted on Reply
#35
RejZoR
AmgalLove it.

"Cool and quiet"

...when there's a CLC waterblock attached to it.
Well, the fact is, it'll be cooler and quieter. So, it's not a lie no matter how you turn it. Fiji is not an R9-290X, mind you. And FYI, Titan X is not exactly a quiet card either. So, why not just give a card AiO and you solve two problems entirely while giving users extra headroom for overclocking. I'm using AiO on my CPU for like 2 years now and I wouldn't trade it for any air cooler. If I had a decently sized case (but I love my cute miniATX), i'd be using AiO on my graphic card already. Fury might even become a reason to invest time into modding my case for it...
Posted on Reply
#36
GreiverBlade
ensabrenoir.......naaaah : Amd dose not trust its customers water cooling skills ........so they do it for them!!
say that to my 290 ... :cry: (that was my 1st water cooling attempt ... )

i wonder if i can fit 2 AIO rad in my Air540 ... in the front in place of the Phobya G-Changer 240V2 (60mm)

pfah! i prefer 2 fury X or nano with a custom block and a single slot shield so i can reuse my l oop and put my 290 and the Kryographics block+ backplate to rest on my shelf as "the best card ever since 2yrs and still going strong" (or just "the best bang for bucks card" )
Posted on Reply
#37
FrustratedGarrett
ZoneDymoShould the title not be "AMD Does not trust its own hardware and has to desperately resort to watercooling otherwise their heater might melt" ?
Well, water-cooling in this instance has nothing to do with the TDP of the chip. Water cooling is used to cool the memory chips which have a way smaller footprint than GDDR5 and thus the rate at which heat needs to be transfered out of these chips has to be a lot higher than what it was with the previous GDDR5 chips.

As for AMD not trusting their own CPUs claim, which sounds true, I don't think you're getting the whole point. Under a good multi-core aware graphics API, their 8 core FX CPUs do better than Intel's 4 core CPUs. DX11 is not that... and all we have now are DX11 games, so they had to use their competitors CPUs to drive two FuryX graphics cards in their custom Fury system.

Whether you're a shill, a troll or just a misinformed person, I hope I have taught you something, son )
Posted on Reply
#38
Aquinus
Resident Wat-man
FrustratedGarrettWell, water-cooling in this instance has nothing to do with the TDP of the chip. Water cooling is used to cool the memory chips which have a way smaller footprint than GDDR5 and thus the rate at which heat needs to be transfered out of these chips has to be a lot higher than what it was with the previous GDDR5 chips.

As for AMD not trusting their own CPUs claim, which sounds true, I don't think you're getting the whole point. Under a good multi-core aware graphics API, their 8 core FX CPUs do better than Intel's 4 core CPUs. DX11 is not that... and all we have now are DX11 games, so they had to use their competitors CPUs to drive two FuryX graphics cards in their custom Fury system.

Whether you're a shill, a troll or just a misinformed person, I hope I have taught you something, son )
There is some logic to what he's saying. The general trend for many years now has been to throw power consumption under the bus for the sake of performance. There is no reason to believe that isn't still the case as AMD has not demonstrated anything to the contrary in quite some time. However, I do have to say that ZoneDymo needs to tone down the bias a bit.

Side note: Cool 'n Quiet? What the hell is this, a CPU? I think the Athlon 64 in my attic wants its technology back. :)
Posted on Reply
#39
FrustratedGarrett
AquinusThere is some logic to what he's saying. The general trend for many years now has been to throw power consumption under the bus for the sake of performance. There is no reason to believe that isn't still the case as AMD has not demonstrated anything to the contrary in quite some time. However, I do have to say that ZoneDymo needs to tone down the bias a bit.

Side note: Cool 'n Quiet? What the hell is this, a CPU? I think the Athlon 64 in my attic wants its technology back. :)
I did not find his post consistent to say the least. Nvidia obviously spent some time after "kepler", which is a marketing term BTW, and ironed out their chip fabrication to reduce power leakage and the result was "Maxwell", another marketing term. Anyhow... an extra $15 for a whole year power bill isn't much to talk about.

AMD were more efficient in their 3000, 4000, 5000 and 6000 series. They were slightly less efficient with their 7000 series, but now they caught up and seem to have done quite a good job with Fiji.
Posted on Reply
#40
Aquinus
Resident Wat-man
FrustratedGarrettMD were more efficient in their 3000, 4000, 5000 and 6000 series. They were slightly less efficient with their 7000 series
Hardly. I would say GCN helped in that respect. The problem is that AMD kept pushing it harder so the power savings simply got turned into performance while maintaining the same power envelope, also for 1080p, I think there is a clear winner on which camp is power efficient with its power. With respect to Fury's power consumption, that has yet to be seen. We don't know how power hungry it is because there is still no info with respect to the GPU's performance. We should know tomorrow though. :) Good thing too, I want to replace these 6870s.

I just took the 390X review because it was right up front, but it's another great example of how AMD would rather suck down the amps which is why I'm not overly optimistic that there will be anything ground breaking. Also consider the AIO water cooler. That's a sign that the GPU's TDP might be a bit on the high side which would be consistent with what AMD has been doing.
Posted on Reply
#41
RejZoR
Fiji is not the same as Hawaii. That's like saying Maxwell is the same as Kepler...
Posted on Reply
#42
FrustratedGarrett


Don't give these performance/watt charts when a couple of Gameworks/Nvidia-poisoned games can swing things to their side by a great deal. Here's a chart of load power consumption. Keep in mind that on average your graphics card is under full load less than 20% of the day.
If you look at the chart, you'll see that TitanX consumes about the same as the 290X while being ~40ish% faster.

I personally wouldn't care about 40% more power under load for the same performance if the product is considerably cheaper. Again, an extra $10 - $15 for power bills in a one year period isn't much to talk about.
Posted on Reply
#43
FrustratedGarrett
RejZoRFiji is not the same as Hawaii. That's like saying Maxwell is the same as Kepler...
Those are marketing terms. Fiji is a GCN chip with optimization here and there to improve performance where it matters. Same goes for Nvidia's chips.
Posted on Reply
#44
RejZoR
What's more important is idle power consumption. Something AMD doesn't have in check for multi monitor setups. I don't care since I have just one monitor, but still...
Posted on Reply
#45
Aquinus
Resident Wat-man
RejZoRWhat's more important is idle power consumption. Something AMD doesn't have in check for multi monitor setups. I don't care since I have just one monitor, but still...
...but I like my 200 watt idle power consumption. :laugh:
How much of that do you think is my 6870s? :p Actually #2 is in ULV, so I suspect any of it is mostly the primary which is the non-reference one.
FrustratedGarrett

Don't give these performance/watt charts when a couple of Gameworks/Nvidia-poisoned games can swing things to their side by a great deal. Here's a chart of load power consumption. Keep in mind that on average your graphics card is under full load less than 20% of the day.
If you look at the chart, you'll see that TitanX consumes about the same as the 290X while being ~40ish% faster.

I personally wouldn't care about 40% more power under load for the same performance if the product is considerably cheaper. Again, an extra $10 - $15 for power bills in a one year period isn't much to talk about.
You see, the problem with that is the way that Hilbert at Guru3d calculates power consumption and how W1zz actually gets it. Hilbert does some math to approximate the GPU's power usage. If you actually read his reviews, he explains it:
HilbertLet's have a look at how much power draw we measure with this graphics card installed. The methodology: We have a device constantly monitoring the power draw from the PC. We simply stress the GPU, not the processor. The before and after wattage will tell us roughly how much power a graphics card is consuming under load. Our test system is based on an eight-core Intel Core i7-5960X Extreme Edition setup on the X99 chipset platform. This setup is clocked to 4.40 GHz on all CPU cores. Next to that we have energy saving functions disabled for this motherboard and processor (to ensure consistent benchmark results). We'll be calculating the GPU power consumption here, not the total PC power consumption.

Power consumption Radeon R9-390X
  1. System in IDLE = 94W
  2. System Wattage with GPU in FULL Stress = 342W
  3. Difference (GPU load) = 248W
  4. Add average IDLE wattage ~10W
  5. Subjective obtained GPU power consumption = ~ 258 Watts
Then you have @W1zzard who has gone through painsteaking work to figure out that actual power draw of the GPU from the PCI-E slot and PCI-E power connectors as described in his reviews:
W1zzardFor this test, we measure the power consumption of only the graphics card via the PCI-Express power connector(s) and PCI-Express bus slot. A Keithley Integra 2700 digital multimeter with 6.5-digit resolution is used for all measurements. Again, the values here only reflect the card's power consumption as measured at its DC inputs, not that of the whole system.

We use Metro: Last Light as a standard test representing typical 3D gaming usage because it offers the following: very high power draw; high repeatability; is a current game that is supported on all cards; drivers are actively tested and optimized for it; supports all multi-GPU configurations; test runs in a relatively short time and renders a non-static scene with variable complexity.

Our results are based on the following tests:
  • Idle: Windows 7 Aero sitting at the desktop (1920x1080) with all windows closed and drivers installed. Card left to warm up in idle mode until power draw was stable.
  • Multi-monitor: Two monitors connected to the tested card, both using different display timings. Windows 7 Aero sitting at the desktop (1920x1080+1280x1024) with all windows closed and drivers installed. Card left to warm up in idle mode until power draw was stable. When using two identical monitors with same timings and resolution, power consumption will be lower. Our test represents the usage model of many productivity users, who have one big screen and a small monitor on the side.
  • Blu-ray Playback: Power DVD 9 Ultra is used at a resolution of 1920x1080 to play back the Batman: The Dark Knight Blu-ray disc with GPU acceleration turned on. Measurements start around timecode 1:19, which has the highest data rates on the BD with up to 40 Mb/s. Playback keeps running until power draw converges to a stable value.
  • Average: Metro: Last Light at 1920x1080, representing a typical gaming power draw. Average of all readings (12 per second) while the benchmark was rendering (no title/loading screen). In order to heat up the card, we run the benchmark once without measuring power consumption.
  • Peak: Metro: Last Light at 1920x1080, representing a typical gaming power draw. Highest single reading during the test.
  • Maximum: Furmark Stability Test at 1280x1024, 0xAA. This results in a very high no-game power-consumption that can typically be reached only with stress-testing applications. We report the highest single reading after a short startup period. Initial bursts during startup are not included, as they are too short to be relevant.
Power consumption results of other cards on this page are measurements of the respective reference design.
In all candor, who do you think has gone through more work to answer this question? Hilbert used a Kill-a-watt and did some math. W1zz used meters on the GPU itself. I think some credit goes to W1zz for going through such lengths to give us such detailed information for those of us (sp. most of us,) who don't have the hardware to do get a real number and not simply doing some math with a kill-a-watt and I think that amount of data W1zz provides describes that in detail.

Lastly, Hilbert doesn't even describe what kind of test he uses to figure out power consumption. All he says is that "he stresses the GPU to 100%." All things considered, that's a bit more shady than W1zz giving us the full lowdown.
Posted on Reply
#46
fullinfusion
Vanguard Beta Tester
MxPhenom 216I would take that cooling right off and put a full coverage block on it right away. Water cooling the way its meant to be done.
That comment made me kinda chuckle, well almost but whatever :rolleyes:

Now without you going ape shit on me let me ask you why would you think it needs a full coverage block on it? The memory is right beside the gpu so why a full coverage,? Ahh but it is full coverage.. Just not the coverage thats ment to be, see what i just did there :rockout:Does the water cooler just cover the gpu and not the 4 memory modules :wtf: ... I'm about 99.99999% sure the block covers the entire gpu/ memory..

Tbh I can't wait 3 more weeks to get my new card... I want it now, gaming evolved, see what I did there :laugh:

Ok fun times over :peace:
Posted on Reply
#47
arbiter
ChaitanyaI has been a long time since I read GPU temps for a stock card so low and this quite. I must say it's a job well done by AMD engineers for atleast making the card cool and quite under load.
Any gpu can be made to run cool when a water cooler is applied. Shoot a GTX980 had a water cooler on it thing topped out at 43c, while clocked at 1600mhz.
Tetsudo77AMD developed this card with many times less budget than nvidia, mind you. Technically speaking GCN is superior than anything nvidia can develope. AMD drivers tend to increase their cards performace overtime while NV drivers don't.
This comes from a owner of a GTX 970.
amd driver increase performance while nv drivers don't? That comment is so full of stupidity I don't even know where to start or how to. GCN being more advanced that some of their new tech most their cards didn't even support freesync for example.
Posted on Reply
#49
MxPhenom 216
ASIC Engineer
fullinfusionThat comment made me kinda chuckle, well almost but whatever :rolleyes:

Now without you going ape shit on me let me ask you why would you think it needs a full coverage block on it? The memory is right beside the gpu so why a full coverage,? Ahh but it is full coverage.. Just not the coverage thats ment to be, see what i just did there :rockout:Does the water cooler just cover the gpu and not the 4 memory modules :wtf: ... I'm about 99.99999% sure the block covers the entire gpu/ memory..

Tbh I can't wait 3 more weeks to get my new card... I want it now, gaming evolved, see what I did there :laugh:

Ok fun times over :peace:
Its not about need, water cooling to me is mostly for the aesthetic (seeing how hardware doesnt run all that hot any more to warrant it, no one needs it) , and I find AIO, unless its the swiftech kits, to be rather ugly. If I already have a water cooling loop, why would I want a part that just adds more clutter in terms of tubes, and another radiator to mount? When I can just strip off the stock cooling, and throw a nice full waterblock on it to integrate into my already awesome water cooling setup.

Also I am more worried about the GPU + VRM temps, than the memory(maybe HBM is different, but I know the VRMs on my 780 run hotter then the memory), which is where full cover comes into play.

Also temps will be even better in a real water cooling loop. My 360 and 240 rad set up laughs at that dinky 120mm.
Posted on Reply
#50
okidna
MxPhenom 216W1zzard has probably had the card for the last few weeks. Watching and laughing as we all speculate.
W1zzardI don't have a 390 non-X yet from anyone. MSI sent a 380 and 370 too, but these are lower priority.

In the queue: EVGA 980 Ti SC+, PowerColor 390X PCS+, Zotac 980 Ti AMP!, and Fury X when I get one
Jun 20, 2015 at 12:41 AM
Posted on Reply
Add your own comment
Jul 17th, 2024 23:35 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts