Technical question based on statement:
If AMD designed it to work with water AIO from the start (and not air cooled) this must mean the chip produces a lot of heat? On such a small PCB would this create heat dissipation problems?
I think that about sums it up. The big problem it seems is that the HBM stacks sit in close proximity to the GPU (and under the same heatsink), while the PCB acts as a heatsink itself for the VRMs -as someone else noted,
the high localised temp, is the same scenario - albeit
not as drastic, as found on the 295X2. Heat from all sides, with minimal internal airflow reliant upon a cold plate on one side to heatsink the entire card. The heat buildup is by my reckoning largely behind the decision to voltage lock the GPU and clock lock the HBM at Hynix's default settings. We may never know unless PowerColor or some other vendor gets a full cover waterblock version of the card out and sanction from AMD to relax voltage lock ( BTW :wasn't there a huge furore when Nvidia voltage locked their cards?).
I genuinely think Fury (non X) is air cooled on slower clocks because it thermally will struggle.
I concur. AMD's GPUs have about the same power requirement as Nvidia's, but have over the last few generations had greater issues with thermal dissipation (maybe the higher transistor density?). The ideal situation would be (aside from TEC) a heatsink fed directly to vapour chamber, with the HBM stacks cooled by fan air and ramsinks, but I think the HBM stacks proximity to the GPU make that a tricky assembly job as well as adding some unwelcome (for the vendors) added expense working in machining tolerances.
I really don't think there is any headroom for air cooling and even then, the water cooling isn't allowing higher clocks. It could be a HBM integration issue with heat affecting perf? Who knows.
A bad IC on a GDDR5 card means the RMA involves removal and replacing. RMA of a Fury X means the whole package probably gets binned and the GPU salvaged for assembly into another interposer package by Amkor. Warranty returns mean bad PR in general, but the physical cost on a small volume halo product card like the Fury X could also be prohibitive.
But isn't that more as a built-in of Nvidia Gameworks; ShadowWorks? Honestly, I personally find Very High looks very... un-natural.
Doesn't matter in the greater scheme of things. How many AMD users vehemently deride, ignore, and refuse to buy any Nvidia sponsored title? Yet the titles still arrive, and if they are AAA titles, sell. If they sell, tech sites are bound by common sense and page views if nothing else to do performance reviews of the titles (and if popular enough include them in their benchmark suites). Benchmarking involves highest image quality at playable settings for the most part, and highest game i.q. in general.
Now, bearing that in mind, and seeing the Dying Light numbers, what do you suppose Nvidia are likely to shoehorn into any of their upcoming sponsored games for maximum quality settings?
Medium is about the top limit where you have "defined but soften shadows" that are more real to life. Even the differences from low/medium hardly noticeable.
In practical terms, no it doesn't. In marketing terms?...well, that's an entirely different matter. DiRT Showdown's advanced lighting enabled lens flares to warm the heart of JJ Abrams. Minimal advancement in gaming enjoyment (less if overdone lens flare is a distraction), but the default testing scenario.
There's been comparisons to the
Gigabyte GeForce GTX 980 Ti G1 review W1zzard did prior to this. Looking at that as to the subject of "power" under gaming it's reveling.
The G1 saw 23% higher in it's Average/Peak from the reference 980Ti. Avg: 211 vs. 259W; Peak 238 vs. 293W
The Fury numbers... Avg 246W; Peak: 280W
One is a non-reference card sporting a very high 20% overclock out of the box, one is a reference card. 20% overclock equates to 23% more power usage (and 27% higher performance than the Fury X at 4K). What kind of wattage do you suppose a Fury X would consume at 1260MHz ? Now, this observation aside, what the hell does your long winded power usage diatribe have to do with my quote that you used to launch into it?