# Radeon R9 Fury X Faster Than GeForce GTX 980 Ti at 4K: AMD



## btarunr (Jun 19, 2015)

AMD, in its press documents, claimed that its upcoming flagship single-GPU graphics card, the Radeon R9 Fury X, will be faster than NVIDIA's recently launched GeForce GTX 980 Ti, at 4K Ultra HD resolution. This puts to rest speculation that its 4 GB of video memory hampers performance against its competitor with 6 GB of memory. From the graph below, which was extracted from AMD's press material, the R9 Fury X will be faster than the GTX 980 Ti, in even the most memory-intensive games at 4K, including Far Cry 4, The Witcher 3: Wild Hunt, Crysis 3, Assassins Creed: Unity, and Battlefield 4. Bigger gains are shown in other games. In every single game tested, the R9 Fury X is offering frame-rates of at least 35 fps. The Radeon R9 Fury X will launch at $649.99 (the same price as the GTX 980 Ti), next week, with market availability within the following 3 weeks.





*View at TechPowerUp Main Site*


----------



## Xzibit (Jun 19, 2015)

Popcorn
Chair
Let the show begin


----------



## jigar2speed (Jun 19, 2015)

If this is what AMD says, I am highly doubtful about Fury X beating GTX 980Ti.


----------



## dj-electric (Jun 19, 2015)

How about we all keep it quite until next week, ah?


----------



## bobbavet (Jun 19, 2015)

Why did the Product slide produce earlier by AMD state 54fps for Crysis and yet this graph shows around 45fps?


----------



## Aceman.au (Jun 19, 2015)

So it doesn't even compete with the Titan X? What....


----------



## RejZoR (Jun 19, 2015)

I think that might be a job for *AMD Radeon Fury Maxx*


----------



## Basard (Jun 19, 2015)

bobbavet said:


> Why did the Product slide produce earlier by AMD state 54fps for Crysis and yet this graph shows around 45fps?



Crysis 1 vs 3 maybe? I dunno, don't remember the slide...  Besides, this slide is more like 35 fps.


----------



## xfia (Jun 19, 2015)

bobbavet said:


> Why did the Product slide produce earlier by AMD state 54fps for Crysis and yet this graph shows around 45fps?


just higher settings.. remember dx12 games will have far better rendering efficiency so games will look better and perform better that could very well be around what sleeping dogs shows.


----------



## nem (Jun 19, 2015)

FURY vs TI


----------



## zsolt_93 (Jun 19, 2015)

The only one where it seems substantially better is Sleeping Dogs, and that is like 10 FPS in a pretty old game. Not sure if that one is DX11. It is always nice to have competition but this seems to little for the disadvantages you get with Fury X. Maybe next gen with HBM2 8GB and air cooling on the whole lineup ,letting the customer decide if they want to go water even if its just a die shrink of this will be of more value. I hope the Nano comes out okay and is at lest at the level of the 980 so there is something new for the more budget oriented people to buy apart from the obvious rebrands that are filling the gap. I cannot understand why they just couldn' t cut the chip and release more cut down versions, having this whole lineup rebrand, but as a I think thoroughly there are 3 SKUs based on FIji, that is pretty much the same number a Maxwell chip has and it is just the naming that skewed the whole lineup.


----------



## ensabrenoir (Jun 19, 2015)

until some actual third party  reviews......


----------



## xfia (Jun 19, 2015)

zsolt_93 said:


> The only one where it seems substantially better is Sleeping Dogs, and that is like 10 FPS in a pretty old game. Not sure if that one is DX11. It is always nice to have competition but this seems to little for the disadvantages you get with Fury X. Maybe next gen with HBM2 8GB and air cooling on the whole lineup ,letting the customer decide if they want to go water even if its just a die shrink of this will be of more value. I hope the Nano comes out okay and is at lest at the level of the 980 so there is something new for the more budget oriented people to buy apart from the obvious rebrands that are filling the gap. I cannot understand why they just couldn' t cut the chip and release more cut down versions, having this whole lineup rebrand, but as a I think thoroughly there are 3 SKUs based on FIji, that is pretty much the same number a Maxwell chip has and it is just the naming that skewed the whole lineup.


its actually more like it will be at least that much.. a microsoft engineer mentioned how gcn is very well optimized for lighting affects and performance can increase 30 percent.


----------



## ZoneDymo (Jun 19, 2015)

Basard said:


> Crysis 1 vs 3 maybe? I dunno, don't remember the slide...  Besides, this slide is more like 35 fps.



He meant to type Far Cry 4, and yeah that shows 45 fps  here but as shown earlier:





They claimed 54 fps, quite a big difference.


----------



## xfia (Jun 19, 2015)

ZoneDymo said:


> He meant to type Far Cry 4, and yeah that shows 45 fps  here but as shown earlier:
> 
> 
> 
> ...


they dont really consider aa necessary for 4k especially with 1 gpu since its ultrahd and most people will never see the pixels unless they are very close like vr or very large displays.


----------



## sakai4eva (Jun 19, 2015)

xfia said:


> they dont really consider aa necessary for 4k especially with 1 gpu since its ultrahd and most people will never see the pixels unless they are very close like vr or very large displays.


Yeah... What's the point of AA when you have massive pixel density?


----------



## revanchrist (Jun 19, 2015)

I think AMD got a winner here with Fury X since it beats 980 Ti even in those Nvidia Gameworks titles. Even if you think AMD might be exaggerating a bit in these slides for marketing purposes or whatever, a tie in those Gameworks titles are still a job very well done.


----------



## the54thvoid (Jun 19, 2015)

Unfortunately there is an old Norse saying, "Press decks for products are like axes in a window - sharp and shiny as they are, you can only trust them when rending limbs from bone"

In other words, 5 more days.  But I have no doubt it'll be fast, probably faster, stock versus stock but I doubt in all games.  If it actually is faster then that's good because it means they've conquered Nvidia's Gamehurts.
But, remember the Norsemen.


----------



## m6tzg6r (Jun 19, 2015)

So its faster in 4K? Well the 13 people who game in 4K are probably happy to hear that.


----------



## sakai4eva (Jun 19, 2015)

m6tzg6r said:


> So its faster in 4K? Well the 13 people who game in 4K are probably happy to hear that.


Well, for someone like me who doesn't game in 4k but would like to in the future, getting a card that is capable of doing that now would make it possible for incremental upgrades that doesn't break the bank every once in a while. 

I'm not excited, but I have a small warm, fuzzy feeling that says "Go AMD, go!"


----------



## Haytch (Jun 19, 2015)

m6tzg6r said:


> So its faster in 4K? Well the 13 people who game in 4K are probably happy to hear that.


I am happy to hear that, ill be happier when i see that!


----------



## xfia (Jun 19, 2015)

Haytch said:


> I am happy to hear that, ill be happier when i see that!


when things get memory intensive it will take over and actually that happens a lot at 4k with gddr5.. higher core boost clocks matter less above 1080p because of it. maybe not so much with a beast 512bit bus, 8gb vram and compression.


----------



## mirakul (Jun 19, 2015)

zsolt_93 said:


> The only one where it seems substantially better is Sleeping Dogs, and that is like 10 FPS in a pretty old game. Not sure if that one is DX11. It is always nice to have competition but this seems to little for the disadvantages you get with Fury X. Maybe next gen with HBM2 8GB and air cooling on the whole lineup ,letting the customer decide if they want to go water even if its just a die shrink of this will be of more value. I hope the Nano comes out okay and is at lest at the level of the 980 so there is something new for the more budget oriented people to buy apart from the obvious rebrands that are filling the gap. I cannot understand why they just couldn' t cut the chip and release more cut down versions, having this whole lineup rebrand, but as a I think thoroughly there are 3 SKUs based on FIji, that is pretty much the same number a Maxwell chip has and it is just the naming that skewed the whole lineup.



I dare say that Sleeping Dogs used more Directx 11 tech than some Gameworks titles from Ubi$oft. It's a beautiful and fun game, though not overhyped like junks from Ubi$oft-nVidia combo.


----------



## GreiverBlade (Jun 19, 2015)

Aceman.au said:


> So it doesn't even compete with the Titan X? What....


why should they compete with a card that is not even a gaming card (but that people buy as a gaming card) that is 4% faster (averaged) at 2160p than a 980Ti and cost 450$ more ~


now if the fury X is effectively faster at 4k and the Titan X is marginally faster than a 980Ti ... then the Fury X will be in line with the Titan X (case to case scenario) and cost less since price aligned to the 980Ti ...

yep the Fury line turn out quite good in the end (albeit having 4gb... but it's like nvidia with they magic "we don't need a 512bit memory bandwidth" after all a 256bit card was on par/beating slightly  a 512bit card, despite having 3.5gb but it's another story...)

waiting for the reviews and SPECIALLY the NANO


----------



## ZoneDymo (Jun 19, 2015)

xfia said:


> they dont really consider aa necessary for 4k especially with 1 gpu since its ultrahd and most people will never see the pixels unless they are very close like vr or very large displays.



Why bring up AA? Neither slide says anything about it, just "Ultra settings".
Unless you are thinking AA is not used in AMD's slide but is used in the one in this article, could be, could also be that AMD was rather generous with their earlier claims.


----------



## xfia (Jun 19, 2015)

ZoneDymo said:


> Why bring up AA? Neither slide says anything about it, just "Ultra settings".
> Unless you are thinking AA is not used in AMD's slide but is used in the one in this article, could be, could also be that AMD was rather generous with their earlier claims.


they pretty well explain the performance segments enough to understand what *smallprint means. they honestly just lay it all out. they are more than kicking ass with the fury cards and the r9 nana crushes nvidia in performance per watt with 250% over the 290x while being more powerful.. excellent for lower power and smaller form factor machines.


----------



## mirakul (Jun 19, 2015)

Don't know why you didn't include the setting


----------



## xfia (Jun 19, 2015)

mirakul said:


> Don't know why you didn't include the setting


looks like the only goal was to keep it above 30ish with the settings.. it is fine if you want to but i think most people really just like 60fps. freesync helps you with picking your standard for frame rates while being smoother and not locked into a refresh rate. i like 50-60fps(hz)


----------



## HumanSmoke (Jun 19, 2015)

the54thvoid said:


> Unfortunately there is an old Norse saying, "Press decks for products are like axes in a window - sharp and shiny as they are, you can only trust them when rending limbs from bone"
> In other words, 5 more days.  But I have no doubt it'll be fast, probably faster, stock versus stock but I doubt in all games.  If it actually is faster then that's good because it means they've conquered Nvidia's Gamehurts.
> But, remember the Norsemen.


I'm hoping it is faster than the 980 Ti and Titan X. It might be the ONLY way to get a fully enabled GM 200 with balls-to-the-wall voltage control and high clocks at a reasonable price.


----------



## the54thvoid (Jun 19, 2015)

HumanSmoke said:


> I'm hoping it is faster than the 980 Ti and Titan X. It might be the ONLY way to get a fully enabled GM 200 with balls-to-the-wall voltage control and high clocks at a reasonable price.



Yeah.  The fabled 980ti 'Metal'?

It'll be a race to core meltdown at 500watts per card.  Fury X-tremely hot versus Titan X-orbitant.


----------



## Kaynar (Jun 19, 2015)

Unfortunately right now i'm totally stuck with G-Sync (in a good and bad way) since its SO MUCH SUPERIOR to nothing, and apparently a little better than FreeSync, that I am just forced to buy nVidia. So I just hope that nVidia pricing on the 980Ti will be something under 700 Euros because of the Fury X competition.

But the above graphs are only for 4K, what about 1080p and 1440p? They are much less memory intensive resolutions and therefore the advantage of AMD's new memory might not be that important on those lower resolutions.


----------



## ZoneDymo (Jun 19, 2015)

xfia said:


> they pretty well explain the performance segments enough to understand what *smallprint means. they honestly just lay it all out. they are more than kicking ass with the fury cards and the r9 nana crushes nvidia in performance per watt with 250% over the 290x while being more powerful.. excellent for lower power and smaller form factor machines.



I'm assuming English is not your first language.
That is quite a hard to read reponds you posted I'm afraid.
Im guessing you mean to say that "Ultra settings *" means "not really Ultra", or Ultra-ish... but as mirakul's link pointed out, they are running it with SMAA in that test, which is post processing AA which does pretty much nothing to hurt performance.

So the claimed 54 fps vs the 45 fps the slide in this article shows still stands and so does my previous statement that it seems AMD was rather generous with their earlier claims.

Also to respond to:
"looks like the only goal was to keep it above 30ish with the settings.. it is fine if you want to but i think most people really just like 60fps. freesync helps you with picking your standard for frame rates while being smoother and not locked into a refresh rate. i like 50-60fps(hz)"

Freesync is something you want to use along side Vsync so yes you are still locked in.
Freesync alone can still introduce screen tearing when the framerate goes past the refreshrate (which I dont have to explain is something you do not want).
Freesync makes the the experience of sub-optimal fps a smoother better one.


----------



## ZoneDymo (Jun 19, 2015)

Kaynar said:


> Unfortunately right now i'm totally stuck with G-Sync (in a good and bad way) since its SO MUCH SUPERIOR to nothing, and apparently a little better than FreeSync, that I am just forced to buy nVidia. So I just hope that nVidia pricing on the 980Ti will be something under 700 Euros because of the Fury X competition.
> 
> But the above graphs are only for 4K, what about 1080p and 1440p? They are much less memory intensive resolutions and therefore the advantage of AMD's new memory might not be that important on those lower resolutions.



idk man, from all I seen its equal to freesync in visual aid, the only difference is that G-sync costs 1 fps.


----------



## mirakul (Jun 19, 2015)

ZoneDymo said:


> idk man, from all I seen its equal to freesync in visual aid, the only difference is that G-sync costs 1 fps.


G-sync costs 100$ more in most case.


----------



## ZoneDymo (Jun 19, 2015)

mirakul said:


> G-sync costs 100$ more in most case.



right right that as well


----------



## Kaynar (Jun 19, 2015)

mirakul said:


> G-sync costs 100$ more in most case.



uhmmm comparing similar models, i'd say a gsync screen is about 200$ more than a freesync. 



ZoneDymo said:


> idk man, from all I seen its equal to freesync in visual aid, the only difference is that G-sync costs 1 fps.



Isn't freesync limited to around 30fps min and 90 fps max thought?


----------



## xfia (Jun 19, 2015)

ZoneDymo said:


> I'm assuming English is not your first language.
> That is quite a hard to read reponds you posted I'm afraid.
> Im guessing you mean to say that "Ultra settings *" means "not really Ultra", or Ultra-ish... but as mirakul's link pointed out, they are running it with SMAA in that test, which is post processing AA which does pretty much nothing to hurt performance.
> 
> ...


what? no no no
freesync is the gpu tech not the standard but adaptive sync is and you dont use vsync with it
freesync has complete frame rate control for 4k and up to like 90 something fps(hz) at 1440p while syncing the refresh rate down to like 9fps(hz)
freesync is practically perfect.. its gsync with the 1 frame of latency.
edit-yeah those are the full specs but displays are different


----------



## ZoneDymo (Jun 19, 2015)

Kaynar said:


> uhmmm comparing similar models, i'd say a gsync screen is about 200$ more than a freesync.
> 
> 
> 
> Isn't freesync limited to around 30fps min and 90 fps max thought?



According to this: http://wccftech.com/amd-freesync-nvidia-gsync-verdict/#ixzz3dVAAHDs0

"Another difference between FreeSync and G-Sync is in the flexibility of the effective refresh rate range. G-Sync is capable of refresh rates that range from 30Hz to 144Hz while the FreeSync spec is capable of refresh rates that range from 9Hz to 240Hz"

According to AMDs website: http://support.amd.com/en-us/search/faq/222

"Using DisplayPort Adaptive-Sync, the graphics card can detect and set an appropriate maximum and minimum refresh rate based on the capabilities reported by the display. Potential ranges include 36-240Hz, 21-144Hz, 17-120Hz and 9-60Hz."


----------



## ZoneDymo (Jun 19, 2015)

xfia said:


> what? no no no
> freesync is the gpu tech not the standard but adaptive sync is and you dont use vsync with it
> freesync has complete frame rate control for 4k and up to like 90 something fps(hz) at 1440p while syncing the refresh rate down to like 9fps(hz)
> freesync is practically perfect.. its gsync with the 1 frame of latency.



Honestly Im beginning to think you barely understand what I'm writing.

1. Freesync is gpu tech, yes its AMD technologie based on adaptive sync build into Displayport.
2. You DO use Vsync with it unless you want screen tearing (which you do not want)





3. Resolution (4k, 1440p etc) have nothing to do with it at all.
Freesync has an enormous range but the monitors supporting it so far dont even come close and its those that determine how far the Freesync is usable.

Its like buying a pump for water that can easily pump through 100 Liters of water per minute, but you you need a tube large enough to move the water through, if the tube is too thin you wont be pushing through the 100 liters of water.

4. Freesync would be perfect if you did not need Vsync at all, if it meant smooth gameplay, no screen tearing and now mouse latency.


----------



## xfia (Jun 19, 2015)

its possible to get screen tearing but the monitors are so well specd on top of freesync its practically perfect.. 

you know tearing happens when you run above your refresh rate right? if your having rapid frames like that above your refresh rate you could still cap your frames and get better latency.

the scenario shown is strange in itself is strange because how freesync works so it would have to be how that display is specd.


----------



## Supercrit (Jun 19, 2015)

Implying that it is slower in all other resolutions? That's not a statement to use to reassure people with.


----------



## ZoneDymo (Jun 19, 2015)

xfia said:


> its possible to get screen tearing but the monitors are so well specd on top of freesync its practically perfect..
> 
> you know tearing happens when you run above your refresh rate right? if your having rapid frames like that above your refresh rate you could still cap your frames and get better latency.
> 
> the scenario shown is strange in itself is strange because how freesync works so it would have to be how that display is specd.



"you know tearing happens when you run above your refresh rate right?"
Yes....what do you think Vsync does? honestly....
"if your having rapid frames like that above your refresh rate you could still cap your frames and get better latency."
Yeah you cap it...WITH VSYNC, not a lot of other ways to cap framerate and even less then that actually work.


----------



## nunyabuisness (Jun 19, 2015)

lets put a water cooled FURY X against an air cooled 980ti. with stock clocks. 

when we know that the EVGA 980ti HYBRID beats the FURY X in fire strike 4K. which is a much fairer comparison. 
the air cooled fury will be abt the same or slower than a 980ti ACX 2.0 of a G1 gaming!


----------



## Aquinus (Jun 19, 2015)

I think I'm going to wait patiently for @W1zzard to do a review.  I know what source I can trust and I suspect internal benchmarks will be a bit biased, considering that's human nature. Until we start hearing from third parties with respect to benchmarks that are confirmed to be legitimate, it doesn't really mean a whole lot. I still remember AMD's "benchmarks" from pre-Bulldozer roll-out and I'm skeptical considering recent history. Not to say I don't want to believe in AMD, let's just say I don't have faith in their assessments of their own hardware.



			
				techreport.com said:
			
		

> Of course, these numbers are supplied by AMD, and it's possible that they've been cherry-picked to present an overly positive picture. Even so, they seem to paint a winning picture for Fiji. We'll be verifying these results independently in our upcoming Fury X review.


----------



## 64K (Jun 19, 2015)

HumanSmoke said:


> I'm hoping it is faster than the 980 Ti and Titan X. It might be the ONLY way to get a fully enabled GM 200 with balls-to-the-wall voltage control and high clocks at a reasonable price.



I've been thinking Nvidia has planned to do that all along if Fury X is faster. They may do it anyway. Probably for $50 more but not for a few months. Most of the non-reference 980 Ti haven't showed up for sale yet except the EVGA. They need to unload their salvage GM200s first.


----------



## mirakul (Jun 19, 2015)

nunyabuisness said:


> lets put a water cooled FURY X against an air cooled 980ti. with stock clocks.
> 
> when we know that the EVGA 980ti HYBRID beats the FURY X in fire strike 4K. which is a much fairer comparison.
> the air cooled fury will be abt the same or slower than a 980ti ACX 2.0 of a G1 gaming!


If they are at the same price point (650$), of course they will be put against each others. I doubt that you could find any custom 980ti with 650$ btw.


----------



## rooivalk (Jun 19, 2015)

nunyabuisness said:


> lets put a water cooled FURY X against an air cooled 980ti. with stock clocks.
> 
> when we know that the EVGA 980ti HYBRID beats the FURY X in fire strike 4K. which is a much fairer comparison.
> the air cooled fury will be abt the same or slower than a 980ti ACX 2.0 of a G1 gaming!


As long as the price is same, it's a fair comparison.

But we also know custom air cooled 980Ti is already beast (see TPU's own review of Gigabyte 980Ti G1). With on average about 17% faster than stock clock, it's probably faster than FuryX.

Interesting to see what would AIB Fury capable of and could 4GB HBM handle Shadow of Mordor with HQ texture?


----------



## the54thvoid (Jun 19, 2015)

64K said:


> I've been thinking Nvidia has planned to do that all along if Fury X is faster. They may do it anyway. Probably for $50 more but not for a few months. Most of the non-reference 980 Ti haven't showed up for sale yet except the EVGA. They need to unload their salvage GM200s first.



They can do it if they want, which is a bit annoying. A full Titan core, 8 pin power, higher tdp, AIB coolers and higher stock clocks. Would make a 10-20% faster (than 980ti) card.
I'll hang on for Fury but given the leaked AMD benches, I see a Classy 980ti on water being my next card, unless Fury has decent OC headroom (which AMD imply it has).
A very good day for enthusiasts.  Not so for those that buy rebrands.


----------



## HisDivineOrder (Jun 19, 2015)

I notice the two games I'd most like to see are absent on this benchmark list:

Dying Light
Grand Theft Auto V

Those are the games that most use VRAM, too.


----------



## uuuaaaaaa (Jun 19, 2015)

nunyabuisness said:


> lets put a water cooled FURY X against an air cooled 980ti. with stock clocks.
> 
> when we know that the EVGA 980ti HYBRID beats the FURY X in fire strike 4K. which is a much fairer comparison.
> the air cooled fury will be abt the same or slower than a 980ti ACX 2.0 of a G1 gaming!



The reference Fury X will most likely beat the reference 980Ti at the same price point. It will also be quieter and run cooler. That AIO is capable to handle a TDP of 500W and the reference board has a vrm capable of delivering around 400 amps which would translate to more or less into a TDP in the 400W range. These are all hints that Fury X might actually be a quite nice overcloker.


----------



## btarunr (Jun 19, 2015)

m6tzg6r said:


> So its faster in 4K? Well the 13 people who game in 4K are probably happy to hear that.



Over 2.5 million 4K monitors have been sold to end-users so far.


----------



## HM_Actua1 (Jun 19, 2015)

face palm.....Apples to oranges..... Fiji to Maxwell.

wait until Pascal drops. Have your tissues ready.


----------



## TheGuruStud (Jun 19, 2015)

Using vsync with freesync

Use a frame limiter. It's even built-in in a lot if newer games that I have.


----------



## the54thvoid (Jun 19, 2015)

I can see the TPU benchmark forums starting to get interesting again.  It's been Nvidia on top for ages now.  This might be good for some red/green jolly japes.

This Hexus review shows the Zotac Amp Overclock (not the Amp Extreme version mind)

Given the AMD slide puts the 980TI at about 33fps (it's not far off Hexus 31-36) it shows a decent lead at 40-44 fps for an overclocked 980ti (>20% faster than stock).  Exciting times ahead - I hope the Fury X is as capable at overclocking as Maxwell.  Makes my purchasing decisions harder though.


----------



## v12dock (Jun 19, 2015)

Hitman_Actual said:


> face palm.....Apples to oranges..... Fiji to Maxwell.
> 
> wait until Pascal drops. Have your tissues ready.



2H 2016 that also gives time for AMD to refine HBM, move to 16 FinFET, and make architectural improvements.


----------



## moproblems99 (Jun 19, 2015)

GreiverBlade said:


> why should they compete with a card that is not even a gaming card (but that people buy as a gaming card) that is 4% faster (averaged) at 2160p than a 980Ti and cost 450$ more ~



Didn't the Titan X get neutered for compute?  Which makes it a $1000+ 12GB gaming card.


----------



## xorbe (Jun 19, 2015)

btarunr said:


> Over 2.5 million 4K monitors have been sold to end-users so far.



Living room TVs, or computer screens hooked up to gaming PCs?


----------



## 64K (Jun 19, 2015)

the54thvoid said:


> They can do it if they want, which is a bit annoying. A full Titan core, 8 pin power, higher tdp, AIB coolers and higher stock clocks. Would make a 10-20% faster (than 980ti) card.
> I'll hang on for Fury but given the leaked AMD benches, I see a Classy 980ti on water being my next card, unless Fury has decent OC headroom (which AMD imply it has).
> A very good day for enthusiasts.  Not so for those that buy rebrands.



An OC Classified 980 Ti on water cooling will be a beast indeed!


----------



## btarunr (Jun 19, 2015)

xorbe said:


> Living room TVs, or computer screens hooked up to gaming PCs?



I believe I used the word "monitor," so that excludes televisions.


----------



## N3M3515 (Jun 19, 2015)

HumanSmoke said:


> I'm hoping it is faster than the 980 Ti and Titan X. It might be the ONLY way to get a fully enabled GM 200 with balls-to-the-wall voltage control and high clocks at a reasonable price.



Wasn't the titan x a fully enabled GM 200?


----------



## TheGuruStud (Jun 19, 2015)

N3M3515 said:


> Wasn't the titan x a fully enabled GM 200?



He means if fury is better than titan x, then it could force nvidia to release a higher clocked, OC friendly version of titan x - titan xxx


----------



## the54thvoid (Jun 19, 2015)

TheGuruStud said:


> He means if fury is better than titan x, then it could force nvidia to release a higher clocked, OC friendly version of titan x - titan xxx



You laugh but you know what - they bloody might well just do something as 'guff' as call it 'Triple X'.

But then Fury can do the exact same thing.  Battle of the hardcore Pr0n.


----------



## xfia (Jun 19, 2015)

http://www.maximumpc.com/ces-2015-amd-demonstrates-freesync-technology-and-nano-pc-video/
got the scientist here showing you should have your settings high enough to be in the 45-60fps range. i dont where that slide come from buts its not something they really like to say about how it works but as mentioned alot of games sync frames and have pretty good dynamic framerate. they are releasing a driver bases dynamic frame control soon enough.
i think the difference in fps we see from amd may be the experience with catalyst..
why not settings like this if a game is running easy max over your refresh rate


 
or why not settings like this in a more balanced scenario 


 
what if a apu needs a little boost in performance


----------



## arbiter (Jun 19, 2015)

bobbavet said:


> Why did the Product slide produce earlier by AMD state 54fps for Crysis and yet this graph shows around 45fps?


THat is pretty suspect how they claim one fps one day then 20% less the next. AMD marketing and tech side haven't really been on same page for anything for a while.


m6tzg6r said:


> So its faster in 4K? Well the 13 people who game in 4K are probably happy to hear that.





sakai4eva said:


> Well, for someone like me who doesn't game in 4k but would like to in the future, getting a card that is capable of doing that now would make it possible for incremental upgrades that doesn't break the bank every once in a while.


Does seem like from settings page the settings were tuned to keep ram usage under 4gb.



xfia said:


> maybe not so much with a beast 512bit bus, 8gb vram and compression.


The compression was a feature of GCN 1.2, 390x is GCN 1.1. 380 is gcn1.2. Even then benchmarks of 390x show marginal performance boosts with higher memory clock only 5 to 8% for what is a 20% memory boost. 380 is only 256bit



mirakul said:


> I dare say that Sleeping Dogs used more Directx 11 tech than some Gameworks titles from Ubi$oft. It's a beautiful and fun game,


Sleeping dogs was complete CRAP of a game. Graphic's crap, controls were worse then GTA4. it made GTA4 look like a well running game.



Kaynar said:


> uhmmm comparing similar models, i'd say a gsync screen is about 200$ more than a freesync.





xfia said:


> freesync has complete frame rate control for 4k and up to like 90 something fps(hz) at 1440p while syncing the refresh rate down to like 9fps(hz)
> freesync is practically perfect.. its gsync with the 1 frame of latency.


i'll groups your 2's post together since both talkin same stuff. G-sync does cost more cause its difference between 2 techs of 1 that was worked on for years to prefect it and 1 that was thrown together in a month to compete. Freesync is perfect? Yea sure if you don't mind Ghosting, or tearing when fps drops under 40fps. Before you try to blame the ghost on the panel its not the panels fault its not. Nvidia took time and effort to test all types and models of panels to see which ones work well doing VRR and which ones suck and made a list of ones that are good to use for G-sync. G-sync only has a small fps lose which is only around 1% on kepler cards since they had to a part of the work in drivers since kepler cards lacked hardware to do a certain thing. But that does make a ton of nvidia cards that support g-sync then AMD had for freesync. its been confirmed that r7/r9 370(x) card doesn't even support freesync, that is pretty sad. I would call that pretty unacceptable.


----------



## HumanSmoke (Jun 19, 2015)

the54thvoid said:


> I see a Classy 980ti on water being my next card, unless Fury has decent OC headroom (which AMD imply it has).


Word has it that AMD won't allow the memory to overclocked, and AMD's own benchmarks show that while the core can be overclocked, the net gain isn't overly spectacular.
From the AMD press deck. Fury X overclocked by 100MHz (9.5% overclock)





Seems in line with other current GPUs, but the 9.5% overclock margin isn't that impressive.


----------



## TheGuruStud (Jun 19, 2015)

Arbiter's trolling skills are insane.


----------



## Ferrum Master (Jun 19, 2015)

HumanSmoke said:


> Word has it that AMD won't allow the memory to overclocked, and AMD's own benchmarks show that while the core can be overclocked, the net gain isn't overly spectacular.
> From the AMD press deck. Fury X overclocked by 100MHz (9.5% overclock)
> 
> 
> ...



You know what was the CPU and resolution on that bench? If 4K... you know... it is good... 1080p ain't... and most importantly, I believe this card needs a hell of a CPU clocked high... you know it is an AMD.

ADD.

And overclocking memory... for more bandwidth... on HBM? Naah... i would locked it too.


----------



## arbiter (Jun 19, 2015)

TheGuruStud said:


> Arbiter's trolling skills are insane.


Sad how one can speak truth's but called a troll.


----------



## xfia (Jun 19, 2015)

i wont argue how adaptive allows for a wide array of oem costomization.. 
so currently gsync does show some advantage at this point so nvidia does pull ahead at this point with adaptive sync but for how for long?


----------



## the54thvoid (Jun 19, 2015)

HumanSmoke said:


> Word has it that AMD won't allow the memory to overclocked, and AMD's own benchmarks show that while the core can be overclocked, the net gain isn't overly spectacular.
> From the AMD press deck. Fury X overclocked by 100MHz (9.5% overclock)
> 
> 
> ...





Ferrum Master said:


> You know what was the CPU and resolution on that bench? If 4K... you know... it is good... 1080p ain't... and most importantly, I believe this card needs a hell of a CPU clocked high... you know it is an AMD.
> 
> ADD.
> 
> And overclocking memory... for more bandwidth... on HBM? Naah... i would locked it too.



CPU is 5960k.  I read the link to source.  If the source info is true the overclock is quite feeble.  A 980ti can go 20% over stock in performance.......  

Still, awaiting Wednesday.

@W1zzard - when you bench (when you publish what you have benched - can you do an apples to apples, balls to the wall overclock on an intensive game, Fury X versus 980ti - both at max OC?  Neutral, non Gameworks and ultra everything so VRam is high.  This would be good to see.


----------



## Ferrum Master (Jun 19, 2015)

the54thvoid said:


> A 980ti can go 20% over stock in performance.......



Yes but we also saw those 20% on quite juicy clocked machines.


----------



## N3M3515 (Jun 19, 2015)

HumanSmoke said:


> Word has it that AMD won't allow the memory to overclocked, and AMD's own benchmarks show that while the core can be overclocked, the net gain isn't overly spectacular.
> From the AMD press deck. Fury X overclocked by 100MHz (9.5% overclock)
> 
> 
> ...



Bad news for amd then, the gigabyte 980Ti g1 gaming is already 15% better than stock 980Ti, and has room for 14% more according to w1zz review.


----------



## GreiverBlade (Jun 19, 2015)

moproblems99 said:


> Didn't the Titan X get neutered for compute?  Which makes it a $1000+ 12GB gaming card.


well ... nobody (except some enthusiast with more money than usual)  consider a 1000$ single gpu board for gaming ... nvidia played dirty with the neutered compute maneuver 
for me the Titan X is not a gaming card.



Hitman_Actual said:


> face palm.....Apples to oranges..... Fiji to Maxwell.
> 
> wait until Pascal drops. Have your tissues ready.


funny one ... Fiji is Maxwell contender ... not the 2yrs one that still hold it who populate the 3XX line ... 

the next gen after Fiji is Pascal contender.


----------



## Bansaku (Jun 19, 2015)

bobbavet said:


> Why did the Product slide produce earlier by AMD state 54fps for Crysis and yet this graph shows around 45fps?



Dislexia?


----------



## GhostRyder (Jun 20, 2015)

arbiter said:


> i'll groups your 2's post together since both talkin same stuff. G-sync does cost more cause its difference between 2 techs of 1 that was worked on for years to prefect it and 1 that was thrown together in a month to compete. Freesync is perfect? Yea sure if you don't mind Ghosting, or tearing when fps drops under 40fps. Before you try to blame the ghost on the panel its not the panels fault its not. Nvidia took time and effort to test all types and models of panels to see which ones work well doing VRR and which ones suck and made a list of ones that are good to use for G-sync. G-sync only has a small fps lose which is only around 1% on kepler cards since they had to a part of the work in drivers since kepler cards lacked hardware to do a certain thing. But that does make a ton of nvidia cards that support g-sync then AMD had for freesync. its been confirmed that r7/r9 370(x) card doesn't even support freesync, that is pretty sad. I would call that pretty unacceptable.


Ok first of all you need to actually research freesync and gsync before speaking about issues.  Instead of making up issues to try and make one sound significantly more superior to the other...  Gsync has had many issues with complaints about Flickering, Ghosting, ETC as well so don't act like Gsync is this perfect entity.  Also your really complaining that the R7 370 does not support freesync?  While I don't like that it doesn't how many people do you see running off to buy a 1440p 144hz monitor with freesync and then grabbing a an R7 370.  Would be the same idea seeing someone grabbing a GTX 750ti (or 760) and doing the same thing...



Hitman_Actual said:


> face palm.....Apples to oranges..... Fiji to Maxwell.
> 
> wait until Pascal drops. Have your tissues ready.


Please explain how this is apples to oranges?  These are this generations contenders???



TheGuruStud said:


> He means if fury is better than titan x, then it could force nvidia to release a higher clocked, OC friendly version of titan x - titan xxx


IF they do that, ill post it here and now I will purchase 3 of them (Call it the XXX Titan)

I want to see overclocking performance of the card.  That is what will matter in the end and if there are any aftermarket variants for better overclocking (Lightning)


----------



## Dieinafire (Jun 20, 2015)

This is a problem with 4k and the Fury

http://www.guru3d.com/news-story/amd-radeon-fury-x-doesnt-have-hdmi-2-support.html


----------



## Xzibit (Jun 20, 2015)

TheGuruStud said:


> He means if fury is better than titan x, then it could force nvidia to release a higher clocked, OC friendly version of titan x - titan xxx



I vote they call it the Tit-an S


----------



## the54thvoid (Jun 20, 2015)

Xzibit said:


> I vote they call it the Tit-an S



Well, with the performance they'd excel with I'd buy two and have a nice set of Tits beside me for when the wife is out.


----------



## BiggieShady (Jun 20, 2015)

the54thvoid said:


> Well, with the performance they'd excel with I'd buy two and have a nice set of Tits beside me for when the wife is out.


Medical emergency, help, instruction manual said to insert my tits into pcie slots.


----------



## nem (Jun 21, 2015)

Enjoy !

































































































































AMD RuleZ!!!


----------



## Caring1 (Jun 21, 2015)

I'm curious, are the four machine screw threads in the end of the card for any particular reason?
It can't be for a support bracket as the card is too short to require one.


----------



## the54thvoid (Jun 21, 2015)

Sure is a nice looking card but I've seen enough pics now. Want Wednesday to arrive for the info we all want.


----------



## jigar2speed (Jun 21, 2015)

I didn't knew Techpowerup allowed Porn on their website ...


----------



## Initialised (Jun 21, 2015)

jigar2speed said:


> I didn't knew Techpowerup allowed Porn on their website ...



Yeah, I just e-jizzed all over my keyboard.

Then I found the user guide and fired the second barrel!

http://support.amd.com/Documents/amd-radeon-r9-fury-x.pdf


----------



## the54thvoid (Jun 21, 2015)

Initialised said:


> Yeah, I just e-jizzed all over my keyboard.
> 
> Then I found the user guide and fired the second barrel!
> 
> http://support.amd.com/Documents/amd-radeon-r9-fury-x.pdf



You have premature problems?  Manual has nothing of note in it....


----------



## Bytales (Jun 21, 2015)

ZoneDymo said:


> idk man, from all I seen its equal to freesync in visual aid, the only difference is that G-sync costs 1 fps.


 
Well, now i own a G-Sync Monitor, the Asus ROG Swift, paired with a 750ti. Just to get a taste of G_sync, the plan was to get two Titan X. But as of now i am selling my monitor, just to switch to freeSync and and Radeon Fury X.

Im getting a 32Inch IPS 4k Samsung Freesync Monitor, and at first one Fury X, later on two when they add freesync driver for Crossfire X.

Why ? I always wanted a 32Inch 4k Monitor, IPS is better than TN, with the new target frame, i can keep all games at 60 fps in 4k with 2 GPUs, and most importantly of all i also got two HDMI inputs to use my Consoles, the allready owned PS4 and the future XBOX one.
With the G-Sync and the ASUS ROG Swift, it can only be used over the Display port, nad i was forced to use my 55 Samsung TV with the PS4, when staying on desktop because im using mouse and keyboard on console 3d shooters, and the TV is mounted above the monitor. Which is bad !

Now i see the Radeon Fury X with its 4gb of memory is enough for 4k (im not using AA), and it has 650 USD launch price.
As it is now i was going to need to spend 2200 EUR on two titan Xs and another 250 EUR on waterblocks.

Going AMD i am spending 1100 on monitor and 1300 EUR on GPUs, whereby i am selling my GSYNC monitor with about 600. So i am spending about 1800 EUR compared to 2500, and i get IPS, 4k, HDMI in and freesync, not to mention 32inch monitor.

Guess which one is the better option !


----------



## Bytales (Jun 21, 2015)

Dieinafire said:


> This is a problem with 4k and the Fury
> 
> http://www.guru3d.com/news-story/amd-radeon-fury-x-doesnt-have-hdmi-2-support.html


 
No its not, Display port is the future, besides your going to have 4k at your dekstop monitor, if you want to watch movies on your 4k TV from PC, do as i do, convert them to mkv, copy them on a memory stick, insert the stick in that smart TV of yours, and bam, you got what you were looking for.
You got to use your brains for a minute boy, not just look at standards and compare 1.4 with 2.0
Theory is another, practice is something else entirely !


----------



## Vlada011 (Jun 21, 2015)

Everything is excellent except this sleeve and srinks... God how that leave cheap look.
Is it possible to someone remove that.
Compare with new CORSAIR AIO systems and EVGA Hybrid look very bad.
But card itself look excellent. LED diodes and BIOS switch are very nice option for reference card.
These AMD models will not be available in my country only under brands as ASUS, MSI, Sapphire.


----------



## Durabo (Jun 22, 2015)

Bytales said:


> Well, now i own a G-Sync Monitor, the Asus ROG Swift, paired with a 750ti. Just to get a taste of G_sync, the plan was to get two Titan X. But as of now i am selling my monitor, just to switch to freeSync and and Radeon Fury X.
> 
> Im getting a 32Inch IPS 4k Samsung Freesync Monitor, and at first one Fury X, later on two when they add freesync driver for Crossfire X.
> 
> ...



I am a long time lurker in this site but this comment was so stupid i had to create an accout to answer it. So  congratulations on that.

There is a monitor which is ips and equal to the rog swift, acer xb270hu which has the same price, same specs so ips argument is invalid from the start. You can also buy it today as it was released at may 2015. Now for the 4k part, there are lots of monitors which are 4k, are ips has g sync and are set to launch this year so you can get any of these too. But g sync and free sync does not make sense at 60hz if you have 2 titanx's or furyx's anyway so 1440p ips 144hz g sync makes more sense. Even then, you wont need g or free sync with that powerful combination of graphics hardware for the foreseeable future since they will be able to output 144fps in most games anyway. Asus PG279Q(1440p, ips, 144hz, g-sync) and Asus PG27AQ(4k,ips,60hz,g-sync) both have hdmi and displayport inputs.

Lets see about the graphics cards. Who said you have to get 2 titan x's? You can easily get 2x 980 ti's which has the same performance and better oc headroom. Congratz, you just saved 700 dollars. See, your price argument already collapsed. Performance wise, it seems like fury x will be able just beat reference 980 ti's by the looks of it. But non ref 980 tis already have %15 better performance at 4k. They can also be overclocked another %15 with stock fan and stock voltage(check w1zards gigabyte 980 ti g1 gaming review). So with extra voltage that %30 performance increase should be attainable with almost all 980 ti's. Overclocking wise, water cooled 980 ti's with unlocked voltage will be on their own level.

As for water cooling argument, you said you need water blocks for titan's so you have a custom water cooling loop as i do. You will need water blocks for fury x's too since their blocks are not compatible with the fittings(there are no fittings they are closed loop) on the market. Even if you want to use them out of your loop which is very stupid since their pumps and blocks will definitely be worse compared to ones you can get at the market, you will need additional 2x 120mm fan spots which i assume you wont have since you already have other radiators. Amd probably wont include instructions, screws or tools to fit them in one slot. 

So lets see, both cards need blocks and both have the same price. While this is a speculation fury x water cooled will not be even close to water cooled 980 ti's(see the second paragraph). So you will have better performance, better noise output, better power consumption for the same price with nvidia cards. Also as you said, you have to wait new drivers with amd cards whereas nvidia already has sli support with g sync. We all know how fast amd are with new drivers and everybody and their blind and deaf cousin knows sli is far superior to crossfire due to driver and scaling issues. 

So there you have it.


----------



## Lucas_ (Jun 22, 2015)

jigar2speed said:


> If this is what AMD says, I am highly doubtful about Fury X beating GTX 980Ti.



i m gussing they will beat them badly this time ... Lets wait and see


----------



## Bytales (Jun 22, 2015)

In reply to Durabo.
First of all my comment is not stupid, its just my point of view. Everyone is entitled to their point of view and what makes sense for someone doesnt necessaraly mean it makes sense for you for we are all different a d think differently. However the truth is singular and uncheangeble. And the truth is:
1)The Asus RogSwift PG279q cannot be bought at this time 22.6 here in germany. The same can also be said for the other gsync asus monitor you listed.
2)apart from the 34inch predator acer monitor that has g sync and 3840x1440 i didnt knew there are other monitors out there with g sync and hdmi.
3)I have personally a problem against buying products based off incomplete chips, buying products which are basically broken from the factory, thats why for me the 980ti is a no go, thats why i have dual 2690v3 xeons on my rig, which leaves me with either 980 or titan x.
4)i want to get the top of the line card, since i never in my life have owned a top of the line card, and whant to see how it feels like it, even if it doesnt make sense from the expense point of view, so i would be stuck with titan x.
5)Using a waterblock on the titan x would in the end waste 2 extra slots in a double sli config, as it cannot be made single slot thanks to the dvi connector, which i wont use anyway.
6)The nature of the stuff mounted in my case wouldn't allow me install two 980 kingpins that could be made single slot through watercooling because they are too wide, so i need to find normal width graphic cards.
7)The fact that the fury x could be made single slot through water cooling would allow me to install 3 of them in crossfire if i would to make them single slot. Further on, the fact that the lenght of the card is shorter because of the hbm memory would allow me to use the built in usb 3.0 headers on the motherboard because the graphics card is not long enough to obstruct them, thus giving me the posibility to free another pci slot wich is occupied by a usb 3.0 pci express card with two 19pin headers, which in the end would allow me to install a fourth fury x.
8)In my eyes as you can clearly see the fury x is the better suited for my build, as i could install 4 as opposed to only two titan xs.
9)34 ich monitor doesnt fit my desktop, it will take a maximum of 32 inch, my deskop was build by me with a 32 inch maximum monitor size in mind. And the 32 inch samsung monitor is on market right now and can be bought allready here in germany.
10)The disadvantages of the furry x is something i could live without. The lack of hdmi 2.0 doesnt bother me. The only 4gb of memory seems to be enough for 4k (as the leaked benchmarks have so far showed, i said seema as we still need to see what actuall reviews will have to say about it), not to mention that if im lucky the memory could be addable with the advent of dx12.
11)You have probably figured it out by now that what you assume may not neccesarlly represent reality. This holds true also for the false assumption you made when you said that i would probably wont be able to install two fury x's because i would need 2 120 radiator slots, and since i have radiators mounted there is no way there would be enough space left. The situation in reality is far worse that you could have imagine it, and regardless i can still fit 2 more 120mm radistors. Here is why.
From top to bottom the whole 5.25 bays of my case are full with 5.25bay devices, all 17 of them. 3 of them dont have devices per se installed but a 120mm fan.
In total i got as radiators
a)one 140.1 double circuit radiator
b)one 140.2 double circuit radiator
c)one 140.2 single circuit radiator
d)two 140.3 radiators.
Aprox 30 fans and 4 pumps.
And i could still install 2 fury x with their radiators. Once i do, ill probably post a photo because the probability is very big that you wont believe me.

Probably now my stupid comment makes a lot more sense. And you have become a wiser man. That is what i wish for you.


----------



## Captain_Tom (Jun 22, 2015)

HumanSmoke said:


> I'm hoping it is faster than the 980 Ti and Titan X. It might be the ONLY way to get a fully enabled GM 200 with balls-to-the-wall voltage control and high clocks at a reasonable price.



Ugh.  I will never understand this:  If this beats the Titan X ->  JUST GET THIS!


----------



## btarunr (Jun 22, 2015)

Xzibit said:


> I vote they call it the Tit-an S



Or the Titan-K (sounds like Titanic?), because beyond Titan-X, any further clocks will sink the perf/Watt and thermals of the GM200.


----------



## xfia (Jun 22, 2015)

btarunr said:


> Or the Titan-K (sounds like Titanic?), because beyond Titan-X, any further clocks will sink the perf/Watt and thermals of the GM200.


i would say smoke is just that much of a nerd he has to have it in its full uncut form  all nerds around here.


----------



## Dalkamyr (Jun 22, 2015)

Can't wait to see them benchmarks...


----------



## Xzibit (Jun 22, 2015)

btarunr said:


> Or the Titan-K (sounds like Titanic?), because beyond Titan-X, any further clocks will sink the perf/Watt and thermals of the GM200.



*MORE VOLTAGE LOCK AHEAD!!!*


----------



## HumanSmoke (Jun 22, 2015)

btarunr said:


> Or the Titan-K (sounds like Titanic?), because beyond Titan-X, any further clocks will sink the perf/Watt and thermals of the GM200.


Who cares? You think people that buy EVGA Classified's, Gigabytes G1, and Galaxy's HOF worry about perf/watt? Why would I care about performance per watt when I run two overclocked GTX 780's ? Did I suddenly wander into 



Spoiler










Careful with that warm milk bta, don't want to burn yourself.

Half the people on this site that shop for enthusiast graphics lost interest in that discussion when the GTX 480 arrived, and the other half haven't given a shit about perf/watt and thermals since the day AMD's GPUs outgrew their reference coolers. 95°C operation? No problem, it's designed for that ! (although Raja Koduri was rather more sanguine : " _there are countries where people can live in 50 Celsius. 95 Celsius? I’m not sure anything would survive_ "...so I guess, the arrival of a watercooled Fury will again make it relevant again for the red team  )


Captain_Tom said:


> Ugh.  I will never understand this:  If this beats the Titan X ->  JUST GET THIS!


You know what, plenty of cards will beat Titan X. If history is any indicator, Titan X (and the Fury) will be mainstream level performance in 12-18 months. The difference between you and me is, that I want to see the absolute best wrung out of an architecture right here, right now, at the right price - and competition is a means to that end. Nvidia pushed AMD to produce the Fury X, and I am hoping that the Fury X pushes Nvidia to produce a stronger SKU, which just might, in turn lead to AMD offering an even faster/stronger alternative....etcetera etcetera.
Sorry if being a hardware enthusiast causes you to stroke out, but if bta gets that GPU site up and running, you can share an eWarmMilk and chat about the best method of removing cat hair from a velour la-z-boy

/jk............................................(maybe)


----------



## Durabo (Jun 22, 2015)

Bytales said:


> In reply to Durabo.
> First of all my comment is not stupid, its just my point of view. Everyone is entitled to their point of view and what makes sense for someone doesnt necessaraly mean it makes sense for you for we are all different a d think differently. However the truth is singular and uncheangeble. And the truth is:
> 1)The Asus RogSwift PG279q cannot be bought at this time 22.6 here in germany. The same can also be said for the other gsync asus monitor you listed.
> 2)apart from the 34inch predator acer monitor that has g sync and 3840x1440 i didnt knew there are other monitors out there with g sync and hdmi.
> ...



First of all i would like to apologize if i insulted you personally in any way, your comment looked like a typical fanboyish comment since you ended it with, "Which is better you decide?" while ignoring the fact gtx 980 ti exists. Ignoring gtx 980 ti without context, i am sure you will agree, looks extremely stupid. Now onto your new points.

1-2) Both pg279q and pg27aq are set to feature hdmi along with displayport, but you are right, they are not available on the market. There is an ultra wide Acer Predator that does though. I would still get xb270hu among them since, again, with 2 furies there is no need for a freesync monitor at 60hz.
3-4)These again, for the lack of a better word, puzzling for me. Gtx 980 is an incomplete maxwell too, it is just designed with a smaller cast. Why do you *have* to get the full chip while the one that is almost half as cheap performs admirably compared to its bigger brother is beyond me. 12 gb ram is useless, at least for gaming purposes as of now, 6gb that 980 ti has is more than enough and performs cooler.
5)This might be a valid point, but i cannot argue since i dont know your configuration. But bear in mind, according this comments 11th point, you are planning to use furies with their stock closed loop cooling solution, which occupies *2 slots* while 980 ti(or fury x for that matter) with another block like say ekwb will only occupy one slot. So you are either lying, which i find highly unlikely or you are confused whereas to use stock or custom water blocks with furies.
6)You have no reason to get any model of gtx 980's anyway.
7)First of all, there is absolutely no need to get 3rd or 4th graphics card since 3 way and 4 way sli-crossfire has only very slight advantage over dual card setup at best. At worst, they perform worse than 1 card. So while i am not one to decide what you get for your build, i suggest you to not get more than 2 cards at almost any circumstance. There is a reason to get more than 2 cards only if you are doing scientific computing, for which you should get another motherboard and quadro/firepro cards anyway. Now with that out of the way, if shorter length of fury x helps your build, you should probably choose it, since you cant go wrong with either fury x or 980 ti at this point. I just dont like it when you compare titan x with fury  x and imply there are no cheaper alternatives from the green camp. 980 ti will probably wont fall behind furies, and with water cooling i am pretty sure it can outperform them. Even if it cannot, we will see a price decrease and possibly a new card from nvidia so it is a win-win situation for us.
By making the cards 1 slot you mean cutting the part which they connect to the chassis, i am pretty sure you can just cut it with the dvi headers same way you can do with the amd cards, you just wont cut any headers, only a metal plate with amd. Looking at my water cooled gtx 780 ti's, they occupy less than 1 slot and metal support with dvi connector looks like can be cutted easily wit a dremel, i dont get why you want to do that tho, if you have that many radiators you have a gigantic case and motherboard with plenty of slots which should provide enough space to dissuade you from such solutions.
8)Again you might be suprised to see 2 titan x's uotperform your furies but i am not one to decide.
9)Check 1-2. Asus pg27aq which is 27" is 4k and has g sync will probably suit you best but it probably wont be out until later this year so yeah, nor arguments from me here.
10)Yup, 4gb of hbm memory will be enough for 4k.
11)I believe you but i would like to see a photo of it anyway . And you should really, really, really connect those furies to your loop rather than using their stock solution with inferior blocks, radiators and pumps.


----------



## Bytales (Jun 23, 2015)

The plan would have been to use two Furies stock, while waiting for custom waterblock, which will see the moment when they become single slot.
GTX 980 is full chip, the GM204., as titan X is also the full chip.

1)If im selling my ASUS ROG swift, I am either going 4k, which for me means 32 inch, or the other way around 144hz ips 2560x1440. To be honest im still pondering between 2560x1440@144Hz and 4k@60Hz. I had no luck attaining 100+ fps constant with my 750ti on my gsync monitor, but i got a bit of g-sync taste to see whats it made of.
I mean gaming at 4k@60 fps cant be that bad, cant it ? To damn bad there arent any 4k@120hz monitors out there, allthough display port 1.3 is suppose to bring that, and if were lucky were going to get it this year.
Ill probably get only one fury x, and wait for the second version with 8gb to insert 4 of them in the case.
Photos here.


----------



## Aquinus (Jun 23, 2015)

Bytales said:


> The plan would have been to use two Furies stock, while waiting for custom waterblock, which will see the moment when they become single slot.
> GTX 980 is full chip, the GM204., as titan X is also the full chip.
> 
> 1)If im selling my ASUS ROG swift, I am either going 4k, which for me means 32 inch, or the other way around 144hz ips 2560x1440. To be honest im still pondering between 2560x1440@144Hz and 4k@60Hz. I had no luck attaining 100+ fps constant with my 750ti on my gsync monitor, but i got a bit of g-sync taste to see whats it made of.
> ...


You built a dual Xeon 2011 machine for gaming? If only that money could have been put to better use.


----------



## Bytales (Jun 23, 2015)

DX12 is suppose to put every core available to good use (24 cores/48thread), and i want to keep the cpus for at least 10 years. 128Gb ram is going to be enough also for this timeframe.
Only when 32 core CPUs are going to be mainstream like now a 8 core cpu is mainstream in desktop, will i even start considering an upgrade.
I believe its going to take a while...


----------



## HumanSmoke (Jun 23, 2015)

Ta-da!






More pics (and benchmarks)* at the source*


----------



## Fluffmeister (Jul 15, 2015)

Xzibit said:


> *MORE VOLTAGE LOCK AHEAD!!!*



Very apt.


----------

