• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Radeon R9 Fury X Faster Than GeForce GTX 980 Ti at 4K: AMD

So its faster in 4K? Well the 13 people who game in 4K are probably happy to hear that.

Over 2.5 million 4K monitors have been sold to end-users so far.
 
face palm.....Apples to oranges..... Fiji to Maxwell.

wait until Pascal drops. Have your tissues ready.
 
Using vsync with freesync:wtf:

Use a frame limiter. It's even built-in in a lot if newer games that I have.
 
I can see the TPU benchmark forums starting to get interesting again. It's been Nvidia on top for ages now. This might be good for some red/green jolly japes.

This Hexus review shows the Zotac Amp Overclock (not the Amp Extreme version mind)

Given the AMD slide puts the 980TI at about 33fps (it's not far off Hexus 31-36) it shows a decent lead at 40-44 fps for an overclocked 980ti (>20% faster than stock). Exciting times ahead - I hope the Fury X is as capable at overclocking as Maxwell. Makes my purchasing decisions harder though. :laugh:

b5897891-30b0-48d1-8248-1d672ed36345.png
 
why should they compete with a card that is not even a gaming card (but that people buy as a gaming card) that is 4% faster (averaged) at 2160p than a 980Ti and cost 450$ more ~

Didn't the Titan X get neutered for compute? Which makes it a $1000+ 12GB gaming card.
 
Over 2.5 million 4K monitors have been sold to end-users so far.

Living room TVs, or computer screens hooked up to gaming PCs?
 
They can do it if they want, which is a bit annoying. A full Titan core, 8 pin power, higher tdp, AIB coolers and higher stock clocks. Would make a 10-20% faster (than 980ti) card.
I'll hang on for Fury but given the leaked AMD benches, I see a Classy 980ti on water being my next card, unless Fury has decent OC headroom (which AMD imply it has).
A very good day for enthusiasts. Not so for those that buy rebrands.

An OC Classified 980 Ti on water cooling will be a beast indeed!
 
Living room TVs, or computer screens hooked up to gaming PCs?

I believe I used the word "monitor," so that excludes televisions.
 
I'm hoping it is faster than the 980 Ti and Titan X. It might be the ONLY way to get a fully enabled GM 200 with balls-to-the-wall voltage control and high clocks at a reasonable price.

Wasn't the titan x a fully enabled GM 200?
 
Wasn't the titan x a fully enabled GM 200?

He means if fury is better than titan x, then it could force nvidia to release a higher clocked, OC friendly version of titan x - titan xxx :laugh:
 
He means if fury is better than titan x, then it could force nvidia to release a higher clocked, OC friendly version of titan x - titan xxx :laugh:

You laugh but you know what - they bloody might well just do something as 'guff' as call it 'Triple X'.

But then Fury can do the exact same thing. Battle of the hardcore Pr0n.
 
http://www.maximumpc.com/ces-2015-amd-demonstrates-freesync-technology-and-nano-pc-video/
got the scientist here showing you should have your settings high enough to be in the 45-60fps range. i dont where that slide come from buts its not something they really like to say about how it works but as mentioned alot of games sync frames and have pretty good dynamic framerate. they are releasing a driver bases dynamic frame control soon enough.
i think the difference in fps we see from amd may be the experience with catalyst..
why not settings like this if a game is running easy max over your refresh rate
1080overkill.PNG

or why not settings like this in a more balanced scenario
balanced.PNG

what if a apu needs a little boost in performance
boost.PNG
 
Last edited:
Why did the Product slide produce earlier by AMD state 54fps for Crysis and yet this graph shows around 45fps?
THat is pretty suspect how they claim one fps one day then 20% less the next. AMD marketing and tech side haven't really been on same page for anything for a while.
So its faster in 4K? Well the 13 people who game in 4K are probably happy to hear that.
Well, for someone like me who doesn't game in 4k but would like to in the future, getting a card that is capable of doing that now would make it possible for incremental upgrades that doesn't break the bank every once in a while.
Does seem like from settings page the settings were tuned to keep ram usage under 4gb.

maybe not so much with a beast 512bit bus, 8gb vram and compression.
The compression was a feature of GCN 1.2, 390x is GCN 1.1. 380 is gcn1.2. Even then benchmarks of 390x show marginal performance boosts with higher memory clock only 5 to 8% for what is a 20% memory boost. 380 is only 256bit

I dare say that Sleeping Dogs used more Directx 11 tech than some Gameworks titles from Ubi$oft. It's a beautiful and fun game,
Sleeping dogs was complete CRAP of a game. Graphic's crap, controls were worse then GTA4. it made GTA4 look like a well running game.

uhmmm comparing similar models, i'd say a gsync screen is about 200$ more than a freesync.
freesync has complete frame rate control for 4k and up to like 90 something fps(hz) at 1440p while syncing the refresh rate down to like 9fps(hz)
freesync is practically perfect.. its gsync with the 1 frame of latency.
i'll groups your 2's post together since both talkin same stuff. G-sync does cost more cause its difference between 2 techs of 1 that was worked on for years to prefect it and 1 that was thrown together in a month to compete. Freesync is perfect? Yea sure if you don't mind Ghosting, or tearing when fps drops under 40fps. Before you try to blame the ghost on the panel its not the panels fault its not. Nvidia took time and effort to test all types and models of panels to see which ones work well doing VRR and which ones suck and made a list of ones that are good to use for G-sync. G-sync only has a small fps lose which is only around 1% on kepler cards since they had to a part of the work in drivers since kepler cards lacked hardware to do a certain thing. But that does make a ton of nvidia cards that support g-sync then AMD had for freesync. its been confirmed that r7/r9 370(x) card doesn't even support freesync, that is pretty sad. I would call that pretty unacceptable.
 
Last edited:
I see a Classy 980ti on water being my next card, unless Fury has decent OC headroom (which AMD imply it has).
Word has it that AMD won't allow the memory to overclocked, and AMD's own benchmarks show that while the core can be overclocked, the net gain isn't overly spectacular.
From the AMD press deck. Fury X overclocked by 100MHz (9.5% overclock)
amd-radeon-fury-overclock-100592120-orig-900x322.png


Seems in line with other current GPUs, but the 9.5% overclock margin isn't that impressive.
 
Arbiter's trolling skills are insane.
 
Word has it that AMD won't allow the memory to overclocked, and AMD's own benchmarks show that while the core can be overclocked, the net gain isn't overly spectacular.
From the AMD press deck. Fury X overclocked by 100MHz (9.5% overclock)
amd-radeon-fury-overclock-100592120-orig-900x322.png


Seems in line with other current GPUs, but the 9.5% overclock margin isn't that impressive.

You know what was the CPU and resolution on that bench? If 4K... you know... it is good... 1080p ain't... and most importantly, I believe this card needs a hell of a CPU clocked high... you know it is an AMD.

ADD.

And overclocking memory... for more bandwidth... on HBM? Naah... i would locked it too.
 
Last edited:
i wont argue how adaptive allows for a wide array of oem costomization..
so currently gsync does show some advantage at this point so nvidia does pull ahead at this point with adaptive sync but for how for long?
 
Word has it that AMD won't allow the memory to overclocked, and AMD's own benchmarks show that while the core can be overclocked, the net gain isn't overly spectacular.
From the AMD press deck. Fury X overclocked by 100MHz (9.5% overclock)
amd-radeon-fury-overclock-100592120-orig-900x322.png


Seems in line with other current GPUs, but the 9.5% overclock margin isn't that impressive.

You know what was the CPU and resolution on that bench? If 4K... you know... it is good... 1080p ain't... and most importantly, I believe this card needs a hell of a CPU clocked high... you know it is an AMD.

ADD.

And overclocking memory... for more bandwidth... on HBM? Naah... i would locked it too.

CPU is 5960k. I read the link to source. If the source info is true the overclock is quite feeble. A 980ti can go 20% over stock in performance.......

Still, awaiting Wednesday.

@W1zzard - when you bench (when you publish what you have benched - can you do an apples to apples, balls to the wall overclock on an intensive game, Fury X versus 980ti - both at max OC? Neutral, non Gameworks and ultra everything so VRam is high. This would be good to see.
 
Word has it that AMD won't allow the memory to overclocked, and AMD's own benchmarks show that while the core can be overclocked, the net gain isn't overly spectacular.
From the AMD press deck. Fury X overclocked by 100MHz (9.5% overclock)
amd-radeon-fury-overclock-100592120-orig-900x322.png


Seems in line with other current GPUs, but the 9.5% overclock margin isn't that impressive.

Bad news for amd then, the gigabyte 980Ti g1 gaming is already 15% better than stock 980Ti, and has room for 14% more according to w1zz review.
 
Didn't the Titan X get neutered for compute? Which makes it a $1000+ 12GB gaming card.
well ... nobody (except some enthusiast with more money than usual) consider a 1000$ single gpu board for gaming ... nvidia played dirty with the neutered compute maneuver :D
for me the Titan X is not a gaming card.

face palm.....Apples to oranges..... Fiji to Maxwell.

wait until Pascal drops. Have your tissues ready.
funny one ... Fiji is Maxwell contender ... not the 2yrs one that still hold it who populate the 3XX line ...

the next gen after Fiji is Pascal contender.
 
i'll groups your 2's post together since both talkin same stuff. G-sync does cost more cause its difference between 2 techs of 1 that was worked on for years to prefect it and 1 that was thrown together in a month to compete. Freesync is perfect? Yea sure if you don't mind Ghosting, or tearing when fps drops under 40fps. Before you try to blame the ghost on the panel its not the panels fault its not. Nvidia took time and effort to test all types and models of panels to see which ones work well doing VRR and which ones suck and made a list of ones that are good to use for G-sync. G-sync only has a small fps lose which is only around 1% on kepler cards since they had to a part of the work in drivers since kepler cards lacked hardware to do a certain thing. But that does make a ton of nvidia cards that support g-sync then AMD had for freesync. its been confirmed that r7/r9 370(x) card doesn't even support freesync, that is pretty sad. I would call that pretty unacceptable.
Ok first of all you need to actually research freesync and gsync before speaking about issues. Instead of making up issues to try and make one sound significantly more superior to the other... Gsync has had many issues with complaints about Flickering, Ghosting, ETC as well so don't act like Gsync is this perfect entity. Also your really complaining that the R7 370 does not support freesync? While I don't like that it doesn't how many people do you see running off to buy a 1440p 144hz monitor with freesync and then grabbing a an R7 370. Would be the same idea seeing someone grabbing a GTX 750ti (or 760) and doing the same thing...

face palm.....Apples to oranges..... Fiji to Maxwell.

wait until Pascal drops. Have your tissues ready.
Please explain how this is apples to oranges? These are this generations contenders???

He means if fury is better than titan x, then it could force nvidia to release a higher clocked, OC friendly version of titan x - titan xxx :laugh:
IF they do that, ill post it here and now I will purchase 3 of them (Call it the XXX Titan)

I want to see overclocking performance of the card. That is what will matter in the end and if there are any aftermarket variants for better overclocking (Lightning)
 
Back
Top