Friday, June 19th 2015

Radeon R9 Fury X Faster Than GeForce GTX 980 Ti at 4K: AMD

AMD, in its press documents, claimed that its upcoming flagship single-GPU graphics card, the Radeon R9 Fury X, will be faster than NVIDIA's recently launched GeForce GTX 980 Ti, at 4K Ultra HD resolution. This puts to rest speculation that its 4 GB of video memory hampers performance against its competitor with 6 GB of memory. From the graph below, which was extracted from AMD's press material, the R9 Fury X will be faster than the GTX 980 Ti, in even the most memory-intensive games at 4K, including Far Cry 4, The Witcher 3: Wild Hunt, Crysis 3, Assassins Creed: Unity, and Battlefield 4. Bigger gains are shown in other games. In every single game tested, the R9 Fury X is offering frame-rates of at least 35 fps. The Radeon R9 Fury X will launch at $649.99 (the same price as the GTX 980 Ti), next week, with market availability within the following 3 weeks.
Source: The TechReport
Add your own comment

102 Comments on Radeon R9 Fury X Faster Than GeForce GTX 980 Ti at 4K: AMD

#26
mirakul
Don't know why you didn't include the setting
Posted on Reply
#27
xfia
mirakulDon't know why you didn't include the setting
looks like the only goal was to keep it above 30ish with the settings.. it is fine if you want to but i think most people really just like 60fps. freesync helps you with picking your standard for frame rates while being smoother and not locked into a refresh rate. i like 50-60fps(hz)
Posted on Reply
#28
HumanSmoke
the54thvoidUnfortunately there is an old Norse saying, "Press decks for products are like axes in a window - sharp and shiny as they are, you can only trust them when rending limbs from bone"
In other words, 5 more days. But I have no doubt it'll be fast, probably faster, stock versus stock but I doubt in all games. If it actually is faster then that's good because it means they've conquered Nvidia's Gamehurts.
But, remember the Norsemen.
I'm hoping it is faster than the 980 Ti and Titan X. It might be the ONLY way to get a fully enabled GM 200 with balls-to-the-wall voltage control and high clocks at a reasonable price.
Posted on Reply
#29
the54thvoid
Super Intoxicated Moderator
HumanSmokeI'm hoping it is faster than the 980 Ti and Titan X. It might be the ONLY way to get a fully enabled GM 200 with balls-to-the-wall voltage control and high clocks at a reasonable price.
Yeah. The fabled 980ti 'Metal'?

It'll be a race to core meltdown at 500watts per card. Fury X-tremely hot versus Titan X-orbitant.
Posted on Reply
#30
Kaynar
Unfortunately right now i'm totally stuck with G-Sync (in a good and bad way) since its SO MUCH SUPERIOR to nothing, and apparently a little better than FreeSync, that I am just forced to buy nVidia. So I just hope that nVidia pricing on the 980Ti will be something under 700 Euros because of the Fury X competition.

But the above graphs are only for 4K, what about 1080p and 1440p? They are much less memory intensive resolutions and therefore the advantage of AMD's new memory might not be that important on those lower resolutions.
Posted on Reply
#31
ZoneDymo
xfiathey pretty well explain the performance segments enough to understand what *smallprint means. they honestly just lay it all out. they are more than kicking ass with the fury cards and the r9 nana crushes nvidia in performance per watt with 250% over the 290x while being more powerful.. excellent for lower power and smaller form factor machines.
I'm assuming English is not your first language.
That is quite a hard to read reponds you posted I'm afraid.
Im guessing you mean to say that "Ultra settings *" means "not really Ultra", or Ultra-ish... but as mirakul's link pointed out, they are running it with SMAA in that test, which is post processing AA which does pretty much nothing to hurt performance.

So the claimed 54 fps vs the 45 fps the slide in this article shows still stands and so does my previous statement that it seems AMD was rather generous with their earlier claims.

Also to respond to:
"looks like the only goal was to keep it above 30ish with the settings.. it is fine if you want to but i think most people really just like 60fps. freesync helps you with picking your standard for frame rates while being smoother and not locked into a refresh rate. i like 50-60fps(hz)"

Freesync is something you want to use along side Vsync so yes you are still locked in.
Freesync alone can still introduce screen tearing when the framerate goes past the refreshrate (which I dont have to explain is something you do not want).
Freesync makes the the experience of sub-optimal fps a smoother better one.
Posted on Reply
#32
ZoneDymo
KaynarUnfortunately right now i'm totally stuck with G-Sync (in a good and bad way) since its SO MUCH SUPERIOR to nothing, and apparently a little better than FreeSync, that I am just forced to buy nVidia. So I just hope that nVidia pricing on the 980Ti will be something under 700 Euros because of the Fury X competition.

But the above graphs are only for 4K, what about 1080p and 1440p? They are much less memory intensive resolutions and therefore the advantage of AMD's new memory might not be that important on those lower resolutions.
idk man, from all I seen its equal to freesync in visual aid, the only difference is that G-sync costs 1 fps.
Posted on Reply
#33
mirakul
ZoneDymoidk man, from all I seen its equal to freesync in visual aid, the only difference is that G-sync costs 1 fps.
G-sync costs 100$ more in most case.
Posted on Reply
#34
ZoneDymo
mirakulG-sync costs 100$ more in most case.
right right that as well ;)
Posted on Reply
#35
Kaynar
mirakulG-sync costs 100$ more in most case.
uhmmm comparing similar models, i'd say a gsync screen is about 200$ more than a freesync. :D
ZoneDymoidk man, from all I seen its equal to freesync in visual aid, the only difference is that G-sync costs 1 fps.
Isn't freesync limited to around 30fps min and 90 fps max thought?
Posted on Reply
#36
xfia
ZoneDymoI'm assuming English is not your first language.
That is quite a hard to read reponds you posted I'm afraid.
Im guessing you mean to say that "Ultra settings *" means "not really Ultra", or Ultra-ish... but as mirakul's link pointed out, they are running it with SMAA in that test, which is post processing AA which does pretty much nothing to hurt performance.

So the claimed 54 fps vs the 45 fps the slide in this article shows still stands and so does my previous statement that it seems AMD was rather generous with their earlier claims.

Also to respond to:
"looks like the only goal was to keep it above 30ish with the settings.. it is fine if you want to but i think most people really just like 60fps. freesync helps you with picking your standard for frame rates while being smoother and not locked into a refresh rate. i like 50-60fps(hz)"

Freesync is something you want to use along side Vsync so yes you are still locked in.
Freesync alone can still introduce screen tearing when the framerate goes past the refreshrate (which I dont have to explain is something you do not want).
Freesync makes the the experience of sub-optimal fps a smoother better one.
what? no no no
freesync is the gpu tech not the standard but adaptive sync is and you dont use vsync with it
freesync has complete frame rate control for 4k and up to like 90 something fps(hz) at 1440p while syncing the refresh rate down to like 9fps(hz)
freesync is practically perfect.. its gsync with the 1 frame of latency.
edit-yeah those are the full specs but displays are different
Posted on Reply
#37
ZoneDymo
Kaynaruhmmm comparing similar models, i'd say a gsync screen is about 200$ more than a freesync. :D



Isn't freesync limited to around 30fps min and 90 fps max thought?
According to this: wccftech.com/amd-freesync-nvidia-gsync-verdict/#ixzz3dVAAHDs0

"Another difference between FreeSync and G-Sync is in the flexibility of the effective refresh rate range. G-Sync is capable of refresh rates that range from 30Hz to 144Hz while the FreeSync spec is capable of refresh rates that range from 9Hz to 240Hz"

According to AMDs website: support.amd.com/en-us/search/faq/222

"Using DisplayPort Adaptive-Sync, the graphics card can detect and set an appropriate maximum and minimum refresh rate based on the capabilities reported by the display. Potential ranges include 36-240Hz, 21-144Hz, 17-120Hz and 9-60Hz."
Posted on Reply
#38
ZoneDymo
xfiawhat? no no no
freesync is the gpu tech not the standard but adaptive sync is and you dont use vsync with it
freesync has complete frame rate control for 4k and up to like 90 something fps(hz) at 1440p while syncing the refresh rate down to like 9fps(hz)
freesync is practically perfect.. its gsync with the 1 frame of latency.
Honestly Im beginning to think you barely understand what I'm writing.

1. Freesync is gpu tech, yes its AMD technologie based on adaptive sync build into Displayport.
2. You DO use Vsync with it unless you want screen tearing (which you do not want)


3. Resolution (4k, 1440p etc) have nothing to do with it at all.
Freesync has an enormous range but the monitors supporting it so far dont even come close and its those that determine how far the Freesync is usable.

Its like buying a pump for water that can easily pump through 100 Liters of water per minute, but you you need a tube large enough to move the water through, if the tube is too thin you wont be pushing through the 100 liters of water.

4. Freesync would be perfect if you did not need Vsync at all, if it meant smooth gameplay, no screen tearing and now mouse latency.
Posted on Reply
#39
xfia
its possible to get screen tearing but the monitors are so well specd on top of freesync its practically perfect..

you know tearing happens when you run above your refresh rate right? if your having rapid frames like that above your refresh rate you could still cap your frames and get better latency.

the scenario shown is strange in itself is strange because how freesync works so it would have to be how that display is specd.
Posted on Reply
#40
Supercrit
Implying that it is slower in all other resolutions? That's not a statement to use to reassure people with.
Posted on Reply
#41
ZoneDymo
xfiaits possible to get screen tearing but the monitors are so well specd on top of freesync its practically perfect..

you know tearing happens when you run above your refresh rate right? if your having rapid frames like that above your refresh rate you could still cap your frames and get better latency.

the scenario shown is strange in itself is strange because how freesync works so it would have to be how that display is specd.
"you know tearing happens when you run above your refresh rate right?"
Yes....what do you think Vsync does? honestly....
"if your having rapid frames like that above your refresh rate you could still cap your frames and get better latency."
Yeah you cap it...WITH VSYNC, not a lot of other ways to cap framerate and even less then that actually work.
Posted on Reply
#42
nunyabuisness
lets put a water cooled FURY X against an air cooled 980ti. with stock clocks.

when we know that the EVGA 980ti HYBRID beats the FURY X in fire strike 4K. which is a much fairer comparison.
the air cooled fury will be abt the same or slower than a 980ti ACX 2.0 of a G1 gaming!
Posted on Reply
#43
Aquinus
Resident Wat-man
I think I'm going to wait patiently for @W1zzard to do a review. :lovetpu: I know what source I can trust and I suspect internal benchmarks will be a bit biased, considering that's human nature. Until we start hearing from third parties with respect to benchmarks that are confirmed to be legitimate, it doesn't really mean a whole lot. I still remember AMD's "benchmarks" from pre-Bulldozer roll-out and I'm skeptical considering recent history. Not to say I don't want to believe in AMD, let's just say I don't have faith in their assessments of their own hardware.
techreport.comOf course, these numbers are supplied by AMD, and it's possible that they've been cherry-picked to present an overly positive picture. Even so, they seem to paint a winning picture for Fiji. We'll be verifying these results independently in our upcoming Fury X review.
Posted on Reply
#44
64K
HumanSmokeI'm hoping it is faster than the 980 Ti and Titan X. It might be the ONLY way to get a fully enabled GM 200 with balls-to-the-wall voltage control and high clocks at a reasonable price.
I've been thinking Nvidia has planned to do that all along if Fury X is faster. They may do it anyway. Probably for $50 more but not for a few months. Most of the non-reference 980 Ti haven't showed up for sale yet except the EVGA. They need to unload their salvage GM200s first.
Posted on Reply
#45
mirakul
nunyabuisnesslets put a water cooled FURY X against an air cooled 980ti. with stock clocks.

when we know that the EVGA 980ti HYBRID beats the FURY X in fire strike 4K. which is a much fairer comparison.
the air cooled fury will be abt the same or slower than a 980ti ACX 2.0 of a G1 gaming!
If they are at the same price point (650$), of course they will be put against each others. I doubt that you could find any custom 980ti with 650$ btw.
Posted on Reply
#46
rooivalk
nunyabuisnesslets put a water cooled FURY X against an air cooled 980ti. with stock clocks.

when we know that the EVGA 980ti HYBRID beats the FURY X in fire strike 4K. which is a much fairer comparison.
the air cooled fury will be abt the same or slower than a 980ti ACX 2.0 of a G1 gaming!
As long as the price is same, it's a fair comparison.

But we also know custom air cooled 980Ti is already beast (see TPU's own review of Gigabyte 980Ti G1). With on average about 17% faster than stock clock, it's probably faster than FuryX.

Interesting to see what would AIB Fury capable of and could 4GB HBM handle Shadow of Mordor with HQ texture?
Posted on Reply
#47
the54thvoid
Super Intoxicated Moderator
64KI've been thinking Nvidia has planned to do that all along if Fury X is faster. They may do it anyway. Probably for $50 more but not for a few months. Most of the non-reference 980 Ti haven't showed up for sale yet except the EVGA. They need to unload their salvage GM200s first.
They can do it if they want, which is a bit annoying. A full Titan core, 8 pin power, higher tdp, AIB coolers and higher stock clocks. Would make a 10-20% faster (than 980ti) card.
I'll hang on for Fury but given the leaked AMD benches, I see a Classy 980ti on water being my next card, unless Fury has decent OC headroom (which AMD imply it has).
A very good day for enthusiasts. Not so for those that buy rebrands.
Posted on Reply
#48
HisDivineOrder
I notice the two games I'd most like to see are absent on this benchmark list:

Dying Light
Grand Theft Auto V

Those are the games that most use VRAM, too.
Posted on Reply
#49
uuuaaaaaa
nunyabuisnesslets put a water cooled FURY X against an air cooled 980ti. with stock clocks.

when we know that the EVGA 980ti HYBRID beats the FURY X in fire strike 4K. which is a much fairer comparison.
the air cooled fury will be abt the same or slower than a 980ti ACX 2.0 of a G1 gaming!
The reference Fury X will most likely beat the reference 980Ti at the same price point. It will also be quieter and run cooler. That AIO is capable to handle a TDP of 500W and the reference board has a vrm capable of delivering around 400 amps which would translate to more or less into a TDP in the 400W range. These are all hints that Fury X might actually be a quite nice overcloker.
Posted on Reply
#50
btarunr
Editor & Senior Moderator
m6tzg6rSo its faster in 4K? Well the 13 people who game in 4K are probably happy to hear that.
Over 2.5 million 4K monitors have been sold to end-users so far.
Posted on Reply
Add your own comment
Nov 26th, 2024 06:30 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts