Friday, June 19th 2015
Radeon R9 Fury X Faster Than GeForce GTX 980 Ti at 4K: AMD
AMD, in its press documents, claimed that its upcoming flagship single-GPU graphics card, the Radeon R9 Fury X, will be faster than NVIDIA's recently launched GeForce GTX 980 Ti, at 4K Ultra HD resolution. This puts to rest speculation that its 4 GB of video memory hampers performance against its competitor with 6 GB of memory. From the graph below, which was extracted from AMD's press material, the R9 Fury X will be faster than the GTX 980 Ti, in even the most memory-intensive games at 4K, including Far Cry 4, The Witcher 3: Wild Hunt, Crysis 3, Assassins Creed: Unity, and Battlefield 4. Bigger gains are shown in other games. In every single game tested, the R9 Fury X is offering frame-rates of at least 35 fps. The Radeon R9 Fury X will launch at $649.99 (the same price as the GTX 980 Ti), next week, with market availability within the following 3 weeks.
Source:
The TechReport
102 Comments on Radeon R9 Fury X Faster Than GeForce GTX 980 Ti at 4K: AMD
It'll be a race to core meltdown at 500watts per card. Fury X-tremely hot versus Titan X-orbitant.
But the above graphs are only for 4K, what about 1080p and 1440p? They are much less memory intensive resolutions and therefore the advantage of AMD's new memory might not be that important on those lower resolutions.
That is quite a hard to read reponds you posted I'm afraid.
Im guessing you mean to say that "Ultra settings *" means "not really Ultra", or Ultra-ish... but as mirakul's link pointed out, they are running it with SMAA in that test, which is post processing AA which does pretty much nothing to hurt performance.
So the claimed 54 fps vs the 45 fps the slide in this article shows still stands and so does my previous statement that it seems AMD was rather generous with their earlier claims.
Also to respond to:
"looks like the only goal was to keep it above 30ish with the settings.. it is fine if you want to but i think most people really just like 60fps. freesync helps you with picking your standard for frame rates while being smoother and not locked into a refresh rate. i like 50-60fps(hz)"
Freesync is something you want to use along side Vsync so yes you are still locked in.
Freesync alone can still introduce screen tearing when the framerate goes past the refreshrate (which I dont have to explain is something you do not want).
Freesync makes the the experience of sub-optimal fps a smoother better one.
freesync is the gpu tech not the standard but adaptive sync is and you dont use vsync with it
freesync has complete frame rate control for 4k and up to like 90 something fps(hz) at 1440p while syncing the refresh rate down to like 9fps(hz)
freesync is practically perfect.. its gsync with the 1 frame of latency.
edit-yeah those are the full specs but displays are different
"Another difference between FreeSync and G-Sync is in the flexibility of the effective refresh rate range. G-Sync is capable of refresh rates that range from 30Hz to 144Hz while the FreeSync spec is capable of refresh rates that range from 9Hz to 240Hz"
According to AMDs website: support.amd.com/en-us/search/faq/222
"Using DisplayPort Adaptive-Sync, the graphics card can detect and set an appropriate maximum and minimum refresh rate based on the capabilities reported by the display. Potential ranges include 36-240Hz, 21-144Hz, 17-120Hz and 9-60Hz."
1. Freesync is gpu tech, yes its AMD technologie based on adaptive sync build into Displayport.
2. You DO use Vsync with it unless you want screen tearing (which you do not want)
3. Resolution (4k, 1440p etc) have nothing to do with it at all.
Freesync has an enormous range but the monitors supporting it so far dont even come close and its those that determine how far the Freesync is usable.
Its like buying a pump for water that can easily pump through 100 Liters of water per minute, but you you need a tube large enough to move the water through, if the tube is too thin you wont be pushing through the 100 liters of water.
4. Freesync would be perfect if you did not need Vsync at all, if it meant smooth gameplay, no screen tearing and now mouse latency.
you know tearing happens when you run above your refresh rate right? if your having rapid frames like that above your refresh rate you could still cap your frames and get better latency.
the scenario shown is strange in itself is strange because how freesync works so it would have to be how that display is specd.
Yes....what do you think Vsync does? honestly....
"if your having rapid frames like that above your refresh rate you could still cap your frames and get better latency."
Yeah you cap it...WITH VSYNC, not a lot of other ways to cap framerate and even less then that actually work.
when we know that the EVGA 980ti HYBRID beats the FURY X in fire strike 4K. which is a much fairer comparison.
the air cooled fury will be abt the same or slower than a 980ti ACX 2.0 of a G1 gaming!
But we also know custom air cooled 980Ti is already beast (see TPU's own review of Gigabyte 980Ti G1). With on average about 17% faster than stock clock, it's probably faster than FuryX.
Interesting to see what would AIB Fury capable of and could 4GB HBM handle Shadow of Mordor with HQ texture?
I'll hang on for Fury but given the leaked AMD benches, I see a Classy 980ti on water being my next card, unless Fury has decent OC headroom (which AMD imply it has).
A very good day for enthusiasts. Not so for those that buy rebrands.
Dying Light
Grand Theft Auto V
Those are the games that most use VRAM, too.