Friday, June 12th 2015

AMD Radeon R9 Fury X Confirmed SKU Name for "Fiji XT"

The bets are off, AMD's latest flagship graphics card will indeed get a fancy name, and it will be named Radeon R9 Fury X. Korean tech-site HardwareBattle leaked a product flyer with the SKU name, and its catchphrase "revolutionary, inside out." Based on the 28 nm "Fiji" silicon, the R9 Fury X is expected to feature 4,096 stream processors, 256 TMUs, 128 ROPs, and a 4096-bit wide HBM memory interface, holding 4 GB of memory.

The reference-design Fury X will come with an AIO cooling solution, likely designed by Asetek, featuring a Cooler Master made fan, ventilating its 120 x 120 mm radiator. Just as the Radeon R9 290X did away with D-Sub (VGA) support (even with dongles), Fiji does away with the DVI connector. You still get three DisplayPort 1.2a ports, and a single HDMI 2.0 connector. The card has been pictured on the web featuring two 8-pin PCIe power connectors.
Sources: HardwareBattle, VideoCardz
Add your own comment

105 Comments on AMD Radeon R9 Fury X Confirmed SKU Name for "Fiji XT"

#76
GreiverBlade
BiggieShadyDamn, now it's removed by user
well ... yes ... i bet he got some comment on the vids about : how a 390X is not next gen because it's a "rebranded" 290X and thus the CFX thumbs are away since 2 years now

well ... it's a next gen... but indeed the CFX thumbs are away since 2 yrs :roll:
about the rebrand ... well the 290 is still a pretty capable card, so why not, it's just like what happened with the 7XX line so who really care in the end.(and i owned a 770 ... so, not me)
buggalugsI'm more interested in non-reference air cooled designs like from Asus, msi lightning, sapphire vapourX etc with highend air cooler.

AMD and partners will release different versions of this card I suspect that doesn't include water cooling.
i am more interested in reference custom waterblock and 1 slot bracket ... or custom air cooled non OC model but with a custom waterblock
(because factory OC sucks most of the time ... and you can easily do yourself what they do to make you pay a premium over a ref stock)

also ... 4gb is actually enough and standard (look at the 970 it also pack 4... wait ... forget it):D
Posted on Reply
#77
BiggieShady
GreiverBladewell ... yes ... i bet he got some comment on the vids about : how a 390X is not next gen because it's a "rebranded" 290X and thus the CFX thumbs are away since 2 years now

well ... it's a next gen... but indeed the CFX thumbs are away since 2 yrs :roll:
well ... yes, that's why I posted it :) they're gonna piss a lots of people confused by the naming scheme
Posted on Reply
#78
GreiverBlade
BiggieShadywell ... yes, that's why I posted it :) they're gonna piss a lots of people confused by the naming scheme
not really...
before we had 250<260<270<280<290 and X variants
now we have 250<360<370<380<390<fury and X variants

same hierarchy in term of performance just 2 card added into the high end.
just like Nv did during the 7XX time and the Titan+Titan black, with a shorter timeframe (aka: none) between the 2 top SKU tho (well not really ... i forgot the 750/Ti 760 and 780/Ti but ... oh well )
Posted on Reply
#79
Dany
i dont mind if fury's power consumption is over 300W as long its better than gtx titan x and 350-400$ cheaper , i cant wait to see this gpu at work , no more waiting thats for sure now , i'd buy one if its that powerfull , no second thougths there , cheers !!
Posted on Reply
#80
Aquinus
Resident Wat-man
BreitYou must be proud to own a card that has 7.5GB RAM free. All the time. :D
It doesn't take much more than going 400MB over physical VRAM to lose substantial performance. I know, it's happening right now to me with my 6870s. I play Elite Dangerous constantly with about 400MB over dedicated going into shared. Thanks to that, my GPUs most of the time are running <50% because they're starved for more VRAM. I'm also not using any AA and everything is set to high (not even ultra.) Simply turning AA on pushed it up to 2GB (it will keep using shared...) and at that point it's unplayable.

The real point here is that so long as you have to start dipping into shared system memory, you lose a substantial amount of performance because the GPU starts spending more time waiting and less time working. Simply put, if I had 2GB on my 6870s, I wouldn't be considering an upgrade. Running out of VRAM is like driving a car with a huge engine but a tiny intake manifold (or a squirrel in your airbox if you will.) So if performance between any two GPUs were similar, I would take the one with more VRAM because it will last you longer. A lot of times it not the rendering itself in a game that gets more complex, it's the size of the textures, however that's not true for all games.

With that all said, CFX works most of the time for me. If it doesn't, a driver update usually takes care of it.
Posted on Reply
#81
Breit
Right. But spending 400$+ for VRAM you'll probably never use?
Posted on Reply
#82
Aquinus
Resident Wat-man
BreitRight. But spending 400$+ for VRAM you'll probably never use?
Just because he's not using it now doesn't mean he won't be in a year or two. I bought my first 6870 6 years ago and my second 3 years ago. If it weren't for VRAM, I would continue to use them. It's not a matter of using it now, it's a matter of using it later down the line. It's called planning ahead.

Don't confuse this with justifying Titan's price. It's not. I'm just saying VRAM is important if you're planning long term (several years,) which I am, since I'm in the market and don't intend to do so much as buy another of the same GPU, 3 years down the road like last time. Just saying.
Posted on Reply
#83
Breit
I got your point. Nevertheless the question remains if those $400+ aren't better spent in say 3+ years from now for a new (probably faster) card with more VRAM?
Posted on Reply
#84
Aquinus
Resident Wat-man
BreitI got your point. Nevertheless the question remains if those $400+ aren't better spent in say 3+ years from now for a new (probably faster) card with more VRAM?
Sure, it's a good argument but the counter point would be, how much would getting another one of the card you already have cost? In 3 years, I bet you I will be able to get a Fury or 980 Ti a lot cheaper than I can now (or will be able to soon.) :p

When I bought my second 6870, it was significantly cheaper than it was when I bought it on release day, that's for sure. What did it get me? 7970 performance (sans vram) for half the cost.
Posted on Reply
#85
Breit
In the foreseeable future with dx12 and split frame rendering, you'll eventually get the VRAM as well with the second card. :)
Posted on Reply
#86
Aquinus
Resident Wat-man
BreitIn the foreseeable future with dx12 and split frame rendering, you'll eventually get the VRAM as well with the second card. :)
I have some serious doubts about how much DX12 will limit duplication of data because a lot of it needs to be shared for both to render the same thing. I suspect it will cut back on it but not eliminate it. I try to reserve judgement for actual results, not what the PR tells us.
Posted on Reply
#87
HisDivineOrder
BreitIn the foreseeable future with dx12 and split frame rendering, you'll eventually get the VRAM as well with the second card. :)
You seem to forget that even using DX12 is going to require using particular engines and using DX12 well (as in well enough to do proper SLI/CF support) is going to require someone tailoring the game to support it.

You're overly optimistic given an industry that can't even be bothered to usually support resolutions beyond 1280x720 and 1920x1080, won't remember FOV sliders, customizable keys, and even sometimes manages to forget not to put "Press Start to Begin" on the first screen of the game.

You think these are the companies that are going to by and large invest lots of time into making sure games are made with DX12 to support the truly advanced feature set you think will make DX12 superior to DX11 for SLI/CF?

I doubt it. I sincerely doubt it. They'll treat DX12 the way they treated DX11 over 9. It'll be a great way to add a few extra features in, but largely the ports will be treated as DX9 with 11 gloss. In this case, it'll be programmers doing things exactly like they remember with DX11 with a slim few (probably done in a month or pulled from something like Gameworks by an external hardware partner) improvements added so they can call it "DirectX 12."

Without the proper effort into rethinking the game as a fully DX12 title, it won't matter at all. If they start with an Xbox One base, it seems unlikely they'll ever move beyond the peculiar ESRAM buffer and large amount of shared system RAM to truly customize it for gaming cards with less than, say, the 5GB's of system RAM the current consoles dedicate usually to GPU-related tasks.
Posted on Reply
#88
newconroer
So no idea what's happened in this thread but the summary I managed to squeeze out was :

No DVI ports = bad
Fury = confusing name
16nm = happening someday
16nm = only 'chance' AMD has to get serious(according to Nvidia loyalists)
DX12 = false prophet?
Posted on Reply
#89
qubit
Overclocked quantum bit
That "Fury" name suggests a real NVIDIA beater. If it turns out to be another underperforming Bulldozer, then AMD will have seriously embarrassed itself and will lose all credibility. Let's hope this is not the case.
Posted on Reply
#90
jigar2speed
qubitThat "Fury" name suggests a real NVIDIA beater. If it turns out to be another underperforming Bulldozer, then AMD will have seriously embarrassed itself and will lose all credibility. Let's hope this is not the case.
This coming from you, should i be worried about the Fury ?
Posted on Reply
#91
qubit
Overclocked quantum bit
jigar2speedThis coming from you, should i be worried about the Fury ?
What is that supposed to mean?
Posted on Reply
#92
RealNeil
I'm glad that we'll know all about Fury and Fury XT this week. Speculation about it, compared to GTX-980Ti has been rampant and never ending.
Posted on Reply
#93
Breit
RealNeilI'm glad that we'll know all about Fury and Fury XT this week. Speculation about it, compared to GTX-980Ti has been rampant and never ending.
Soooo exciting... ;)
Posted on Reply
#94
RealNeil
BreitSoooo exciting... ;)
Not for me. No money to buy one, but I did just buy a gtx-980 windforce
Posted on Reply
#95
qubit
Overclocked quantum bit
RealNeilI'm glad that we'll know all about Fury and Fury XT this week. Speculation about it, compared to GTX-980Ti has been rampant and never ending.
Gets tiresome, doesn't it? One just wants it to end and to finally know the facts.
Posted on Reply
#97
ValenOne
newconroerSo no idea what's happened in this thread but the summary I managed to squeeze out was :

No DVI ports = bad
Fury = confusing name
16nm = happening someday
16nm = only 'chance' AMD has to get serious(according to Nvidia loyalists)
DX12 = false prophet?
AMD plans to use 14 nm from GoFlo(same tech as Samsung's 14 nm) not TSMC's 16 nm.
Posted on Reply
#98
Breit
rvalenciaGoFlo
:D:D
Nice typo.
Posted on Reply
#99
Wshlist
I expect the 1.1 version with 8GB version will follow a short time later making the people who bought the 4GB one for a great deal of money feel rather annoyed.
But maybe I'm wrong.
Posted on Reply
#100
GreiverBlade
WshlistI expect the 1.1 version with 8GB version will follow a short time later making the people who bought the 4GB one for a great deal of money feel rather annoyed.
But maybe I'm wrong.
NAAHHH they're not nvidia ... they will never do a similar thing than Titan to 780Ti to Titan Black, Titan X to 980Ti ... do they? (light joke... dont take that seriously :laugh: )
well ... at last owner of a 290/290X will have no worries to wait a bit on the 8gb version if there ever is one ahah ... (i suspect it's the HBM who limit the vRAM to 4gb but no biggies ... 8gb is still not really common and 4K is also not common :D altho most 4k gaming can be done with 4gb and for those who are satisfied with 1080p monitor, well no need to explain :D )

lately i am telling myself : "hey, i already have a 390 ... no need to upgrade, can wait the next next one... or maybe a 390X."
Posted on Reply
Add your own comment
Nov 21st, 2024 10:28 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts