Saturday, June 13th 2015

AMD Radeon R9 Fury X Pictured Some More

Here are some of the clearest pictures of AMD's next-generation flagship graphics card, the Radeon R9 Fury X. Much like the R9 295X2, this card features an AIO liquid cooling solution. With the relocation of memory from chips surrounding the GPU to the GPU package as stacked HBM, the resulting PCB space savings translate into a card that's very compact. Under its hood is a full-coverage liquid cooling pump-block, which is plumbed to a thick 120 mm x 120 mm radiator. The card draws power from a pair of 8-pin PCIe power connectors. Display outputs include three DisplayPort 1.2a, and one HDMI 2.0.
Source: PC Perspective
Add your own comment

98 Comments on AMD Radeon R9 Fury X Pictured Some More

#26
Prima.Vera
This is probably the best card to play on 1440p. Not for 4K. Or at least is not future proof for 4K.
Posted on Reply
#27
RejZoR
BiggieShadyI'd say cutting out special double precision cuda cores was a worthwhile trade off when you consider for what gpu compute is used and how much overall they gained with simplified instruction scheduling. Furthermore cache system got entire another level with maxwell as opposed to kepler/fermi, so I'm not surprised level 1 cache size was adjusted. Same goes for register file size, I think all these changes were for the better even for compute - after all, these GPUs work in clusters for compute applications

You may be onto something since Tonga is GCN 1.2
Then again, how many users use them for compute tasks? These are gaming cards and while they can do compute, they are not meant for this. That's why both, AMD and NVIDIA have more expensive FireGL and Quadro. So they can sell dedicated cards that are great at compute and sort of suck at gaming...
Prima.VeraThis is probably the best card to play on 1440p. Not for 4K. Or at least is not future proof for 4K.
Based on what? VRAM capacity? While that may be true to some extent, 4GB is still a lot and with enormous bandwidth it has, I don't think that's really a problem this moment. And we are talking just raw bandwidth here, we aren't even taking into account framebuffer compression which basically adds additional ~40% of bandwidth.

Lets be realistic, majority of people who buy Titan-X and Fury X cards buy them every year. They want best of the best, zero compromises. These kind of people don't care how future proof the card is, they just want it to perform great NOW. And Fury X will do that without any doubt.

People like me and probably you, we buy them so that they remain useful for 2, maybe 3 years and then we buy a new high end (not enthusiast!) one. For us, buying such card is a double edged thing. Yes, it's expensive, but in a long run, maybe it's not that future proof. Still, there are just few games that really need so much VRAM, 3/4 of others are happy with VRAM even at around only 2-3GB...
Posted on Reply
#28
Prima.Vera
RejZoRBased on what? VRAM capacity? While that may be true to some extent, 4GB is still a lot and with enormous bandwidth it has, I don't think that's really a problem this moment. And we are talking just raw bandwidth here, we aren't even taking into account framebuffer compression which basically adds additional ~40% of bandwidth.

Lets be realistic, majority of people who buy Titan-X and Fury X cards buy them every year. They want best of the best, zero compromises. These kind of people don't care how future proof the card is, they just want it to perform great NOW. And Fury X will do that without any doubt.

People like me and probably you, we buy them so that they remain useful for 2, maybe 3 years and then we buy a new high end (not enthusiast!) one. For us, buying such card is a double edged thing. Yes, it's expensive, but in a long run, maybe it's not that future proof. Still, there are just few games that really need so much VRAM, 3/4 of others are happy with VRAM even at around only 2-3GB...
Yeah, I was talking about the VRAM capacity. Already games like GTA V, COD or Watch Dogs go beyond 4GB usage in 4K, so the target for the next games is clear...
You are right, personally I play in 1080p and I always upgrade my card every 3rd generation. But I always buy the top card of the moment. ;)
Posted on Reply
#29
RejZoR
I prefer to stick with certain resolution, because it's just a smart thing to do. In the past it was 1280x1024 on a 19 inch 5:4 display. Sure it was a bit small and not wide screen, but it had superb latency and it was a 75Hz one so it served me well for gaming in particular. Image was super sharp and responsive. Now I've gone to 1080p 144Hz ASUS and I'm planning to stick with it for quite a while. It's big enough, it's widescreen and I'll be able to run all games at max possible settings with same high end card for very long time before it actually becomes too slow.

I mean, my HD7950 just now starting to show some age in maybe 2 or 3 games. The rest still runs on max with FSAA! So, if I buy vanilla Fury or even maybe just R9-390X, I should be fine again for quite some time.
Posted on Reply
#30
R-T-B
RejZoRI prefer to stick with certain resolution, because it's just a smart thing to do. In the past it was 1280x1024 on a 19 inch 5:4 display. Sure it was a bit small and not wide screen, but it had superb latency and it was a 75Hz one so it served me well for gaming in particular. Image was super sharp and responsive. Now I've gone to 1080p 144Hz ASUS and I'm planning to stick with it for quite a while. It's big enough, it's widescreen and I'll be able to run all games at max possible settings with same high end card for very long time before it actually becomes too slow.

I mean, my HD7950 just now starting to show some age in maybe 2 or 3 games. The rest still runs on max with FSAA! So, if I buy vanilla Fury or even maybe just R9-390X, I should be fine again for quite some time.
As much as I want to buy a Fury to support AMD RejZor, I'm with you. I play at 1080p 60Hz and don't plan to upgrade for some time... I have an R9 290X right now. That's probably future proof for most games for a fair bit, and makes upgrading now kinda silly.

I'm actually considering even sidegrading to Tongo but it's not worth the effort honestly for just a small energy savings.
Posted on Reply
#31
Assimilator
Gigayte's GeForce 900 G1 Gaming series has 6 display outputs (Flex Display Technology), I feel like AMD missed a trick here by not having an add-on bracket that gives additional outputs - similar to the way some low-end HTPC cards come with a full-height bracket with 2 outputs and a half-height with only 1. That way AMD could cater for people who want single-slot cards (standard bracket as in the picture) while allowing partners to choose whether they want to offer SKUs with more than 4 outputs.

Granted, not that many people will be using more than 4 outputs anyway.
Posted on Reply
#32
john_
newtekie1Two 8-Pins? So we can expect the power draw to surpass the 300w mark.
Or it will have plenty of room for overclocking. It is a water cooled hi end card. Everyone seems to forget that.

Posted on Reply
#33
FreedomEclipse
~Technological Technocrat~
Aceman.auJesus it's tiny.
thats what she said. ( ͡° ͜ʖ ͡°)
Posted on Reply
#34
PLAfiller
FreedomEclipsethats what she said. ( ͡° ͜ʖ ͡°)
BUUUURN! :P
Posted on Reply
#35
RejZoR
What I wish is BIOS editing tools. I've been using this on my HD6950 and now HD7950 and it's stuff sent from heaven. All this overclocking software like MSI Afterburner delivers unstable overclocking. In my case it was constantly failing and it was also not switching 2D/3D properly.

But with flashed bios, fan curve, voltage, clocks, everything is rock solid at much higher values that I could ever use in Afterburner. And best of all, system format, bootup, other OS, it's ready for performance without fiddling. Seeing how R9-290X didn't have that, it makes me sad. It'll be hard to go back on crappy software overclocking once you taste the brilliance of flashed BIOS...
Posted on Reply
#36
Devon68
I just have to leave this here.
JESUS CHRIST IT'S UGLY.
Posted on Reply
#37
xfia
RejZoRThen again, how many users use them for compute tasks? These are gaming cards and while they can do compute, they are not meant for this. That's why both, AMD and NVIDIA have more expensive FireGL and Quadro. So they can sell dedicated cards that are great at compute and sort of suck at gaming...



Based on what? VRAM capacity? While that may be true to some extent, 4GB is still a lot and with enormous bandwidth it has, I don't think that's really a problem this moment. And we are talking just raw bandwidth here, we aren't even taking into account framebuffer compression which basically adds additional ~40% of bandwidth.

Lets be realistic, majority of people who buy Titan-X and Fury X cards buy them every year. They want best of the best, zero compromises. These kind of people don't care how future proof the card is, they just want it to perform great NOW. And Fury X will do that without any doubt.

People like me and probably you, we buy them so that they remain useful for 2, maybe 3 years and then we buy a new high end (not enthusiast!) one. For us, buying such card is a double edged thing. Yes, it's expensive, but in a long run, maybe it's not that future proof. Still, there are just few games that really need so much VRAM, 3/4 of others are happy with VRAM even at around only 2-3GB...
that is true that we have not seen this much memory bandwidth before along with the architecture itself and the compression.. page file might be used a little different for all we know at the moment.
Posted on Reply
#38
BiggieShady
RejZoRThen again, how many users use them for compute tasks? These are gaming cards and while they can do compute, they are not meant for this. That's why both, AMD and NVIDIA have more expensive FireGL and Quadro. So they can sell dedicated cards that are great at compute and sort of suck at gaming...
Quadro M6000 has the same GM200 GPU as Titan X, only difference is ECC memory AFAIK. These are used in production where no mistakes can happen (hence ECC), but I'm sure many GPGPU devs use mid range gaming cards for development. Although I have a feeling that's nothing compared to total number of pc gamers that play games on Titan X :laugh:
Posted on Reply
#39
R-T-B
RejZoRWhat I wish is BIOS editing tools. I've been using this on my HD6950 and now HD7950 and it's stuff sent from heaven. All this overclocking software like MSI Afterburner delivers unstable overclocking. In my case it was constantly failing and it was also not switching 2D/3D properly.

But with flashed bios, fan curve, voltage, clocks, everything is rock solid at much higher values that I could ever use in Afterburner. And best of all, system format, bootup, other OS, it's ready for performance without fiddling. Seeing how R9-290X didn't have that, it makes me sad. It'll be hard to go back on crappy software overclocking once you taste the brilliance of flashed BIOS...
I feel your pain man. Felt it very vividly when I dumped my old 7970... The bios editor tools for that were awesome.
Posted on Reply
#40
ensabrenoir
that 390x unboxing video is back up again
Posted on Reply
#41
R-T-B
My only concern is that the GCN 1.2 added compression isn't going to so much. It didn't from Tahiti->Tonga, just read the TPU review. 5% or less in most cases.

That said, it has so much bandwidth it still will kick ass IMO.
Posted on Reply
#42
bug
Aceman.auJesus it's tiny.
Only if you don't factor in that giant fan.
I'm curious what it can do (a few days to wait, I guess), but as an nvidia made man, it won't matter much to me till next year.
Posted on Reply
#43
Ja.KooLit
man.... think need to start courting my wife so I can buy 2 of these :)
Posted on Reply
#44
RejZoR
R-T-BMy only concern is that the GCN 1.2 added compression isn't going to so much. It didn't from Tahiti->Tonga, just read the TPU review. 5% or less in most cases.

That said, it has so much bandwidth it still will kick ass IMO.
Wait a second, you're comparing apples with oranges here! Tonga compared to Tahiti has less shaders, less TMU units and narrower memory bus and identical clocks. Where did you get those 5% when card is significantly less powerful on paper, but performs on par with the larger "brother". It has to gain performance from somewhere and trust me, it ain't just 5%...
Posted on Reply
#45
R-T-B
RejZoRWait a second, you're comparing apples with oranges here! Tonga compared to Tahiti has less shaders, less TMU units and narrower memory bus and identical clocks. Where did you get those 5% when card is significantly less powerful on paper, but performs on par with the larger "brother". It has to gain performance from somewhere and trust me, it ain't just 5%...
I just skimmed the TPU review tbh, I assumed it to be a respin of Tahiti silicon directly. My bad.
Posted on Reply
#46
RejZoR
Tonga wasn't a success, it was a tech preview after all. But they proved the point with it. It can be underpowered and the optimizations in the GPU can make up the difference quite well. How that extrapolates on a card with twice the physical memory bus bandwidth and a lot more shaders, no one but AMD really know. But it has to do something there as well, particularly at higher resolutions where these cards should shine.
Posted on Reply
#47
OneMoar
There is Always Moar
would't use one if AMD gave one to me ... no place to mount that rad either ...
Posted on Reply
#48
BiggieShady
OneMoarwould't use one if AMD gave one to me ... no place to mount that rad either ...
Rubbish, you're not fooling anyone ... in fact no one here can say that and mean it.
Even with no place to mount the rad, we'd all learn to live with the rad hanging out of the open case :laugh:
Posted on Reply
#49
Loosenut
OneMoarwould't use one if AMD gave one to me ... no place to mount that rad either ...
I'll PM you my adress if ever it happens
Posted on Reply
#50
RejZoR
I have a miniATX case with one AiO already in it. And I'm quite confident I could place a second one in there. If I've thought myself anything is that case is never too small for anything. You should see my old box, TT Lanbox with the same Core i7 920 in it. This Lian Li is luxury.

I'd have to move the CPU AiO to the exhaust port and graphic card AiO in the front...
Posted on Reply
Add your own comment
Dec 4th, 2024 03:31 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts