# AMD Radeon R9 Fury X Confirmed SKU Name for "Fiji XT"



## btarunr (Jun 12, 2015)

The bets are off, AMD's latest flagship graphics card will indeed get a fancy name, and it will be named Radeon R9 Fury X. Korean tech-site HardwareBattle leaked a product flyer with the SKU name, and its catchphrase "revolutionary, inside out." Based on the 28 nm "Fiji" silicon, the R9 Fury X is expected to feature 4,096 stream processors, 256 TMUs, 128 ROPs, and a 4096-bit wide HBM memory interface, holding 4 GB of memory.

The reference-design Fury X will come with an AIO cooling solution, likely designed by Asetek, featuring a Cooler Master made fan, ventilating its 120 x 120 mm radiator. Just as the Radeon R9 290X did away with D-Sub (VGA) support (even with dongles), Fiji does away with the DVI connector. You still get three DisplayPort 1.2a ports, and a single HDMI 2.0 connector. The card has been pictured on the web featuring two 8-pin PCIe power connectors. 



 

 



*View at TechPowerUp Main Site*


----------



## Cybrnook2002 (Jun 12, 2015)

Thanks


----------



## nickbaldwin86 (Jun 12, 2015)

No DVI ports!!! WOOT

I will take two! 

if these can be single slot that would be amazing (with water block course)


----------



## dj-electric (Jun 12, 2015)

With GTX 980 Ti's AIB OC results, the Fury X would have probably nothing to use against it but a much cheaper price.


----------



## 64K (Jun 12, 2015)

I'm expecting great performance from Fury. There will no doubt be people waiting in line to slam it for electricity used but for enthusiasts the extra 50 cents a month on the power bill are irrelevant unless you run it 24/7 at full load or you pay a ridiculously high rate per kWh.


----------



## the54thvoid (Jun 12, 2015)

Looking back, the 290x had 64 ROP's and 780ti had 48. This has 128 to the 96 on 980ti. So up by 100% versus 50% on Maxwell. Surely that should help it pull ahead?

Edit: I have been corrected. They're both 100%. Hey I typed it working out, I'm focussing on reps, not ROP's.


----------



## Luka KLLP (Jun 12, 2015)

The hype is real


----------



## ShurikN (Jun 12, 2015)

Dj-ElectriC said:


> With GTX 980 Ti's AIB OC results, the Fury X would have probably nothing to use against it but a much cheaper price.


And you know that how?!


----------



## hyp36rmax (Jun 12, 2015)

Cooler Master fan?  I wonder if it's our new Silencio FP120... That's news, no one in our office knew about this... Source?  There have been "leaked" shots of the radiator with what looked like a Nidec Gentle Typhoon AP29/30/31 and even an AP15.


----------



## Disparia (Jun 12, 2015)

btarunr said:


> You still get three DisplayPort 1.2a ports, and a single HDMI 2.0 connector.



NICE.


----------



## GhostRyder (Jun 12, 2015)

Well this design certainly is new and different that is for sure.  Cannot wait to see it in the public!


----------



## ensabrenoir (Jun 12, 2015)

.......a few more days and the real Fury will begin.


----------



## RejZoR (Jun 12, 2015)

This is Ferrari of graphic cards. You don't give a shit how much resources it consumes in the process for as long as it's fast. And makes no noise. In this case XD

If highest end is R9 Fury X, then I assume vanilla Fury will be R9 Fury (without the X).


----------



## the54thvoid (Jun 12, 2015)

Caring1 said:


> 48 - 96 is still 100% increase.



I had a mong moment.


----------



## Caring1 (Jun 12, 2015)

the54thvoid said:


> I had a mong moment.


It happens to everyone 
my previous comment is gone


----------



## ShurikN (Jun 12, 2015)

RejZoR said:


> If highest end is R9 Fury X, then I assume vanilla Fury will be R9 Fury (without the X).


And without water cooling i presume.


----------



## RejZoR (Jun 12, 2015)

Supposedly, the vanilla Fury should arrive in AiO and air cooled version.


----------



## the54thvoid (Jun 12, 2015)

RejZoR said:


> Supposedly, the vanilla Fury should arrive in AiO and air cooled version.



I'd prefer air cooled, AiO is a waste for my needs. Either way it's be custom water. Just need EKWB to start PR spamming for both 980ti and Fury!


----------



## SirEpicWin (Jun 12, 2015)

ShurikN said:


> And you know that how?!


http://www.guru3d.com/articles-pages/gigabyte-geforce-gtx-980-ti-g1-gaming-soc-review,1.html

Runs cooler,faster cost 40$ more 
I was seriously considering switching to the red team but.............


----------



## SimpleTECH (Jun 12, 2015)

The question I have is with the die being larger than previous GPUs, will ithe still have the same mounting?  Looking to throw an AIO on the air cooled version.


----------



## 64K (Jun 12, 2015)

SirEpicWin said:


> http://www.guru3d.com/articles-pages/gigabyte-geforce-gtx-980-ti-g1-gaming-soc-review,1.html
> 
> Runs cooler,faster cost 40$ more
> I was seriously considering switching to the red team but.............



The Gigabyte 980 Ti Gaming is a great card, no doubt about that, but we don't know for certain yet what the Fury X performance is or the price.


----------



## Aquinus (Jun 12, 2015)

I've felt the PR hype before. While I want to believe, I'm reserving judgement for reviews and benchmarks.


----------



## RejZoR (Jun 12, 2015)

the54thvoid said:


> I'd prefer air cooled, AiO is a waste for my needs. Either way it's be custom water. Just need EKWB to start PR spamming for both 980ti and Fury!



But AiO will be much much quieter regardless of conditions. Also, OC headroom should be a lot bigger on AiO...


----------



## GAR (Jun 12, 2015)

4GB of ram is a failure IMO, if you plan to keep this card for 3+ years, I wouldnt buy it, It might be good for todays games at 1440P and most at 4K, but most games even at 1080P can use all 4GB of memory, GTA 5 is one example, I have the titan X and in some cases Ive seen 4.5gb of ram being used with msaa on at 1080P.


----------



## HisDivineOrder (Jun 12, 2015)

AMD?  Do you really want to dredge up a naming scheme from ATI Rage cards way back when?  Do you really think those of us who lived during those ancient times are going to go, "Wow, that name gives me warm and fuzzy feelings!"

Because it doesn't.  It reminds me of Rage Fury Pro and Rage Fury MAXX.  The latter card barely worked.  Imagine if nVidia resurrected the FX5800 series as the FX8500.  Or Intel brought back the Pentium IV and RDRAM because people loved it so much.

No, AMD.  Stop being silly.


----------



## xkche (Jun 12, 2015)

GAR said:


> 4GB of ram is a failure IMO, if you plan to keep this card for 3+ years, I wouldnt buy it, It might be good for todays games at 1440P and most at 4K, but most games even at 1080P can use all 4GB of memory, GTA 5 is one example, I have the titan X and in some cases Ive seen 4.5gb of ram being used with msaa on at 1080P.



In theory DX12 can mixup RAM.


----------



## btarunr (Jun 12, 2015)

GAR said:


> 4GB of ram is a failure IMO, if you plan to keep this card for 3+ years, I wouldnt buy it, It might be good for todays games at 1440P and most at 4K, but most games even at 1080P can use all 4GB of memory, GTA 5 is one example, I have the titan X and in some cases Ive seen 4.5gb of ram being used with msaa on at 1080P.



Yeah, but GTA5 doesn't use tiled-resources. With GL mega-textures and D3D tiled-resources, memory size growth in video cards will take a hit (or at least games will consume lesser video memory). Tiled-resources will be as heavily proliferated/implemented a feature of DX12, as tessellation was, with DX11.

Memory bandwidth, more than memory size, will hold the key to this generation. AMD is getting a headstart over NVIDIA. Your 980 Ti may look good with existing DX11 games, but come DX12, and its memory implementation will begin to choke.


----------



## 2big2fail (Jun 12, 2015)

The Fury X is going to make some amazing mini-ITX builds.


----------



## dj-electric (Jun 12, 2015)

RejZoR said:


> But AiO will be much much quieter regardless of conditions. Also, OC headroom should be a lot bigger on AiO...



AIO isn't "much much" quiter at all. In fact, in most cases it is louder than air coolers becuase of pump noice and higher fan RPM due to the lack of enough cooling surface. The asus STRIX and MSI TF are prime examples from w1z's reviews. Close to mere silence on hefty GPUs. Coolers like the D15 are the ultimate example, with noice to performance ratio that utterly demolishes the best of AIOs


----------



## RejZoR (Jun 12, 2015)

I have an Antec H2O 920 on my CPU and it's basically whisper quiet. The pump is inaudible and I've replaced shitty fans with Noiseblocker Multi-Frame fans that run at fixed speed. Depending on the price, I might do the same for GPU. The only thing that really makes any noise is graphic card... But I think I'll be going with WindForce 3X Fury...


----------



## hyp36rmax (Jun 12, 2015)

2big2fail said:


> The Fury X is going to make some amazing mini-ITX builds.



Agreed!


----------



## RejZoR (Jun 12, 2015)

ATi Rage Fury Maxx was (if we exclude 3dfx VSA100) one of the rare high end multiprocessor graphic cards. ATi also invented AFR rendering mode for it. The reason why it wasn't a success is because it was ahead of its time like most multi GPU units back then and even today...


----------



## happita (Jun 12, 2015)

Not putting in for another upgrade until the new 16nm cards come out. THEN there will be some interesting competition between both Nvidia and AMD. They just seem uninteresting with what's going on right now. It all depends on how much of a performance bump people can get if they consider a jump from let's say R9 290/GTX 970.

BTW, from the charts that I've seen as of late, isn't the R9 390 and R9 390X just rebrands of the 290 and 290X?


----------



## 2big2fail (Jun 12, 2015)

From the earlier rumors there was going to (eventually) be an 8GB version of the Fiji XT. Any news on that?


----------



## newconroer (Jun 12, 2015)

nickbaldwin86 said:


> No DVI ports!!! WOOT
> 
> I will take two!
> 
> if these can be single slot that would be amazing (with water block course)



The majority of connections for monitors over 96hz are DVI-D and Display port. However the latter doesn't always play nice at high frequencies, so DVI-D is a stable go to.
And seeing as a lot of 'gamers' are moving to high hz display panels, I find this a bad move on their part.


----------



## SirEpicWin (Jun 12, 2015)

64K said:


> The Gigabyte 980 Ti Gaming is a great card, no doubt about that, but we don't know for certain yet what the Fury X performance is or the price.



I was referring to the reference  980 ti


----------



## Cool Vibrations (Jun 12, 2015)

HisDivineOrder said:


> AMD?  Do you really want to dredge up a naming scheme from ATI Rage cards way back when?  Do you really think those of us who lived during those ancient times are going to go, "Wow, that name gives me warm and fuzzy feelings!"
> 
> Because it doesn't.  It reminds me of Rage Fury Pro and Rage Fury MAXX.  The latter card barely worked.  Imagine if nVidia resurrected the FX5800 series as the FX8500.  Or Intel brought back the Pentium IV and RDRAM because people loved it so much.
> 
> No, AMD.  Stop being silly.



Speak for yourself. This card is an auto buy for me because of that same reason. Buying two right when it releases. 

Perform as well as a Titan X and cheaper? No complaints here.


----------



## Delta6326 (Jun 12, 2015)

the54thvoid said:


> I'd prefer air cooled, AiO is a waste for my needs. Either way it's be custom water. Just need EKWB to start PR spamming for both 980ti and Fury!



Titan X block works on 980 TI 

EK-FC Titan X is a high performance full-cover water block for nVidia reference (NVA-PG600) design GeForce GTX Titan X and GTX 980 Ti series graphics cards.


----------



## semantics (Jun 12, 2015)

btarunr said:


> Yeah, but GTA5 doesn't use tiled-resources. With GL mega-textures and D3D tiled-resources, memory size growth in video cards will take a hit (or at least games will consume lesser video memory). Tiled-resources will be as heavily proliferated/implemented a feature of DX12, as tessellation was, with DX11.
> 
> Memory bandwidth, more than memory size, will hold the key to this generation. AMD is getting a headstart over NVIDIA. Your 980 Ti may look good with existing DX11 games, but come DX12, and its memory implementation will begin to choke.


Isn't that a moot point when you buy a card for today's games and games soon coming out. While some games will have those features others will not which kind of sucks when you buy a flagship card and it just won't work like a flagship card for some games. I mean either way we'll see in the reviews how the card hold up especially at the higher resolutions.


----------



## RejZoR (Jun 12, 2015)

Considering MASSIVE difference in badnwidth, I don't think Fury X will be "just" on par with Titan X...


----------



## Assimilator (Jun 12, 2015)

Patiently waiting for the 16th when the hype around this card will finally dissipate in silence.


----------



## dj-electric (Jun 12, 2015)

RejZoR said:


> Considering MASSIVE difference in badnwidth, I don't think Fury X will be "just" on par with Titan X...



Lets see...
R9 290X - 320GB/s
GTX 980 - 224GB/s
42% high VRAM bandwitdh capacity for R9 290X, while being beaten by 38% on 1440P (according to TPU).

AMD R9 Fury X - 512GB/s
GTX 980 Ti - 337GB/s
51% additional bandwitdh capacity for Fury X.

You do the math. Numbers numbers, wish they held water.


----------



## heydan83 (Jun 12, 2015)

This card and the asus MG279Q will be the perfect combination.. looking forward to both of them.


----------



## Slizzo (Jun 12, 2015)

Dj-ElectriC said:


> Lets see...
> R9 290X - 320GB/s
> GTX 980 - 224GB/s
> 42% high VRAM bandwitdh capacity for R9 290X, while being beaten by 38% on 1440P (according to TPU).
> ...



But, for the aforementioned reason where DX12 will be more memory bandwidth aware the AMD card should definitely pull ahead once those games start to hit the market.


----------



## ZoneDymo (Jun 12, 2015)

semantics said:


> Isn't that a moot point when you buy a card for today's games and games soon coming out. While some games will have those features others will not which kind of sucks when you buy a flagship card and it just won't work like a flagship card for some games. I mean either way we'll see in the reviews how the card hold up especially at the higher resolutions.



His was a respond to a comment about this card with its 4gb holding up in 3 years.
And from their you go back to the present... okej then.
At the present 4gb is more then enough for 99% of the games out now and in the foreseeable future.
After that DX12 becomes the norm and 4gb will not be as important anymore as bandwidth so it will still not pose a problem later.

thats the point here.


----------



## Frick (Jun 12, 2015)

Dj-ElectriC said:


> Lets see...
> R9 290X - 320GB/s
> GTX 980 - 224GB/s
> 42% high VRAM bandwitdh capacity for R9 290X, while being beaten by 38% on 1440P (according to TPU).
> ...



Isn't fury a new chip and the 390's 290's?


----------



## Slizzo (Jun 12, 2015)

Frick said:


> Isn't fury a new chip and the 390's 290's?



For the most part. But where do you see him referencing the 390?


----------



## RejZoR (Jun 12, 2015)

Dj-ElectriC said:


> Lets see...
> R9 290X - 320GB/s
> GTX 980 - 224GB/s
> 42% high VRAM bandwitdh capacity for R9 290X, while being beaten by 38% on 1440P (according to TPU).
> ...



Erm, you do realize Fury X is not a 290X right? 2816 shaders vs 4096, not to mention more of everything else including the fact it has BETTER shaders to begin with.


----------



## arbiter (Jun 12, 2015)

RejZoR said:


> This is Ferrari of graphic cards. You don't give a shit how much resources it consumes in the process for as long as it's fast. And makes no noise. In this case XD
> 
> If highest end is R9 Fury X, then I assume vanilla Fury will be R9 Fury (without the X).



Its ferrari of cards that probably cost as much as a ferrari, 980ti is nissan GTR. just as fast a and a lot cheaper. Ferrari's can be very loud to.



the54thvoid said:


> Looking back, the 290x had 64 ROP's and 780ti had 48. This has 128 to the 96 on 980ti. So up by 100% versus 50% on Maxwell. Surely that should help it pull ahead?
> Edit: I have been corrected. They're both 100%. Hey I typed it working out, I'm focussing on reps, not ROP's.



Well If specs ment anything anymore it would. But day where # of cores and rops, memory bandwidth, died many years ago.



happita said:


> Not putting in for another upgrade until the new 16nm cards come out. THEN there will be some interesting competition between both Nvidia and AMD. They just seem uninteresting with what's going on right now. It all depends on how much of a performance bump people can get if they consider a jump from let's say R9 290/GTX 970.BTW, from the charts that I've seen as of late, isn't the R9 390 and R9 390X just rebrands of the 290 and 290X?


Nvidia can deal with a small loss or even tieing til 16nm is ready. 390(x) will be based off hawaii chip, but looking like they will be upgraded with gcn 1.2 which is what r9 285 used.



2big2fail said:


> From the earlier rumors there was going to (eventually) be an 8GB version of the Fiji XT. Any news on that?


Yea but won't be i heard 2-3 months after but don't think see them til like nov/dec probably area maybe oct. I expect a lot of people will be in for some sticker shock when they come out.



RejZoR said:


> Considering MASSIVE difference in badnwidth, I don't think Fury X will be "just" on par with Titan X...





Dj-ElectriC said:


> Lets see...
> R9 290X - 320GB/s
> GTX 980 - 224GB/s
> 42% high VRAM bandwitdh capacity for R9 290X, while being beaten by 38% on 1440P (according to TPU).
> ...





RejZoR said:


> Erm, you do realize Fury X is not a 290X right? 2816 shaders vs 4096, not to mention more of everything else including the fact it has BETTER shaders to begin with.



Yea memory bandwidth has been really only reason AMD gpu's remain in a competitive spot. Wonder what would happened if 980 had 384 bit or even 512bit from day one. what the performance diff would be?

We do realize all that, but we also see how history has proven everything. Just cause AMD has had more bandwidth hasn't ment it hasn't been a ton faster.


----------



## BiggieShady (Jun 12, 2015)




----------



## GreiverBlade (Jun 12, 2015)

64K said:


> I'm expecting great performance from Fury. There will no doubt be people waiting in line to slam it for electricity used but for enthusiasts the extra 50 cents a month on the power bill are irrelevant unless you run it 24/7 at full load or you pay a ridiculously high rate per kWh.


so do i... thanks for pointing out the extra cost over a month is .... ridiculous at best.


----------



## Fluffmeister (Jun 12, 2015)

BiggieShady said:


>



Lol.


----------



## The Von Matrices (Jun 12, 2015)

Remember how back in 2007, ATI touted that they were introducing a new, straightforward naming scheme for their graphics cards?  Yeah, it makes me nostalgic too.


----------



## GreiverBlade (Jun 12, 2015)

BiggieShady said:


>


for the CFX comment ... it made me laugh hard ... come on ... did he never saw a 290/290X before? no way he shouldn't know AMD already removed the CFX bridge since the 290/290X


----------



## N3M3515 (Jun 12, 2015)

Dj-ElectriC said:


> while being beaten by 38% on 1440P (according to TPU).



1440p - TPU


 
Where did you get that 38%?


----------



## $ReaPeR$ (Jun 12, 2015)

N3M3515 said:


> 1440p - TPU
> View attachment 65703
> Where did you get that 38%?



LOL mate.. fanboys will always be fanboys.. 

to the point: very excited! waiting for benchmarks. no point in speculating..


----------



## dj-electric (Jun 12, 2015)

BiggieShady said:


>



I cringed.



N3M3515 said:


> 1440p - TPU
> View attachment 65703
> Where did you get that 38%?



Typo, meant 980 TI


----------



## $ReaPeR$ (Jun 12, 2015)

again it isnt 38 mate.. it is 28% and 27% respectively..


----------



## FordGT90Concept (Jun 12, 2015)

nickbaldwin86 said:


> No DVI ports!!! WOOT


No, NO, |\|( )!1!!111!!


----------



## Breit (Jun 12, 2015)

GAR said:


> 4GB of ram is a failure IMO, if you plan to keep this card for 3+ years, I wouldnt buy it, It might be good for todays games at 1440P and most at 4K, but most games even at 1080P can use all 4GB of memory, GTA 5 is one example, I have the titan X and in some cases Ive seen 4.5gb of ram being used with msaa on at 1080P.



You must be proud to own a card that has 7.5GB RAM free. All the time. 

I'd say that 4GB can be enough, considering a proper coded game, even at 4K (see Witcher 3, which never goes above 3GB, even at max. settings). I guess for anything more demanding, you need a faster card anyways (or two for that matter).
DX12 has resource binding and split frame rendering which will help with VRAM usage and allows for combining VRAM if you choose to use more than one card.


----------



## the54thvoid (Jun 12, 2015)

$ReaPeR$ said:


> again it isnt 38 mate.. it is 28% and 27% respectively..
> 
> View attachment 65706 View attachment 65707



It's what you want it to be but if you take the percentile scores as points:

290x = 72
980ti = 100
diff = 28
28/72 = 38%  The 980ti is 38% faster than the 290x.

Or in other parlance, the 290X is only 72 % as fast as the 980ti.  %'s make it difficult to analyse this way (IMO).  A review total fps average would give a more arithmetically pleasing outcome to the argument.


----------



## MxPhenom 216 (Jun 12, 2015)

RejZoR said:


> This is Ferrari of graphic cards. You don't give a shit how much resources it consumes in the process for as long as it's fast. And makes no noise. In this case XD
> 
> If highest end is R9 Fury X, then I assume vanilla Fury will be R9 Fury (without the X).


At least Ferrari's make a beautiful sound all at the same time.


----------



## happita (Jun 13, 2015)

IMO this comparison between chips doesn't make a lot of sense since the R7/R9 2XX were designed to compete with Nvidia's GTX 700 cards, not their 900s. If the 3XX series is just going to be a small upgrade (or rebrand), it looks like the 3XX series is just barely going to compete with Nvidia which means that prices aren't going to change much in the foreseeable future in the mid/high and high end spectrum which is a bummer.


----------



## MrGenius (Jun 13, 2015)

No DVI? Seriously?

Repeat after me AMD: "I...am...sofa...king...re...tod...did"


----------



## Xzibit (Jun 13, 2015)

*PCPERSPECTIVE*


----------



## Solaris17 (Jun 13, 2015)

holyshit is that rad thick.


----------



## Arjai (Jun 13, 2015)

arbiter said:


> Its ferrari of cards that probably cost as much as a ferrari, 980ti is nissan GTR. just as fast a and a lot cheaper. Ferrari's can be very loud too.


BTW, this reference sucks.

A Nissan GTR should not ever be spoken in the same sentence as...ANY Ferrari. That is a Period at the end of that sentence, it is significant.

Also, there is not a Nissan, EVER, including racing vehicles, that are faster than a Ferrari, unless you take an old Daytona versus a Nissan F1, Red Bull car. Drivers not included.

Straight up, one mile acceleration test, against the clock, same driver, Ferrari wins. Hands down.

Not to mention, consumer vehicles, where price vs. quality makes all the difference. The new GTR? Junk. Ferrari La Ferrari, or even the new 488 Turbo? 

If you cannot tell the difference, this point is mute.

Back to the POINT, if the Fury is a Ferrari, calling the 980ti a Nissan, is an insult. So, You must then be a Radeon FanBoi trying to disguise yourself? Nah, Prolly just a Nvidean with no sense.

FUD on, Nobody, at this point is even close to correct, about anything BUT conjecture.

Although, this crap can be, a little, entertaining.


----------



## buggalugs (Jun 13, 2015)

I'm more interested in non-reference air cooled designs like from Asus, msi lightning, sapphire vapourX etc with highend air cooler.

 AMD and partners will release different versions of this card I suspect that doesn't include water cooling.


----------



## arbiter (Jun 13, 2015)

$ReaPeR$ said:


> LOL mate.. fanboys will always be fanboys..
> 
> to the point: very excited! waiting for benchmarks. no point in speculating..


Yea speaking of AMD fanboyz how bad they hype the hell outta a card that have no proof its even as good as they think.



Breit said:


> You must be proud to own a card that has 7.5GB RAM free. All the time.
> I'd say that 4GB can be enough, considering a proper coded game, even at 4K (see Witcher 3, which never goes above 3GB, even at max. settings). I guess for anything more demanding, you need a faster card anyways (or two for that matter).
> DX12 has resource binding and split frame rendering which will help with VRAM usage and allows for combining VRAM if you choose to use more than one card.


Yea 7 months ago all AMD fan boyz were ALL over nvidia only putting 4gb on gtx980 saying it wasn't enough. Amazing how AMD does it and it is enough 7 months later. Funny how the hypocrisy works isn't it?



Arjai said:


> A Nissan GTR should not ever be spoken in the same sentence as...ANY Ferrari.



My point was gtr will give the ferrari a run for its money. while costing a lot less.


----------



## Solaris17 (Jun 13, 2015)

I'm worried that air cooling solutions even from third party vendors might not be upto snuff given the reduced amount of PCB real estate that they have available for our copper goddesses.


----------



## Arjai (Jun 13, 2015)

arbiter said:


> My point was gtr will give the ferrari a run for its money. while costing a lot less.



*Cough*Bullshit*Cough*


----------



## arbiter (Jun 13, 2015)

Solaris17 said:


> I'm worried that air cooling solutions even from third party vendors might not be upto snuff given the reduced amount of PCB real estate that they have available for our copper goddesses.


that size PCB no there is NO way to cool thing on that tiny pcb with air cooler. It will have a full size card at least.



Arjai said:


> *Cough*Bullshit*Cough*



Standard European point of view. just cause a car cost 200k+ doesn't mean a car that cost 100k can't beat it.


----------



## ThE_MaD_ShOt (Jun 13, 2015)

Arjai said:


> *Cough*Bullshit*Cough*


@Arjai buddy,pal you can't really say that. If a "Farm truck" can whip the crap out of porches, Z06's, Lambos and whatever else we are forgetting then a gtr with the right setup can smoke a ferrari. Actually I know of a couple over 800 HP gtr's.


----------



## RejZoR (Jun 13, 2015)

$ReaPeR$ said:


> again it isnt 38 mate.. it is 28% and 27% respectively..
> 
> View attachment 65706 View attachment 65707



And then you convert those lovely % into actual framerate and it ends up being a 5fps difference. Talk framerate, not %... Not because I'm a fanboy or anything, but anyone who follows graphic cards seriously knows that % always sounds so glorious and then you check actual framerate difference and it's negligible difference. And then you take into an account that one card is 1 year old and another absolutely brand new and makes you wonder wtf was NVIDIA "improving"... Again, not a fanboy, just being realistic...


----------



## Frick (Jun 13, 2015)

Cars =! computers mates. It never ever works to liken them.


----------



## BiggieShady (Jun 13, 2015)

Fluffmeister said:


> Lol.





GreiverBlade said:


> for the CFX comment ... it made me laugh hard ... come on ... did he never saw a 290/290X before? no way he shouldn't know AMD already removed the CFX bridge since the 290/290X





Dj-ElectriC said:


> I cringed.


Damn, now it's removed by user


----------



## GreiverBlade (Jun 13, 2015)

BiggieShady said:


> Damn, now it's removed by user


well ... yes ... i bet he got some comment on the vids about : how a 390X is not next gen because it's a "rebranded" 290X and thus the CFX thumbs are away since 2 years now

well ... it's a next gen... but indeed the CFX thumbs are away since 2 yrs 
about the rebrand ... well the 290 is still a pretty capable card, so why not, it's just like what happened with the 7XX line so who really care in the end.(and i owned a 770 ... so, not me)



buggalugs said:


> I'm more interested in non-reference air cooled designs like from Asus, msi lightning, sapphire vapourX etc with highend air cooler.
> 
> AMD and partners will release different versions of this card I suspect that doesn't include water cooling.


i am more interested in reference custom waterblock and 1 slot bracket ... or custom air cooled non OC model but with a custom waterblock
(because factory OC sucks most of the time ... and you can easily do yourself what they do to make you pay a premium over a ref stock)

also ... 4gb is actually enough and standard (look at the 970 it also pack 4... wait ... forget it)


----------



## BiggieShady (Jun 13, 2015)

GreiverBlade said:


> well ... yes ... i bet he got some comment on the vids about : how a 390X is not next gen because it's a "rebranded" 290X and thus the CFX thumbs are away since 2 years now
> 
> well ... it's a next gen... but indeed the CFX thumbs are away since 2 yrs


well ... yes, that's why I posted it  they're gonna piss a lots of people confused by the naming scheme


----------



## GreiverBlade (Jun 13, 2015)

BiggieShady said:


> well ... yes, that's why I posted it  they're gonna piss a lots of people confused by the naming scheme


not really...
before we had 250<260<270<280<290 and X variants
now  we have  25060708090<fury and X variants

same hierarchy in term of performance just 2 card added into the high end.
just like Nv did during the 7XX time and the Titan+Titan black, with a shorter timeframe (aka: none) between the 2 top SKU tho (well not really ... i forgot the 750/Ti 760 and 780/Ti but ... oh well )


----------



## Dany (Jun 13, 2015)

i dont mind if fury's power consumption is over 300W as long its better than gtx titan x and 350-400$ cheaper , i cant wait to see this gpu at work , no more waiting thats for sure now , i'd buy one if its that powerfull , no second thougths there ,  cheers !!


----------



## Aquinus (Jun 13, 2015)

Breit said:


> You must be proud to own a card that has 7.5GB RAM free. All the time.


It doesn't take much more than going 400MB over physical VRAM to lose substantial performance. I know, it's happening right now to me with my 6870s. I play Elite Dangerous constantly with about 400MB over dedicated going into shared. Thanks to that, my GPUs most of the time are running <50% because they're starved for more VRAM. I'm also not using any AA and everything is set to high (not even ultra.) Simply turning AA on pushed it up to 2GB (it will keep using shared...) and at that point it's unplayable.

The real point here is that so long as you have to start dipping into shared system memory, you lose a substantial amount of performance because the GPU starts spending more time waiting and less time working. Simply put, if I had 2GB on my 6870s, I wouldn't be considering an upgrade. Running out of VRAM is like driving a car with a huge engine but a tiny intake manifold (or a squirrel in your airbox if you will.) So if performance between any two GPUs were similar, I would take the one with more VRAM because it will last you longer. A lot of times it not the rendering itself in a game that gets more complex, it's the size of the textures, however that's not true for all games.

With that all said, CFX works most of the time for me. If it doesn't, a driver update usually takes care of it.


----------



## Breit (Jun 13, 2015)

Right. But spending 400$+ for VRAM you'll probably never use?


----------



## Aquinus (Jun 13, 2015)

Breit said:


> Right. But spending 400$+ for VRAM you'll probably never use?


Just because he's not using it now doesn't mean he won't be in a year or two. I bought my first 6870 6 years ago and my second 3 years ago. If it weren't for VRAM, I would continue to use them. It's not a matter of using it now, it's a matter of using it later down the line. It's called planning ahead.

Don't confuse this with justifying Titan's price. It's not. I'm just saying VRAM is important if you're planning long term (several years,) which I am, since I'm in the market and don't intend to do so much as buy another of the same GPU, 3 years down the road like last time. Just saying.


----------



## Breit (Jun 13, 2015)

I got your point. Nevertheless the question remains if those $400+ aren't better spent in say 3+ years from now for a new (probably faster) card with more VRAM?


----------



## Aquinus (Jun 13, 2015)

Breit said:


> I got your point. Nevertheless the question remains if those $400+ aren't better spent in say 3+ years from now for a new (probably faster) card with more VRAM?


Sure, it's a good argument but the counter point would be, how much would getting another one of the card you already have cost? In 3 years, I bet you I will be able to get a Fury or 980 Ti a lot cheaper than I can now (or will be able to soon.) 

When I bought my second 6870, it was significantly cheaper than it was when I bought it on release day, that's for sure. What did it get me? 7970 performance (sans vram) for half the cost.


----------



## Breit (Jun 13, 2015)

In the foreseeable future with dx12 and split frame rendering, you'll eventually get the VRAM as well with the second card.


----------



## Aquinus (Jun 13, 2015)

Breit said:


> In the foreseeable future with dx12 and split frame rendering, you'll eventually get the VRAM as well with the second card.


I have some serious doubts about how much DX12 will limit duplication of data because a lot of it needs to be shared for both to render the same thing. I suspect it will cut back on it but not eliminate it. I try to reserve judgement for actual results, not what the PR tells us.


----------



## HisDivineOrder (Jun 13, 2015)

Breit said:


> In the foreseeable future with dx12 and split frame rendering, you'll eventually get the VRAM as well with the second card.




You seem to forget that even using DX12 is going to require using particular engines and using DX12 well (as in well enough to do proper SLI/CF support) is going to require someone tailoring the game to support it.

You're overly optimistic given an industry that can't even be bothered to usually support resolutions beyond 1280x720 and 1920x1080, won't remember FOV sliders, customizable keys, and even sometimes manages to forget not to put "Press Start to Begin" on the first screen of the game.

You think these are the companies that are going to by and large invest lots of time into making sure games are made with DX12 to support the truly advanced feature set you think will make DX12 superior to DX11 for SLI/CF?

I doubt it.  I sincerely doubt it.  They'll treat DX12 the way they treated DX11 over 9.  It'll be a great way to add a few extra features in, but largely the ports will be treated as DX9 with 11 gloss.  In this case, it'll be programmers doing things exactly like they remember with DX11 with a slim few (probably done in a month or pulled from something like Gameworks by an external hardware partner) improvements added so they can call it "DirectX 12."

Without the proper effort into rethinking the game as a fully DX12 title, it won't matter at all.  If they start with an Xbox One base, it seems unlikely they'll ever move beyond the peculiar ESRAM buffer and large amount of shared system RAM to truly customize it for gaming cards with less than, say, the 5GB's of system RAM the current consoles dedicate usually to GPU-related tasks.


----------



## newconroer (Jun 13, 2015)

So no idea what's happened in this thread but the summary I managed to squeeze out was : 

No DVI ports  = bad
Fury = confusing name
16nm = happening someday
16nm = only 'chance' AMD has to get serious(according to Nvidia loyalists)
DX12 = false prophet?


----------



## qubit (Jun 14, 2015)

That "Fury" name suggests a real NVIDIA beater. If it turns out to be another underperforming Bulldozer, then AMD will have seriously embarrassed itself and will lose all credibility. Let's hope this is not the case.


----------



## jigar2speed (Jun 14, 2015)

qubit said:


> That "Fury" name suggests a real NVIDIA beater. If it turns out to be another underperforming Bulldozer, then AMD will have seriously embarrassed itself and will lose all credibility. Let's hope this is not the case.



This coming from you, should i be worried about the Fury ?


----------



## qubit (Jun 14, 2015)

jigar2speed said:


> This coming from you, should i be worried about the Fury ?


What is that supposed to mean?


----------



## RealNeil (Jun 14, 2015)

I'm glad that we'll know all about Fury and Fury XT this week. Speculation about it, compared to GTX-980Ti has been rampant and never ending.


----------



## Breit (Jun 14, 2015)

RealNeil said:


> I'm glad that we'll know all about Fury and Fury XT this week. Speculation about it, compared to GTX-980Ti has been rampant and never ending.


Soooo exciting...


----------



## RealNeil (Jun 14, 2015)

Breit said:


> Soooo exciting...



Not for me. No money to buy one, but I did just buy a gtx-980 windforce


----------



## qubit (Jun 15, 2015)

RealNeil said:


> I'm glad that we'll know all about Fury and Fury XT this week. Speculation about it, compared to GTX-980Ti has been rampant and never ending.


Gets tiresome, doesn't it? One just wants it to end and to finally know the facts.


----------



## RealNeil (Jun 15, 2015)

Yes


----------



## rvalencia (Jun 15, 2015)

newconroer said:


> So no idea what's happened in this thread but the summary I managed to squeeze out was :
> 
> No DVI ports  = bad
> Fury = confusing name
> ...


AMD plans to use 14 nm from GoFlo(same tech as Samsung's 14 nm)  not TSMC's 16 nm.


----------



## Breit (Jun 15, 2015)

rvalencia said:


> GoFlo


 
Nice typo.


----------



## Wshlist (Jun 15, 2015)

I expect the 1.1 version with 8GB version will follow a short time later making the people who bought the 4GB one for a great deal of money feel rather annoyed.
But maybe I'm wrong.


----------



## GreiverBlade (Jun 15, 2015)

Wshlist said:


> I expect the 1.1 version with 8GB version will follow a short time later making the people who bought the 4GB one for a great deal of money feel rather annoyed.
> But maybe I'm wrong.


NAAHHH they're not nvidia ... they will never do a similar thing than Titan to 780Ti to Titan Black, Titan X to 980Ti ... do they? (light joke... dont take that seriously  ) 
well ... at last owner of a 290/290X will have no worries to wait a bit on the 8gb version if there ever is one ahah ... (i suspect it's the HBM who limit the vRAM to 4gb but no biggies ... 8gb is still not really common and 4K is  also not common  altho most 4k gaming can be done with 4gb and for those who are satisfied with 1080p monitor, well no need to explain  )

lately i am telling myself : "hey, i already have a 390 ... no need to upgrade, can wait the next next one... or maybe a 390X."


----------



## Breit (Jun 15, 2015)

GreiverBlade said:


> at last owner of a 290/290X will have no worries to wait a bit on the 8gb version if there ever is one ahah ...



We already have 8GB versions of R9-290X for some time.
For example this one: http://www.sapphiretech.com/presentation/product/?cid=1&gid=3&sgid=1227&pid=2548&psn=&lid=1&leg=0


----------



## $ReaPeR$ (Jun 15, 2015)

the54thvoid said:


> It's what you want it to be but if you take the percentile scores as points:
> 
> 290x = 72
> 980ti = 100
> ...




i didnt think of it in that way  you are right! 






arbiter said:


> Yea speaking of AMD fanboyz how bad they hype the hell outta a card that have no proof its even as good as they think.
> Yea 7 months ago all AMD fan boyz were ALL over nvidia only putting 4gb on gtx980 saying it wasn't enough. Amazing how AMD does it and it is enough 7 months later. Funny how the hypocrisy works isn't it?
> My point was gtr will give the ferrari a run for its money. while costing a lot less.



spewing negativity about a product that hasnt been tested yet is at least pointless. lets just wait for the benchmarks.. ok? 







RejZoR said:


> And then you convert those lovely % into actual framerate and it ends up being a 5fps difference. Talk framerate, not %... Not because I'm a fanboy or anything, but anyone who follows graphic cards seriously knows that % always sounds so glorious and then you check actual framerate difference and it's negligible difference. And then you take into an account that one card is 1 year old and another absolutely brand new and makes you wonder wtf was NVIDIA "improving"... Again, not a fanboy, just being realistic...



+1 to that! if though, one wants the absolute top in performance and does not care about the cost, the titan is the product to buy. on the other hand for the 80% of the people anything above 300$ is an overkill imo.


----------



## RealNeil (Jun 15, 2015)

Breit said:


> We already have 8GB versions of R9-290X for some time.
> For example this one: http://www.sapphiretech.com/presentation/product/?cid=1&gid=3&sgid=1227&pid=2548&psn=&lid=1&leg=0



I like those Sapphire 8GB GPUs. I have a friend with three of them for sale. I'm trying to wrangle a deal on them for my X99 build.

Right now I have the Toxic version of the R9-280X with the same cooler and fans. It's an excellent GPU and it stays nice and cool.


----------



## GreiverBlade (Jun 15, 2015)

Breit said:


> We already have 8GB versions of R9-290X for some time.
> For example this one: http://www.sapphiretech.com/presentation/product/?cid=1&gid=3&sgid=1227&pid=2548&psn=&lid=1&leg=0


i was talking about the Fury ...


----------



## Wshlist (Jun 16, 2015)

GreiverBlade said:


> ...  (i suspect it's the HBM who limit the vRAM to 4gb but no biggies ... 8gb is still not really common and 4K is  also not common  altho most 4k gaming can be done with 4gb and for those who are satisfied with 1080p monitor, well no need to explain  ) ...



Yes it's the HBM but I think the reason it's 4GB is the newness and cost of the technology right now, and I see the manufacturing process evolve quickly to make it possible to do 8GB at a reasonable cost and reliability.
But I'm just guessing of course, but we've seen such things before in cutting edge tech obviously.


----------



## GreiverBlade (Jun 16, 2015)

Wshlist said:


> Yes it's the HBM but I think the reason it's 4GB is the newness and cost of the technology right now


why you add a "but"  it's technically what i meant by writting "i suspect it's the HBM who limit the vRAM...etc" new tech and cost yep ... not physical or electrical limitation.

for me your guess is right


----------

