# Radeon Fury X Outperforms GeForce GTX Titan X, Fury to GTX 980 Ti: 3DMark Bench



## btarunr (Jun 17, 2015)

AMD's upcoming $650 Radeon R9 Fury X could have what it takes to beat NVIDIA's $999 GeForce GTX Titan X, while the $550 Radeon Fury (non-X) performs close to the $650 GeForce GTX 980 Ti, according to leaked 3DMark 11 and 3DMark (2013) benches by Korean tech publication ITCM.co.kr. The benches see the R9 Fury X score higher than the GTX Titan X in all three tests, while the R9 Fury is almost as fast as the GTX 980 Ti. The cards maintain their winning streak over NVIDIA even with memory-intensive tests such as 3DMark Fire Strike Ultra (4K), but buckle with 5K. These two cards, which are bound for the market within the next 30 days, were tested alongside the R9 390X, which is not too far behind the GTX 980, in the same graphs. The R9 Nano, however, isn't circulated among industry partners, yet. It could still launch in Summer 2015.



 



*View at TechPowerUp Main Site*


----------



## haswrong (Jun 17, 2015)

factory clocks?


----------



## ironcerealbox (Jun 17, 2015)

If this is accurate (still too early to tell until more benchmarks are released), then Nvidia might look like this:


----------



## HumanSmoke (Jun 17, 2015)

Encouraging, but I think I'd still like to see the actual screenshots of the benchmark runs


ironcerealbox said:


> If this is accurate (still too early to tell until more benchmarks are released), then Nvidia might look like this:


Aye, and if they turn out to the usual pre-release FUD, then Nvidia might be all...





/Only wanted an excuse to post another Werewolf Rob Ryan gif, so +1 drug fuelled Tijuana binge to you @ironcerealbox


----------



## Deleted member 67555 (Jun 17, 2015)

Looks like Nvidia is still kicking ass in resolutions that 12 people use....


----------



## Lou007 (Jun 17, 2015)

So is that 4gb of ram kicking 12gb of rams butt  It also looks like (If these are to be believed) that the R9 390X is more than just a rebrand.


----------



## Basard (Jun 17, 2015)

jmcslob said:


> Looks like Nvidia is still kicking ass in resolutions that 12 people use....



Lol... I'm still at 1680x1050.... I'll probably upgrade to 1440 in a few years.  But, yeah, I'm sure plenty of people are running 4k monitors, but the majority are still happily using 1080....  

I'll maybe pick up a Nano after their prices drop a bit... after that (years from now) I'll buy a better monitor.... then after my Nano gets old, I'll upgrade to another card that doesn't suck 200+ watts of power. 

The only reason I'm using this RIDICULOUS Asus 5870 is because a friend gave it to me for free.  It's a fucking nice card, but shit, it was expensive when new and it uses a ton of power.


----------



## haswrong (Jun 17, 2015)

jmcslob said:


> Looks like Nvidia is still kicking ass in resolutions that 12 people use....



i wonder if the dual chip 2x 4GB card would behave as 8GB with no choke point, full throughput in the 5k and 8k firestrike benchmark in dx12 environment. thatd be a nice future-proof feat..


----------



## ironcerealbox (Jun 17, 2015)

HumanSmoke said:


> Encouraging, but I think I'd still like to see the actual screenshots of the benchmark runs
> 
> Aye, and if they turn out to the usual pre-release FUD, then Nvidia might be all...
> 
> ...



Aye, very true and good ol' Ryan brothers...always get a laugh out of those two.


----------



## v12dock (Jun 17, 2015)

I just wanna see some benchmarks... It's significantly cheaper than expected im almost considering getting two.


----------



## HumanSmoke (Jun 17, 2015)

Lou007 said:


> So is that 4gb of ram kicking 12gb of rams butt  It also looks like (If these are to be believed) that the R9 390X is more than just a rebrand.


These benchmark " leaks" are by the same DG Lee who leaked this time last year, the info that the Hawaii die used in the 290X actually had 3072 cores/192 TAUs/48 CU when everyone else on the planet including AMD was (and still is) under the impression that the die contains 2816 / 176 / 44.
The guy has intense love of working out performance mathematically from known data points, which is why I wanted to see screenshots to allay suspicions that these are actual benchmark runs rather than mathematical extrapolation.


----------



## haswrong (Jun 17, 2015)

v12dock said:


> I just wanna see some benchmarks... It's significantly cheaper than expected im almost considering getting two.


i wanna see a new poll too..


----------



## xenocide (Jun 17, 2015)

haswrong said:


> i wonder if the dual chip 2x 4GB card would behave as 8GB with no choke point, full throughput in the 5k and 8k firestrike benchmark in dx12 environment. thatd be a nice future-proof feat..


 
In DX12 it would, but I wouldn't hold your breath for an abundance of DX12 games.  Developers probably won't release games that fully support DX12 until 2018.


----------



## fullinfusion (Jun 17, 2015)

$650 hmm

Well thats going to be $849-$879 here in Canada..

Oh well ya gota pay to play


----------



## Deleted member 67555 (Jun 17, 2015)

LOL..I normally avoid these types of threads because they are poop...LOL
And I have a hard time not posting poop on threads like this....


----------



## qubit (Jun 17, 2015)

Real competition, that's what I like to see.  Let's hope these benchmarks are accurate.

I can just see NVIDIA having to release something like a "GTX 980 Ti+" with the full, ungimped GPU and with higher clocks in order to get back on top. Oh and maybe a price cut too.


----------



## Black (Jun 17, 2015)

We will see but I think AMD finally on the right road


----------



## HM_Actua1 (Jun 17, 2015)

Apples to Oranges......  Maxwell is not Pascal. Wait until Pascal drops... As for this E3 Hype. I'll believe it after real testers get their hands on them and run some real world game testing.


----------



## Mathragh (Jun 17, 2015)

HumanSmoke said:


> These benchmark " leaks" are by the same DG Lee who leaked this time last year, the info that the Hawaii die used in the 290X actually had 3072 cores/192 TAUs/48 CU when everyone else on the planet including AMD was (and still is) under the impression that the die contains 2816 / 176 / 44.
> The guy has intense love of working out performance mathematically from known data points, which is why I wanted to see screenshots to allay suspicions that these are actual benchmark runs rather than mathematical extrapolation.


Ironically, those specs he quoted would actually cause hawaii to roughly perfoms as it does in its 390x form in those leaked benchies...

Slighty offtopic,: with the official launch being the 24th it atleast makes a bit more sense that reviewers didnt get any cards last week (Ryan Shrout anyone?). I bet they were handed out/sent to reviewers after this event today.


----------



## xvi (Jun 17, 2015)

Too bad the Nano isn't in the list too.


----------



## Xzibit (Jun 17, 2015)

Instead of spending $999 for single digit 8k gaming I can get it for $429.   SOLD!!!!  

LG where is that 8k monitor you talked about. 

/sarcasm


----------



## Batou1986 (Jun 17, 2015)

If this is real how did they can get a hold of 8gb HBM card that AMD says is still in dev ?

nevermind for a second there I forgot the 390x is a 290x with 8gb of ram


----------



## btarunr (Jun 17, 2015)

Lou007 said:


> So is that 4gb of ram kicking 12gb of rams butt  It also looks like (If these are to be believed) that the R9 390X is more than just a rebrand.



Would you choose 4 GB DDR3 over 1 GB GDDR5? The choice between HBM and GDDR5 will be similar.



xvi said:


> Too bad the Nano isn't in the list too.



R9 Nano will launch towards the end of summer (mid/late August?). Hence nobody has even engineering samples.


----------



## Hood (Jun 17, 2015)

If true, I'm still not buying AMD, just happy that 980 Ti prices will probably drop $50-$100.


----------



## bpgt64 (Jun 17, 2015)

As a long standing member of Green Team, very kool....love it when AMD punches Nvidia in the balls on price/perf.


----------



## nickbaldwin86 (Jun 17, 2015)

4GB really.?.?. oh well

I hope this kicks down the prices of the 980Ti... I want two


----------



## xenocide (Jun 17, 2015)

I wouldn't get too excited until the card is actually reviewed.  AMD's marketting team isn't known for delivering exactly as they claim.  Last time I saw slides like this they had Bulldozer handily beating SB i7's.  We know how that turned out.


----------



## RealNeil (Jun 17, 2015)

Price wars anyone?


----------



## xenocide (Jun 17, 2015)

RealNeil said:


> Price wars anyone?


 
That's a game Nvidia wins.  The Fury X is an expensive piece of hardware with still pricey HBM, a pretty complicated manufacturing process, and that AIO liquid cooler.  I'd be amazed if AMD was making anywhere near what Nvidia is making per sale with the 980 Ti.  If Nvidia drops the 980 Ti even to $600 it hurts AMD.  Dropping it to something like $550 and knocking $50 off all their cards on the way down makes AMD irrelevant at most price points...


----------



## neko77025 (Jun 17, 2015)

Thats it .. I am renaming the Fury to Perseus ... Cause it just slayed A Titan !!!! 
(/dumps fuel on the fire) 

From now on .. Fury X will be know to all as ... *PERSEUS*


----------



## dwade (Jun 17, 2015)

But 980 ti already slayed the Titan X. And it still has more VRAM than Fury X.


----------



## the54thvoid (Jun 17, 2015)

As @HumanSmoke says, DG Lee produces an inordinate amount of poo.

Real reviews required, not extrapolated benchmarks.


----------



## bogami (Jun 17, 2015)

This will be enough for you every honor AMD. Now the wings fall off since NVIDIA did not have competition, and the only disadvantage is the RAM on 4Gb FuryX. Turn off AA and all will be well. There is certainly some reserve since driver is new. Now it will. Nvidia will naw be forced to lower prices. Devil(nvidia) must once get a lesson about greed.


----------



## SIGSEGV (Jun 17, 2015)

if this bench scores were true then it's indeed very good moves for AMD.
still wanna get their upcoming hbm2 with 14nm process node gpu on next year as i have no plan to buy 4k monitor this year. i really wanna know how many scores will be gained if FURY X benched with gimpworks titles in 4K res?


----------



## ZoneDymo (Jun 17, 2015)

nobody going to mention the R9 390x seems to preform quite a bit better then the r9 290x?


----------



## rruff (Jun 17, 2015)

btarunr said:


> Would you choose 4 GB DDR3 over 1 GB GDDR5? The choice between HBM and GDDR5 will be similar.



Doesn't seem like a good comparison. The *quantity* of vram just needs to be enough to deal with the digital info in a frame. Besides that, if the vram is *slow* it can limit your fps, and if the processor is *slow* then it will limit your fps. Ideally all factors need to be balanced. 

Currently cards like a GTX 970 or R9 290x are pretty well balanced with 4GB of GDDR5. Processor and vram speed, and vram quantity will all be near hitting the limit simultaneously. 

But bring out a card with a much faster processor and much faster vram, and 4GB is no longer sufficient. It becomes the limiting factor. Because you have the hp to run high res detail except for the vram *quantity* not being enough to buffer it.


----------



## nem (Jun 17, 2015)

so beautiful  *o*


----------



## dwade (Jun 17, 2015)

Wow.


----------



## xorbe (Jun 17, 2015)

Beats?  Maybe pips.  But I hope it opens a can of whoop-ass, we need progress!


----------



## Ebo (Jun 17, 2015)

I think some of our members have been spreading FUD no matter what.

AMD have launched a new lineup, based on completely new tech with new menory, a bandwith Nvidia can only dream of.

1 member in perticular have been* very negative  *allready from the start. He wants maximum preformance for under 400 euros which is his max....well that* NOT* going to happen and even within his price range he is still unhappy while his machine have a 3 year old setup. I just dont understand it at all.

If you want the newest and best, theres a price to pay, and still AMD delivers even this time arround, at a lower price, than Nvidia has of now. 
Looking at how some have been bashing AMD for not delivering stabile drivers ?, just look at Nvidia their last 3 drivers havent been stabile and caused a lot of problems for a lot of people, until the last one, so my friend tells me.

I have made my mind up, Im going for the Fury X or the Fury, all comes down to which hit the shelves first and I really dont care about the price. It can be 500-6-7 euros, I dont care, all I know is that I want it.


----------



## DarthJedi (Jun 17, 2015)

dwade said:


> But 980 ti already slayed the Titan X. And it still has more VRAM than Fury X.



No it does not, LOL. 980Ti is a cut down GM200/TitanX. Some factory overclocked 980Ti are faster than TitanX at stock clocks, but most samples of TitanX can overclock by 30-60% on stock air and water.


----------



## RCoon (Jun 17, 2015)

the54thvoid said:


> DG Lee produces an inordinate amount of poo.



I am siggin' that, because it makes me chuckle every time I read it.


----------



## Xzibit (Jun 17, 2015)

dwade said:


> Wow.



Those are AMD slide favorable numbers but if they hold true. 

*Tom's Hardware*
Titan X
Avg = 39
Min = 33

980 Ti
Avg = 38
Min = 32

R9 290X
Avg = 32
Min = 26


----------



## petedread (Jun 17, 2015)

And I was just about to go back to Nvidia after 5 years by getting a 980ti. But if all is true then I have to give AMD some much needed cash/market share.


----------



## samljer (Jun 17, 2015)

Card not relevant for its given performance....

I have a card now that i use at 1440p that has 4GB ram and guess what,
It locks in to 72FPS (vsync 72hz monitor) and as soon as i start turning on the eye candy
the ram runs out (long before the GPU cant handle it) - GTAV is bad for this for me

as soon as i see MSI after burner show 3998mb in use, the frame rate drops into the toilet.
its not the GPU....

so when something this powerful has 4GB only, its irrelevant. 
a GPU this powerful can push content using 6gb framebuffers.
AMD FUKED up this card putting only 4.

I really wanted this card too, i waited so long, now im going to end up with the 980ti.


Take my advice guys... 4GB is not enough for this GPU.
and if your gaming at 1080.... then you dont need this GPU anyway.
my wife still uses a GTX 680 that pushes 1080 at 200fps + in most titles
with the new stuff still hitting 60-65 without breaking a sweat.


----------



## [XC] Oj101 (Jun 17, 2015)

Lou007 said:


> It also looks like (If these are to be believed) that the R9 390X is more than just a rebrand.



What makes you say that? The performance is just above the level of the 290X (according to these graphs).


----------



## DarthJedi (Jun 17, 2015)

samljer said:


> Card not relevant for its given performance....
> 
> I have a card now that i use at 1440p that has 4GB ram and guess what,
> It locks in to 72FPS (vsync 72hz monitor) and as soon as i start turning on the eye candy
> ...



Actually, no.

1. You do need this card at 1080 and 1440 because not even then some games will be playable in 60 or 144 FPS. For as long as you don't have 60 FPS as minimum, you're not experiencing the best and perfect fluidity. At least on average you should have 60 FPS for smooth gameplay. Then, we have 144Hz monitors on 1080 and 1440 that provide great experience and need power.
So yes, you NEED this card.

2. Unfortunately, yes, 4GB might not be enough for some 4K games, but for some it will be. You don't need AA anyway at 4K so most games will work fine.

3. However, those games that need more, like GTA5 at 5K don't stop at 6GB anyway. You need more than that and 980Ti can't help you. If you play at 5K, yes, for some games only TitanX works, but in most cases even at 4K Fury X will be fine.


----------



## samljer (Jun 17, 2015)

naxeem said:


> Actually, no.
> 
> 1. You do need this card at 1080 and 1440 because not even then some games will be playable in 60 or 144 FPS. For as long as you don't have 60 FPS as minimum, you're not experiencing the best and perfect fluidity. At least on average you should have 60 FPS for smooth gameplay. Then, we have 144Hz monitors on 1080 and 1440 that provide great experience and need power.
> So yes, you NEED this card.
> ...




1) Frame rate doesnt increase ram need, sorry. 
2) 4gb isnt enough for most games at 1440p, this has been my experience for the last 2 years. Where did i say 4k?
3) 6GB is enough for GTAV with a SINGLE GPU @ 1440p I suggest you watch the linuxtechtips episode where he proved it.

thanks for playing,  but you sound very abbrasive for no reason. I wont come back to this.


----------



## HumanSmoke (Jun 17, 2015)

ZoneDymo said:


> nobody going to mention the R9 390x seems to preform quite a bit better then the r9 290x?


Does it? I wasn't aware the card had been reviewed yet?
The only benchmarks I've seen that have some validity, are from people with cards in hand. Performance looks pretty similar.
Of course if DG Lee is actually benching the cards, I'd assume he must have had the result on screen at some stage - why not post screencaps if this is the case?


----------



## DarthJedi (Jun 17, 2015)

samljer said:


> 1) Frame rate doesnt increase ram need, sorry.
> 2) 4gb isnt enough for most games at 1440p, this has been my experience for the last 2 years. Where did i say 4k?
> 3) 6GB is enough for GTAV with a SINGLE GPU @ 1440p I suggest you watch the linuxtechtips episode where he proved it.
> 
> thanks for playing,  but you sound very abbrasive for no reason. I wont come back to this.



You seem to be very aggressive and trying to hide it. Please, keep your horses down and avoid suggesting things to people using false information or lie to them.

1. Not sure what are you replying here since there is no mention of correlation of VRAM and framerate anywhere in my post.

2. Your experience might be like that if you play specifically only the games that force high VRAM usage.  On the contrary to your experience, and fortunately for gamers out there, most games do not use more than 3, let alone 4 GB of VRAM at 1440 or 2160 even.

3. It has never been said so. Read again. Yet another "topic" you reply with completely unrelated answer. Intentionally, obviously.



http://www.tweaktown.com/tweakipedia/90/much-vram-need-1080p-1440p-4k-aa-enabled/index.html
http://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_Titan_X/33.html

You intentionally misinterpret and read what is not to be read anywhere and reply with answers completely irrelevant, unrelated to what you're faking to be replying to. Then you also make up information and present them as facts.

You seem to be a troll and should be banned ASAP. Hopefully moderators will remove you and your posts for good.


----------



## Naito (Jun 17, 2015)

This is good news. If Nvidia drop their prices (unlikely as the products will probably _conveniently _slot between each other in the price v performance metric), I might just buy a GTX 980Ti...

As always, will wait for W1zzard's brilliant reviews.


----------



## buggalugs (Jun 17, 2015)

. AMD did a good job to get this performance from 28nm and we get new technology with HBM.

 Its not over yet, there should be some interesting non-reference overclocked versions like ASUS DCUII or MSI Lightning etc.

 There was a rumour about versions of fury with GDDR5 to allow for higher Vram, but either way there should be lots of interesting options based on this GPU.  Anyway 4GB is enough for the vast majority of gamers.

 Its going to be a long time before pascal comes next year


----------



## ZoneDymo (Jun 17, 2015)

HumanSmoke said:


> Does it? I wasn't aware the card had been reviewed yet?
> The only benchmarks I've seen that have some validity, are from people with cards in hand. Performance looks pretty similar.
> Of course if DG Lee is actually benching the cards, I'd assume he must have had the result on screen at some stage - why not post screencaps if this is the case?



Im talking about the slides shown in this news post, the title reads "
*Radeon Fury X Outperforms GeForce GTX Titan X, Fury to GTX 980 Ti: 3DMark Bench"*

and we see in the first slide that the Fury X does indeed outperform TitanX by.. well about 30 points, which can hardly be called outperforming.

Now a bigger jump can be seen at the bottom of the slide, the 390x scoring about 1500 points higher then the 290x.
Now if this is just the clocks, then that makes the "outperforming" part of the Fury X over the Titan X even less relevant, a 5mhz bump should easily level the 2 out.


----------



## RCoon (Jun 17, 2015)

ZoneDymo said:


> Im talking about the slides shown in this news post, the title reads "
> *Radeon Fury X Outperforms GeForce GTX Titan X, Fury to GTX 980 Ti: 3DMark Bench"*
> 
> and we see in the first slide that the Fury X does indeed outperform TitanX by.. well about 30 points, which can hardly be called outperforming.
> ...



Those slides are made up benchmarks using educated guesses. They are not actual benchmarks run on the actual hardware. It's entirely guestimation.


----------



## ZoneDymo (Jun 17, 2015)

RCoon said:


> Those slides are made up benchmarks using educated guesses. They are not actual benchmarks run on the actual hardware. It's entirely guestimation.



"according to leaked 3DMark 11 and 3DMark (2013) benches by Korean tech publication ITCM.co.kr."

Maybe not the most credible source but it goes a bit far to just call them "made up" right?


----------



## Vayra86 (Jun 17, 2015)

ZoneDymo said:


> "according to leaked 3DMark 11 and 3DMark (2013) benches by Korean tech publication ITCM.co.kr."
> 
> Maybe not the most credible source but it goes a bit far to just call them "made up" right?



Come on. It's being done every single day. Ignore this stuff and wait for real world performance, much safer, much closer to the actual facts. We don't even have bench specifics, clocks, etc. All of which you would expect if these benchmarks were real and not a wild guess.

In my opinion AMD's slideshow falls in the exact same category.


----------



## rruff (Jun 17, 2015)

naxeem said:


> Actually, no.
> 
> 1. You do need this card at 1080 and 1440 because not even then some games will be playable in 60 or 144 FPS. For as long as you don't have 60 FPS as minimum, you're not experiencing the best and perfect fluidity. At least on average you should have 60 FPS for smooth gameplay. Then, we have 144Hz monitors on 1080 and 1440 that provide great experience and need power.
> So yes, you NEED this card.
> ...



You are actually agreeing with him, and he makes a good point. *In some games even now at 1440p, 4GB of vram limits eye candy. *That's running at 72 fps locked, and he doesn't say what the card is, but it's not one of the latest. Sure, in most of them it won't be an issue and maybe it isn't a big deal to tone down the settings a little. But for some people it is a little disappointing to have a new $600 card with this limitation.


----------



## haswrong (Jun 17, 2015)

xenocide said:


> In DX12 it would, but I wouldn't hold your breath for an abundance of DX12 games.  Developers probably won't release games that fully support DX12 until 2018.


thats ok. im not interested in games primarily. im interested in the functionality (i was reacting to the drop of performance in the 5k,8k scenarios of firestrike probably due to 4GB memory buffer clutter). and if im able to use both gpus from programmatical point of view without any special enabling profiles from amd driver team, id be most satisfied. buying a dual gpu card only to be able to use one of those would be quite an inefficient endeavour.


----------



## GhostRyder (Jun 17, 2015)

Still waiting for reviews across the web instead of leaks.  I want to see them from everyone because then we can get a good idea of where the real performance is!

Either way if this holds to be true, then its a great day!


----------



## haswrong (Jun 17, 2015)

Ebo said:


> I think some of our members have been spreading FUD no matter what.
> 
> AMD have launched a new lineup, based on completely new tech with new menory, a bandwith Nvidia can only dream of.
> 
> ...


look, there is a distant or not too distant possibility, that amd rented a technology from nvidia (who may have developed it themselves or rented elsewhere) to produce the improved price/performance ratio, just to keep the nvidia-amd tandem going. we may be just witnessing a perfect cartell going without realizing it. you simply cant tell. so 400 bucks for a top notch card is as possible as 650 bucks. but both under different economical conditions. do amd deserve their product to be bought? i dont think theres a natural deserve system in this world. if you think they deserve, then buy it for their price and then donate on top of it. there are people in this world for which its not easy to spend 400 on one brand new component. do they deserve to only buy used or old ones? your call..


----------



## moproblems99 (Jun 17, 2015)

haswrong said:


> look, there is a distant or not too distant possibility, that amd rented a technology from nvidia (who may have developed it themselves or rented elsewhere) to produce the improved price/performance ratio, just to keep the nvidia-amd tandem going. we may be just witnessing a perfect cartell going without realizing it. you simply cant tell. so 400 bucks for a top notch card is as possible as 650 bucks. but both under different economical conditions. do amd deserve their product to be bought? i dont think theres a natural deserve system in this world. if you think they deserve, then buy it for their price and then donate on top of it. there are people in this world for which its not easy to spend 400 on one brand new component. do they deserve to only buy used or old ones? your call..



Yes.  Because they are not owed a 600 graphics card for 400 either.  You either have the money or you don't.  I don't have 600 so I won't buy it.  Will I complain that it is 600 and not 400?  No.


----------



## kaligon (Jun 17, 2015)

Damn... you internet "experts" are funny as hell! Still can't figure out how nobody around here realizes you're comparing DDR5 technology to HBM technology, which is something brand new and much more efficient in terms of data processing speed. We don't even know how far its performance extends, nor what it can do with those 4GBs. Stop misinforming people, because some might even believe this bullcrap. Read up on how HBM works and stop making stupid comparisons. The only thing you can count on will be the upcoming benchmarks, aside from that, nothing should be taken into consideration.


----------



## rruff (Jun 17, 2015)

kaligon said:


> Damn... you internet "experts" are funny as hell! Still can't figure out how nobody around here realizes you're comparing DDR5 technology to HBM technology, which is something brand new and much more efficient in terms of data processing speed. We don't even know how far its performance extends, nor what it can do with those 4GBs.



Not talking about speed here, but rather capacity. 4GB HBM is still 4GB of digital memory, and when the game needs >4GB to store the information it needs, then you will have issues.


----------



## HABO (Jun 17, 2015)

kaligon said:


> Damn... you internet "experts" are funny as hell! Still can't figure out how nobody around here realizes you're comparing DDR5 technology to HBM technology, which is something brand new and much more efficient in terms of data processing speed. We don't even know how far its performance extends, nor what it can do with those 4GBs. Stop misinforming people, because some might even believe this bullcrap. Read up on how HBM works and stop making stupid comparisons. The only thing you can count on will be the upcoming benchmarks, aside from that, nothing should be taken into consideration.



Rofl you are the internet expert.... when your cerebral capacity is small, its irrelevant how fast you can speak, you are dumb, thats the problem.... you know what i mean... thats why there is significant performance drop in this "fake" performance tests in 8k resolution....


----------



## Vayra86 (Jun 17, 2015)

Let's just all call each other dumb and be happy about it.

Makes life a lot more enjoyable


----------



## Haytch (Jun 17, 2015)

Wow, so much anger going on over guesstimated benchmarks from an unreliable source.
Allow me to lay down the facts to help resolve the arguing.
1.  The more ram, the better, regardless of the game, the resolution etc.
2.  The faster the speed of the ram, the better!
3.  The faster the GPU, the better.
I am sure you get my point.

If have concerns regarding power-draw or maybe consider these 'highest-end available' consumer products as overkill, then, perhaps you should not be considering the purchase of the 'highest-end available' model and go for something aimed at your budget/requirements.

Personally, I play at 4k except when I am using my racing simulator which results in 10440 x 2160 resolution.
Next year i plan to get myself another 3 4K screens and mount them which would result in 10440 x 4320 resolution.
Why would i do this ?  Because i can!
Is this reasonable thinking ?  Depends if 17PSi Turbo is reasonable in my car with my miltek exhaust, clutch upgrade, RS500 brakepads, Slotted & Vented front/back discs, big-assed intercooler, dreamscience cold air intake, modded swaybars, Quaife LSD and so on.

I am proud of AMD R&D.  HBM is a huge leap and will improve with time. I really can see HBM taking cards to 64Gb and maybe beyond.  GPU speed might be a limiting factor, but that's another matter.  It is more of a case, whether we can or not.
I applaud AMD for being able to continue to survive, and not only that, bring us high-end quality products that make the opposition cry and even bitch.  Nvidia has a much larger market-share, much larger R&D fund and has a much larger sum worth, and yet AMD is either on-top, even or slightly under, not the defeated little company in the corner of the market that Nvidia wants them to be.

Don't get me wrong, i love my Titans and i am not some AMD fan-boy, in-fact, i think both companies are playing us and i hate that about both of them, but i can't do anything about that. I can only give credit where it is due.

The last time i was this proud of either AMD or Nvidia was when Nvidia released the 8800GTX.


----------



## Jax2Elite (Jun 17, 2015)

Lets not forget this is AMD's next generation and it is only just and I will repeat "just" ahead of what is now getting on for a 12 month old Nvidia architecture. If we look at Maxwell I (GTX 750Ti) then that is actually 16 months old.

I am super happy AMD is finally releasing something that is not going to set my house on fire and well done to them. Unfortunately the performance gap is certainly not big enough to suffer drivers that are equally as bad as Creative Sound Blasters with U.I design on par with Windows 95.

Performance is only one part of a much bigger picture.


----------



## Nate00 (Jun 17, 2015)

ironcerealbox said:


> If this is accurate (still too early to tell until more benchmarks are released), then Nvidia might look like this:



LMAO,   that is funny.  Good one.


----------



## Haytch (Jun 17, 2015)

Jax2Elite said:


> Lets not forget this is AMD's next generation and it is only just and I will repeat "just" ahead of what is now getting on for a 12 month old Nvidia architecture. If we look at Maxwell I (GTX 750Ti) then that is actually 16 months old.
> 
> I am super happy AMD is finally releasing something that is not going to set my house on fire and well done to them. Unfortunately the performance gap is certainly not big enough to suffer drivers that are equally as bad as Creative Sound Blasters with U.I design on par with Windows 95.
> 
> Performance is only one part of a much bigger picture.


That's a damn good first post.  Welcome to TechPowerUP!
I do not have a problem with the heat coming from my AMD GPU's because they seem to have awesome fans from the factory capable of cooling the cards appropriately, problem is, they sound like an aircraft landed on your desktop!


----------



## moproblems99 (Jun 17, 2015)

Haytch said:


> Is this reasonable thinking ? Depends if 17PSi Turbo is reasonable in my car with my miltek exhaust, clutch upgrade, RS500 brakepads, Slotted & Vented front/back discs, big-assed intercooler, dreamscience cold air intake, modded swaybars, Quaife LSD and so on.



What kind of car do you have?


----------



## erocker (Jun 17, 2015)

Keep on topic folks. I won't ask again.

Thank you.


----------



## RealNeil (Jun 17, 2015)

Ebo said:


> I think some of our members have been spreading FUD no matter what.
> 
> AMD have launched a new lineup, based on completely new tech with new memory, a bandwidth Nvidia can only dream of.
> 
> ...




The Fury looks good to me too. How, and why they're getting such good numbers out of Fury intrigues me to no end. Is 4GB of memory somehow relevant again?





samljer said:


> Card not relevant for its given performance....
> 
> AMD FUKED up this card putting only 4.
> 
> ...



Maybe a totally redesigned memory interface and redesigned memory have combined to enhance the performance of that measly 4GB?
Is it not possible that what we know as fact could be changing because of it? Could 4GB of a brand new design change the way we game on our PCs?


----------



## Flazza (Jun 17, 2015)

I am fairly certain even in capacity alone you cannot compare 4gb gddr5 with 4 gb hbm, simply because hbm being clocked at a much lower speed doesnt have all the re-transmitted bus memory errors which you normally get with gddr5.

Someone will hopefully correct me if I am wrong, i am just going by from what I remember of an old white paper on some of the technologies built into gddr5 (error detection and re-transmission).


----------



## GAR (Jun 17, 2015)

FAKE, FAKE, AND FAKE. Why report BS? Just wait one more week.


----------



## Flazza (Jun 17, 2015)

GAR said:


> FAKE, FAKE, AND FAKE. Why report BS? Just wait one more week.



Very true 

Human nature I suppose, when little information is presented we tend to fill in the blanks ourselves.

Not that I will be buying any cards any time soon... I think my dual r9 290's will last a fair bit longer under dx12.


----------



## buildzoid (Jun 17, 2015)

ZoneDymo said:


> nobody going to mention the R9 390x seems to preform quite a bit better then the r9 290x?



5% core clock increase and 20% mem clock increase. 10% overall increase. Looks reasonable and not unexpected IMO.


----------



## Black (Jun 17, 2015)

haswrong said:


> look, there is a distant or not too distant possibility, that amd rented a technology from nvidia (who may have developed it themselves or rented elsewhere) to produce the improved price/performance ratio, just to keep the nvidia-amd tandem going. we may be just witnessing a perfect cartell going without realizing it. you simply cant tell. so 400 bucks for a top notch card is as possible as 650 bucks. but both under different economical conditions. do amd deserve their product to be bought? i dont think theres a natural deserve system in this world. if you think they deserve, then buy it for their price and then donate on top of it. there are people in this world for which its not easy to spend 400 on one brand new component. do they deserve to only buy used or old ones? your call..




AMD rented a technology from nvidia ?


----------



## LightningJR (Jun 17, 2015)

If you extrapolate the numbers based on shader count of the 290X and the Fury X you will notice the Fury X beating the TitanX and 980ti in some games and losing to them in others. For one, you have to realize that the performance numbers of the 980ti and the TitanX are very close and losing to the Titan X or beating the 980ti will probably also have you losing to the 980ti or beating the TitanX, still following? 

Based on the shader increase only the Fury X should be competitive to NVidias highest end cards.

I do think 4GB is not really enough, objectively 4GB for 4K with future games, it's not realistic. I understand that AMD may not have the R&D down for 8GB HBM so they may not have the choice. I hope DX12 will help with future-proofing the 4GB HBM, only time will tell.

I have a 1080p monitor and getting a min framerate of 60fps in every game at max graphics is my goal so I don't mind the 4GB of VRAM but that's just me.


----------



## xenocide (Jun 17, 2015)

I get the sneaking suspicion the Fury X won't overclock that well too.  I don't think the AIO liquid cooler was an aesthetic choice...


----------



## Xzibit (Jun 17, 2015)




----------



## the54thvoid (Jun 17, 2015)

Xzibit said:


>



Nobody can say compared to the awful 290X cooler implementation that this Fury X design isn't anything other than spectacular.
For all the critics of the Hawaii release, AMD appear to have put a shit load of work into this design.  On that alone, we should all applaud AMD for their work.
As for performance I'll wait and see the reviews and not pander to rumour so close to its release.  But c'mon people, the Fury X design brief is top fucking notch.  Pardon my French.


----------



## Wshlist (Jun 17, 2015)

Ebo said:


> I think some of our members have been spreading FUD no matter what.
> 
> AMD have launched a new lineup, based on completely new tech with new menory, a bandwith Nvidia can only dream of.
> 
> ...



Paying the top price at introduction of a top of the line card is a fool's game, you will find a better cheaper, and fixed, product will be out a few months later and your money is gone.
But if you have a lot of spare cash and don't care, all the better you buy it since that does help the companies and that does help the people who are more sensible with their money


----------



## Wshlist (Jun 18, 2015)

LightningJR said:


> If you extrapolate the numbers based on shader count of the 290X and the Fury X you will notice the Fury X beating the TitanX and 980ti in some games and losing to them in others. For one, you have to realize that the performance numbers of the 980ti and the TitanX are very close and losing to the Titan X or beating the 980ti will probably also have you losing to the 980ti or beating the TitanX, still following?
> 
> Based on the shader increase only the Fury X should be competitive to NVidias highest end cards.
> 
> ...



I don't think it's about 4K monitors but about VR headsets, those start at 2.5K or so but next generation offerings might up it a bit.
As for the need for more RAM, they do have that direct system RAM mapping and fast transferring, and streaming textures, and their newish compression techniques for textures which is suppose to fix the need for a lot of RAM in newer games. Suppose to I say, but will it work out?
It's a tricky thing because MS and sony also thought they were alright with their consoles, but even at E3 and with fancy demo videos I notice some poor resolution on textures on a regular basis, so that illustrates that whatever the experts predict is perfectly fine might not be quite that fine.
But as you say, that is for if you actually use 4K displays/future VR headsets and are a bit nitpicky 

As for 8GB HBM, I think it's too tricky (low yield/heat issues/cost?) to start with that I would guess. But it does feel odd that my 2011 GFX card has 3GB and 4GB would be enough for now. Plus I didn't even touch on the non-game uses graphics cards have these days through directcompute and OpenCL and such. But oh well.


----------



## darkangel0504 (Jun 18, 2015)

Calm down ....


----------



## DeadSkull (Jun 18, 2015)

GG, It's over. Nvidia is finished.


----------



## badtaylorx (Jun 18, 2015)

this sucks.  I use 5400x1080 (5k portrait eyefinity)for my sim racing. I really wanted the new fury cards, but I may have to go with a couple 390x's instead...


----------



## darkangel0504 (Jun 18, 2015)

DeadSkull said:


> GG, It's over. Nvidia is finished.



Nope.


----------



## Yorgos (Jun 18, 2015)

xenocide said:


> That's a game Nvidia wins.  The Fury X is an expensive piece of hardware with still pricey HBM, a pretty complicated manufacturing process, and that AIO liquid cooler.  I'd be amazed if AMD was making anywhere near what Nvidia is making per sale with the 980 Ti.  If Nvidia drops the 980 Ti even to $600 it hurts AMD.  Dropping it to something like $550 and knocking $50 off all their cards on the way down makes AMD irrelevant at most price points...


it's nice to see people wanting so much something that they end up believing it's true.

The deal here is that they cut down costs from buying memory chips,
cutting down costs from the complex pcb/larger pcb, e.t.c.
That's why an 8-core Arm soc costs like 5$ and
while an 8-core Arm cpu and a dGPU and a dModem and a dWifi module e.t.c. will cost each 5$ plus the cost to design the pcb and put them together.
Companies are integrating Ram controllers, southbridges, northbridges, networking staff, accelerator and many other things into 1 piece of silicon... because it's dirty cheap.
Sorry to burst your bubble, live with it.


----------



## HavocNME (Jun 18, 2015)

This is going to sound stupid but they had multiple memory chips on these boards in the last series, so why can't they put two 4GB HBM stacks on one card to make 8GB of memory until the 8GB HBM stack is perfected?


----------



## [XC] Oj101 (Jun 18, 2015)

HavocNME said:


> This is going to sound stupid but they had multiple memory chips on these boards in the last series, so why can't they put two 4GB HBM stacks on one card to make 8GB of memory until the 8GB HBM stack is perfected?



That will make for a much bigger GPU package, consequences of that may include lower yields and higher failure rates. Because the memory is integrated into the GPU package, if memory fails you bin the whole lot. There's no salvaging of the GPU.


----------



## 64K (Jun 18, 2015)

Leaked slide shows the Fury X beating the 980 Ti in every game at 4K

http://videocardz.com/56711/amd-radeon-r9-fury-x-official-benchmarks-leaked


----------



## LightningJR (Jun 18, 2015)

HavocNME said:


> This is going to sound stupid but they had multiple memory chips on these boards in the last series, so why can't they put two 4GB HBM stacks on one card to make 8GB of memory until the 8GB HBM stack is perfected?




It's not a stupid question but it is a question that's difficult to answer. For one AMD seems to have 4 stacks of 1GB/8Gbit HBM on the substrate. Answering the question of why they can't just use 2GB/16Gbit stacks or some other mixture is difficult to answer since it would require someone to look for technical papers on HBM or have a direct engineer working in the field give us the answer. I'll admit that I didn't search for the answer for myself but if I were to speculate, going from the top of the pyramid, it's either too expensive, too difficult or just not ready (R&D). There's a multitude of reasons that could affect these options, like I said I didn't do the research, i could give you all the reasons I think they couldn't do 8GB but they would all be speculation, winded and just take a lot of time to explain when I have to type it.


----------



## Crap Daddy (Jun 18, 2015)

64K said:


> Leaked slide shows the Fury X beating the 980 Ti in every game at 4K
> 
> http://videocardz.com/56711/amd-radeon-r9-fury-x-official-benchmarks-leaked



AMD slides, AMD favored games, AMD in game settings and the difference bar a few games is insignificant. Not too good at $650. Let's see how they OC. The custom OCed 980TIs like the Gigabyte monster for 40$ more should be well above Fury X.


----------



## Haytch (Jun 18, 2015)

moproblems99 said:


> What kind of car do you have?


Ford Focus XR5 Turbo 2012.  It is referred to as the Ford Focus ST in Europe.  I reside in Australia.

I never knew that Nvidia was planning on a mini version of the Titan.  Is that even true ?
I also believe that AMD put a lot of time and effort into the new Fury, looking forward to its release and some benchmarks.


----------



## Steevo (Jun 18, 2015)

Crap Daddy said:


> AMD slides, AMD favored games, AMD in game settings and the difference bar a few games is insignificant. Not too good at $650. Let's see how they OC. The custom OCed 980TIs like the Gigabyte monster for 40$ more should be well above Fury X.




Wow.

Batman, AC:U, FC4, and others are Nvidia titles. 

http://www.geforce.com/whats-new/articles/far-cry-4-nvidia-gameworks-trailer

http://www.geforce.com/whats-new/guides/assassins-creed-unity-graphics-and-performance-guide

http://www.geforce.com/whats-new/guides/batman-arkham-origins-graphics-and-performance-guide

But whatever you like to think so you sleep better at night.


----------



## [XC] Oj101 (Jun 18, 2015)

Haytch said:


> Ford Focus XR5 Turbo 2012.  It is referred to as the Ford Focus ST in Europe.  I reside in Australia.
> 
> I never knew that Nvidia was planning on a mini version of the Titan.  Is that even true ?
> I also believe that AMD put a lot of time and effort into the new Fury, looking forward to its release and some benchmarks.



Look at the date of the Titan Mini article


----------



## HTC (Jun 19, 2015)

64K said:


> Leaked slide shows the Fury X beating the 980 Ti in every game at 4K
> 
> http://videocardz.com/56711/amd-radeon-r9-fury-x-official-benchmarks-leaked



That FRTC part interests me quite a bit: good way to save power.


----------



## Haytch (Jun 19, 2015)

[XC] Oj101 said:


> Look at the date of the Titan Mini article


Ahh i just realized, it's an April fools joke.  Sorry mate, i did not notice that because i don't care much for April fools day.
If you ask me, in the technology world it seems like everyday is April fools day.  Drivers every April and everyone is a fool.


----------



## Vlada011 (Jun 20, 2015)

I have only one objection. And that's nothing for graphic card.
But they really should launch normal simple tubes, same as on CORSAIR H100i example.
Without this bad and cheap sleeve. Everything else is great.
I even think that reference R9-390X 8GB from pictures are best designed AMD reference cards ever.
Much better than last year and HD7970. 
ASUS will probably decide to offer something ROG from Fury X and R9-390X 8GB.
Probably Poseidon models. But maybe even they improve AIO design and invest little more money in that.
But OK they want to offer for 550-650$ same as TITAN X for 1000$ and they save little money.
Radiator could be cubic as Coolstream PE, some similar type, normal tubes, but that would be probably more expensive cards.
Anyway I think many people will decide to buy Air model, Fury X for 550$.
At least card will offer excellent gaming for 1080p and 1440p resolution. Than when become possible AMD will launch and Fury X 8GB, immediately, same as R9-290X.
For same price only little later. 
That's some other politic little than NVIDIA. NVIDIA completely trained their fans, now poor people even justify sabotage of 1 years old premium card.
They say "NVIDIA need to earn somehow money"... Poor NVIDIA, they will bankrupt probably if big part of people who bought 780Ti decide to keep card longer. 
Because of that they need to sabotage fps on 780Ti and for people that's OK. Like only NVIDIA want profit, as AMD and Intel don't need but didn't done something like that.

It's not always question who make better chip, in theory GTX780Ti is better chip than Hawaii, I mean full GK110. 
But NVIDIA spoil own chip to force people forward... Hawaii now have and 8GB and higher fps than GTX780Ti... and what AMD don't need money... 
Rumors say they will bankrupt max at 2020-2021 if something not change. Much worse financial situation, much less hype, much harder earn money and didn't done that.
AMD everything what they earn is on hard work, but NVIDIA big part is hype, and rich gamers allow to NVIDIA experiment with Shield, Tegra, etc...with their money.

Situation is now... GTX980Ti should cost 50$ less than Fury X, custom GTX980Ti models can't be more expensive than Fury X custom models.
TITAN X worth maybe 100-150$ more than Fury X.  But only if benchmark tests are real. We don't know still. 
24h after NVIDIA presentation people could look 10 unboxing video clips with different models and performance in details for most important models, even pictures for custom cards are available soon.
Now after 4 day, still very little about whole series.


----------



## Wshlist (Jun 22, 2015)

I said earlier that €500 is my limit for a graphics card, but when I think about it you do normally pay a good 100 for a watercooler, so I guess one should take that into account when determining limits. And when comparing to other cards too


----------



## Fluffmeister (Jul 15, 2015)

Just revisited this thread, good read.


----------

