• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Radeon R9 Fury X 4 GB

Joined
Apr 25, 2013
Messages
127 (0.03/day)
umm that's the same story of this review. In some tests the Fury came out on top of the 980 ti, trouble is when it did it was by 1-2%, in the tests the 980 ti came out on top it was a higher percentage making the overall average in favor of the 980 ti as even the games which the fury x is faster, the 980 ti has almost the exact gaming experience. The reverse isn't true however where the Fury X is significantly slower in several tests, so much so it might affect detail level or even playability in some cases.
If you count those Gameworks titles, I don't think we would have a nice discussion.
On other note, here is the latest driver for FuryX
http://www2.ati.com/drivers/amd-catalyst-15.15.1004-software-suite-win8.1-win7-64bit-june20.exe
Notice the june20 part.
 
Joined
Oct 15, 2010
Messages
951 (0.18/day)
System Name Little Boy / New Guy
Processor AMD Ryzen 9 5900X / Intel Core I5 10400F
Motherboard Asrock X470 Taichi Ultimate / Asus H410M Prime
Cooling ARCTIC Liquid Freezer II 280 A-RGB / ARCTIC Freezer 34 eSports DUO
Memory TeamGroup Zeus 2x16GB 3200Mhz CL16 / Teamgroup 1x16GB 3000Mhz CL18
Video Card(s) Asrock Phantom RX 6800 XT 16GB / Asus RTX 3060 Ti 8GB DUAL Mini V2
Storage Patriot Viper VPN100 Nvme 1TB / OCZ Vertex 4 256GB Sata / Ultrastar 2TB / IronWolf 4TB / WD Red 8TB
Display(s) Compumax MF32C 144Hz QHD / ViewSonic OMNI 27 144Hz QHD
Case Phanteks Eclipse P400A / Montech X3 Mesh
Power Supply Aresgame 850W 80+ Gold / Aerocool 850W Plus bronze
Mouse Gigabyte Force M7 Thor
Keyboard Gigabyte Aivia K8100
Software Windows 10 Pro 64 Bits
Joined
Nov 4, 2005
Messages
11,984 (1.72/day)
System Name Compy 386
Processor 7800X3D
Motherboard Asus
Cooling Air for now.....
Memory 64 GB DDR5 6400Mhz
Video Card(s) 7900XTX 310 Merc
Storage Samsung 990 2TB, 2 SP 2TB SSDs, 24TB Enterprise drives
Display(s) 55" Samsung 4K HDR
Audio Device(s) ATI HDMI
Mouse Logitech MX518
Keyboard Razer
Software A lot.
Benchmark Scores Its fast. Enough.

the54thvoid

Super Intoxicated Moderator
Staff member
Joined
Dec 14, 2009
Messages
13,051 (2.39/day)
Location
Glasgow - home of formal profanity
Processor Ryzen 7800X3D
Motherboard MSI MAG Mortar B650 (wifi)
Cooling be quiet! Dark Rock Pro 4
Memory 32GB Kingston Fury
Video Card(s) Gainward RTX4070ti
Storage Seagate FireCuda 530 M.2 1TB / Samsumg 960 Pro M.2 512Gb
Display(s) LG 32" 165Hz 1440p GSYNC
Case Asus Prime AP201
Audio Device(s) On Board
Power Supply be quiet! Pure POwer M12 850w Gold (ATX3.0)
Software W10
If you count those Gameworks titles, I don't think we would have a nice discussion.
On other note, here is the latest driver for FuryX
http://www2.ati.com/drivers/amd-catalyst-15.15.1004-software-suite-win8.1-win7-64bit-june20.exe
Notice the june20 part.

No doubt the drivers will help performance but you're really quite 'graspy' when it comes to propping up AMD. They don't need your help - the card's good, just not what (people like you) hyped it up to be.

EDIT:

Your link doesn't work.
 
  • Like
Reactions: 64K
Joined
Apr 26, 2009
Messages
517 (0.09/day)
Location
You are here.
System Name Prometheus
Processor Intel i7 14700K
Motherboard ASUS ROG STRIX B760-I
Cooling Noctua NH-D12L
Memory Corsair 32GB DDR5-7200
Video Card(s) MSI RTX 4070Ti Ventus 3X OC 12GB
Storage WD Black SN850 1TB
Display(s) DELL U4320Q 4K
Case SSUPD Meshroom D Fossil Gray
Audio Device(s) ASUS SupremeFX S1220A
Power Supply Corsair SF750 Platinum SFX
Mouse Razer Orochi V2
Keyboard Nuphy Air75 V2 White
Software Windows 11 Pro x64
Someone at AMD doesn't know the difference between GB (gigabyte) and Gb (gigabit)...

 
Last edited:

64K

Joined
Mar 13, 2014
Messages
6,773 (1.73/day)
Processor i7 7700k
Motherboard MSI Z270 SLI Plus
Cooling CM Hyper 212 EVO
Memory 2 x 8 GB Corsair Vengeance
Video Card(s) Temporary MSI RTX 4070 Super
Storage Samsung 850 EVO 250 GB and WD Black 4TB
Display(s) Temporary Viewsonic 4K 60 Hz
Case Corsair Obsidian 750D Airflow Edition
Audio Device(s) Onboard
Power Supply EVGA SuperNova 850 W Gold
Mouse Logitech G502
Keyboard Logitech G105
Software Windows 10

Not sure what's the situation there. Warner Brothers has pulled the game for the PC for the time being. I don't think at this point that they are just having a hissy fit because they got called out for releasing a buggy game but it's not like they didn't know from beta testing what the issues were with the game to begin with. Hopefully they will do their job and fix the game.

http://arstechnica.com/gaming/2015/...-pulled-from-steam-and-retailers-due-to-bugs/
 
Joined
Sep 7, 2011
Messages
2,785 (0.58/day)
Location
New Zealand
System Name MoneySink
Processor 2600K @ 4.8
Motherboard P8Z77-V
Cooling AC NexXxos XT45 360, RayStorm, D5T+XSPC tank, Tygon R-3603, Bitspower
Memory 16GB Crucial Ballistix DDR3-1600C8
Video Card(s) GTX 780 SLI (EVGA SC ACX + Giga GHz Ed.)
Storage Kingston HyperX SSD (128) OS, WD RE4 (1TB), RE2 (1TB), Cav. Black (2 x 500GB), Red (4TB)
Display(s) Achieva Shimian QH270-IPSMS (2560x1440) S-IPS
Case NZXT Switch 810
Audio Device(s) onboard Realtek yawn edition
Power Supply Seasonic X-1050
Software Win8.1 Pro
Benchmark Scores 3.5 litres of Pale Ale in 18 minutes.
Technical question based on statement:
If AMD designed it to work with water AIO from the start (and not air cooled) this must mean the chip produces a lot of heat? On such a small PCB would this create heat dissipation problems?
I think that about sums it up. The big problem it seems is that the HBM stacks sit in close proximity to the GPU (and under the same heatsink), while the PCB acts as a heatsink itself for the VRMs -as someone else noted, the high localised temp, is the same scenario - albeit not as drastic, as found on the 295X2. Heat from all sides, with minimal internal airflow reliant upon a cold plate on one side to heatsink the entire card. The heat buildup is by my reckoning largely behind the decision to voltage lock the GPU and clock lock the HBM at Hynix's default settings. We may never know unless PowerColor or some other vendor gets a full cover waterblock version of the card out and sanction from AMD to relax voltage lock ( BTW :wasn't there a huge furore when Nvidia voltage locked their cards?).
I genuinely think Fury (non X) is air cooled on slower clocks because it thermally will struggle.
I concur. AMD's GPUs have about the same power requirement as Nvidia's, but have over the last few generations had greater issues with thermal dissipation (maybe the higher transistor density?). The ideal situation would be (aside from TEC) a heatsink fed directly to vapour chamber, with the HBM stacks cooled by fan air and ramsinks, but I think the HBM stacks proximity to the GPU make that a tricky assembly job as well as adding some unwelcome (for the vendors) added expense working in machining tolerances.
I really don't think there is any headroom for air cooling and even then, the water cooling isn't allowing higher clocks. It could be a HBM integration issue with heat affecting perf? Who knows.
A bad IC on a GDDR5 card means the RMA involves removal and replacing. RMA of a Fury X means the whole package probably gets binned and the GPU salvaged for assembly into another interposer package by Amkor. Warranty returns mean bad PR in general, but the physical cost on a small volume halo product card like the Fury X could also be prohibitive.
But isn't that more as a built-in of Nvidia Gameworks; ShadowWorks? Honestly, I personally find Very High looks very... un-natural.
Doesn't matter in the greater scheme of things. How many AMD users vehemently deride, ignore, and refuse to buy any Nvidia sponsored title? Yet the titles still arrive, and if they are AAA titles, sell. If they sell, tech sites are bound by common sense and page views if nothing else to do performance reviews of the titles (and if popular enough include them in their benchmark suites). Benchmarking involves highest image quality at playable settings for the most part, and highest game i.q. in general.
Now, bearing that in mind, and seeing the Dying Light numbers, what do you suppose Nvidia are likely to shoehorn into any of their upcoming sponsored games for maximum quality settings?
Medium is about the top limit where you have "defined but soften shadows" that are more real to life. Even the differences from low/medium hardly noticeable.
In practical terms, no it doesn't. In marketing terms?...well, that's an entirely different matter. DiRT Showdown's advanced lighting enabled lens flares to warm the heart of JJ Abrams. Minimal advancement in gaming enjoyment (less if overdone lens flare is a distraction), but the default testing scenario.
There's been comparisons to the Gigabyte GeForce GTX 980 Ti G1 review W1zzard did prior to this. Looking at that as to the subject of "power" under gaming it's reveling.
The G1 saw 23% higher in it's Average/Peak from the reference 980Ti. Avg: 211 vs. 259W; Peak 238 vs. 293W
The Fury numbers... Avg 246W; Peak: 280W
One is a non-reference card sporting a very high 20% overclock out of the box, one is a reference card. 20% overclock equates to 23% more power usage (and 27% higher performance than the Fury X at 4K). What kind of wattage do you suppose a Fury X would consume at 1260MHz ? Now, this observation aside, what the hell does your long winded power usage diatribe have to do with my quote that you used to launch into it?
 
Last edited:
Joined
Apr 19, 2011
Messages
2,198 (0.44/day)
Location
So. Cal.
what do you suppose Nvidia are likely to shoehorn into any of their upcoming sponsored games for maximum quality settings?
IDK, I suppose we need to see what Warner Bros. can do to fix the PC version Batman Arkham Knight?

In practical terms, no it doesn't.
Like belly buttons we all get an opinion. Mine was running very-high shadow map to me adds no visual reality.

and 27% higher performance than the Fury X at 4K
Odd if a 980Ti and Fury X are close at 4K ~2%... W1zzard review of that G1 appears to indicate it like 15% between a 980Ti.

Now, this observation aside, what the hell does your long winded power usage diatribe have to do with my quote that you used to launch into it?
Nothing that was a new paragraph, and new topic not directed at you, just comparing published data. While, that’s the part you "decry" as long winded?
 
Joined
Jun 11, 2013
Messages
353 (0.08/day)
Processor Core i5-3350P @3.5GHz
Motherboard MSI Z77MA-G45 (uATX)
Cooling Stock Intel
Memory 2x4GB Crucial Ballistix Tactical DDR3 1600
Video Card(s) |Ξ \/ G /\ GeForce GTX 670 FTW+ 4GB w/Backplate, Part Number: 04G-P4-3673-KR, ASIC 68.5%
Storage some cheap seagates and one wd green
Display(s) Dell UltraSharp U2412M
Case some cheap old eurocase, black, with integrated cheap lit lcd for basic monitoring
Audio Device(s) Realtek ALC892
Power Supply Enermax Triathlor 550W ETA550AWT bronze, non-modular, airflow audible over 300W power draw
Mouse PMSG1G
Keyboard oldschool membrane Keytronic 104 Key PS/2 (big enter, right part of right shift broken into "\" key)
Oh. Oh my. This is beautiful. It's exactly what I expected. A card that underperformed, didn't OC for crap, and was overpriced because of the CLC. I can't believe people really expected a card that shipped stock with a AIO water cooling solution would OC well. Of course it gets good temps, the cooler is like $100 tacked onto the pricetag. The power consumption is still garbage, and the performance on sub-4K resolutions is pretty bad. If this were priced around $500 or even $549.99, it would be a decent product, but at $650 I see literally no reason to buy this thing.



TPU already kind of debunked the idea of games using 4GB+ at higher resolutions. For some reason people refused to believe that games wouldn't quickly climb to using 6GB or 8GB. Then, almost immediately after that article was published here, they reviewed The Witcher 3, which used ~2GB of VRAM at 4K, which kind of solidified the idea that VRAM saturation wasn't a real problem, lazy optimization was. A lot of games can use 6-8GB of VRAM, but most of them don't need to. There are plenty of games that adapt to the amount of VRAM available too. I remember seeing reviewed of BF3 and BF4 where on cards with more than 4GB of VRAM it would be using 3.5GB give or take, and with 3GB and 2GB cards it would be using near all of it, with no real performance loss.

but you are considering only stupid-old polygonal approximative graphics even without raytracing.. the next gen graphics with a lot of recursion is where you can start needing quite unpretty amount of memory and it makes hardware with large amount of it more future-proof. you may have noticed that nvidia kinda made the first step towards monetizing voxelized graphics with their vxgi 2-bounce illumination model (with $hitload of pep-talk trying to suggest that nvidia actually invented everything what spells graphics). what i wanted to say waiting for hbm v2 is just another waiting and losing time and fury is 1 year late product. its been at least 2 years of prolonging, rebranding old stuff and losing time from both camps (and intel comfortably waits in hiding whit their shitty igps integrated to cpus mostly increasing their prices with little use). what strikes me though is, that many people jumped on that competition train that competition is a needed thing. it in fact is only for a minority of creatures (or groups there of) of predatory nature who are using its effect to eliminate or control a subject that poses a threat, because its basically a war and who runs out of resources first loses or at least isnt allowed the originally intended share of outcome by the will of the stronger entity. on the other hand theres a much greater power called synergy which is usually beneficial for all partakers thanks to the unity and coherency. i wonder when this world finally realizes it and starts spelling this word. probably not as early as theres less than only half a billion peeps left from the mutual wars we are going to have soon in the future as members of society kepp on predating on each other because of competition and because we do what politicians backed by enforcers say using media outlets, not what we would stand for ourselves representing our free, but scattered, disunited incoherent minds. whatever.. the price is unbearable for me anyway regardless of the hardwares performance.. see ya in the next discussion. :ohwell:

Compassion and Science are the only things that can save mankind.
true
 
Last edited:
Joined
Sep 7, 2011
Messages
2,785 (0.58/day)
Location
New Zealand
System Name MoneySink
Processor 2600K @ 4.8
Motherboard P8Z77-V
Cooling AC NexXxos XT45 360, RayStorm, D5T+XSPC tank, Tygon R-3603, Bitspower
Memory 16GB Crucial Ballistix DDR3-1600C8
Video Card(s) GTX 780 SLI (EVGA SC ACX + Giga GHz Ed.)
Storage Kingston HyperX SSD (128) OS, WD RE4 (1TB), RE2 (1TB), Cav. Black (2 x 500GB), Red (4TB)
Display(s) Achieva Shimian QH270-IPSMS (2560x1440) S-IPS
Case NZXT Switch 810
Audio Device(s) onboard Realtek yawn edition
Power Supply Seasonic X-1050
Software Win8.1 Pro
Benchmark Scores 3.5 litres of Pale Ale in 18 minutes.
Odd if a 980Ti and Fury X are close at 4K ~2%... W1zzard review of that G1 appears to indicate it like 15% between a 980Ti.
My bad on using a wrong baseline figure, but your 15% is wrong.


100 / 85 = 1.1765, or 17.65% higher. To normalize the G1's percentage in the Fury X chart you have to normalize it to the 102% for the reference card. 120% for the G1 / 102% reference gives the same 17.65%. 120% of the Fury X equals 20% faster than Fury X
Nothing that was a new paragraph, and new topic not directed at you, just comparing published data. While, that’s the part you "decry" as long winded?
Just seemed a bit defensive and hinged largely on the one review that had the Fury X pulling less power than 980 Ti/Titan X ( although compute is a totally different story).
While the differences are marginal in a lot of cases, the consensus is that the card (and system - which would also involve AMD's driver overhead / CPU load into the reckoning) isn't more frugal than the Titan X/980 Ti overall - although using certain game titles that may be the case. Hardware France, true to Damien's exacting test protocol sourced a second Fury X in addition to the AMD provided review sample. The differences are quite marked


I might also add that sites that tested the Fury X's power demand to be lower than the 980Ti/Titan X:
Tom's Hardware, Hardware Heaven (355W 980Ti, 350W Fury X)
Sites that tested the Fury X's power demand to be greater than the 980 Ti/Titan X
TechPowerUp, Hardware France, Guru3D, bit-tech, Hot Hardware, ComputerBase, PC Perspective, Tech Report, HardOCP, Digital Storm, Tweaktown, Hexus, HardwareBG, SweClockers, Legit Reviews, Hardware.info, Eurogamer, Overclock3D, Forbes, Hardware Canucks, PC World, Hardwareluxx, and PCGH (who also test power consumption with two games)
 
Last edited:
Joined
Apr 19, 2011
Messages
2,198 (0.44/day)
Location
So. Cal.
but your 15% is wrong.
When you started with 27%.... I'm less wrong!

Just seemed a bit defensive
Nope, not me... I'm not anyone who's saying a Fury X is ever going to be seen as "pulling less power", but is an improvement (yes some or good portion is HBM) over Hawaii. Not sure why you seem to put me in that group?

Hardwareluxx has in past done some sophisticated power testing, and would concur their findings of 13% above a 980Ti'. That's a lot better than W1zzard data of 18%, though if I had to put a round number to it I'd say 15%. Looking a Hardwareluxx and their OC they got a really good 1185MHz (13%) and saw between 9-10% bump in FpS in several tiles, though power went up 17%, So yes Fiji should not have had an executive shooting his mouth off about OC'n.

Like I said earlier at Tom's I can't find they tested a reference 980Ti, so no sure how they arrived at that.
Day off tomorrow (so my Friday night), and a pleasant Southern California evening... done.
 
Joined
Apr 25, 2013
Messages
127 (0.03/day)
With the fact that Furmark was still used in this bench, what would be expected here? Do you guys even realize that nVidia card consumes less power in Furmark than in game? Does that really suit to be the maximum consumption test?

And even in game test, what is the point of comparing peak power of different architectures? It should be the average consumption as in Tom's. It's similar to max fps and avg fps, the later is what mater.

Then people process to hail nVidia and boo AMD based on that fail numbers, what a joke.
 
Joined
Apr 3, 2012
Messages
4,370 (0.95/day)
Location
St. Paul, MN
System Name Bay2- Lowerbay/ HP 3770/T3500-2+T3500-3+T3500-4/ Opti-Con/Orange/White/Grey
Processor i3 2120's/ i7 3770/ x5670's/ i5 2400/Ryzen 2700/Ryzen 2700/R7 3700x
Motherboard HP UltraSlim's/ HP mid size/ Dell T3500 workstation's/ Dell 390/B450 AorusM/B450 AorusM/B550 AorusM
Cooling All stock coolers/Grey has an H-60
Memory 2GB/ 4GB/ 12 GB 3 chan/ 4GB sammy/T-Force 16GB 3200/XPG 16GB 3000/Ballistic 3600 16GB
Video Card(s) HD2000's/ HD 2000/ 1 MSI GT710,2x MSI R7 240's/ HD4000/ Red Dragon 580/Sapphire 580/Sapphire 580
Storage ?HDD's/ 500 GB-er's/ 500 GB/2.5 Samsung 500GB HDD+WD Black 1TB/ WD Black 500GB M.2/Corsair MP600 M.2
Display(s) 1920x1080/ ViewSonic VX24568 between the rest/1080p TV-Grey
Case HP 8200 UltraSlim's/ HP 8200 mid tower/Dell T3500's/ Dell 390/SilverStone Kublai KL06/NZXT H510 W x2
Audio Device(s) Sonic Master/ onboard's/ Beeper's!
Power Supply 19.5 volt bricks/ Dell PSU/ 525W sumptin/ same/Seasonic 750 80+Gold/EVGA 500 80+/Antec 650 80+Gold
Mouse cheap GigaWire930, CMStorm Havoc + Logitech M510 wireless/iGear usb x2/MX 900 wireless kit 4 Grey
Keyboard Dynex, 2 no name, SYX and a Logitech. All full sized and USB. MX900 kit for Grey
Software Mint 18 Sylvia/ Opti-Con Mint KDE/ T3500's on Kubuntu/HP 3770 is Win 10/Win 10 Pro/Win 10 Pro/Win10
Benchmark Scores World Community Grid is my benchmark!!
1st World issues, why do they amuse me?

:confused:

:lovetpu:
 
Joined
Nov 3, 2011
Messages
695 (0.15/day)
Location
Australia
System Name Eula
Processor AMD Ryzen 9 7900X PBO
Motherboard ASUS TUF Gaming X670E Plus Wifi
Cooling Corsair H150i Elite LCD XT White
Memory Trident Z5 Neo RGB DDR5-6000 64GB (4x16GB F5-6000J3038F16GX2-TZ5NR) EXPO II, OCCT Tested
Video Card(s) Gigabyte GeForce RTX 4080 GAMING OC
Storage Corsair MP600 XT NVMe 2TB, Samsung 980 Pro NVMe 2TB, Toshiba N300 10TB HDD, Seagate Ironwolf 4T HDD
Display(s) Acer Predator X32FP 32in 160Hz 4K FreeSync/GSync DP, LG 32UL950 32in 4K HDR FreeSync/G-Sync DP
Case Phanteks Eclipse P500A D-RGB White
Audio Device(s) Creative Sound Blaster Z
Power Supply Corsair HX1000 Platinum 1000W
Mouse SteelSeries Prime Pro Gaming Mouse
Keyboard SteelSeries Apex 5
Software MS Windows 11 Pro
Isn't it possible that you are the one who suffers from fanboyism and spreading fuds? That table can't be right... Maxwell2 supports it for sure, more to that iirc it's mandatory to support it for every DX12 GPU.


Yes, AMD cards will gain more speed increase from dx12, but we don't know how that will actually impact real life gaming performances, and I'm so tired of this benchmark being linked on every tech sites. Do you even realize it's an api OVERHEAD(!) test, or you just see the bigger number and the longer something?
3DMark's API overhead benchmark also test GPU's command I/O ports and pathways
 
Joined
Sep 22, 2012
Messages
1,010 (0.23/day)
Location
Belgrade, Serbia
System Name Intel® X99 Wellsburg
Processor Intel® Core™ i7-5820K - 4.5GHz
Motherboard ASUS Rampage V E10 (1801)
Cooling EK RGB Monoblock + EK XRES D5 Revo Glass PWM
Memory CMD16GX4M4A2666C15
Video Card(s) ASUS GTX1080Ti Poseidon
Storage Samsung 970 EVO PLUS 1TB /850 EVO 1TB / WD Black 2TB
Display(s) Samsung P2450H
Case Lian Li PC-O11 WXC
Audio Device(s) CREATIVE Sound Blaster ZxR
Power Supply EVGA 1200 P2 Platinum
Mouse Logitech G900 / SS QCK
Keyboard Deck 87 Francium Pro
Software Windows 10 Pro x64
AMD again lose from NVIDIA. Again is weaker than two high end GeForce.
I'm NVIDIA fan but I'm sad little because AMD. Somehow no matter what they try they can't offer to customers same performance as NVIDIA.
Even several months after NVIDIA are unable to bring same performance.
I would like really to AMD at least sell enough graphics to cover production and to earn something for existence.
NVIDIA dominance is huge... and they know everything. Whole time they knew that AMD can't offer same performance even as slower GM200.
One more thing is very bad, customers love to OC, this is so popular only because overclocking. Their Fury X don't need 8+8 pin. He could work and with 6+6pin.
Anyway customers can't OC more than 5-6 or maybe 10%, probably for games and less. In mean time GTX980Ti go on 200-250MHz and difference among them grow and grow...
Don't even to talk about overclocked TITAN X vs overclocked Fury X.... I afraid when everything settle down Fury X will be exactly on middle GTX980 - GTX 980Ti.
30-40% stronger than R9-290X. People expect to driver bring them 10% improvements total in every situation... That's impossible even for NVIDIAs best moments not for AMD.
Customers expect one clear win, at least for 2-3% more powerful than TITAN X, maybe even same, only to avoid someone to say Yes TITAN X is stronger... I mean because NVIDIA is better and because I love EVGA and I will continue with GeForce, I say that even in moments when I thought that AMD will win but no reason to anybody wish bad things to AMD, I remember they served me well 10 years until Cayman.
I still have god memories about Radeon before they start to lose from NVIDIA and that GeForce become better option for gaming, no doubt in that.
 
Last edited:
Joined
Mar 13, 2012
Messages
396 (0.09/day)
Location
USA
AMD again lose from NVIDIA. Again is weaker than two high end GeForce.
I'm NVIDIA fan but I'm sad little because AMD. Somehow no matter what they try they can't offer to customers same performance as NVIDIA.
Even several months after NVIDIA are unable to bring same performance.
I would like really to AMD at least sell enough graphics to cover production and to earn something for existence.
NVIDIA dominance is huge... and they know everything. Whole time they knew that AMD can't offer same performance even as slower GM200.
One more thing is very bad, customers love to OC, this is so popular only because overclocking. Their Fury X don't need 8+8 pin. He could work and with 6+6pin.
Anyway customers can't OC more than 5-6 or maybe 10%, probably for games and less. In mean time GTX980Ti go on 200-250MHz and difference among them grow and grow...
Don't even to talk about overclocked TITAN X vs overclocked Fury X.... I afraid when everything settle down Fury X will be exactly on middle GTX980 - GTX 980Ti.
30-40% stronger than R9-290X. People expect to driver bring them 10% improvements total in every situation... That's impossible even for NVIDIAs best moments not for AMD.
Customers expect one clear win, at least for 2-3% more powerful than TITAN X, maybe even same, only to avoid someone to say Yes TITAN X is stronger... I mean because NVIDIA is better and because I love EVGA and I will continue with GeForce, I say that even in moments when I thought that AMD will win but no reason to anybody wish bad things to AMD, I remember they served me well 10 years until Cayman.
I still have god memories about Radeon before they start to lose from NVIDIA and that GeForce become better option for gaming, no doubt in that.


Nvidia has what, 20+ times the budget AMD has? And AMD has to divide their budget for more things than Nvidia and has fewer engineers?

I'd say AMD is doing well to offer what they do, all things considered.
 
Joined
Nov 3, 2011
Messages
695 (0.15/day)
Location
Australia
System Name Eula
Processor AMD Ryzen 9 7900X PBO
Motherboard ASUS TUF Gaming X670E Plus Wifi
Cooling Corsair H150i Elite LCD XT White
Memory Trident Z5 Neo RGB DDR5-6000 64GB (4x16GB F5-6000J3038F16GX2-TZ5NR) EXPO II, OCCT Tested
Video Card(s) Gigabyte GeForce RTX 4080 GAMING OC
Storage Corsair MP600 XT NVMe 2TB, Samsung 980 Pro NVMe 2TB, Toshiba N300 10TB HDD, Seagate Ironwolf 4T HDD
Display(s) Acer Predator X32FP 32in 160Hz 4K FreeSync/GSync DP, LG 32UL950 32in 4K HDR FreeSync/G-Sync DP
Case Phanteks Eclipse P500A D-RGB White
Audio Device(s) Creative Sound Blaster Z
Power Supply Corsair HX1000 Platinum 1000W
Mouse SteelSeries Prime Pro Gaming Mouse
Keyboard SteelSeries Apex 5
Software MS Windows 11 Pro
AMD again lose from NVIDIA. Again is weaker than two high end GeForce.
I'm NVIDIA fan but I'm sad little because AMD. Somehow no matter what they try they can't offer to customers same performance as NVIDIA.
Even several months after NVIDIA are unable to bring same performance.
I would like really to AMD at least sell enough graphics to cover production and to earn something for existence.
NVIDIA dominance is huge... and they know everything. Whole time they knew that AMD can't offer same performance even as slower GM200.
One more thing is very bad, customers love to OC, this is so popular only because overclocking. Their Fury X don't need 8+8 pin. He could work and with 6+6pin.
Anyway customers can't OC more than 5-6 or maybe 10%, probably for games and less. In mean time GTX980Ti go on 200-250MHz and difference among them grow and grow...
Don't even to talk about overclocked TITAN X vs overclocked Fury X.... I afraid when everything settle down Fury X will be exactly on middle GTX980 - GTX 980Ti.
30-40% stronger than R9-290X. People expect to driver bring them 10% improvements total in every situation... That's impossible even for NVIDIAs best moments not for AMD.
Customers expect one clear win, at least for 2-3% more powerful than TITAN X, maybe even same, only to avoid someone to say Yes TITAN X is stronger... I mean because NVIDIA is better and because I love EVGA and I will continue with GeForce, I say that even in moments when I thought that AMD will win but no reason to anybody wish bad things to AMD, I remember they served me well 10 years until Cayman.
I still have god memories about Radeon before they start to lose from NVIDIA and that GeForce become better option for gaming, no doubt in that.


Fury X is competitive against 980 Ti at very high resolution. Remember, AMD hasn't enabled DX11 MT drivers in Windows 8.1.

I rather see benchmarks done on Windows 10 i.e. Project Cars has frame rate uplift on R9-280 on Windows 10. Windows 10 forces DX11 MT.

 
Last edited:
Joined
Sep 7, 2011
Messages
2,785 (0.58/day)
Location
New Zealand
System Name MoneySink
Processor 2600K @ 4.8
Motherboard P8Z77-V
Cooling AC NexXxos XT45 360, RayStorm, D5T+XSPC tank, Tygon R-3603, Bitspower
Memory 16GB Crucial Ballistix DDR3-1600C8
Video Card(s) GTX 780 SLI (EVGA SC ACX + Giga GHz Ed.)
Storage Kingston HyperX SSD (128) OS, WD RE4 (1TB), RE2 (1TB), Cav. Black (2 x 500GB), Red (4TB)
Display(s) Achieva Shimian QH270-IPSMS (2560x1440) S-IPS
Case NZXT Switch 810
Audio Device(s) onboard Realtek yawn edition
Power Supply Seasonic X-1050
Software Win8.1 Pro
Benchmark Scores 3.5 litres of Pale Ale in 18 minutes.
Nvidia has what, 20+ times the budget AMD has?"]And AMD has to divide their budget for more things than Nvidia
No and Yes.
No, Nvidia doesn't have 20+ times the budget. Nvidia's R&D for the past year amounted to $1.36bn. AMD's R&D came to $1.04bn
Yes, the lions share of AMD's R&D should be going into K12 and Zen (especially the latter) and the platform as a whole, but the split wouldn't be anywhere close to 10:1 in favour of processors.
I'd say AMD is doing well to offer what they do, all things considered.
Too early to say. Fiji isn't going to make or break the company. Sales volumes for $650 cards aren't particularly high, and I'm estimating that it costs AMD a hell of a lot more to build a Fury X than it does for Nvidia and its partners to churn out GTX 980 Ti's. With Fury X missing the halo of "worlds fastest GPU", they will need to get a dual-GPU up and running very quickly - but that again will be a double-edged sword. Two expensive high performance GPUs will need a 240 radiator at least - all adds to the bill of materials. Somehow I don't see the company turning their market share around too much unless they sacrifice average selling prices of the rest of the Fury line- and the product stack under it.
If the company get Zen out the door in a timely manner, and the platform lives up to AMD's PR ( not a given based on recent history) they should/could be OK. If Zen flops and/or is late out of the gate, the AMD Financial Analyst Day in 2017 might look like this
 
Joined
Nov 3, 2011
Messages
695 (0.15/day)
Location
Australia
System Name Eula
Processor AMD Ryzen 9 7900X PBO
Motherboard ASUS TUF Gaming X670E Plus Wifi
Cooling Corsair H150i Elite LCD XT White
Memory Trident Z5 Neo RGB DDR5-6000 64GB (4x16GB F5-6000J3038F16GX2-TZ5NR) EXPO II, OCCT Tested
Video Card(s) Gigabyte GeForce RTX 4080 GAMING OC
Storage Corsair MP600 XT NVMe 2TB, Samsung 980 Pro NVMe 2TB, Toshiba N300 10TB HDD, Seagate Ironwolf 4T HDD
Display(s) Acer Predator X32FP 32in 160Hz 4K FreeSync/GSync DP, LG 32UL950 32in 4K HDR FreeSync/G-Sync DP
Case Phanteks Eclipse P500A D-RGB White
Audio Device(s) Creative Sound Blaster Z
Power Supply Corsair HX1000 Platinum 1000W
Mouse SteelSeries Prime Pro Gaming Mouse
Keyboard SteelSeries Apex 5
Software MS Windows 11 Pro
No and Yes.
No, Nvidia doesn't have 20+ times the budget. Nvidia's R&D for the past year amounted to $1.36bn. AMD's R&D came to $1.04bn
Yes, the lions share of AMD's R&D should be going into K12 and Zen (especially the latter) and the platform as a whole, but the split wouldn't be anywhere close to 10:1 in favour of processors.

Too early to say. Fiji isn't going to make or break the company. Sales volumes for $650 cards aren't particularly high, and I'm estimating that it costs AMD a hell of a lot more to build a Fury X than it does for Nvidia and its partners to churn out GTX 980 Ti's. With Fury X missing the halo of "worlds fastest GPU", they will need to get a dual-GPU up and running very quickly - but that again will be a double-edged sword. Two expensive high performance GPUs will need a 240 radiator at least - all adds to the bill of materials. Somehow I don't see the company turning their market share around too much unless they sacrifice average selling prices of the rest of the Fury line- and the product stack under it.
If the company get Zen out the door in a timely manner, and the platform lives up to AMD's PR ( not a given based on recent history) they should/could be OK. If Zen flops and/or is late out of the gate, the AMD Financial Analyst Day in 2017 might look like this
According to http://www.streetwisereport.com/adv...t-intel-nvidia-corporation-nasdaqnvda/120113/

Qualcomm expects to make a MA(merge or acquisition) offer for AMD. Qualcomm already kicked NVIDIA's out of mobile phones.
 
Joined
Sep 7, 2011
Messages
2,785 (0.58/day)
Location
New Zealand
System Name MoneySink
Processor 2600K @ 4.8
Motherboard P8Z77-V
Cooling AC NexXxos XT45 360, RayStorm, D5T+XSPC tank, Tygon R-3603, Bitspower
Memory 16GB Crucial Ballistix DDR3-1600C8
Video Card(s) GTX 780 SLI (EVGA SC ACX + Giga GHz Ed.)
Storage Kingston HyperX SSD (128) OS, WD RE4 (1TB), RE2 (1TB), Cav. Black (2 x 500GB), Red (4TB)
Display(s) Achieva Shimian QH270-IPSMS (2560x1440) S-IPS
Case NZXT Switch 810
Audio Device(s) onboard Realtek yawn edition
Power Supply Seasonic X-1050
Software Win8.1 Pro
Benchmark Scores 3.5 litres of Pale Ale in 18 minutes.
According to http://www.streetwisereport.com/adv...t-intel-nvidia-corporation-nasdaqnvda/120113/

Qualcomm expects to make a MA(merge or acquisition) offer for AMD. Qualcomm already kicked NVIDIA's out of mobile phones.
Qualcomm could fund the buy out from pocket change, but I'll believe it when I see it. Seems like every bad financial quarter brings the rumour of someone buying AMD (Samsung, BLX, Xilinx, and the list goes on). AMD's stock price then magically firms up just before the earnings call.
 
Joined
Oct 15, 2010
Messages
208 (0.04/day)
Lots of negative comments here,, i might add.
What i havent seen are some DX12 benchmarks, because i have a feeling this little pinky is going to blow everything out of the water. Because in the end dx12 is the future, and not dx11 (which mean these benchmarks here)
Allthough, sincere to be, i might add that we must wait at least a generation before dx12 becomes mainstream.
 
Joined
Nov 26, 2013
Messages
816 (0.20/day)
Location
South Africa
System Name Mroofie / Mroofie
Processor Inte Cpu i5 4460 3.2GHZ Turbo Boost 3.4
Motherboard Gigabyte B85M-HD3
Cooling Stock Cooling
Memory Apacer DDR3 1333mhz (4GB) / Adata DDR3 1600Mhz(8GB) CL11
Video Card(s) Gigabyte Gtx 960 WF
Storage Seagate 1TB / Seagate 80GB / Seagate 1TB (another one)
Display(s) Philips LED 24 Inch 1080p 60Hz
Case Zalman T4
Audio Device(s) Meh
Power Supply Antec Truepower Classic 750W 80 Plus Gold
Mouse Meh
Keyboard Meh
VR HMD Meh
Software Windows 10
Benchmark Scores Meh
With the fact that Furmark was still used in this bench, what would be expected here? Do you guys even realize that nVidia card consumes less power in Furmark than in game? Does that really suit to be the maximum consumption test?

And even in game test, what is the point of comparing peak power of different architectures? It should be the average consumption as in Tom's. It's similar to max fps and avg fps, the later is what mater.

Then people process to hail nVidia and boo AMD based on that fail numbers, what a joke.
whaaat ?? o_Oo_Oo_O
 

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
27,842 (3.71/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
And even in game test, what is the point of comparing peak power of different architectures? It should be the average consumption as in Tom's. It's similar to max fps and avg fps, the later is what mater.
you do realize that we have "average" gaming power draw numbers in our reviews? clearly marked as "average", and provide additional data points for interested readers
 
Joined
Jun 11, 2013
Messages
353 (0.08/day)
Processor Core i5-3350P @3.5GHz
Motherboard MSI Z77MA-G45 (uATX)
Cooling Stock Intel
Memory 2x4GB Crucial Ballistix Tactical DDR3 1600
Video Card(s) |Ξ \/ G /\ GeForce GTX 670 FTW+ 4GB w/Backplate, Part Number: 04G-P4-3673-KR, ASIC 68.5%
Storage some cheap seagates and one wd green
Display(s) Dell UltraSharp U2412M
Case some cheap old eurocase, black, with integrated cheap lit lcd for basic monitoring
Audio Device(s) Realtek ALC892
Power Supply Enermax Triathlor 550W ETA550AWT bronze, non-modular, airflow audible over 300W power draw
Mouse PMSG1G
Keyboard oldschool membrane Keytronic 104 Key PS/2 (big enter, right part of right shift broken into "\" key)
Lots of negative comments here,, i might add.
What i havent seen are some DX12 benchmarks, because i have a feeling this little pinky is going to blow everything out of the water. Because in the end dx12 is the future, and not dx11 (which mean these benchmarks here)
Allthough, sincere to be, i might add that we must wait at least a generation before dx12 becomes mainstream.
its probably the lousy 64 computing units (where as much as twice as much were expected by the crowd). *shrug*
amd ends up creating an average console hardware in the place where pc enthusiast hardware is craved. :cry:
so for the true technology progress enthusiasts, its a bit of a let down after what nvida pulled off with their flexible overclocking and kinda so much waiting for amd to deliver on their heap of promises.
 
Last edited:

Frick

Fishfaced Nincompoop
Joined
Feb 27, 2006
Messages
19,578 (2.86/day)
Location
Piteå
System Name White DJ in Detroit
Processor Ryzen 5 5600
Motherboard Asrock B450M-HDV
Cooling Be Quiet! Pure Rock 2
Memory 2 x 16GB Kingston Fury 3400mhz
Video Card(s) XFX 6950XT Speedster MERC 319
Storage Kingston A400 240GB | WD Black SN750 2TB |WD Blue 1TB x 2 | Toshiba P300 2TB | Seagate Expansion 8TB
Display(s) Samsung U32J590U 4K + BenQ GL2450HT 1080p
Case Fractal Design Define R4
Audio Device(s) Plantronics 5220, Nektar SE61 keyboard
Power Supply Corsair RM850x v3
Mouse Logitech G602
Keyboard Cherry MX Board 1.0 TKL Brown
Software Windows 10 Pro
Benchmark Scores Rimworld 4K ready!
amd ends up creating an average console hardware in the place where pc enthusiast hardware is craved. :cry:

Fury X is not avarage console hardware if that's what you mean.
 
Top