• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Radeon R9 Nano Nears Launch, 50% Higher Performance per Watt over Fury X

Joined
Jun 13, 2012
Messages
1,385 (0.31/day)
Processor i7-13700k
Motherboard Asus Tuf Gaming z790-plus
Cooling Coolermaster Hyper 212 RGB
Memory Corsair Vengeance RGB 32GB DDR5 7000mhz
Video Card(s) Asus Dual Geforce RTX 4070 Super ( 2800mhz @ 1.0volt, ~60mhz overlock -.1volts)
Storage 1x Samsung 980 Pro PCIe4 NVme, 2x Samsung 1tb 850evo SSD, 3x WD drives, 2 seagate
Display(s) Acer Predator XB273u 27inch IPS G-Sync 165hz
Power Supply Corsair RMx Series RM850x (OCZ Z series PSU retired after 13 years of service)
Mouse Logitech G502 hero
Keyboard Logitech G710+
Don't belivee me? Read this explanation by a retired AMD GPU engineer. This guy REALLY knows what he's talking about, he's no fanboy, he's an actual, bonna fide expert, his posts (go ahead and read through the thread and be prepared to be blown away) should silence the idiotic nVidia fanboyism on here:
I cannot at this juncture, because their offerings and false marketing and lies left a bitter taste in my mouth. Nevermind that they have failed to keep high-end SKUs in stock locally.

I made many AMD GPU purchases on tech they touted as new and all that, but not once did they ever actually deliver on it until many many months later, if at all (Crossfire (broken cursor), Eyefinity (First was Crossfire running Crysis on three dell 30-inch screens, never did work in native res although they had live demos running at LANs and when I was there, they wouldn't let me see the back of the monitors or PC) + (Frame time problems that I complained about for years before they actually acknowledged, and only did when they had ZERO choice), CTM (never made it to prime-time, precursor to DX12), Mantle (New CTM, backed by better code, still not used by more than 10 or so games right now).

It's not that I favor AMD, it is that they fail to deliver. While for me personally and my own purchases, NVidia HAS delivered. NVidia touts features...and they work! What a novel idea!
@cadaveca Said pretty much what has been the truth about AMD for last 3 years. Claim stuff about their product but not living up to the claims, Claiming new gpu's when they are rebrands (aka 300 series they claim are new when in fact they are rebrands). In terms of pushing things in the industry, with new innovative things like shadowplay and g-sync. AMD has been 2 steps behind in those buying in to another company to do their recording and throwing together tech to compete with g-sync in 6 months and shipping it way before it was ready.

Both AMD and Nvidia are deceitful at times so I would rather wait until some DX12 games drop before deciding.
AMD in last few years has been most deceitful last 3-4 years with claims of xxx performance but giving what usually is a bit less.

SO what AMD seems to have higher performance in a game that was original built using their 3rd party locked up API. That performance transfered over to DX12 which nvidia is starting from square 1. AMD had what almost a year heard start working with that were i doubt nvidia had more the a month or 2. Please don't say nvidia had access to source for a year, they have had access to DX11 but DX12 for it likely much difference story.
Last fact is this is ONE game in ALPHA stages, Doesn't mean very much.
 
Joined
Apr 30, 2012
Messages
3,881 (0.85/day)
SO what AMD seems to have higher performance in a game that was original built using their 3rd party locked up API. That performance transfered over to DX12 which nvidia is starting from square 1. AMD had what almost a year heard start working with that were i doubt nvidia had more the a month or 2. Please don't say nvidia had access to source for a year, they have had access to DX11 but DX12 for it likely much difference story.
Last fact is this is ONE game in ALPHA stages, Doesn't mean very much.

So your rational is to say Microsoft which consults with Intel/AMD/Nvidia to come up with DX12 standards some how excluded Nvidia from the process ?

Nvidia also sits on the presidency of Khronos consortium which adopted Mantel into Vulkan.

Nvidia was the first one to team up with Microsoft for a DX12 demo. Well over a 1yr+.

Nvidia is well placed to know where things are going well in advance.

Don't forget how bad the DX12 test makes AMD look in DX11 yet

Apples-to-Apples







*I know its just one game with Hairworks.

The Ashes of the Singularity wasn't so much that AMD was ahead but how Nvidia fell behind.
 
Joined
Mar 10, 2015
Messages
3,984 (1.13/day)
System Name Wut?
Processor 3900X
Motherboard ASRock Taichi X570
Cooling Water
Memory 32GB GSkill CL16 3600mhz
Video Card(s) Vega 56
Storage 2 x AData XPG 8200 Pro 1TB
Display(s) 3440 x 1440
Case Thermaltake Tower 900
Power Supply Seasonic Prime Ultra Platinum
I almost bought the 380, that last graph makes me so happy I didn't.
 
Joined
Jun 13, 2012
Messages
1,385 (0.31/day)
Processor i7-13700k
Motherboard Asus Tuf Gaming z790-plus
Cooling Coolermaster Hyper 212 RGB
Memory Corsair Vengeance RGB 32GB DDR5 7000mhz
Video Card(s) Asus Dual Geforce RTX 4070 Super ( 2800mhz @ 1.0volt, ~60mhz overlock -.1volts)
Storage 1x Samsung 980 Pro PCIe4 NVme, 2x Samsung 1tb 850evo SSD, 3x WD drives, 2 seagate
Display(s) Acer Predator XB273u 27inch IPS G-Sync 165hz
Power Supply Corsair RMx Series RM850x (OCZ Z series PSU retired after 13 years of service)
Mouse Logitech G502 hero
Keyboard Logitech G710+
So your rational is to say Microsoft which consults with Intel/AMD/Nvidia to come up with DX12 standards some how excluded Nvidia from the process ?

Nvidia also sits on the presidency of Khronos consortium which adopted Mantel into Vulkan.

Nvidia was the first one to team up with Microsoft for a DX12 demo. Well over a 1yr+.
Nvidia is well placed to know where things are going well in advance.
Don't forget how bad the DX12 test makes AMD look in DX11 yet
Apples-to-Apples
*I know its just one game with Hairworks.
The Ashes of the Singularity wasn't so much that AMD was ahead but how Nvidia fell behind.
in witcher, is it really Apple to Apples or does AMD use the setting in their control panel to DUMB down tessellation to lower level? Wouldn't shock me if that is what they did. I am sure if most people knew about that little setting they would wonder as well.
That is 1 thing you have to wonder but even with DX12 being worked with nvidia for 1+year doesn't mean that much when it comes to a game.
 
Joined
Apr 30, 2012
Messages
3,881 (0.85/day)
in witcher, is it really Apple to Apples or does AMD use the setting in their control panel to DUMB down tessellation to lower level? Wouldn't shock me if that is what they did. I am sure if most people knew about that little setting they would wonder as well.
That is 1 thing you have to wonder but even with DX12 being worked with nvidia for 1+year doesn't mean that much when it comes to a game.

That same logic can be applied to anything.

We wont know if AMD is dumbing down, you like to point to the tessellation and others point to the DirectCompute of Nvidia. Unless both all of a sudden become transparent with their drivers there wont be an answer.
 
Last edited:
Joined
Apr 19, 2011
Messages
2,198 (0.44/day)
Location
So. Cal.
I almost bought the 380, that last graph makes me so happy I didn't.
Which part... that neither offer playable Witcher with AA On, 16AF, High Post Process -HBO+, Ultra Graphics - Default HairWorks (normally at 8X) for around $200?

So what you intend? To pay more rather than setting Hairwork to a lower setting (4X) to get playable? The Witcher canned B-M is impractical in determining if a $200 card is worthy/practical to provide immersive graphics experience. Nvidia wants you to think that you must go to a $300 to make the experience enjoyable and that's just disingenuous. There are plenty of “tweaks” for Foliage, Grass, Shadows, that are max’d out in the B-M that in my opinion don’t superbly enhance the immersive experience.

Have a look at the difference here between, hair, foliage, grass, shadows. Some like shadows in 8x (default) look to razor-sharp, not the natural soften shadows that appear in real world (go outside and look). For Hairworks their comparison has you looking the guy’s head in a way that I don’t believe the game ever get that close, while at that close the hair for 8X to me still looks horrible. While between 8X/4X perhaps the most improved area is the sideburns. So are you saying your okay paying $100 more for "in your face" sideburns, and that doesn’t burn you?

http://www.geforce.com/whats-new/gu...king-guide#nvidia-hairworks-config-file-tweak
 
Joined
Mar 10, 2015
Messages
3,984 (1.13/day)
System Name Wut?
Processor 3900X
Motherboard ASRock Taichi X570
Cooling Water
Memory 32GB GSkill CL16 3600mhz
Video Card(s) Vega 56
Storage 2 x AData XPG 8200 Pro 1TB
Display(s) 3440 x 1440
Case Thermaltake Tower 900
Power Supply Seasonic Prime Ultra Platinum
No it meant that it wasn't going to play any game that I have at anything near the level of detail I wanted.
 
Top