• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD GPUs See Lesser Performance Drop on "Deus Ex: Mankind Divided" DirectX 12

Joined
Jan 31, 2011
Messages
2,210 (0.44/day)
System Name Ultima
Processor AMD Ryzen 7 5800X
Motherboard MSI Mag B550M Mortar
Cooling Arctic Liquid Freezer II 240 rev4 w/ Ryzen offset mount
Memory G.SKill Ripjaws V 2x16GB DDR4 3600
Video Card(s) Palit GeForce RTX 4070 12GB Dual
Storage WD Black SN850X 2TB Gen4, Samsung 970 Evo Plus 500GB , 1TB Crucial MX500 SSD sata,
Display(s) ASUS TUF VG249Q3A 24" 1080p 165-180Hz VRR
Case DarkFlash DLM21 Mesh
Audio Device(s) Onboard Realtek ALC1200 Audio/Nvidia HD Audio
Power Supply Corsair RM650
Mouse Rog Strix Impact 3 Wireless | Wacom Intuos CTH-480
Keyboard A4Tech B314 Keyboard
Software Windows 10 Pro
Both the frame time and average fps results in the techreport review are different from what Hilbert measured at guru3d. Perhaps techpowerup can run these tests and get us the final results.
Techreport and computerbase used actual gameplay scenarios in testing while Guru3D used the built in benchmark
 

the54thvoid

Super Intoxicated Moderator
Staff member
Joined
Dec 14, 2009
Messages
13,046 (2.39/day)
Location
Glasgow - home of formal profanity
Processor Ryzen 7800X3D
Motherboard MSI MAG Mortar B650 (wifi)
Cooling be quiet! Dark Rock Pro 4
Memory 32GB Kingston Fury
Video Card(s) Gainward RTX4070ti
Storage Seagate FireCuda 530 M.2 1TB / Samsumg 960 Pro M.2 512Gb
Display(s) LG 32" 165Hz 1440p GSYNC
Case Asus Prime AP201
Audio Device(s) On Board
Power Supply be quiet! Pure POwer M12 850w Gold (ATX3.0)
Software W10
Techreport and computerbase used actual gameplay scenarios in testing while Guru3D used the built in benchmark

Benchmark might be, dare I say it, favourable to one side.
 
Joined
Nov 3, 2011
Messages
695 (0.15/day)
Location
Australia
System Name Eula
Processor AMD Ryzen 9 7900X PBO
Motherboard ASUS TUF Gaming X670E Plus Wifi
Cooling Corsair H150i Elite LCD XT White
Memory Trident Z5 Neo RGB DDR5-6000 64GB (4x16GB F5-6000J3038F16GX2-TZ5NR) EXPO II, OCCT Tested
Video Card(s) Gigabyte GeForce RTX 4080 GAMING OC
Storage Corsair MP600 XT NVMe 2TB, Samsung 980 Pro NVMe 2TB, Toshiba N300 10TB HDD, Seagate Ironwolf 4T HDD
Display(s) Acer Predator X32FP 32in 160Hz 4K FreeSync/GSync DP, LG 32UL950 32in 4K HDR FreeSync/G-Sync DP
Case Phanteks Eclipse P500A D-RGB White
Audio Device(s) Creative Sound Blaster Z
Power Supply Corsair HX1000 Platinum 1000W
Mouse SteelSeries Prime Pro Gaming Mouse
Keyboard SteelSeries Apex 5
Software MS Windows 11 Pro
I'm pretty sure AMD's poor DX 11 performance is due to the lack of hardware multi-threading, a DX 11 feature Nvidia GPUs do have. It's got little to do with the drivers.

Nvidia's claims of DX 12 support are definitely shaky, it's similar to the 3.5 GB on the 970. Nvidia "has" DX 12 support but it's mostly software based and not hardware. They support very few features of DX 12 in the hardware.
From https://developer.nvidia.com/dx12-dos-and-donts

On DX11 the driver does farm off asynchronous tasks to driver worker threads where possible.


Under DX11, NVIDIA driver has asynchronous tasks and multiple threads where possible.




https://developer.nvidia.com/unlocking-gpu-intrinsics-hlsl

None of the intrinsics are possible in standard DirectX or OpenGL. But they have been supported and well-documented in CUDA for years. A mechanism to support them in DirectX has been available for a while but not widely documented. I happen to have an old NVAPI version 343 on my system from October 2014 and the intrinsics are supported in DirectX by that version and probably earlier versions. This blog explains the mechanism for using them in DirectX.

Unlike OpenGL or Vulkan, DirectX unfortunately doesn't have a native mechanism for vendor-specific extensions. But there is still a way to make all this functionality available in DirectX 11 or 12 through custom intrinsics. That mechanism is implemented in our graphics driver and accessible through the NVAPI library.



NVIDIA already has hit-the-metal intrinsics and Direct3D API kitbashing along time ago while AMD gains similar feature with Doom Vulkan.
 
Joined
Jul 14, 2008
Messages
872 (0.15/day)
Location
Copenhagen, Denmark
System Name Ryzen/Laptop/htpc
Processor R9 3900X/i7 6700HQ/i7 2600
Motherboard AsRock X470 Taichi/Acer/ Gigabyte H77M
Cooling Corsair H115i pro with 2 Noctua NF-A14 chromax/OEM/Noctua NH-L12i
Memory G.Skill Trident Z 32GB @3200/16GB DDR4 2666 HyperX impact/24GB
Video Card(s) TUL Red Dragon Vega 56/Intel HD 530 - GTX 950m/ 970 GTX
Storage 970pro NVMe 512GB,Samsung 860evo 1TB, 3x4TB WD gold/Transcend 830s, 1TB Toshiba/Adata 256GB + 1TB WD
Display(s) Philips FTV 32 inch + Dell 2407WFP-HC/OEM/Sony KDL-42W828B
Case Phanteks Enthoo Luxe/Acer Barebone/Enermax
Audio Device(s) SoundBlasterX AE-5 (Dell A525)(HyperX Cloud Alpha)/mojo/soundblaster xfi gamer
Power Supply Seasonic focus+ 850 platinum (SSR-850PX)/165 Watt power brick/Enermax 650W
Mouse G502 Hero/M705 Marathon/G305 Hero Lightspeed
Keyboard G19/oem/Steelseries Apex 300
Software Win10 pro 64bit
While of course a possibility, I wouldn't put any money/stock/holding of breath in that happening. Call me cynical if you must but Nvidia won't lose significant market share in the next 2 gens even if they deserve to for the same reason Apple doesn't lose market share even though they deserve to; they have a refined and sadly effective hype and marketing machine that constantly builds and stokes the consumer norm that they are the superior choice.

Though I hope you're right for the price drops that need to happen for all consumers, in the age of Likes, Views and Trending, mindshare is the real metric that maintains a marketshare's status quo and Nvidia is throwing too much TWIMTBP money around for that to change any time soon.
i mostly agree, but the thing is that the market is something that can go either way sometimes, who expected ARM to dominate the smartphone/device market for example? but it did, and now the giant Intel is feeling the "heat" in that market section.
 
Joined
Dec 29, 2010
Messages
3,808 (0.75/day)
Processor AMD 5900x
Motherboard Asus x570 Strix-E
Cooling Hardware Labs
Memory G.Skill 4000c17 2x16gb
Video Card(s) RTX 3090
Storage Sabrent
Display(s) Samsung G9
Case Phanteks 719
Audio Device(s) Fiio K5 Pro
Power Supply EVGA 1000 P2
Mouse Logitech G600
Keyboard Corsair K95
Because in Germany ComputerBase is called "NVidiaBase" --- similar to Polish PC Lab (Called PC LOL) and a few other sites in the world where the bias can't simply be explained by an error in the benchmarks, reviews etc.

Truth.
 
Joined
Jul 10, 2011
Messages
797 (0.16/day)
Processor Intel
Motherboard MSI
Cooling Cooler Master
Memory Corsair
Video Card(s) Nvidia
Storage Western Digital/Kingston
Display(s) Samsung
Case Thermaltake
Audio Device(s) On Board
Power Supply Seasonic
Mouse Glorious
Keyboard UniKey
Software Windows 10 x64
Because in Germany ComputerBase is called "NVidiaBase" --- similar to Polish PC Lab (Called PC LOL) and a few other sites in the world where the bias can't simply be explained by an error in the benchmarks, reviews etc.

But you haven't called them biased when they showed AMD Doom Vulkan results.

GameGPU, Techreport, Computerbase showed completely different results from Guru3D and they are wrong not Guru3D.
 
Joined
Feb 18, 2005
Messages
5,847 (0.81/day)
Location
Ikenai borderline!
System Name Firelance.
Processor Threadripper 3960X
Motherboard ROG Strix TRX40-E Gaming
Cooling IceGem 360 + 6x Arctic Cooling P12
Memory 8x 16GB Patriot Viper DDR4-3200 CL16
Video Card(s) MSI GeForce RTX 4060 Ti Ventus 2X OC
Storage 2TB WD SN850X (boot), 4TB Crucial P3 (data)
Display(s) 3x AOC Q32E2N (32" 2560x1440 75Hz)
Case Enthoo Pro II Server Edition (Closed Panel) + 6 fans
Power Supply Fractal Design Ion+ 2 Platinum 760W
Mouse Logitech G602
Keyboard Razer Pro Type Ultra
Software Windows 10 Professional x64
Another DX12 thread, another bunch of willy-waving.

Can we maybe rather wait until a non-AMD and non-NVIDIA-affiliated company actually writes a game that uses DX12 from the ground up, and then compare performance? Because every DX12 renderer until this point has been a half-assed second thought at best and terrible at worst.

Mankind Divided is a particularly poor example: a beta DX11 renderer port of a console port. Like, seriously.
 
Joined
Sep 17, 2014
Messages
22,424 (6.03/day)
Location
The Washing Machine
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling Thermalright Peerless Assassin
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
Another DX12 thread, another bunch of willy-waving.

Can we maybe rather wait until a non-AMD and non-NVIDIA-affiliated company actually writes a game that uses DX12 from the ground up, and then compare performance? Because every DX12 renderer until this point has been a half-assed second thought at best and terrible at worst.

Mankind Divided is a particularly poor example: a beta DX11 renderer port of a console port. Like, seriously.

Games aren't built for API's but because there is a business case and economic incentive. They're built from new dev kits and applications, and those are here, have been here, for a while now. You know, engines too - like CryEngine that has been in use for over a decade, showing us 'true next gen' before anyone could run it, and it is only NOW that we start seeing console games catch up to that.

Consoles have had close to metal programming for a long time.
PC's have had massive amounts of processing power (compared to consoles) for a long time

The irony is that PC gaming doesn't NEED DX12 to progress, the consoles need it. VR and mobile need it. Systems with weak, low-power CPU's need it. And on the top end, PC gaming needs it to keep surpassing everything else. I suppose that last little bit is what you're looking for but let's face it, we won't see another Crysis in years.

You know how those high end GPUs are sold today? With inferior technology that really has no merit in real gaming, like VR, like 4K on crappy panels, like HDR with high latencies. These techs are all about moving hardware, not about a great experience, and they're all cash cows right now. None of them are mature.

In the end, the only thing that truly carries a platform and technological progress is an honest investment in stuff we can do with it. That doesn't mean more hardware, it means good, lovingly crafted software. Apple and Blizzard are some of the very few companies that understood this. MS has a love-hate relationship with it. A few smaller game developers also have it.
 
Last edited:

INSTG8R

Vanguard Beta Tester
Joined
Nov 26, 2004
Messages
8,042 (1.10/day)
Location
Canuck in Norway
System Name Hellbox 5.1(same case new guts)
Processor Ryzen 7 5800X3D
Motherboard MSI X570S MAG Torpedo Max
Cooling TT Kandalf L.C.S.(Water/Air)EK Velocity CPU Block/Noctua EK Quantum DDC Pump/Res
Memory 2x16GB Gskill Trident Neo Z 3600 CL16
Video Card(s) Powercolor Hellhound 7900XTX
Storage 970 Evo Plus 500GB 2xSamsung 850 Evo 500GB RAID 0 1TB WD Blue Corsair MP600 Core 2TB
Display(s) Alienware QD-OLED 34” 3440x1440 144hz 10Bit VESA HDR 400
Case TT Kandalf L.C.S.
Audio Device(s) Soundblaster ZX/Logitech Z906 5.1
Power Supply Seasonic TX~’850 Platinum
Mouse G502 Hero
Keyboard G19s
VR HMD Oculus Quest 3
Software Win 11 Pro x64
I have to quote you both because I agree with points in both...

Another DX12 thread, another bunch of willy-waving.

Can we maybe rather wait until a non-AMD and non-NVIDIA-affiliated company actually writes a game that uses DX12 from the ground up, and then compare performance? Because every DX12 renderer until this point has been a half-assed second thought at best and terrible at worst.

Mankind Divided is a particularly poor example: a beta DX11 renderer port of a console port. Like, seriously.

This is my take right now on DX12 too. There has been no "standard" as far as what Devs are "picking and choosing" from the DX12 feature set so there is just no consistency for anybody or any game so far. It just ends up like this game with alot of WTF? and "Whats the point?" One game uses "this feature" and the next game uses "another one" and we end up with these oddball bench results and wacky performance...






Games aren't built for API's. They're built from new dev kits and applications, and those are here, have been here, for a while now. You know, engines too - like CryEngine that has been in use for over a decade, showing us 'true next gen' before anyone could run it, and it is only NOW that we start seeing console games catch up to that.
Consoles have had close to metal programming for a long time.
PC's have had massive amounts of processing power (compared to consoles) for a long time

The irony is that PC gaming doesn't NEED DX12 to progress, the consoles need it. VR and mobile need it. Systems with weak, low-power CPU's need it. And on the top end, PC gaming needs it to keep surpassing everything else. I suppose that last little bit is what you're looking for but let's face it, we won't see another Crysis in years.

You know how those high end GPUs are sold today? With inferior technology that really has no merit in real gaming, like VR, like 4K on crappy panels, like HDR with high latencies. These techs are all about moving hardware, not about a great experience, and they're all cash cows right now. None of them are mature.

In the end, the only thing that truly carries a platform and technological progress is an honest investment in stuff we can do with it. That doesn't mean more hardware, it means good, lovingly crafted software. Apple and Blizzard are some of the very few companies that understood this. MS has a love-hate relationship with it. A few smaller game developers also have it.

Cryengine is/was a great example of "pushing the limits" and sadly the only thing lately that comes to mind that kinda comes close to that would be The Witcher 3 which allows you to turn it up to "ludicrous" letting you use pretty much every bell and whistle in the "toolbox" if you have the hardware to handle it.

I also agree with the pushing of "immature" tech and selling overpriced cards to try to "brute force" through them, sadly it's still selling the hardware...
 
Joined
Mar 6, 2012
Messages
569 (0.12/day)
Processor i5 4670K - @ 4.8GHZ core
Motherboard MSI Z87 G43
Cooling Thermalright Ultra-120 *(Modded to fit on this motherboard)
Memory 16GB 2400MHZ
Video Card(s) HD7970 GHZ edition Sapphire
Storage Samsung 120GB 850 EVO & 4X 2TB HDD (Seagate)
Display(s) 42" Panasonice LED TV @120Hz
Case Corsair 200R
Audio Device(s) Xfi Xtreme Music with Hyper X Core
Power Supply Cooler Master 700 Watts
I don't understand - The article says AMD see less performance drop on Deus Ex: MD on Dx 12 and gives reference to Guru3D but when you click on the link AMD cards are actually gaining performance.

How is Gaining performance = Seeing lesser performance drop ?
 
Joined
Sep 5, 2004
Messages
1,958 (0.27/day)
Location
The Kingdom of Norway
Processor Ryzen 5900X
Motherboard Gigabyte B550I AORUS PRO AX 1.1
Cooling Noctua NB-U12A
Memory 2x 32GB Fury DDR4 3200mhz
Video Card(s) PowerColor Radeon 7800 XT Hellhound
Storage Kingston FURY Renegade 2TB PCIe 4.0
Display(s) 2x Dell U2412M
Case Phanteks P400A
Audio Device(s) Hifimediy Sabre 9018 USB DAC
Power Supply Corsair AX850 (from 2012)
Software Windows 10?
well is it just me or is there no beta tag in these articles?, Deux Ex DX12 is in BETA, aka unfinished
you have to switch to a beta branch to get dx12 enabled... proper dx12 is in next weeks patch on 19th september-ish ?
 
Joined
Mar 24, 2012
Messages
533 (0.12/day)
I don't understand - The article says AMD see less performance drop on Deus Ex: MD on Dx 12 and gives reference to Guru3D but when you click on the link AMD cards are actually gaining performance.

How is Gaining performance = Seeing lesser performance drop ?

try looking for more bench. from what i can see results are all over the place. but yes for AMD cards not all of them gaining performance. some even have performance decrease. and it seems that Guru3d only using built in benchmark tool to do their test. some other site using real game play instead.
 

rtwjunkie

PC Gaming Enthusiast
Supporter
Joined
Jul 25, 2008
Messages
13,989 (2.35/day)
Location
Louisiana
Processor Core i9-9900k
Motherboard ASRock Z390 Phantom Gaming 6
Cooling All air: 2x140mm Fractal exhaust; 3x 140mm Cougar Intake; Enermax ETS-T50 Black CPU cooler
Memory 32GB (2x16) Mushkin Redline DDR-4 3200
Video Card(s) ASUS RTX 4070 Ti Super OC 16GB
Storage 1x 1TB MX500 (OS); 2x 6TB WD Black; 1x 2TB MX500; 1x 1TB BX500 SSD; 1x 6TB WD Blue storage (eSATA)
Display(s) Infievo 27" 165Hz @ 2560 x 1440
Case Fractal Design Define R4 Black -windowed
Audio Device(s) Soundblaster Z
Power Supply Seasonic Focus GX-1000 Gold
Mouse Coolermaster Sentinel III (large palm grip!)
Keyboard Logitech G610 Orion mechanical (Cherry Brown switches)
Software Windows 10 Pro 64-bit (Start10 & Fences 3.0 installed)
well is it just me or is there no beta tag in these articles?, Deux Ex DX12 is in BETA, aka unfinished
you have to switch to a beta branch to get dx12 enabled... proper dx12 is in next weeks patch on 19th september-ish ?

This is true, and it has been pointed out in this thread.
 

NGreediaOrAMSlow

New Member
Joined
Sep 11, 2016
Messages
19 (0.01/day)
Actually currently it does. While AMD current architecture is better placed or optimized for DX12, in terms of raw clock, max TFLOPS Nvidia dominates.

And no matter how well you optimize there is a limit in how much a hardware can do. While probably like cars, those numbers can be taken with a kind of grain of salt in terms that manufactures may inflate them, if you look at given numbers, Nvidia's are higher... Period.

OOps a typo. Actually what I mean is it doesn't beat NVidia on high end if you take in consideration max theoretical throughput (TFLOPS).
 
Joined
Sep 17, 2014
Messages
22,424 (6.03/day)
Location
The Washing Machine
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling Thermalright Peerless Assassin
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
OOps a typo. Actually what I mean is it doesn't beat NVidia on high end if you take in consideration max theoretical throughput (TFLOPS).

Obviously because AMD hasn't put out their high end range yet.

TFLOPS are way too rough an indicator to be of any use in these kinds of comparisons. They're an indicator, nothing else, and I don't see how they are ever relevant in any GPU discussion. It's a shortcut to avoid getting into details, that is all.

Just because something appears on a marketing slide, doesn't make it informative.
 
Top