• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA Announces DX12 Gameworks Support

Joined
Aug 20, 2007
Messages
21,539 (3.40/day)
System Name Pioneer
Processor Ryzen R9 9950X
Motherboard GIGABYTE Aorus Elite X670 AX
Cooling Noctua NH-D15 + A whole lotta Sunon and Corsair Maglev blower fans...
Memory 64GB (4x 16GB) G.Skill Flare X5 @ DDR5-6000 CL30
Video Card(s) XFX RX 7900 XTX Speedster Merc 310
Storage Intel 905p Optane 960GB boot, +2x Crucial P5 Plus 2TB PCIe 4.0 NVMe SSDs
Display(s) 55" LG 55" B9 OLED 4K Display
Case Thermaltake Core X31
Audio Device(s) TOSLINK->Schiit Modi MB->Asgard 2 DAC Amp->AKG Pro K712 Headphones or HDMI->B9 OLED
Power Supply FSP Hydro Ti Pro 850W
Mouse Logitech G305 Lightspeed Wireless
Keyboard WASD Code v3 with Cherry Green keyswitches + PBT DS keycaps
Software Gentoo Linux x64 / Windows 11 Enterprise IoT 2024
NVIDIA has announced DX12 support for their proprietary GameWorks SDK, including some new exclusive effects such as "Flex" and "Flow." Most interestingly, NVIDIA is claiming that simulation effects get a massive boost from Async Compute, nearly doubling performance on a GTX 1080 using that style of effects. Obviously, Async Compute is a DX12 exclusive technology. The performance gains in an area where NVIDIA normally is perceived to not do so well are indeed encouraging, even if only in their exclusive ecosystem. Whether GCN powered cards will see similar gains when running GameWorks titles remains to be seen.



View at TechPowerUp Main Site
 
Last edited by a moderator:
Joined
Sep 26, 2012
Messages
871 (0.19/day)
Location
Australia
System Name ATHENA
Processor AMD 7950X
Motherboard ASUS Crosshair X670E Extreme
Cooling ASUS ROG Ryujin III 360, 13 x Lian Li P28
Memory 2x32GB Trident Z RGB 6000Mhz CL30
Video Card(s) ASUS 4090 STRIX
Storage 3 x Kingston Fury 4TB, 4 x Samsung 870 QVO
Display(s) Acer X38S, Wacom Cintiq Pro 15
Case Lian Li O11 Dynamic EVO
Audio Device(s) Topping DX9, Fluid FPX7 Fader Pro, Beyerdynamic T1 G2, Beyerdynamic MMX300
Power Supply Seasonic PRIME TX-1600
Mouse Xtrfy MZ1 - Zy' Rail, Logitech MX Vertical, Logitech MX Master 3
Keyboard Logitech G915 TKL
VR HMD Oculus Quest 2
Software Windows 11 + Universal Blue
I desperately hope that developers avoid this shit this generation. No gamer wants this crap, and most examples usually end up having tanky performance on all but the latest high ends.
 
Joined
Oct 16, 2012
Messages
734 (0.16/day)
Location
Malaysia
System Name Cypher-C4
Processor Ryzen 9 5900X
Motherboard MSI B450 Tomahawk Max
Cooling Deepcool AK620
Memory 32GB Crucial Ballistix Elite 3200MHz
Video Card(s) Palit RTX 3060 Ti Dual
Storage 250GB Crucial MX500 + 2TB Crucial MX500 + 2TB WD Black + 4TB Toshiba + 1TB Samsung F3
Display(s) Acer XV272UP
Case Corsair Obsidian 750D
Audio Device(s) Behringher UMC202HD Audio Interface + Mackie HM-4 + Sennheiser HD 280 Pro + Shure SM58
Power Supply Corsair HX750i
Mouse Steelseries Rival 310
Keyboard Keychron K8 + Kailh BOX Crystal Jade switches + Ducky Good in Blue keycaps
Software Windows 10 Pro 64-bit
I desperately hope that developers avoid this shit this generation. No gamer wants this crap, and most examples usually end up having tanky performance on all but the latest high ends.

:confused: o_O :wtf:
 
Joined
Dec 29, 2010
Messages
3,809 (0.75/day)
Processor AMD 5900x
Motherboard Asus x570 Strix-E
Cooling Hardware Labs
Memory G.Skill 4000c17 2x16gb
Video Card(s) RTX 3090
Storage Sabrent
Display(s) Samsung G9
Case Phanteks 719
Audio Device(s) Fiio K5 Pro
Power Supply EVGA 1000 P2
Mouse Logitech G600
Keyboard Corsair K95
That's the whole point, tank your system so you have to buy the new ti. On a serious note, wasn't one of the points of directx 12 was to free ourselves from crap like gameworks?
 
Last edited:
Joined
Jul 13, 2016
Messages
3,328 (1.08/day)
Processor Ryzen 7800X3D
Motherboard ASRock X670E Taichi
Cooling Noctua NH-D15 Chromax
Memory 32GB DDR5 6000 CL30
Video Card(s) MSI RTX 4090 Trio
Storage Too much
Display(s) Acer Predator XB3 27" 240 Hz
Case Thermaltake Core X9
Audio Device(s) Topping DX5, DCA Aeon II
Power Supply Seasonic Prime Titanium 850w
Mouse G305
Keyboard Wooting HE60
VR HMD Valve Index
Software Win 10
I desperately hope that developers avoid this shit this generation. No gamer wants this crap, and most examples usually end up having tanky performance on all but the latest high ends.

Couldn't agree more, it hasn't earned the title Gimpworks for nothing. It's essentially a black box that doesn't allow AMD to optimize for the code. It tanks performance on last gen Nvidia and all AMD cards.

On the other hand you have AMD technology like TressFX that works well on both AMD and Nvidia.

GameWorks is equivalent to that one time Intel "asked" software devs to compile code using a specially provided compiler that gimped AMD CPUs. The only difference is Nvidia GameWorks is long running and Nvidia Fanboys don't seem to care the damage it is doing to the industry.

"Whether GCN powered cards will see similar gains when running GameWorks titles remains to be seen"

Let me answer that for you, no. I can answer with 100% confidence that Nvidia has never and will never let AMD get a boost from GameWorks.
 
Joined
Jul 13, 2016
Messages
3,328 (1.08/day)
Processor Ryzen 7800X3D
Motherboard ASRock X670E Taichi
Cooling Noctua NH-D15 Chromax
Memory 32GB DDR5 6000 CL30
Video Card(s) MSI RTX 4090 Trio
Storage Too much
Display(s) Acer Predator XB3 27" 240 Hz
Case Thermaltake Core X9
Audio Device(s) Topping DX5, DCA Aeon II
Power Supply Seasonic Prime Titanium 850w
Mouse G305
Keyboard Wooting HE60
VR HMD Valve Index
Software Win 10
That's the whole point, thank your system so you have to buy the new ti. On a serious note, wasn't one of the points of directx 12 was to free ourselves from crap like gameworks?

It is but Nvidia wants to make it's own version of everything. If it includes DX 12 support into GameWorks, it can send it's engineers to game devs to implement Nvidia favoring DX 12 code into the game. When Nvidia does this, the game dev has to sign a contract not to share Nvidia proprietary code with anyone else, meaning AMD are unable to optimize anything Nvidia engineers do. Often times this can mean AMD cannot optimize for very important parts of the code. Thus Nvidia can cover it's hardware's failure by essentially paying game devs off with free engineers.
 
Joined
Aug 20, 2007
Messages
21,539 (3.40/day)
System Name Pioneer
Processor Ryzen R9 9950X
Motherboard GIGABYTE Aorus Elite X670 AX
Cooling Noctua NH-D15 + A whole lotta Sunon and Corsair Maglev blower fans...
Memory 64GB (4x 16GB) G.Skill Flare X5 @ DDR5-6000 CL30
Video Card(s) XFX RX 7900 XTX Speedster Merc 310
Storage Intel 905p Optane 960GB boot, +2x Crucial P5 Plus 2TB PCIe 4.0 NVMe SSDs
Display(s) 55" LG 55" B9 OLED 4K Display
Case Thermaltake Core X31
Audio Device(s) TOSLINK->Schiit Modi MB->Asgard 2 DAC Amp->AKG Pro K712 Headphones or HDMI->B9 OLED
Power Supply FSP Hydro Ti Pro 850W
Mouse Logitech G305 Lightspeed Wireless
Keyboard WASD Code v3 with Cherry Green keyswitches + PBT DS keycaps
Software Gentoo Linux x64 / Windows 11 Enterprise IoT 2024
TIL "GameWorks" is called an ecosystem. This is fucking shame for the PC master race.

It is an ecosystem of sorts. A commercial, proprietary one yes, (and in my personal opinion buggy too) but a system of products all the same.
 
Joined
Jul 31, 2010
Messages
232 (0.04/day)
Processor AMD R5 3600
Motherboard ASUS X370 Prime Pro
Memory 32GB (4x8GB) Crucial Ballistix DDR4 3000Mhz C15 @ 3600Mhz C16
Video Card(s) Nvidia RTX 2060 6GB GDDR6 FE
Storage 4x 1TB SSD
Display(s) Benq XL2411Z 144Hz + Asus VS24H
Case Corsair 270R
Audio Device(s) Yamaha AG03 + Rode Procaster
Power Supply Corsair AX 650W 90+ Gold
Mouse Logitech G Pro Superlight
Keyboard KBD67 Lite R3 (Gazzew Boba U4 + AKKO Midnight)
VR HMD Valve Index
Software Windows 10 Pro 64bit
I'm surprised by the amount of sheer dumb comments on here, it's like people just fell for whatever turds and speculation the rumor mills throw.
The reason why some Gameworks features impacted other vendor hardware more than Nvidia's is because features like Hairworks, Furworks, Volumetric Lighting, etc.. make heavy use of tessellation, which is something Nvidia GPUs excel at.

Let me answer that for you, no. I can answer with 100% confidence that Nvidia has never and will never let AMD get a boost from GameWorks.

HBAO+ has a lower performance impact on AMD's Fury X than it does on a 980 Ti is one example.
 

FordGT90Concept

"I go fast!1!11!1!"
Joined
Oct 13, 2008
Messages
26,259 (4.44/day)
Location
IA, USA
System Name BY-2021
Processor AMD Ryzen 7 5800X (65w eco profile)
Motherboard MSI B550 Gaming Plus
Cooling Scythe Mugen (rev 5)
Memory 2 x Kingston HyperX DDR4-3200 32 GiB
Video Card(s) AMD Radeon RX 7900 XT
Storage Samsung 980 Pro, Seagate Exos X20 TB 7200 RPM
Display(s) Nixeus NX-EDG274K (3840x2160@144 DP) + Samsung SyncMaster 906BW (1440x900@60 HDMI-DVI)
Case Coolermaster HAF 932 w/ USB 3.0 5.25" bay + USB 3.2 (A+C) 3.5" bay
Audio Device(s) Realtek ALC1150, Micca OriGen+
Power Supply Enermax Platimax 850w
Mouse Nixeus REVEL-X
Keyboard Tesoro Excalibur
Software Windows 10 Home 64-bit
Benchmark Scores Faster than the tortoise; slower than the hare.
Whether GCN powered cards will see similar gains when running GameWorks titles remains to be seen.
Psssh! More like CPU. PhysX (a component of GameWorksBroke) won't run at all on GCN.

This is why I like AMD more. They have their own stuff as well, but it's always open, meaning anyone can fiddle with it and optimize it.
Especially NVIDIA and Intel. But NVIDIA, you know, has to sell you that bridge to no where.


Maybe NVIDIA do an about-face because NVIDIA has to pony up to get any developers to use it the way they want them to but I certainly won't be holding my breath for that. It's at least plausible because Direct3D 12 is a standard API so it may use standard compute calls to operate on the GPU instead of specifically CUDA code. I mean, they are touting that Direct3D 12 did something right, aren't they? They never claimed 200% increase in compute + graphics before, did they? Maybe that's incentive enough to let Intel and AMD accelerate it.
 
Last edited:
Joined
Jul 13, 2016
Messages
3,328 (1.08/day)
Processor Ryzen 7800X3D
Motherboard ASRock X670E Taichi
Cooling Noctua NH-D15 Chromax
Memory 32GB DDR5 6000 CL30
Video Card(s) MSI RTX 4090 Trio
Storage Too much
Display(s) Acer Predator XB3 27" 240 Hz
Case Thermaltake Core X9
Audio Device(s) Topping DX5, DCA Aeon II
Power Supply Seasonic Prime Titanium 850w
Mouse G305
Keyboard Wooting HE60
VR HMD Valve Index
Software Win 10
I'm surprised by the amount of sheer dumb comments on here, it's like people just fell for whatever turds and speculation the rumor mills throw.
The reason why some Gameworks features impacted other vendor hardware more than Nvidia's is because features like Hairworks, Furworks, Volumetric Lighting, etc.. make heavy use of tessellation, which is something Nvidia GPUs excel at.



HBAO+ has a lower performance impact on AMD's Fury X than it does on a 980 Ti is one example.

There's nothing Nvidia can do about AMD being better at HBAO+, AMD cards are just better at the calculations require for most methods of SSAO, including Nvidia's own. Vega is likely to take that up another notch.
 

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
27,957 (3.71/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
This is why I like AMD more. They have their own stuff as well, but it's always open, meaning anyone can fiddle with it and optimize it.
Lots of NVIDIA Gameworks stuff has been open sourced for this GDC

gw.jpg
 

FordGT90Concept

"I go fast!1!11!1!"
Joined
Oct 13, 2008
Messages
26,259 (4.44/day)
Location
IA, USA
System Name BY-2021
Processor AMD Ryzen 7 5800X (65w eco profile)
Motherboard MSI B550 Gaming Plus
Cooling Scythe Mugen (rev 5)
Memory 2 x Kingston HyperX DDR4-3200 32 GiB
Video Card(s) AMD Radeon RX 7900 XT
Storage Samsung 980 Pro, Seagate Exos X20 TB 7200 RPM
Display(s) Nixeus NX-EDG274K (3840x2160@144 DP) + Samsung SyncMaster 906BW (1440x900@60 HDMI-DVI)
Case Coolermaster HAF 932 w/ USB 3.0 5.25" bay + USB 3.2 (A+C) 3.5" bay
Audio Device(s) Realtek ALC1150, Micca OriGen+
Power Supply Enermax Platimax 850w
Mouse Nixeus REVEL-X
Keyboard Tesoro Excalibur
Software Windows 10 Home 64-bit
Benchmark Scores Faster than the tortoise; slower than the hare.
About-face it is then, good. :D

I hope some games (like Witcher 3) and game engines (UE3/UE4) get retroactively patched for GCN support.
 
Last edited:
Joined
Sep 15, 2011
Messages
6,760 (1.39/day)
Processor Intel® Core™ i7-13700K
Motherboard Gigabyte Z790 Aorus Elite AX
Cooling Noctua NH-D15
Memory 32GB(2x16) DDR5@6600MHz G-Skill Trident Z5
Video Card(s) ZOTAC GAMING GeForce RTX 3080 AMP Holo
Storage 2TB SK Platinum P41 SSD + 4TB SanDisk Ultra SSD + 500GB Samsung 840 EVO SSD
Display(s) Acer Predator X34 3440x1440@100Hz G-Sync
Case NZXT PHANTOM410-BK
Audio Device(s) Creative X-Fi Titanium PCIe
Power Supply Corsair 850W
Mouse Logitech Hero G502 SE
Software Windows 11 Pro - 64bit
Benchmark Scores 30FPS in NFS:Rivals
There's nothing Nvidia can do about AMD being better at HBAO+, AMD cards are just better at the calculations require for most methods of SSAO, including Nvidia's own. Vega is likely to take that up another notch.
Tbh, I could never tell which one is better, HBAO+ or SSAO. They look different on every single game, however I cannot tell which one is more realistic. The HBAO performs a little worst also.
 
Joined
Mar 1, 2017
Messages
2 (0.00/day)
Couldn't agree more, it hasn't earned the title Gimpworks for nothing. It's essentially a black box that doesn't allow AMD to optimize for the code. It tanks performance on last gen Nvidia and all AMD cards.

There's nothing Nvidia can do about AMD being better at HBAO+, AMD cards are just better at the calculations require for most methods of SSAO, including Nvidia's own. Vega is likely to take that up another notch.

So much for a "black box" lol.

If amd had no power over what runs on their cards, nvidia could easily check the gpu vendor and run whatever inefficient ssao method they could come up with.

Luckily, your story is 100% wrong and ignores the entire purpose of a gpu driver.
 
Joined
Dec 15, 2009
Messages
233 (0.04/day)
Location
Austria
Does Physix 3.x make use of the CPUS SEE/SSE2 instruction set now? Or is NV still crippling Physix CPU performance with the stone-age x87 instruction set?
 

the54thvoid

Super Intoxicated Moderator
Staff member
Joined
Dec 14, 2009
Messages
13,116 (2.39/day)
Location
Glasgow - home of formal profanity
Processor Ryzen 7800X3D
Motherboard MSI MAG Mortar B650 (wifi)
Cooling be quiet! Dark Rock Pro 4
Memory 32GB Kingston Fury
Video Card(s) Gainward RTX4070ti
Storage Seagate FireCuda 530 M.2 1TB / Samsumg 960 Pro M.2 512Gb
Display(s) LG 32" 165Hz 1440p GSYNC
Case Asus Prime AP201
Audio Device(s) On Board
Power Supply be quiet! Pure POwer M12 850w Gold (ATX3.0)
Software W10
Did nobody read yesterdays DX12 post? It requires more coding than DX11 and still needs to be tuned to each vendor.
The level of instant Nvidia hate is amusing.

As W1zzard has said, GW has been opened up to a degree but even before that they've released SDKs for their ecosystem.
AMD have played open source because of their position, not because they love everyone. I don't fully trust their partnership with Bethesda to fully optimise their games for AMD.
 
Joined
Oct 2, 2004
Messages
13,791 (1.87/day)
What this means in a nutshell, we won't see games performing better with current PhysX, they'll just cram more objects into games and we'll see ZERO gains while also not really seeing any visual improvement either. What difference does it make between 100 and 200 shards of shattered glass? Visually, not much. But calculation wise, you're taxing it twice as much. The times of 4 shards of glass vs 20 are long gone. When numbers are this high, you don't have to go stupid on it just because you can, you should leave the gain for players to enjoy in terms pof higher framerate. But sily me always thinking backwards by removing polygons instead of adding them with tessellation and wanting PhysX to perform better while looking the same instead of just cramming gazillion of everything to make it more real than reality. Ugh...
 
Joined
Sep 26, 2012
Messages
871 (0.19/day)
Location
Australia
System Name ATHENA
Processor AMD 7950X
Motherboard ASUS Crosshair X670E Extreme
Cooling ASUS ROG Ryujin III 360, 13 x Lian Li P28
Memory 2x32GB Trident Z RGB 6000Mhz CL30
Video Card(s) ASUS 4090 STRIX
Storage 3 x Kingston Fury 4TB, 4 x Samsung 870 QVO
Display(s) Acer X38S, Wacom Cintiq Pro 15
Case Lian Li O11 Dynamic EVO
Audio Device(s) Topping DX9, Fluid FPX7 Fader Pro, Beyerdynamic T1 G2, Beyerdynamic MMX300
Power Supply Seasonic PRIME TX-1600
Mouse Xtrfy MZ1 - Zy' Rail, Logitech MX Vertical, Logitech MX Master 3
Keyboard Logitech G915 TKL
VR HMD Oculus Quest 2
Software Windows 11 + Universal Blue
Lots of NVIDIA Gameworks stuff has been open sourced for this GDC

View attachment 84656

The last time I looked at this, the github repo still held blobs\object code that according to the license, you are not able to dissemble. If its fully open now, I have no problem with random API's, for devs to use. Contrary to above posts saying its Nvidia hatred, its simply more broadbased that Gameworks was incredibly detrimental to the PC Ecosystem and games as a whole whilst it was in flavour. I for one do not want to see a repeat.
 
Joined
Mar 24, 2012
Messages
533 (0.11/day)
Does Physix 3.x make use of the CPUS SEE/SSE2 instruction set now? Or is NV still crippling Physix CPU performance with the stone-age x87 instruction set?

PhysX 3 is actually quite good (cpu based PhysX) to the point havok have to make noise and remind everyone that they are still the best when it comes to third party physics solution for games. and funny things is despite people always complained about nvidia proprietary tendency PhysX right now more open than havok. it is still not open source but you can look at the source code now without the need to pay nvidia for the access unlike havok.
 
Joined
Jun 1, 2007
Messages
150 (0.02/day)
Location
new jersey usa
you guys kidding me?
there has been shit all nada nothing with new or just even better visuals in dx12.
go get a console you will love the performance and looks
 
Joined
Jul 9, 2015
Messages
3,413 (0.99/day)
System Name M3401 notebook
Processor 5600H
Motherboard NA
Memory 16GB
Video Card(s) 3050
Storage 500GB SSD
Display(s) 14" OLED screen of the laptop
Software Windows 10
Benchmark Scores 3050 scores good 15-20% lower than average, despite ASUS's claims that it has uber cooling.
Joined
Apr 29, 2014
Messages
4,304 (1.11/day)
Location
Texas
System Name SnowFire / The Reinforcer
Processor i7 10700K 5.1ghz (24/7) / 2x Xeon E52650v2
Motherboard Asus Strix Z490 / Dell Dual Socket (R720)
Cooling RX 360mm + 140mm Custom Loop / Dell Stock
Memory Corsair RGB 16gb DDR4 3000 CL 16 / DDR3 128gb 16 x 8gb
Video Card(s) GTX Titan XP (2025mhz) / Asus GTX 950 (No Power Connector)
Storage Samsung 970 1tb NVME and 2tb HDD x4 RAID 5 / 300gb x8 RAID 5
Display(s) Acer XG270HU, Samsung G7 Odyssey (1440p 240hz)
Case Thermaltake Cube / Dell Poweredge R720 Rack Mount Case
Audio Device(s) Realtec ALC1150 (On board)
Power Supply Rosewill Lightning 1300Watt / Dell Stock 750 / Brick
Mouse Logitech G5
Keyboard Logitech G19S
Software Windows 11 Pro / Windows Server 2016
I desperately hope that developers avoid this shit this generation. No gamer wants this crap, and most examples usually end up having tanky performance on all but the latest high ends.
I agree, never been a big fan of Gameworks, never will.

If amd had no power over what runs on their cards, nvidia could easily check the gpu vendor and run whatever inefficient ssao method they could come up with.
The difference is if they did do that people would find out and it would be a PR nightmare. At least this way they can keep up the argument "Its better because Nvidia is better" rather than "Were not going to even let you try".

Either way, this just means were moving closer and closer to DX12 replacing DX11 as the main DirectX being used which is also a good thing.
 
Top