• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA Announces DX12 Gameworks Support

FordGT90Concept

"I go fast!1!11!1!"
Joined
Oct 13, 2008
Messages
26,259 (4.44/day)
Location
IA, USA
System Name BY-2021
Processor AMD Ryzen 7 5800X (65w eco profile)
Motherboard MSI B550 Gaming Plus
Cooling Scythe Mugen (rev 5)
Memory 2 x Kingston HyperX DDR4-3200 32 GiB
Video Card(s) AMD Radeon RX 7900 XT
Storage Samsung 980 Pro, Seagate Exos X20 TB 7200 RPM
Display(s) Nixeus NX-EDG274K (3840x2160@144 DP) + Samsung SyncMaster 906BW (1440x900@60 HDMI-DVI)
Case Coolermaster HAF 932 w/ USB 3.0 5.25" bay + USB 3.2 (A+C) 3.5" bay
Audio Device(s) Realtek ALC1150, Micca OriGen+
Power Supply Enermax Platimax 850w
Mouse Nixeus REVEL-X
Keyboard Tesoro Excalibur
Software Windows 10 Home 64-bit
Benchmark Scores Faster than the tortoise; slower than the hare.
What this means in a nutshell, we won't see games performing better with current PhysX, they'll just cram more objects into games and we'll see ZERO gains while also not really seeing any visual improvement either. What difference does it make between 100 and 200 shards of shattered glass? Visually, not much. But calculation wise, you're taxing it twice as much. The times of 4 shards of glass vs 20 are long gone. When numbers are this high, you don't have to go stupid on it just because you can, you should leave the gain for players to enjoy in terms pof higher framerate. But sily me always thinking backwards by removing polygons instead of adding them with tessellation and wanting PhysX to perform better while looking the same instead of just cramming gazillion of everything to make it more real than reality. Ugh...
If NVIDIA did open source it so AMD and Intel can GPU accelerate it, then GameWorks can be used for important things in games like destroying buildings instead of just cosmetic things like shattering glass, liter flying around, fancy hair/fur, and realistic capes. Because GameWorks wasn't vendor agnostic, developers could only use it for visuals.
 
Joined
Apr 30, 2012
Messages
3,881 (0.84/day)
If NVIDIA did open source it so AMD and Intel can GPU accelerate it, then GameWorks can be used for important things in games like destroying buildings instead of just cosmetic things like shattering glass, liter flying around, fancy hair/fur, and realistic capes. Because GameWorks wasn't vendor agnostic, developers could only use it for visuals.

Microsoft bought Havok (2015) and Donya Labs (2017). Who knows what Microsoft intentions are with their recent purchases but they could easily implement (in DirectX) or nudge developers in such a direction.

This might be a counter to such a move or could be an attempt to play nice with Dev/Publishers towards Nintendo Switch and make it appealing to cross platform (Switch/PC).
 
Last edited:

FordGT90Concept

"I go fast!1!11!1!"
Joined
Oct 13, 2008
Messages
26,259 (4.44/day)
Location
IA, USA
System Name BY-2021
Processor AMD Ryzen 7 5800X (65w eco profile)
Motherboard MSI B550 Gaming Plus
Cooling Scythe Mugen (rev 5)
Memory 2 x Kingston HyperX DDR4-3200 32 GiB
Video Card(s) AMD Radeon RX 7900 XT
Storage Samsung 980 Pro, Seagate Exos X20 TB 7200 RPM
Display(s) Nixeus NX-EDG274K (3840x2160@144 DP) + Samsung SyncMaster 906BW (1440x900@60 HDMI-DVI)
Case Coolermaster HAF 932 w/ USB 3.0 5.25" bay + USB 3.2 (A+C) 3.5" bay
Audio Device(s) Realtek ALC1150, Micca OriGen+
Power Supply Enermax Platimax 850w
Mouse Nixeus REVEL-X
Keyboard Tesoro Excalibur
Software Windows 10 Home 64-bit
Benchmark Scores Faster than the tortoise; slower than the hare.
I did not know Microsoft bought Havok off of Intel. Makes sense because Intel only grabbed Havok in pursuit of Knight's Corner/Xeon Phi and when they abandoned making that project into a GPU, Intel pretty much forgot about Havok. I definitely sense Havok integration into the DirectX API in DirectX 12.1 or DirectX 13 that runs predominantly in the GPU compute queue.

NVIDIA must sense this is going to happen soon (likely through Direct3D work) so they unshackled GameWorks because Microsoft will destroy PhysX on Windows where the bulk of gamers are. PhysX is really the only API in GameWorks that has gained a lot of traction namely because it was really the first to market via Ageia. As far as I know, it is still the only GPU accelerated API that is reasonable for developers to use. Most that use Havok just run it on CPU.

I really wish Microsoft would have acquired Havok sooner so the PhysX nonsense would have ended quicker.
 
Last edited:
Joined
Sep 26, 2012
Messages
871 (0.19/day)
Location
Australia
System Name ATHENA
Processor AMD 7950X
Motherboard ASUS Crosshair X670E Extreme
Cooling ASUS ROG Ryujin III 360, 13 x Lian Li P28
Memory 2x32GB Trident Z RGB 6000Mhz CL30
Video Card(s) ASUS 4090 STRIX
Storage 3 x Kingston Fury 4TB, 4 x Samsung 870 QVO
Display(s) Acer X38S, Wacom Cintiq Pro 15
Case Lian Li O11 Dynamic EVO
Audio Device(s) Topping DX9, Fluid FPX7 Fader Pro, Beyerdynamic T1 G2, Beyerdynamic MMX300
Power Supply Seasonic PRIME TX-1600
Mouse Xtrfy MZ1 - Zy' Rail, Logitech MX Vertical, Logitech MX Master 3
Keyboard Logitech G915 TKL
VR HMD Oculus Quest 2
Software Windows 11 + Universal Blue
I did not know Microsoft bough Havok off of Intel. Makes sense because Intel only grabbed Havok in pursuit of Knight's Corner/Xeon Phi and when they abandoned making that project into a GPU, Intel pretty much forgot about Havok. I definitely sense Havok integration into the DirectX API in DirectX 12.1 or DirectX 13 that runs predominantly in the GPU compute queue.

NVIDIA must sense this is going to happen soon (likely through Direct3D work) so they unshackled GameWorks because Microsoft will destroy PhysX on Windows where the bulk of gamers are. PhysX is really the only API in GameWorks that has gained a lot of traction namely because it was really the first to market via Ageia. As far as I know, it is still the only GPU accelerated API that is reasonable for developers to use. Most that use Havok just run it on CPU.

I really wish Microsoft would have acquired Havok sooner so the PhysX nonsense would have ended quicker.

The thing about physics simulation is that it doesn't actually benefit that much from being run on a GPU. And especially not Nvidia's implementation which is still crippled, even on Nvidia hardware compared to competing solutions like Havok.
 

FordGT90Concept

"I go fast!1!11!1!"
Joined
Oct 13, 2008
Messages
26,259 (4.44/day)
Location
IA, USA
System Name BY-2021
Processor AMD Ryzen 7 5800X (65w eco profile)
Motherboard MSI B550 Gaming Plus
Cooling Scythe Mugen (rev 5)
Memory 2 x Kingston HyperX DDR4-3200 32 GiB
Video Card(s) AMD Radeon RX 7900 XT
Storage Samsung 980 Pro, Seagate Exos X20 TB 7200 RPM
Display(s) Nixeus NX-EDG274K (3840x2160@144 DP) + Samsung SyncMaster 906BW (1440x900@60 HDMI-DVI)
Case Coolermaster HAF 932 w/ USB 3.0 5.25" bay + USB 3.2 (A+C) 3.5" bay
Audio Device(s) Realtek ALC1150, Micca OriGen+
Power Supply Enermax Platimax 850w
Mouse Nixeus REVEL-X
Keyboard Tesoro Excalibur
Software Windows 10 Home 64-bit
Benchmark Scores Faster than the tortoise; slower than the hare.
It does, especially fluid simulations.
 
Joined
Oct 1, 2013
Messages
250 (0.06/day)
The thing about physics simulation is that it doesn't actually benefit that much from being run on a GPU. And especially not Nvidia's implementation which is still crippled, even on Nvidia hardware compared to competing solutions like Havok.
GPU physics is superior in particles and fluid simulations, basically small pipelines. In other stuff like destruction and collision CPU is better though.
 
Joined
Mar 1, 2017
Messages
2 (0.00/day)
The difference is if they did do that people would find out and it would be a PR nightmare. At least this way they can keep up the argument "Its better because Nvidia is better" rather than "Were not going to even let you try".

Either way, this just means were moving closer and closer to DX12 replacing DX11 as the main DirectX being used which is also a good thing.

How is that different? You said gameworks was a black box, implying AMD couldn't see or modify the code they run on their GPUs. But HBAO+ code they can see? Your claims aren't consistent.
 
Top