• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Recommended PhysX card for 5xxx series? [Is vRAM relevant?]

Joined
Dec 25, 2020
Messages
7,789 (5.08/day)
Location
São Paulo, Brazil
System Name "Icy Resurrection"
Processor 13th Gen Intel Core i9-13900KS
Motherboard ASUS ROG Maximus Z790 Apex Encore
Cooling Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM
Memory 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V
Video Card(s) NVIDIA RTX A2000
Storage 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD
Display(s) 55-inch LG G3 OLED
Case Pichau Mancer CV500 White Edition
Audio Device(s) Sony MDR-V7 connected through Apple USB-C
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Classic IntelliMouse (2017)
Keyboard IBM Model M type 1391405
Software Windows 10 Pro 22H2
Benchmark Scores I pulled a Qiqi~
Mantle was never going to be the final product and as such was really only AMD's experiment which in the end turned into Vulcan
What nGreedia are doing is completely removing PhysX with GPU support on newest GPU's with no actual plans to replace it with a 64bit version allowing it to run on a GPU
They're two completely different things

In my point of view, the demise of both was just the natural evolution of things. Mantle was superseded by Vulkan, PhysX was superseded by Havok and many other vendor-agnostic, generally superior and easier on performance physics engines. That you can run on any vendor's hardware. When Mantle was pulled, all games which supported it became unplayable had DirectX 11 versions also made alongside them, which ran poorer on AMD hardware. It's the same here, with the deprecation of 32-bit CUDA, PhysX's removal simply regresses its state a little, to the same state it has always run in the competition's hardware.

I was never too fond of it, sure, some of the effects were awesome for the games of the time, but I've always seen it as a point of contention, usually on the side of dismissal. "CUDA isn't important"; "PhysX is a gimmick", "GimpWorks", all points i've seen thrown around to steer people away from GeForce... and now we get this outrage? That's my point. Big reason I'm not fond of PhysX is that it was always locked to Nvidia hardware, even more so than the performance issues.
 
Joined
Apr 30, 2020
Messages
1,086 (0.61/day)
System Name S.L.I + RTX research rig
Processor Ryzen 7 5800X 3D.
Motherboard MSI MEG ACE X570
Cooling Corsair H150i Cappellx
Memory Corsair Vengeance pro RGB 3200mhz 32Gbs
Video Card(s) 2x Dell RTX 2080 Ti in S.L.I
Storage Western digital Sata 6.0 SDD 500gb + fanxiang S660 4TB PCIe 4.0 NVMe M.2
Display(s) HP X24i
Case Corsair 7000D Airflow
Power Supply EVGA G+1600watts
Mouse Corsair Scimitar
Keyboard Cosair K55 Pro RGB
Because it's a new method that almost completely reinvents the wheel, raising the bar far higher than rasterization could ever take graphics towards photorealism. Its biggest drawback is the vast amount of computing power required to achieve the feat. PhysX on the other hand, is just an obsolete, proprietary physics engine that has long outlived its usefulness and suffered with optimization problems through most of all its service life.


path tracings is exactly the same physX. Photo realistic my ass, it isn't even close to photo-realistic. It's noise garbage & if rasterization were not need why the hell are we even using it with anything with RT??? Everything in the whole statement is completely contradictor to the facts of how things are playing out RTX & "path tracing" are becoming a propriety garbage, like it or not that's the truth.
 
Last edited by a moderator:
Joined
Jan 31, 2010
Messages
5,611 (1.02/day)
Location
Gougeland (NZ)
System Name Cumquat 2021
Processor AMD RyZen R7 7800X3D
Motherboard Asus Strix X670E - E Gaming WIFI
Cooling Deep Cool LT720 + CM MasterGel Pro TP + Lian Li Uni Fan V2
Memory 32GB GSkill Trident Z5 Neo 6000
Video Card(s) PowerColor HellHound RX7800XT 2550cclk/2450mclk
Storage 1x Adata SX8200PRO NVMe 1TB gen3 x4 1X Samsung 980 Pro NVMe Gen 4 x4 1TB, 12TB of HDD Storage
Display(s) AOC 24G2 IPS 144Hz FreeSync Premium 1920x1080p
Case Lian Li O11D XL ROG edition
Audio Device(s) RX7800XT via HDMI + Pioneer VSX-531 amp Technics 100W 5.1 Speaker set
Power Supply EVGA 1000W G5 Gold
Mouse Logitech G502 Proteus Core Wired
Keyboard Logitech G915 Wireless
Software Windows 11 X64 PRO (build 24H2)
Benchmark Scores it sucks even more less now ;)
PhysX was superseded by Havok and many other vendor-agnostic, generally superior and easier on performance physics engines
Skyrim used the havok physics engine and it was trash if you ran over 60fps shit would fly off everywhere when entering a room unless you diddled around in the configs to adjust timings for it or used mods to adjust for faster than 60fps
 
Joined
Dec 25, 2020
Messages
7,789 (5.08/day)
Location
São Paulo, Brazil
System Name "Icy Resurrection"
Processor 13th Gen Intel Core i9-13900KS
Motherboard ASUS ROG Maximus Z790 Apex Encore
Cooling Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM
Memory 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V
Video Card(s) NVIDIA RTX A2000
Storage 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD
Display(s) 55-inch LG G3 OLED
Case Pichau Mancer CV500 White Edition
Audio Device(s) Sony MDR-V7 connected through Apple USB-C
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Classic IntelliMouse (2017)
Keyboard IBM Model M type 1391405
Software Windows 10 Pro 22H2
Benchmark Scores I pulled a Qiqi~
path tracings is exactly the same physX. Photo realistic my ass, it isn't even close to photo-realistic. It's noise garbage & if rasterization were not need why the hell are we even using it with anything with RT??? Everything in the whole statement is completely contradictor to the facts of how things are playing out RTX & "path tracing" are becoming a propriety garbage, like it or not that's the truth.

I think this thread is the ultimate proof that I am not full of crap. Low-end Nvidia and Radeon buyers have spent years upon years relentlessly dogging on GameWorks, and now that the feature that's arguably its poster child doesn't work anymore, it's something that is being weaponized as if it was an absolute essential. I don't necessarily believe Nvidia handled this the correct way, but I'm not going to lie about it, I don't miss it either. PhysX or its lack thereof have never stopped me from enjoying a video game.

Why does RT have noise? Simple: because the hardware is anemic and can't trace every ray, denoising algorithms basically attempt to hide the fact that rays are being traced at random (which generates the noise) by approximating pixels as well as the algorithm can. The less accurate your ray engine is (or the lower the resolution), the noisier the image will be. Nvidia's RTX denoiser has always had a very good level of accuracy, and it was still dismissed as "fancy reflections", again, largely by the same type of folks who cannot really run a RT game well due to their hardware.

Bookmark this post and tell me if people won't be all too happy once affordable hardware from any brand you pick can run it well ;)

Skyrim used the havok physics engine and it was trash if you ran over 60fps shit would fly off everywhere when entering a room unless you diddled around in the configs to adjust timings for it or used mods to adjust for faster than 60fps

That's because the time scale is bound to fps, and Creation Engine always assumes it's running at 60 fps or below (iPresentInterval=1 at 60 Hz, this ini parameter controlling the presentation interval as a division factor of the screen's refresh rate, essentially a vblank every X frames command). It broke a lot more than just physics, Papyrus (the scripting engine) also ran faster and everything glitched as a result. It's not because of the physics engine used, Wikipedia places it at ~360 games using it and it's not an exhaustive list.

Even Fallout 4 is still affected to a degree, Fallout 76 isn't because the scripting engine runs server-side, I think the 64 bit version of Skyrim (SE) was finally fixed. Starfield is and has always been OK, as was Oblivion for some reason. Oblivion even supports ultrawide natively.
 
Last edited by a moderator:
Joined
Apr 30, 2020
Messages
1,086 (0.61/day)
System Name S.L.I + RTX research rig
Processor Ryzen 7 5800X 3D.
Motherboard MSI MEG ACE X570
Cooling Corsair H150i Cappellx
Memory Corsair Vengeance pro RGB 3200mhz 32Gbs
Video Card(s) 2x Dell RTX 2080 Ti in S.L.I
Storage Western digital Sata 6.0 SDD 500gb + fanxiang S660 4TB PCIe 4.0 NVMe M.2
Display(s) HP X24i
Case Corsair 7000D Airflow
Power Supply EVGA G+1600watts
Mouse Corsair Scimitar
Keyboard Cosair K55 Pro RGB
I think this thread is the ultimate proof that I am not full of crap. Low-end Nvidia and Radeon buyers have spent years upon years relentlessly dogging on GameWorks, and now that the feature that's arguably its poster child doesn't work anymore, it's something that is being weaponized as if it was an absolute essential. I don't necessarily believe Nvidia handled this the correct way, but I'm not going to lie about it, I don't miss it either. PhysX or its lack thereof have never stopped me from enjoying a video game.

Why does RT have noise? Simple: because the hardware is anemic and can't trace every ray, denoising algorithms basically attempt to hide the fact that rays are being traced at random (which generates the noise) by approximating pixels as well as the algorithm can. The less accurate your ray engine is (or the lower the resolution), the noisier the image will be. Nvidia's RTX denoiser has always had a very good level of accuracy, and it was still dismissed as "fancy reflections", again, largely by the same type of folks who cannot really run a RT game well due to their hardware.

Nice way to dodge what I said about rasterization being included with raytracing.
 
Joined
Nov 22, 2023
Messages
445 (0.95/day)
Why though?

-Because you can't build an entire Gen around features and then simultaneously depreciate features.

It's an uncomfortable reality that all the reasons to buy your Nvidia card can disappear randomly at some point.

How do you dunk and AMiDiots at that point? How I ask you?!
 
Joined
Dec 25, 2020
Messages
7,789 (5.08/day)
Location
São Paulo, Brazil
System Name "Icy Resurrection"
Processor 13th Gen Intel Core i9-13900KS
Motherboard ASUS ROG Maximus Z790 Apex Encore
Cooling Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM
Memory 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V
Video Card(s) NVIDIA RTX A2000
Storage 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD
Display(s) 55-inch LG G3 OLED
Case Pichau Mancer CV500 White Edition
Audio Device(s) Sony MDR-V7 connected through Apple USB-C
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Classic IntelliMouse (2017)
Keyboard IBM Model M type 1391405
Software Windows 10 Pro 22H2
Benchmark Scores I pulled a Qiqi~
Nice way to dodge what I said about rasterization being included with raytracing.

Indeed, it looks like I actually turned your question back at you without realizing it. You are right about that, but my point stands, would you turn down fully pathtraced games if you could run them with the exact same performance, without generative AI crutches, or reduced resolution, noisy output? Most newer features such as image upscalers, frame generation, mesh shaders, DirectStorage/GPU data decomp, etc. are largely quality of life and performance improvements, with the biggest advancement in fidelity being ray tracing. Which is still too bloody heavy to run on currently existing hardware, which is why rasterization is still "necessary." The extra memory and compute power of modern GPUs is largely being squandered with unoptimized, extremely high resolution assets that don't even look good in practice, and this is coming from a dude who was insane enough to order an RTX 5090 for gaming.

The thing is, it hasn't advanced in some time. I think the last really impressive raster techniques introduced were soft shadows and subsurface scattering, at that point you pretty much got atmospherics and light simulation as accurate as it'll get on traditional raster graphics. I'm of the opinion that raster was fully mastered around the Pascal era, so much that the GTX 1080 Ti is still widely considered to hold up to this day, and indeed still fully serves the needs of many. AMD took a while to catch up, but despite being downlevel hardware with none of the newer DirectX 12 Ultimate features, a lot of people still defend the RX 5700 XT for the same reason: no GPUs released since that generation really yielded any tangible improvements regarding fidelity, and they'll run DirectX 11 games just fine. After all, this is still the prevalent graphics API today.

It's an uncomfortable reality that all the reasons to buy your Nvidia card can disappear randomly at some point.

Are you able to note any exclusive middleware with any degree of relevancy to game engines that was developed by ATI in the mid-2000's or - AMD shortly after the business was acquired - that is still supported and maintained on current-generation AMD hardware today?
 
Joined
Feb 18, 2005
Messages
6,203 (0.85/day)
Location
Ikenai borderline!
System Name Firelance.
Processor Threadripper 3960X
Motherboard ROG Strix TRX40-E Gaming
Cooling IceGem 360 + 6x Arctic Cooling P12
Memory 8x 16GB Patriot Viper DDR4-3200 CL16
Video Card(s) MSI GeForce RTX 4060 Ti Ventus 2X OC
Storage 2TB WD SN850X (boot), 4TB Crucial P3 (data)
Display(s) Dell S3221QS(A) (32" 38x21 60Hz) + 2x AOC Q32E2N (32" 25x14 75Hz)
Case Enthoo Pro II Server Edition (Closed Panel) + 6 fans
Power Supply Fractal Design Ion+ 2 Platinum 760W
Mouse Logitech G604
Keyboard Razer Pro Type Ultra
Software Windows 10 Professional x64
TBH I really think NVIDIA should've built a 32-to-64-bit translation layer for PhysX. I understand why they didn't do it (doesn't make economic sense) but it would've been a much-appreciated gesture.

Skyrim used the havok physics engine and it was trash if you ran over 60fps shit would fly off everywhere when entering a room unless you diddled around in the configs to adjust timings for it or used mods to adjust for faster than 60fps
As the Genshin Doctor mentions, that is Bethesda and Creation Kit being crap - nothing to do with Havok.

-Because you can't build an entire Gen around features and then simultaneously depreciate features.
Sorry for calling you out but this is a massive pet peeve of mine: it's DEPRECATE, not depreciate.
 
Joined
Aug 9, 2024
Messages
209 (1.00/day)
TBH I really think NVIDIA should've built a 32-to-64-bit translation layer for PhysX. I understand why they didn't do it (doesn't make economic sense) but it would've been a much-appreciated gesture.

I don't understand why they couldn't have, with that "army of software engineers" they claim to have working on drivers.

Mantle was never going to be the final product

I wish that you could have been a forum moderator 10-12 years ago and had to deal with all of the obnoxious shills who felt otherwise.
 
Joined
Nov 22, 2023
Messages
445 (0.95/day)
Sorry for calling you out but this is a massive pet peeve of mine: it's DEPRECATE, not depreciate.

- "Deprecated" refers to software that is no longer recommended, while "depreciated" refers to a decrease in value over time.

I believe both make sense here. Of Physx specifically the feature is deprecated but in the context of Nvidia features as a whole they have depreciated.

But I'm not gonna die on this hill.

Are you able to note any exclusive middleware with any degree of relevancy to game engines that was developed by ATI in the mid-2000's or - AMD shortly after the business was acquired - that is still supported and maintained on current-generation AMD hardware today?

- Why even bring up ATI/AMD? Nvidia is the one selling everyone on the features of their cards. ATI/AMD are irrelevant has beens of the GPU market.

With a 90%+ market share, I think developments like this should absolutely give NV purchasers pause.
 
Joined
Jul 8, 2022
Messages
273 (0.28/day)
Location
USA
Processor i9-11900K
Motherboard Asus ROG Maximus XIII Hero
Cooling Arctic Liquid Freezer II 360
Memory 4x8GB DDR4
Video Card(s) Alienware RTX 3090 OEM
Storage OEM Kioxia 2tb NVMe (OS), 4TB WD Blue HDD (games)
Display(s) LG 27GN950-B
Case Lian Li Lancool II Mesh Performance (black)
Audio Device(s) Logitech Pro X Wireless
Power Supply Corsair RM1000x
Keyboard HyperX Alloy Elite 2
I am assuming performance is still bad with a dedicated card because PhyX stalls the entire rendering pipeline + overhead from copying the results from across PCIe
NVIDIA should do something about this. A solution to link cards separate from the PCIe bus… A bridge, if you will…
 
Joined
Dec 25, 2020
Messages
7,789 (5.08/day)
Location
São Paulo, Brazil
System Name "Icy Resurrection"
Processor 13th Gen Intel Core i9-13900KS
Motherboard ASUS ROG Maximus Z790 Apex Encore
Cooling Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM
Memory 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V
Video Card(s) NVIDIA RTX A2000
Storage 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD
Display(s) 55-inch LG G3 OLED
Case Pichau Mancer CV500 White Edition
Audio Device(s) Sony MDR-V7 connected through Apple USB-C
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Classic IntelliMouse (2017)
Keyboard IBM Model M type 1391405
Software Windows 10 Pro 22H2
Benchmark Scores I pulled a Qiqi~
- Why even bring up ATI/AMD? Nvidia is the one selling everyone on the features of their cards. ATI/AMD are irrelevant has beens of the GPU market.

With a 90%+ market share, I think developments like this should absolutely give NV purchasers pause.

It's really because are the only company that has been around long enough to be a historical competitor to Nvidia, and well, you saw the thread. (likely empty) threats to "switch sides" are being thrown, condemnation of "nvidia removing things" that people spent years upon years shunning, go figure.
 
Joined
Nov 22, 2023
Messages
445 (0.95/day)
It's really because are the only company that has been around long enough to be a historical competitor to Nvidia, and well, you saw the thread. (likely empty) threats to "switch sides" are being thrown, condemnation of "nvidia removing things" that people spent years upon years shunning, go figure.

- Yeah no one is going anywhere over this. People will whine and then go crawling right back to the only game in town.

If AMD had been consistently competing across all fronts maybe they would have caught a few people jumping ship but they're a CPU company who almost disdainfully deals in GPUs because of decisions made by prior management, and they'll catch no one.
 
Joined
May 17, 2021
Messages
3,647 (2.62/day)
Processor Ryzen 5 5700x
Motherboard B550 Elite
Cooling Thermalright Perless Assassin 120 SE
Memory 32GB Fury Beast DDR4 3200Mhz
Video Card(s) Gigabyte 3060 ti gaming oc pro
Storage Samsung 970 Evo 1TB, WD SN850x 1TB, plus some random HDDs
Display(s) LG 27gp850 1440p 165Hz 27''
Case Lian Li Lancool II performance
Power Supply MSI 750w
Mouse G502
cards at 2000 usd (or even mid rage approaching or surpassing 1000usd) and you have to buy another one to play old (really not that old some of them) games. If PC gaming isn't a meme right now, i'm the queen of England
This is increasingly looking more and more like the horse we could buy with real money in a game, i mean what can go wrong from this point forward
:rolleyes:

Peperidge farms remembers when one of the selling points of PC over consoles was the backwards compatibility :D

And there's people defending and reasoning this. Coping lvl 100
 
Joined
Nov 16, 2023
Messages
2,018 (4.24/day)
Location
Nowhere
System Name I don't name my rig
Processor 14700K
Motherboard Asus TUF Z790
Cooling Air/water/DryIce
Memory DDR5 G.Skill Z5 RGB 6000mhz C36
Video Card(s) RTX 4070 Super
Storage 980 Pro
Display(s) 1080P 144hz
Case Open bench
Audio Device(s) Some Old Sherwood stereo and old cabinet speakers
Power Supply Corsair 1050w HX series
Mouse Razor Mamba Tournament Edition
Keyboard Logitech G910
VR HMD Quest 2
Software Windows
Benchmark Scores Max Freq 13700K 6.7ghz DryIce Max Freq 14700K 7.0ghz DryIce Max all time Freq FX-8300 7685mhz LN2
Here's an old recording when I was doing Physx vs no physx with Cell Factor revolution. You had to own an Agiea physx PPU to play some of these maps. I never did get an NVidia card to run physx in this game though.

 
Joined
Feb 1, 2019
Messages
3,876 (1.74/day)
Location
UK, Midlands
System Name Main PC
Processor 13700k
Motherboard Asrock Z690 Steel Legend D4 - Bios 13.02
Cooling Noctua NH-D15S
Memory 32 Gig 3200CL14
Video Card(s) 4080 RTX SUPER FE 16G
Storage 1TB 980 PRO, 2TB SN850X, 2TB DC P4600, 1TB 860 EVO, 2x 3TB WD Red, 2x 4TB WD Red
Display(s) LG 27GL850
Case Fractal Define R4
Audio Device(s) Soundblaster AE-9
Power Supply Antec HCG 750 Gold
Software Windows 10 21H2 LTSC
It was a heavily promoted feature




I will probably pick up a used 2060 for playing older games like that.
To be fair looking at the first clip (didnt check the rest), it is a big improvement. Happy they didnt drop it sooner so I still have it on my 4080.

With the attention it has got from the media and community, I wouldnt be surprised now if Nvidia did some 32 bit physx binaries or emulation on 64 bit cuda.
 
Top