• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Keep a 4080s or take a 5070ti?

dgianstefani

TPU Proofreader
Staff member
Joined
Dec 29, 2017
Messages
5,306 (2.03/day)
Location
Swansea, Wales
System Name Silent/X1 Yoga/S25U-1TB
Processor Ryzen 9800X3D @ 5.575ghz all core 1.24 V, Thermal Grizzly AM5 High Performance Heatspreader/1185 G7
Motherboard ASUS ROG Strix X670E-I, chipset fans replaced with Noctua A14x25 G2
Cooling Optimus Block, HWLabs Copper 240/40 + 240/30, D5/Res, 4x Noctua A12x25, 1x A14G2, Mayhems Ultra Pure
Memory 64 GB Dominator Titanium White 6000 MT, 130 ns tRFC, active cooled
Video Card(s) RTX 3080 Ti Founders Edition, Conductonaut Extreme, 18 W/mK MinusPad Extreme, Corsair XG7 Waterblock
Storage Intel Optane DC P1600X 118 GB, Samsung 990 Pro 2 TB
Display(s) 32" 240 Hz 1440p Samsung G7, 31.5" 165 Hz 1440p LG NanoIPS Ultragear, MX900 dual gas VESA mount
Case Sliger SM570 CNC Aluminium 13-Litre, 3D printed feet, custom front, LINKUP Ultra PCIe 4.0 x16 White
Audio Device(s) Audeze Maxwell Ultraviolet w/upgrade pads & LCD headband, Galaxy Buds 3 Pro, Razer Nommo Pro
Power Supply SF1000 Plat, full transparent custom cables, Sentinel Pro 1500 Online Double Conversion UPS w/Noctua
Mouse Razer Viper V3 Pro 8 KHz Mercury White & Pulsar Supergrip tape, Razer Atlas, Razer Strider Chroma
Keyboard Wooting 60HE+ module, TOFU-R CNC Alu/Brass, SS Prismcaps W+Jellykey, LekkerV2 mod, TLabs Leath/Suede
Software Windows 11 IoT Enterprise LTSC 24H2
Benchmark Scores Legendary
If the difference is so little you would hardly notice or tell then I don't think it really matters. Power Efficiency simply doesn't really matter for this comparison in my opinion. Just because the 7900XTX is a smidge bit faster than the 4080S (usually only by 1% to 4%), suddenly didnt mean people were recommending the 7900XTX over the 4080S in raster. I think the same logic applies here.

Not to mention, OP mentioned overclocking a potential 5070Ti to beat a 4080 (which it already matches as is) which opens up another can of worms for power efficiency, and ATP if your gonna overclock power efficiency probably isnt your concern anymore. And even in this regard, an overclocked 5070Ti from what im reading is still well within 5% difference range. Cooler matters too in that regard but it seems pretty consistent across all the charts I saw for different models (avoiding FE for comparison due to no FE card for 5070Ti)

TL;DR, power efficiency, and by extension, other stuff such as raster, really dont matter in this arguement. I dont see a point in bringing them up. Your choice is gonna entirely hinge on the features. (and price, if you can get a good price of course.)

5070Ti if its at a price similar to a 4080S (which the cheapest ive seen are), if you already own a 4080S, you should pass unless you value the features exclusive to the 5070Ti and can actually get it. If you don't, and were considering getting a 4080S, then, in that specific instance, go ahead. Even more so if you can buy a 5070Ti at its actual MSRP. If your not an american thats a factor too, due to stuff like VAT, but thats getting into a whole other can of worms.
You're still missing the point.

7900XTX vs 4080 had pros/cons. 7900XTX ~5% faster in raster, bit more VRAM (debatable if a real positive beyond bigger number better, since no CUDA for professional applications and games don't use 24 GB), slightly better in Linux (at the time), 40% slower in RT, no DLSS, more power draw etc.

5070 Ti vs 4080, 5070 Ti wins or draws in every category. OK, it's only 5% more efficient than the previous most efficient card available. But again, doesn't matter if the win is small, there's no downside to going with the 5070 Ti for the same or less money.

OP is clearly OK with the hassle aspect, he's asking which is the better card, and the answer is obvious.

I would say to get it if you can at MSRP. It's worth it for MFG IMO and you can likely sell your 4080s for the same price or more.
He doesn't need to sell it, it's within return window.
 
Joined
Dec 9, 2024
Messages
165 (2.12/day)
Location
Missouri
System Name Don't do thermal paste, kids
Processor Ryzen 7 5800X
Motherboard ASUS PRIME B550-PLUS AC-HES
Cooling Thermalright Peerless Assassin 120 SE
Memory Silicon Power 32GB (2 x 16GB) DDR4 3200
Video Card(s) GTX 1060 3GB (temporarily)
Display(s) Gigabyte G27Q (1440p / 170hz DP)
Case SAMA SV01
Power Supply Firehazard in the making
Mouse Corsair Nightsword
Keyboard Steelseries Apex Pro
Have 4080S? Keep, unless;
Value DLSS4 and MFG? Get 5070Ti.

Return window open? If you have a guaranteed way of getting a 5070Ti, get 5070Ti. But do NOT pay more than 4080S MSRP. Ideally, pay less.
Don't have one but wanted it, now 5070Ti out? Get 5070Ti.

This is basically what I would recommend OP. 50 Series launch has been rough, just don't be a idiot when buying a 5070Ti and you should be okay if your doing that. OC, Power Effiency, Raster, etc are not what you should worry about. Just focus on the features, or other small things the 5070Ti has over the 4080 if it benefits you. Or if you already can get one for same / smaller price.


You're still missing the point.
Blaming Spider-Man GIF

7900XTX vs 4080 had pros/cons. 7900XTX ~5% faster in raster, bit more VRAM (debatable if a real positive beyond bigger number better, since no CUDA for professional applications and games don't use 24 GB), slightly better in Linux (at the time), 40% slower in RT, no DLSS, more power draw etc.
You are exactly highlighting more of what I'm talking about. Nobody bought a 7900XTX because its faster in raster. They bought it for other reasons. This comparison is no different, except its actually a very close comparison. So why should power efficiency even matter? Just because the 7900XTX on paper wins in raster doesn't mean it actually matters.. you can still value that small, microscopic win, that nobody will really realistically care about or even notice, but the comparison is so close here it doesnt matter.

I think we both are saying the same thing but just applying it in different ways. And for me, a microscopic win might as well be a draw.
 
Joined
Sep 10, 2005
Messages
31 (0.00/day)
Have 4080S? Keep, unless;
Value DLSS4 and MFG? Get 5070Ti.

Return window open? If you have a guaranteed way of getting a 5070Ti, get 5070Ti. But do NOT pay more than 4080S MSRP. Ideally, pay less.
Don't have one but wanted it, now 5070Ti out? Get 5070Ti.

This is basically what I would recommend OP. 50 Series launch has been rough, just don't be a idiot when buying a 5070Ti and you should be okay if your doing that. OC, Power Effiency, Raster, etc are not what you should worry about. Just focus on the features, or other small things the 5070Ti has over the 4080 if it benefits you. Or if you already can get one for same / smaller price.



Blaming Spider-Man GIF


You are exactly highlighting more of what I'm talking about. Nobody bought a 7900XTX because its faster in raster. They bought it for other reasons. This comparison is no different, except its actually a very close comparison. So why should power efficiency even matter? Just because the 7900XTX on paper wins in raster doesn't mean it actually matters.. you can still value that small, microscopic win, that nobody will really realistically care about or even notice, but the comparison is so close here it doesnt matter.

I think we both are saying the same thing but just applying it in different ways. And for me, a microscopic win might as well be a draw.
well dlss4 is also on 4000 cards, what are the other features? mfg? i don't know if i ever will use, i have a 120hz oled tv for game
and a feature i will miss is physix
 

steamrick

New Member
Joined
Feb 10, 2025
Messages
4 (0.29/day)
If you're still within your return window and are *certain* you can get a 5070Ti for less money than the 4080S cost you, I guess you can consider switching them.

Advantages of the 5070Ti:
- barely more efficient
- more headroom for overclocking or undervolting due to 5nm process increment
- more memory bandwidth (if you're looking to run local LLMs and don't have the budget for 24GB+ VRAM)
- MFG, if you care about that
- supports newer features that may or may not become relevant in the future (neural rendering and textures)

Advantages of the 4080S:
- more mature drivers, proven hardware design
- very minor performance advantage on average and slightly less minor performance advantage in raytracing heavy workloads (in spite of the older gen RT units)
- lower chance of a 'dud' that has more units deactivated than it should
- you already have it in hand
- faster in compute heavy workloads such Stable Diffusion


Value DLSS4 and MFG? Get 5070Ti.
DLSS4 works just fine on a 4080S
 
Joined
Jun 14, 2020
Messages
4,490 (2.62/day)
System Name Mean machine
Processor AMD 6900HS
Memory 2x16 GB 4800C40
Video Card(s) AMD Radeon 6700S
"condoning such actions"

My dude it's 32 bit software for which a direct replacement has existed for more than 15 years, and this is 2025. Should hardware manufacturers keep supporting all standards for infinity? Seems like a waste of die space.

Besides, it's not like the game is unplayable without PhysX, or that it will magically make a 15 year old game look contemporary.

Seems like a very similar issue to how modern fast computers have problems running very old games, as framerate tied to physics etc and the sheer speed of modern hardware causes stuff to break. If turning on Physx on a game from the 2000s is critical, I'm sure there's millions of old GPUs/systems on Ebay or in your garage gathering dust.

Could NVIDIA write a translation layer similar to what Intel did for old DX games with their discrete GPU release? Maybe. Is it worth it? Unlikely.

My bet is this whole "issue" was the first time most people even remembered PhysX existed.

I'm guessing NVIDIA made this move to 64 bit only CUDA/PhysX for the same reason Apple went 64 bit only, Intel tried to, and Android is in the process of doing, it simplifies things and allows more focus on currently important things.

Correction, the newest ARM processors for Android already only support 64 bit code.

Started with the Pixel 7. I didn't hear much outcry then.


But hey, NVIDIA bad right?
If I ever dared mentioned that I prefer nvidia over amd cards for the physX support at any point in the last 5 years in these forums, people would nail me on the cross. and would suggest that im just trying to find excuses not to buy amd. Now that nvidia removes support for it, it became a news worthy issue.
 
Joined
Dec 9, 2024
Messages
165 (2.12/day)
Location
Missouri
System Name Don't do thermal paste, kids
Processor Ryzen 7 5800X
Motherboard ASUS PRIME B550-PLUS AC-HES
Cooling Thermalright Peerless Assassin 120 SE
Memory Silicon Power 32GB (2 x 16GB) DDR4 3200
Video Card(s) GTX 1060 3GB (temporarily)
Display(s) Gigabyte G27Q (1440p / 170hz DP)
Case SAMA SV01
Power Supply Firehazard in the making
Mouse Corsair Nightsword
Keyboard Steelseries Apex Pro
DLSS4 works just fine on a 4080S
well dlss4 is also on 4000 cards

To be fair ive heard back and forths on that, could be because MFG is often labeled as part of DLSS4 when it really isnt so.. yea. My fault there. In that case, even less reason to trade out a 4080S unless you like MFG.
 
Last edited:
Joined
Sep 17, 2014
Messages
23,348 (6.12/day)
Location
The Washing Machine
System Name Tiny the White Yeti
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling CPU: Thermalright Peerless Assassin / Case: Phanteks T30-120 x3
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
VR HMD HD 420 - Green Edition ;)
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
It pretty much is, same perf, more features, better power efficiency. If he can get it cheaper too and return the 4080 what exactly is your problem?
Yes, and the featureset is slightly changed, you might miss some ROPs, and the driver branch is exhibiting major problems.

Its completely pointless to sidestep to a 5070ti at this point. Same perf - that really already sealed the deal. Early adopting hardware is never great, and this gen is exceptionally not great at that.

"condoning such actions"

My dude it's 32 bit software for which a direct replacement has existed for more than 15 years, and this is 2025. Should hardware manufacturers keep supporting all standards for infinity? Seems like a waste of die space.

Besides, it's not like the game is unplayable without PhysX, or that it will magically make a 15 year old game look contemporary.

Seems like a very similar issue to how modern fast computers have problems running very old games, as framerate tied to physics etc and the sheer speed of modern hardware causes stuff to break. If turning on Physx on a game from the 2000s is critical, I'm sure there's millions of old GPUs/systems on Ebay or in your garage gathering dust.

Could NVIDIA write a translation layer similar to what Intel did for old DX games with their discrete GPU release? Maybe. Is it worth it? Unlikely.

My bet is this whole "issue" was the first time most people even remembered PhysX existed.

I'm guessing NVIDIA made this move to 64 bit only CUDA/PhysX for the same reason Apple went 64 bit only, Intel tried to, and Android is in the process of doing, it simplifies things and allows more focus on currently important things.

Correction, the newest ARM processors for Android already only support 64 bit code.

Started with the Pixel 7. I didn't hear much outcry then.


But hey, NVIDIA bad right?
You're comparing PC gaming to an Android phone. You don't get it at all.

Its the same reason Windows still has backwards compatibility.
 

dgianstefani

TPU Proofreader
Staff member
Joined
Dec 29, 2017
Messages
5,306 (2.03/day)
Location
Swansea, Wales
System Name Silent/X1 Yoga/S25U-1TB
Processor Ryzen 9800X3D @ 5.575ghz all core 1.24 V, Thermal Grizzly AM5 High Performance Heatspreader/1185 G7
Motherboard ASUS ROG Strix X670E-I, chipset fans replaced with Noctua A14x25 G2
Cooling Optimus Block, HWLabs Copper 240/40 + 240/30, D5/Res, 4x Noctua A12x25, 1x A14G2, Mayhems Ultra Pure
Memory 64 GB Dominator Titanium White 6000 MT, 130 ns tRFC, active cooled
Video Card(s) RTX 3080 Ti Founders Edition, Conductonaut Extreme, 18 W/mK MinusPad Extreme, Corsair XG7 Waterblock
Storage Intel Optane DC P1600X 118 GB, Samsung 990 Pro 2 TB
Display(s) 32" 240 Hz 1440p Samsung G7, 31.5" 165 Hz 1440p LG NanoIPS Ultragear, MX900 dual gas VESA mount
Case Sliger SM570 CNC Aluminium 13-Litre, 3D printed feet, custom front, LINKUP Ultra PCIe 4.0 x16 White
Audio Device(s) Audeze Maxwell Ultraviolet w/upgrade pads & LCD headband, Galaxy Buds 3 Pro, Razer Nommo Pro
Power Supply SF1000 Plat, full transparent custom cables, Sentinel Pro 1500 Online Double Conversion UPS w/Noctua
Mouse Razer Viper V3 Pro 8 KHz Mercury White & Pulsar Supergrip tape, Razer Atlas, Razer Strider Chroma
Keyboard Wooting 60HE+ module, TOFU-R CNC Alu/Brass, SS Prismcaps W+Jellykey, LekkerV2 mod, TLabs Leath/Suede
Software Windows 11 IoT Enterprise LTSC 24H2
Benchmark Scores Legendary
well dlss4 is also on 4000 cards, what are the other features? mfg? i don't know if i ever will use, i have a 120hz oled tv for game
and a feature i will miss is physix
To be clear, it still has PhysX, just not the 32 bit version that is only used in ~15 year old games.

You're comparing PC gaming to an Android phone. You don't get it at all.

Its the same reason Windows still has backwards compatibility.
Lol, I'm simply giving an example of why NVIDIA did this and how this kind of thing is completely normal. The card generation still has backwards compatibility, just not for a niche toggleable feature on a small set of games almost two decades old, which still run just fine without it and have a CPU fallback or processing on a second GPU if it's really so important to you (doubt.jpg).

If I ever dared mentioned that I prefer nvidia over amd cards for the physX support at any point in the last 5 years in these forums, people would nail me on the cross. and would suggest that im just trying to find excuses not to buy amd. Now that nvidia removes support for it, it became a news worthy issue.
Yes, when everyone remembered it existed since it's a vector to attack NVIDIA.
 
Joined
May 30, 2015
Messages
1,979 (0.56/day)
Location
Seattle, WA
To be clear, it still has PhysX, just not the 32 bit version that is only used in ~15 year old games.

You keep mentioning the age of the games as if old games should not be playable due to their age, which I think is hilarious because some of the most played games in the world are 10-20 years old. WoW, Runescape, DOTA 2, GTA:V, etc.

Borderlands 2 is the worst case scenario here as it still gets around 5000 players each day on Steam and is quite frequently sold in the trilogy pack to new buyers who will probably want to play it. On the 50 series without a dedicated card for 32-bit PhysX it tanks into the single digit FPS range during combat. Not fun, and an 18 year old PhysX accelerator is faster and smoother. Mirror's Edge without PhysX is also an entirely different game. There's an entire physical interaction system with glass impacting the player which gets turned off entirely, but very few people still play that game and it is still playable in the 'PhysX off' mode without the extra gameplay variety.

Yes, when everyone remembered it existed since it's a vector to attack NVIDIA.

It's objectively a bad move to remove compatibility features without notice and without alternatives. Even the most hated company on this forum, Apple, made an effort with Rosetta to bring legacy functionality forward both times they dropped an enter architecture. NVIDIA could have, didn't, and it shows where they stand.

On topic for OP; keep the 4080 Super. It's a solid well understood and extremely compatible card. MFG is a meme and not worth worrying about.
 

dgianstefani

TPU Proofreader
Staff member
Joined
Dec 29, 2017
Messages
5,306 (2.03/day)
Location
Swansea, Wales
System Name Silent/X1 Yoga/S25U-1TB
Processor Ryzen 9800X3D @ 5.575ghz all core 1.24 V, Thermal Grizzly AM5 High Performance Heatspreader/1185 G7
Motherboard ASUS ROG Strix X670E-I, chipset fans replaced with Noctua A14x25 G2
Cooling Optimus Block, HWLabs Copper 240/40 + 240/30, D5/Res, 4x Noctua A12x25, 1x A14G2, Mayhems Ultra Pure
Memory 64 GB Dominator Titanium White 6000 MT, 130 ns tRFC, active cooled
Video Card(s) RTX 3080 Ti Founders Edition, Conductonaut Extreme, 18 W/mK MinusPad Extreme, Corsair XG7 Waterblock
Storage Intel Optane DC P1600X 118 GB, Samsung 990 Pro 2 TB
Display(s) 32" 240 Hz 1440p Samsung G7, 31.5" 165 Hz 1440p LG NanoIPS Ultragear, MX900 dual gas VESA mount
Case Sliger SM570 CNC Aluminium 13-Litre, 3D printed feet, custom front, LINKUP Ultra PCIe 4.0 x16 White
Audio Device(s) Audeze Maxwell Ultraviolet w/upgrade pads & LCD headband, Galaxy Buds 3 Pro, Razer Nommo Pro
Power Supply SF1000 Plat, full transparent custom cables, Sentinel Pro 1500 Online Double Conversion UPS w/Noctua
Mouse Razer Viper V3 Pro 8 KHz Mercury White & Pulsar Supergrip tape, Razer Atlas, Razer Strider Chroma
Keyboard Wooting 60HE+ module, TOFU-R CNC Alu/Brass, SS Prismcaps W+Jellykey, LekkerV2 mod, TLabs Leath/Suede
Software Windows 11 IoT Enterprise LTSC 24H2
Benchmark Scores Legendary
You keep mentioning the age of the games as if old games should not be playable due to their age, which I think is hilarious because some of the most played games in the world are 10-20 years old. WoW, Runescape, DOTA 2, GTA:V, etc.

Borderlands 2 is the worst case scenario here as it still gets around 5000 players each day on Steam and is quite frequently sold in the trilogy pack to new buyers who will probably want to play it. On the 50 series without a dedicated card for 32-bit PhysX it tanks into the single digit FPS range during combat. Not fun, and an 18 year old PhysX accelerator is faster and smoother. Mirror's Edge without PhysX is also an entirely different game. There's an entire physical interaction system with glass impacting the player which gets turned off entirely, but very few people still play that game and it is still playable in the 'PhysX off' mode without the extra gameplay variety.
So do any of these "most played" games in the world use 32 bit PhysX? Or have they, perhaps, been updated, and don't use the same engine they did 20 years ago. Making the argument that they're "20 year old" games a bit... forced. To my knowledge it's mainly singleplayer games that implemented PhysX, and I can't think of many singleplayer games from 15-20 years ago that have consistently high player counts.

Are the games truly "not playable" when you can play them just fine without PhysX, as all AMD and Intel GPU users have to anyway?

You mentioned Borderlands 2 because it's probably the single game that has daily player counts in the thousands. So I guess if you're one of those 5000 people don't buy a RTX 50-Series. For the rest it's realistically a complete non-issue.

It's objectively a bad move to remove compatibility features without notice and without alternatives. Even the most hated company on this forum, Apple, made an effort with Rosetta to bring legacy functionality forward both times they dropped an enter architecture. NVIDIA could have, didn't, and it shows where they stand.
An optional feature most devs didn't even bother to implement due to being vendor locked, that was completely replaced 15 years ago with a 64 bit version, is not comparable to a complete OS rewrite to a new architecture, requiring a translation layer without which literally all legacy apps would not work, at all.

NVIDIA not bothering to write a translation layer isn't ideal, but it's far from equivalent to if Apple didn't make Rosetta. It's more surprising considering NVIDIA software support is typically the benchmark.

Returning the 4080S he bought, presumably getting at least $1000 back if he bought a MSRP model, perhaps more, would allow him to buy a premium AIB 5070 Ti, or if he's lucky, a 5080 if one can be found at MSRP.
 
Joined
Sep 17, 2014
Messages
23,348 (6.12/day)
Location
The Washing Machine
System Name Tiny the White Yeti
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling CPU: Thermalright Peerless Assassin / Case: Phanteks T30-120 x3
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
VR HMD HD 420 - Green Edition ;)
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
To be clear, it still has PhysX, just not the 32 bit version that is only used in ~15 year old games.


Lol, I'm simply giving an example of why NVIDIA did this and how this kind of thing is completely normal. The card generation still has backwards compatibility, just not for a niche toggleable feature on a small set of games almost two decades old, which still run just fine without it and have a CPU fallback or processing on a second GPU if it's really so important to you (doubt.jpg).


Yes, when everyone remembered it existed since it's a vector to attack NVIDIA.
Understood, and I don't think the comparison flies just quite so well, because legacy gaming is a thing, and keeping featuresets up for gaming is also a thing. So I view this as a loss, too. And no, a lot of older games do not get remade or updated to newer engines, and qpeople don't have that desire either. The precedent here is a dangerous one. Its crazy there is no backwards compatibility for something so simple.

This argument weighs more heavily too when you consider the supposed 'progress' is really not that desirable either. What do we gain, for Nvidia removing this? Tangibly? I don't see it - so its a net loss, in a world where GPUs get vastly more expensive. Bad.
 
Joined
Jul 13, 2016
Messages
3,516 (1.12/day)
Processor Ryzen 7800X3D
Motherboard ASRock X670E Taichi
Cooling Noctua NH-D15 Chromax
Memory 32GB DDR5 6000 CL30
Video Card(s) MSI RTX 4090 Trio
Storage P5800X 1.6TB 4x 15.36TB Micron 9300 Pro 4x WD Black 8TB M.2
Display(s) Acer Predator XB3 27" 240 Hz
Case Thermaltake Core X9
Audio Device(s) JDS Element IV, DCA Aeon II
Power Supply Seasonic Prime Titanium 850w
Mouse PMM P-305
Keyboard Wooting HE60
VR HMD Valve Index
Software Win 10
An optional feature most devs didn't even bother to implement due to being vendor locked, that was completely replaced 15 years ago with a 64 bit version, is not comparable to a complete OS rewrite to a new architecture, requiring a translation layer without which literally all legacy apps would not work, at all.

NVIDIA not bothering to write a translation layer isn't ideal, but it's far from equivalent to if Apple didn't make Rosetta. It's more surprising considering NVIDIA software support is typically the benchmark.

Yeah they aren't the same, translating 32-bit PhysX commands to 64 bit would be VASTLY easier to do than ensuring an entire prior app ecosystem is is compatible and translated to a new architecture. They don't need to it be that performant either given the power of modern graphics cards, they just need it to work.

Nice of you to point that out, only makes Nvidia look even lazier. They pretty much only had to spit out some unoptimized code that modern hardware can brute force that translates 32-bit PhysX to 64-bit and it would still be 60x faster than the CPU fallback. Couldn't even do that.
 
Joined
Dec 31, 2020
Messages
1,185 (0.78/day)
System Name Dust Collector Mower 50
Processor E5-4627 v4
Motherboard VEINEDA X99
Memory 32 GB
Video Card(s) 2080 Ti
Storage NE-512
Display(s) G27Q
Case MATREXX 50
Power Supply SF450
But I do not understand how the 5070ti who makes 43TFLOPS has the same FPS or can also go better than the 4080s that makes 52 of TFLOPS
In vR the fastest memories of the 5070ths can make the difference or rather the Cuda Core in more than the 4080s?
True that. 24.4% faster memories can sometimes make the difference.
 
Joined
Jul 13, 2016
Messages
3,516 (1.12/day)
Processor Ryzen 7800X3D
Motherboard ASRock X670E Taichi
Cooling Noctua NH-D15 Chromax
Memory 32GB DDR5 6000 CL30
Video Card(s) MSI RTX 4090 Trio
Storage P5800X 1.6TB 4x 15.36TB Micron 9300 Pro 4x WD Black 8TB M.2
Display(s) Acer Predator XB3 27" 240 Hz
Case Thermaltake Core X9
Audio Device(s) JDS Element IV, DCA Aeon II
Power Supply Seasonic Prime Titanium 850w
Mouse PMM P-305
Keyboard Wooting HE60
VR HMD Valve Index
Software Win 10
But I do not understand how the 5070ti who makes 43TFLOPS has the same FPS or can also go better than the 4080s that makes 52 of TFLOPS
In this review, the 5070ti seems to be much better https://wccftech.com/review/colorfu...b-gpu-review-enthusiast-performance-at-749/5/
but in many other review is never better
I don't explain these discrepancies

Games that are bottle-necked by memory speed will favor the 5070 Ti while games that benefit from the additional cores will favor the 4080s.

Ultimately it really just depends on the game.
 
Joined
Sep 10, 2018
Messages
7,752 (3.28/day)
Location
California
System Name His & Hers
Processor R7 5800X/ R7 7950X3D Stock
Motherboard X670E Aorus Pro X/ROG Crosshair VIII Hero
Cooling Corsair h150 elite/ Corsair h115i Platinum
Memory Trident Z5 Neo 6000/ 32 GB 3200 CL14 @3800 CL16 Team T Force Nighthawk
Video Card(s) Evga FTW 3 Ultra 3080ti/ Gigabyte Gaming OC 4090
Storage lots of SSD.
Display(s) A whole bunch OLED, VA, IPS.....
Case 011 Dynamic XL/ Phanteks Evolv X
Audio Device(s) Arctis Pro + gaming Dac/ Corsair sp 2500/ Logitech G560/Samsung Q990B
Power Supply Seasonic Ultra Prime Titanium 1000w/850w
Mouse Logitech G502 Lightspeed/ Logitech G Pro Hero.
Keyboard Logitech - G915 LIGHTSPEED / Logitech G Pro
If I ever dared mentioned that I prefer nvidia over amd cards for the physX support at any point in the last 5 years in these forums, people would nail me on the cross. and would suggest that im just trying to find excuses not to buy amd. Now that nvidia removes support for it, it became a news worthy issue.

It's still kinda crappy they removed it some of the games that support it are doing cool things even modern games don't and while it wouldn't dissuade me from purchasing a gpu that was clearly better than what I currently own overall I definitely wouldn't swap a similar performing one. MFG is just frame gen with all the benefits and all the downsides of current frame gen amplified on top of reducing performance at each step as the only real stand out feature.


is better faster memory or cuda cores for 4k gaming 120hz and vr gaming on quest3?

It really comes down to which one is cheaper honestly they are close enough otherwise that it doesn't matter both offer somthing the other gpu can't do I hate adding latency to my games so frame generation is irrelevant to me but if you love it I'd slightly lean 50 series the other issue right now is 50 series is half baked who know how long before Nvidia sorts it all out it's embarrassing for a trillion dollar company to have these sorts of issues would be like Apple releasing a phone that somtimes couldn't make phone calls lmao.
 
Joined
Jul 13, 2016
Messages
3,516 (1.12/day)
Processor Ryzen 7800X3D
Motherboard ASRock X670E Taichi
Cooling Noctua NH-D15 Chromax
Memory 32GB DDR5 6000 CL30
Video Card(s) MSI RTX 4090 Trio
Storage P5800X 1.6TB 4x 15.36TB Micron 9300 Pro 4x WD Black 8TB M.2
Display(s) Acer Predator XB3 27" 240 Hz
Case Thermaltake Core X9
Audio Device(s) JDS Element IV, DCA Aeon II
Power Supply Seasonic Prime Titanium 850w
Mouse PMM P-305
Keyboard Wooting HE60
VR HMD Valve Index
Software Win 10
is better faster memory or cuda cores for 4k gaming 120hz and vr gaming on quest3?

I think it's a wash between the two. The 4080 Super doesn't really loose any performance compared to the 5070 Ti at higher resolutions. The 4080 Super is 3% faster at 1440p and 2% faster at 4K relative to the 5070 Ti.

There's not much data on VR but given the 5070 Ti isn't really gaining at higher resolutions I suspect it'd be much the same.
 
Joined
Dec 28, 2012
Messages
4,206 (0.95/day)
System Name Skunkworks 3.0
Processor 5800x3d
Motherboard x570 unify
Cooling Noctua NH-U12A
Memory 32GB 3600 mhz
Video Card(s) asrock 6800xt challenger D
Storage Sabarent rocket 4.0 2TB, MX 500 2TB
Display(s) Asus 1440p144 27"
Case Old arse cooler master 932
Power Supply Corsair 1200w platinum
Mouse *squeak*
Keyboard Some old office thing
Software Manjaro
"condoning such actions"

My dude it's 32 bit software for which a direct replacement has existed for more than 15 years, and this is 2025. Should hardware manufacturers keep supporting all standards for infinity? Seems like a waste of die space.
My dude, nvidia is a multi trillion dollar corpo. Pretty sure they could figure out a software emulation solution for legacy software. But, you know, why bother when people will meatshield your defective chips and make excuses while your cards burst into flame?

I mean, should we drop DX9? It's old, who needs that? Hell get rid of 10 while we're at it, and all forms of open GL. Who uses that old software?

If I wanted a 10 year lifespan on my software I'd buy apple.
Besides, it's not like the game is unplayable without PhysX, or that it will magically make a 15 year old game look contemporary.

Seems like a very similar issue to how modern fast computers have problems running very old games, as framerate tied to physics etc and the sheer speed of modern hardware causes stuff to break. If turning on Physx on a game from the 2000s is critical, I'm sure there's millions of old GPUs/systems on Ebay or in your garage gathering dust.

Could NVIDIA write a translation layer similar to what Intel did for old DX games with their discrete GPU release? Maybe. Is it worth it? Unlikely.

My bet is this whole "issue" was the first time most people even remembered PhysX existed.

I'm guessing NVIDIA made this move to 64 bit only CUDA/PhysX for the same reason Apple went 64 bit only, Intel tried to, and Android is in the process of doing, it simplifies things and allows more focus on currently important things.

Correction, the newest ARM processors for Android already only support 64 bit code.

Started with the Pixel 7. I didn't hear much outcry then.


But hey, NVIDIA bad right?
You clearly were not paying attention, forums for various older apps that are not actively supported were apoplectic over the change.
Not sure how many people buy $1000 2025 GPUs to run games from 2010... but OK.

As for the defect chip thing it's 100% returnable and a fraction of a percent incidence rate, so I wouldn't be inclined to stay up all night worrying.
I dont know many people who buy $1000 GPUs to LOSE performance and features over their previous card either. But then again, who wants to play metro exodus or batman Arkham asylum when you have such modern gems as Forespoken, and, uh, Concord?
 

dgianstefani

TPU Proofreader
Staff member
Joined
Dec 29, 2017
Messages
5,306 (2.03/day)
Location
Swansea, Wales
System Name Silent/X1 Yoga/S25U-1TB
Processor Ryzen 9800X3D @ 5.575ghz all core 1.24 V, Thermal Grizzly AM5 High Performance Heatspreader/1185 G7
Motherboard ASUS ROG Strix X670E-I, chipset fans replaced with Noctua A14x25 G2
Cooling Optimus Block, HWLabs Copper 240/40 + 240/30, D5/Res, 4x Noctua A12x25, 1x A14G2, Mayhems Ultra Pure
Memory 64 GB Dominator Titanium White 6000 MT, 130 ns tRFC, active cooled
Video Card(s) RTX 3080 Ti Founders Edition, Conductonaut Extreme, 18 W/mK MinusPad Extreme, Corsair XG7 Waterblock
Storage Intel Optane DC P1600X 118 GB, Samsung 990 Pro 2 TB
Display(s) 32" 240 Hz 1440p Samsung G7, 31.5" 165 Hz 1440p LG NanoIPS Ultragear, MX900 dual gas VESA mount
Case Sliger SM570 CNC Aluminium 13-Litre, 3D printed feet, custom front, LINKUP Ultra PCIe 4.0 x16 White
Audio Device(s) Audeze Maxwell Ultraviolet w/upgrade pads & LCD headband, Galaxy Buds 3 Pro, Razer Nommo Pro
Power Supply SF1000 Plat, full transparent custom cables, Sentinel Pro 1500 Online Double Conversion UPS w/Noctua
Mouse Razer Viper V3 Pro 8 KHz Mercury White & Pulsar Supergrip tape, Razer Atlas, Razer Strider Chroma
Keyboard Wooting 60HE+ module, TOFU-R CNC Alu/Brass, SS Prismcaps W+Jellykey, LekkerV2 mod, TLabs Leath/Suede
Software Windows 11 IoT Enterprise LTSC 24H2
Benchmark Scores Legendary
I mean, should we drop DX9? It's old, who needs that? Hell get rid of 10 while we're at it, and all forms of open GL. Who uses that old software?
Literally what Intel did. Why? Because it simplifies design moving forward.

League recently released a patch that removes DX9 renderer, so if you're still using hardware that only supports that, you can't play the game, despite it being an incredibly lightweight title.

At some point you have to move on and stop clutching onto ancient technology, because otherwise it holds back current development.

The days of 32 bit software were XP and older.

I will say a translation layer for 32-64 could have allievated the concerns of the several people who want to play with 32 bit PhysX turned on.
 
Joined
Sep 19, 2014
Messages
175 (0.05/day)
If you find yourself choosing whether to keep a 4080 super (you are within the return period Amazon) and buy a 5070ti at a smaller price of the 4080super, what would you do? i play in 4k but i also love vg gaming, In vR the fastest memories of the 5070ths can make the difference or rather the Cuda Core in more than the 4080s?
and also considering that a 5070ti is good overclockable and can overcome a 4080?
For same money 5070Ti is better, Better DLSS and more OC potential.

Too many problems with the 50 series, 5070Ti are potentially missing ROPs meaning lower performance.
Need to be realy unlucky to get one

If the difference is so little you would hardly notice or tell then I don't think it really matters. Power Efficiency simply doesn't really matter for this comparison in my opinion. Just because the 7900XTX is a smidge bit faster than the 4080S (usually only by 1% to 4%), suddenly didnt mean people were recommending the 7900XTX over the 4080S in raster. I think the same logic applies here.

Not to mention, OP mentioned overclocking a potential 5070Ti to beat a 4080 (which it already matches as is) which opens up another can of worms for power efficiency, and ATP if your gonna overclock power efficiency probably isnt your concern anymore. And even in this regard, an overclocked 5070Ti from what im reading is still well within 5% difference range. Cooler matters too in that regard but it seems pretty consistent across all the charts I saw for different models (avoiding FE for comparison due to no FE card for 5070Ti)

TL;DR, power efficiency, and by extension, other stuff such as raster, really dont matter in this arguement. I dont see a point in bringing them up. Your choice is gonna entirely hinge on the features. (and price, if you can get a good price of course.)

5070Ti if its at a price similar to a 4080S (which the cheapest ive seen are), if you already own a 4080S, you should pass unless you value the features exclusive to the 5070Ti and can actually get it. If you don't, and were considering getting a 4080S, then, in that specific instance, go ahead. Even more so if you can buy a 5070Ti at its actual MSRP. If your not an american thats a factor too, due to stuff like VAT, but thats getting into a whole other can of worms.
4080s is faster than 7900XTX tho

relative-performance-2560-1440.png
 
Last edited:

dgianstefani

TPU Proofreader
Staff member
Joined
Dec 29, 2017
Messages
5,306 (2.03/day)
Location
Swansea, Wales
System Name Silent/X1 Yoga/S25U-1TB
Processor Ryzen 9800X3D @ 5.575ghz all core 1.24 V, Thermal Grizzly AM5 High Performance Heatspreader/1185 G7
Motherboard ASUS ROG Strix X670E-I, chipset fans replaced with Noctua A14x25 G2
Cooling Optimus Block, HWLabs Copper 240/40 + 240/30, D5/Res, 4x Noctua A12x25, 1x A14G2, Mayhems Ultra Pure
Memory 64 GB Dominator Titanium White 6000 MT, 130 ns tRFC, active cooled
Video Card(s) RTX 3080 Ti Founders Edition, Conductonaut Extreme, 18 W/mK MinusPad Extreme, Corsair XG7 Waterblock
Storage Intel Optane DC P1600X 118 GB, Samsung 990 Pro 2 TB
Display(s) 32" 240 Hz 1440p Samsung G7, 31.5" 165 Hz 1440p LG NanoIPS Ultragear, MX900 dual gas VESA mount
Case Sliger SM570 CNC Aluminium 13-Litre, 3D printed feet, custom front, LINKUP Ultra PCIe 4.0 x16 White
Audio Device(s) Audeze Maxwell Ultraviolet w/upgrade pads & LCD headband, Galaxy Buds 3 Pro, Razer Nommo Pro
Power Supply SF1000 Plat, full transparent custom cables, Sentinel Pro 1500 Online Double Conversion UPS w/Noctua
Mouse Razer Viper V3 Pro 8 KHz Mercury White & Pulsar Supergrip tape, Razer Atlas, Razer Strider Chroma
Keyboard Wooting 60HE+ module, TOFU-R CNC Alu/Brass, SS Prismcaps W+Jellykey, LekkerV2 mod, TLabs Leath/Suede
Software Windows 11 IoT Enterprise LTSC 24H2
Benchmark Scores Legendary
Joined
Sep 10, 2018
Messages
7,752 (3.28/day)
Location
California
System Name His & Hers
Processor R7 5800X/ R7 7950X3D Stock
Motherboard X670E Aorus Pro X/ROG Crosshair VIII Hero
Cooling Corsair h150 elite/ Corsair h115i Platinum
Memory Trident Z5 Neo 6000/ 32 GB 3200 CL14 @3800 CL16 Team T Force Nighthawk
Video Card(s) Evga FTW 3 Ultra 3080ti/ Gigabyte Gaming OC 4090
Storage lots of SSD.
Display(s) A whole bunch OLED, VA, IPS.....
Case 011 Dynamic XL/ Phanteks Evolv X
Audio Device(s) Arctis Pro + gaming Dac/ Corsair sp 2500/ Logitech G560/Samsung Q990B
Power Supply Seasonic Ultra Prime Titanium 1000w/850w
Mouse Logitech G502 Lightspeed/ Logitech G Pro Hero.
Keyboard Logitech - G915 LIGHTSPEED / Logitech G Pro
It is now, wasn't at launch. Something something fine wine.

That was only a thing when AMD consistently gave 30-50% more Vram than nvidia did. Like the GTX 680 vs 7970 and 290x vs GTX 780.

Although some of that was just having meh AF launch drivers and bringing up performance in games they weren't performing all that well in. My 7970 way outlived my 680 as well as the 290X that was still being used till like 2020/21 I think.

We are seeing the opposite now because games are implementing a version of RT by default and AMD is way behind.
 
Top