• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

24 Gb video RAM is worth for gaming nowadays?

24 Gb video RAM is worth for gaming nowadays?

  • Yes

    Votes: 66 41.5%
  • No

    Votes: 93 58.5%

  • Total voters
    159
Joined
Apr 30, 2020
Messages
999 (0.59/day)
System Name S.L.I + RTX research rig
Processor Ryzen 7 5800X 3D.
Motherboard MSI MEG ACE X570
Cooling Corsair H150i Cappellx
Memory Corsair Vengeance pro RGB 3200mhz 32Gbs
Video Card(s) 2x Dell RTX 2080 Ti in S.L.I
Storage Western digital Sata 6.0 SDD 500gb + fanxiang S660 4TB PCIe 4.0 NVMe M.2
Display(s) HP X24i
Case Corsair 7000D Airflow
Power Supply EVGA G+1600watts
Mouse Corsair Scimitar
Keyboard Cosair K55 Pro RGB
I think that when Plague tale at 4k ultra uses around 4.5 to 5.5 GB Vram, the whole Vram discussion is pointless. For comparison, TLOU requires 9.5GB at 720p, and it looks much worse than Plague tale. So the only thing having more VRAM will achieve is games hogging more and more cause devs don't optimize crap. You won't get better visuals, youll just pay more for a card with more vram that also uses more power. Great, isn't it
yes, but does it stutter?
cause I've been seeing a lot of complaints about stuttering on new games.
 
Joined
Jan 14, 2019
Messages
12,567 (5.80/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
No "added" input lag, key word here is "added", game with vsync at 60fps is typically triple buffered that means 2 frames worth of delay so roughly 30ms.
No added input lag compared to the uncapped situation. What more do you want? Zero input lag doesn't exist, you know.

Say you cap at 60 with no vsync, still, every fame costs 16ms, so the input lag is at the very least 16ms.
That is factually wrong because input lag does not equal frame time (because of the rendering pipeline you mentioned below), but for simplicity's sake, let's say you're right. But then...

Say you don't cap and the game runs at 120 average, that's 8 ms per frame at the very least now. This isn't entirely accurate because a typical rendering pipeline has more frames rendered ahead of times but you get the point (hopefully), the less time you spend on a frame the lower the input lag.
If that frame ever gets displayed on your screen, which it doesn't. It's wasted effort.

Why you insist to believe that getting tearing (which by the way is less noticeable at higher framerate aka uncapped) and not getting the lowest input lag possible as well is the superior choice is beyond me
Where did I say that?

Enhanced/Fast sync eliminates tearing, and so does a frame rate cap at the refresh rate of your monitor.

By the way, tearing isn't less noticeable at higher framerates.

But each to their own, for the record I never run uncapped, I do not care for input lag, it's always vsync or VRR for me.
So you've been arguing about nothing in the last 5 or so pages. Congratulations, well done! You've just wasted half a day of everyone's time. :banghead:
 
Joined
Mar 29, 2023
Messages
1,045 (1.65/day)
Processor Ryzen 7800x3d
Motherboard Asus B650e-F Strix
Cooling Corsair H150i Pro
Memory Gskill 32gb 6000 mhz cl30
Video Card(s) RTX 4090 Gaming OC
Storage Samsung 980 pro 2tb, Samsung 860 evo 500gb, Samsung 850 evo 1tb, Samsung 860 evo 4tb
Display(s) Acer XB321HK
Case Coolermaster Cosmos 2
Audio Device(s) Creative SB X-Fi 5.1 Pro + Logitech Z560
Power Supply Corsair AX1200i
Mouse Logitech G700s
Keyboard Logitech G710+
Software Win10 pro
What's crazy and a little sad at the same time is the 4060/4060ti will both likely come with 8GB of vram which is just pathetic..... Even entry level cards had 8GB like 7 years ago smh.

As much as I don't like the 7900XT/7900XTX at least they come with adequate Vram.

Yeah, even nvidia offered 8gb on the midrange 1070... 7 years ago ! But i'm quite sure that nvidia regrets making the 1000 lineup as good as they did...

And Nvidia are deffo guilty of stagnating the midrange products in the last years, especially when it comes to vram. People who buys a 4060 ti, expecting to use it for AAA gaming in the next 3+ years will be sorely disapointed with their experience.

But ever since the kepler days nvidia has done stuff like this. The 960 2gb was more or less unusable for anything other than esports games even on release. 780 ti was also more or less a bad experience at release due to the 3gb vram... especially if you ran them in sli at high res. Remember watch dogs stuttering soooo bad due to running out of vram on the 780 ti's, while the 6 gb versions of the 780 provided a much better experience, despite being 20% slower gpu's.
 
Joined
Jan 14, 2019
Messages
12,567 (5.80/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
40-60fps feels pretty terrible on most monitors Gsync/Freesync isn't going to fix that.
I'm usually fine with 40-60 FPS. It's around 30 and below where I start making faces.
 
Joined
Sep 10, 2018
Messages
6,965 (3.03/day)
Location
California
System Name His & Hers
Processor R7 5800X/ R7 7950X3D Stock
Motherboard X670E Aorus Pro X/ROG Crosshair VIII Hero
Cooling Corsair h150 elite/ Corsair h115i Platinum
Memory Trident Z5 Neo 6000/ 32 GB 3200 CL14 @3800 CL16 Team T Force Nighthawk
Video Card(s) Evga FTW 3 Ultra 3080ti/ Gigabyte Gaming OC 4090
Storage lots of SSD.
Display(s) A whole bunch OLED, VA, IPS.....
Case 011 Dynamic XL/ Phanteks Evolv X
Audio Device(s) Arctis Pro + gaming Dac/ Corsair sp 2500/ Logitech G560/Samsung Q990B
Power Supply Seasonic Ultra Prime Titanium 1000w/850w
Mouse Logitech G502 Lightspeed/ Logitech G Pro Hero.
Keyboard Logitech - G915 LIGHTSPEED / Logitech G Pro
I'm usually fine with 40-60 FPS. It's around 30 and below where I start making faces.

I think for me it comes from gaming at high framerates for over a decade now... I can do 60 on an OLED likely due to the vastly superior pixel response but anything lower hell na.
 
Joined
Mar 29, 2023
Messages
1,045 (1.65/day)
Processor Ryzen 7800x3d
Motherboard Asus B650e-F Strix
Cooling Corsair H150i Pro
Memory Gskill 32gb 6000 mhz cl30
Video Card(s) RTX 4090 Gaming OC
Storage Samsung 980 pro 2tb, Samsung 860 evo 500gb, Samsung 850 evo 1tb, Samsung 860 evo 4tb
Display(s) Acer XB321HK
Case Coolermaster Cosmos 2
Audio Device(s) Creative SB X-Fi 5.1 Pro + Logitech Z560
Power Supply Corsair AX1200i
Mouse Logitech G700s
Keyboard Logitech G710+
Software Win10 pro
I think for me it comes frome gaming at high framerates for over a decade now... I can do 60 on an OLED likely due to the vastly superior pixel response but anything lower hell no.

I think it is also highly subjective. I deffo prefer playing at 8k 60 hz vs 4k 144 hz (have both). Image clarity and detail from higher resolution makes a much bigger difference to me than higher than 60 fps (58 actually, as i cap fps at that) does :)
 
Joined
Sep 10, 2018
Messages
6,965 (3.03/day)
Location
California
System Name His & Hers
Processor R7 5800X/ R7 7950X3D Stock
Motherboard X670E Aorus Pro X/ROG Crosshair VIII Hero
Cooling Corsair h150 elite/ Corsair h115i Platinum
Memory Trident Z5 Neo 6000/ 32 GB 3200 CL14 @3800 CL16 Team T Force Nighthawk
Video Card(s) Evga FTW 3 Ultra 3080ti/ Gigabyte Gaming OC 4090
Storage lots of SSD.
Display(s) A whole bunch OLED, VA, IPS.....
Case 011 Dynamic XL/ Phanteks Evolv X
Audio Device(s) Arctis Pro + gaming Dac/ Corsair sp 2500/ Logitech G560/Samsung Q990B
Power Supply Seasonic Ultra Prime Titanium 1000w/850w
Mouse Logitech G502 Lightspeed/ Logitech G Pro Hero.
Keyboard Logitech - G915 LIGHTSPEED / Logitech G Pro
I think it is also highly subjective. I deffo prefer playing at 8k 60 hz vs 4k 144 hz (have both). Image clarity and detail from higher resolution makes a much bigger difference to me than higher than 60 fps (58 actually, as i cap fps at that) does :)

I'm the opposite I'd rather game at 1440p 240hz than 8k60.

That's the beauty of PC hardware though everyone assuming they can aford it can game how they'd prefer.
 
Joined
Mar 29, 2023
Messages
1,045 (1.65/day)
Processor Ryzen 7800x3d
Motherboard Asus B650e-F Strix
Cooling Corsair H150i Pro
Memory Gskill 32gb 6000 mhz cl30
Video Card(s) RTX 4090 Gaming OC
Storage Samsung 980 pro 2tb, Samsung 860 evo 500gb, Samsung 850 evo 1tb, Samsung 860 evo 4tb
Display(s) Acer XB321HK
Case Coolermaster Cosmos 2
Audio Device(s) Creative SB X-Fi 5.1 Pro + Logitech Z560
Power Supply Corsair AX1200i
Mouse Logitech G700s
Keyboard Logitech G710+
Software Win10 pro
I'm the opposite I'd rather game at 1440p 240hz than 8k60.

That's the beauty of PC hardware though everyone assuming they can aford it can game how they'd prefer.

Yeah, as said it's highly subjective. I suppose i'm in the minority with my preference... but hot dang does it look good in 8k... :p
 
Joined
Jul 14, 2018
Messages
473 (0.20/day)
Location
Jakarta, Indonesia
System Name PC-GX1
Processor i9 10900 non K (stock) TDP 65w
Motherboard asrock b560 steel legend | Realtek ALC897
Cooling cooler master hyper 2x12 LED turbo argb | 5x12cm fan rgb intake | 3x12cm fan rgb exhaust
Memory corsair vengeance LPX 2x32gb ddr4 3600mhz
Video Card(s) MSI RTX 3080 10GB Gaming Z Trio LHR TDP 370w| 566.36 WHQL | MSI AB v4.65 | RTSS v7.36
Storage NVME 2+2TB gen3| SSD 4TB sata3 | 1+2TB 7200rpm sata3| 4+4+5TB USB3 (optional)
Display(s) AOC U34P2C (IPS panel, 3440x1440 75hz) + speaker 5W*2 | APC BX1100CI MS (660w)
Case lianli lancool 2 mesh RGB windows - white edition | 1x dvd-RW usb 3.0 (optional)
Audio Device(s) Nakamichi soundstation8w 2.1 100W RMS | Simbadda CST 9000N+ 2.1 352W RMS
Power Supply seasonic focus gx-850w 80+ gold - white edition 2021 | APC BX2200MI MS (1200w)
Mouse steelseries sensei ten | logitech g440
Keyboard steelseries apex 5 | steelseries QCK prism cloth XL | steelseries arctis 5
VR HMD -
Software dvd win 10 home 64bit oem + full update 22H2
Benchmark Scores -
xcellent delivery and perfection in proving!

Just check if any game consumes 20+ GB whilst staying playable (1% low is strictly more than 30 and avg is strictly more than 45 FPS) with 4090.
...there's no such game.

2160p, still not maximum graphic option all, using 3 DLC HD textures....

vram usage .....

warplanes = 14gb .....
wartanks = 16gb.....
warships = 18gb....

yes, there is.....

on 1080p (16:9) = recommended 12gb vram....
on 1440p (16:9) = recommended 16gb vram....
on 2160p (16:9) = recommended 20gb vram....

IMO....
 
Joined
Jan 14, 2019
Messages
12,567 (5.80/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
@aus and ferret,

I don't buy on a whim, and I rarely buy full price, so please, don't try to put me in a box just because I didn't listen to the haters. I have said COUNTLESS times that this game is an exception with the lifted 2 Hr refund limit. I just didn't see any reason to refund it, and I still don't after well over 2 play throughs now. In fact I'm only liking it more on Survivor mode, as it ramps up the challenge with no see through walls "Hearing" feature, and half as many available resources to scavenge. But hey, if you're only into games for the atmosphere and exploration, I can see why you'd only value it at 10 GBP. I happen to want MUCH more than that out of my games, so I get why you don't understand my rationale. :rolleyes:

Furthermore, if you guys were 65 like me, with ever fading memory, eyesight, reflexes, and dexterity, maybe you'd get that waiting a long time for price drops makes me wonder if I'll still be able to play with the challenges I put on myself when I game by the time it hits YOUR idea of an acceptable price. I am by no means a casual gamer, but I know playing on the harder modes will one day become quite the chore. I also only play select genres of games, so I don't buy nearly as many as some do, which makes near full price or even at full price shopping once in a while, very manageable for me.

It seems like you guys judge players the same way a lot of the skeptics do games, without really knowing them, and it's clear you guys don't get how business works. You can say all you want that boycotting or waiting for bargain bin prices will improve game quality, but the practical man knows it's the early months of a release at or near full price that keeps developers stable enough to even AFFORD to put out good games. They'd all go broke and stop making games if every player listened to your way of thinking.
Don't get me wrong, I don't judge you. If you feel like the game is worth its price to you, that's fine. :) It's only that it isn't worth it to me.

I was only explaining my own reasons for not yet buying it, or in fact, for not buying any remake at normal game prices. I never expected you to agree with me or follow me in my thoughts. I apologise if my comment seemed that way.

I know how much full price sales mean to devs to stay afloat - I just don't feel like they deserve that full price for a remake / platform port, nor am I a charity organisation dedicated to support game devs even when their actions / products / prices don't fully resonate with me. But this is again, personal. If you feel like they do deserve your support, and you're enjoying the game, that's all that matters, and I wish you lots of fun with it. :)

Our ways of thinking may be different, but in the end, our love of games is what counts. :toast:
 
Last edited:
Joined
Jan 8, 2017
Messages
9,503 (3.27/day)
System Name Good enough
Processor AMD Ryzen R9 7900 - Alphacool Eisblock XPX Aurora Edge
Motherboard ASRock B650 Pro RS
Cooling 2x 360mm NexXxoS ST30 X-Flow, 1x 360mm NexXxoS ST30, 1x 240mm NexXxoS ST30
Memory 32GB - FURY Beast RGB 5600 Mhz
Video Card(s) Sapphire RX 7900 XT - Alphacool Eisblock Aurora
Storage 1x Kingston KC3000 1TB 1x Kingston A2000 1TB, 1x Samsung 850 EVO 250GB , 1x Samsung 860 EVO 500GB
Display(s) LG UltraGear 32GN650-B + 4K Samsung TV
Case Phanteks NV7
Power Supply GPS-750C
No added input lag compared to the uncapped situation. What more do you want? Zero input lag doesn't exist, you know.
But uncapped is better, that's the point.

That is factually wrong
It literally isn't, 16ms is the absolute lowest input lag figure achievable in that case, in an ideal situation when there are no other frames in queue that's the lowest number you can expect. Who knows what the final input lag is in absolute terms but the time it takes to finish a frame is obviously the biggest factor and it also sets the lowest bound.

If that frame ever gets displayed on your screen, which it doesn't. It's wasted effort.
It's wasted effort in terms of what ? You are getting the benefit of lower input lag, what a strange statement.

By the way, tearing isn't less noticeable at higher framerates.

It most definitely is, the more frames are being sent to the display the higher the chances are that the tearing will be less jarring because there is not going to be as big of a difference from the frame that's currently being displayed and the one that just arrived.

Enhanced/Fast sync eliminates tearing, and so does a frame rate cap at the refresh rate of your monitor.

I just explained that capping the frame rate to that of your monitor does not eliminate tearing, you only get rid of tearing if the frames are synchronized with the monitor (vsync or VRR on), just having the same number of frames begin rendered is not enough, that's not how this works. And if you enable Enhanced/Fast sync then you are truly getting nothing in return because now you've added that framebuffer lag back and the game is capped. I suppose it would be better vs just vsync but you'll get a ton of stutter, Enhanced/Fast sync is intended to work well when the framerate is many times over that of your monitor.

So you've been arguing about nothing in the last 5 or so pages

What a bizarre thing to say, just because I don't use something that doesn't mean I can't argue over what is true and what isn't.
 
Last edited:
Joined
Dec 12, 2020
Messages
1,755 (1.19/day)
Wasn't Metro Exodus performance mostly a problem with your OC?
Yes my only option now is to reduce graphics settings. My overclock, like you said, obviously isn't stable with Metro:Exodus but this is the only game giving me this issue.
 
Joined
Apr 18, 2019
Messages
2,396 (1.15/day)
Location
Olympia, WA
System Name Sleepy Painter
Processor AMD Ryzen 5 3600
Motherboard Asus TuF Gaming X570-PLUS/WIFI
Cooling FSP Windale 6 - Passive
Memory 2x16GB F4-3600C16-16GVKC @ 16-19-21-36-58-1T
Video Card(s) MSI RX580 8GB
Storage 2x Samsung PM963 960GB nVME RAID0, Crucial BX500 1TB SATA, WD Blue 3D 2TB SATA
Display(s) Microboard 32" Curved 1080P 144hz VA w/ Freesync
Case NZXT Gamma Classic Black
Audio Device(s) Asus Xonar D1
Power Supply Rosewill 1KW on 240V@60hz
Mouse Logitech MX518 Legend
Keyboard Red Dragon K552
Software Windows 10 Enterprise 2019 LTSC 1809 17763.1757
Yes my only option now is to reduce graphics settings. My overclock, like you said, obviously isn't stable with Metro:Exodus but this is the only game giving me this issue.
There's some irony here:
STALKER's X-ray engine has been a better stress test for my OCs than many utilities, yet according to metrics, it's not even really loading my hardware.
GSC game world folks, when they first closed, migrated to the 4A Games and helped build the engine Metro games are running.
 
Joined
Dec 12, 2020
Messages
1,755 (1.19/day)
@LabRat 891
That's an old engine too. I can't get the STALKER benchmark to run on my windows10 box. I do remember the STALKER benchmark fully loaded all cores of my CPU though.
 
Joined
Apr 18, 2019
Messages
2,396 (1.15/day)
Location
Olympia, WA
System Name Sleepy Painter
Processor AMD Ryzen 5 3600
Motherboard Asus TuF Gaming X570-PLUS/WIFI
Cooling FSP Windale 6 - Passive
Memory 2x16GB F4-3600C16-16GVKC @ 16-19-21-36-58-1T
Video Card(s) MSI RX580 8GB
Storage 2x Samsung PM963 960GB nVME RAID0, Crucial BX500 1TB SATA, WD Blue 3D 2TB SATA
Display(s) Microboard 32" Curved 1080P 144hz VA w/ Freesync
Case NZXT Gamma Classic Black
Audio Device(s) Asus Xonar D1
Power Supply Rosewill 1KW on 240V@60hz
Mouse Logitech MX518 Legend
Keyboard Red Dragon K552
Software Windows 10 Enterprise 2019 LTSC 1809 17763.1757
@LabRat 891
That's an old engine too. I can't get the STALKER benchmark to run on my windows10 box. I do remember the STALKER benchmark fully loaded all cores of my CPU though.
Odd. I was just running the Call of Pripyat bench a few months ago in comparing a Windows 7 R5 3600 + RX580 8GB vs. my (at the time) Windows 10 R5 5600 + 6500XT 4GB.

The benchmark would've been a great 'if it runs this, you're stable' kinda tool, if X-Ray engine didn't act like The Zone itself. Even the enormously improved 64-bit recompile the community did, has a LOT of "moments".
From what I recall reading about in the dev(b)logs, X-Ray engine is basically some kinda slavic black magic, in that it work(ed)(s) at all.
 
Joined
Jan 14, 2019
Messages
12,567 (5.80/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
But uncapped is better, that's the point.
Still not proved.

It literally isn't, 16ms is the absolute lowest input lag figure achievable in that case, in an ideal situation when there are no other frames in queue that's the lowest number you can expect. Who knows what the final input lag is in absolute terms but the time it takes to finish a frame is obviously the biggest factor and it also sets the lowest bound.


It's wasted effort in terms of what ? You are getting the benefit of lower input lag, what a strange statement.
If "16 ms is the absolute minimum", then how will you "get the benefit of lower input lag"? You're contradicting yourself.

It most definitely is, the more frames are being sent to the display the higher the chances are that the tearing will be less jarring because there is not going to be as big of a difference from the frame that's currently being displayed and the one that just arrived.
That's not how it works. Screen tearing occurs any time your GPU's output image and the monitor's refresh rate are not synchronised. Higher frame rates do not eliminate it.
Read about it, seriously:
I just explained that capping the frame rate to that of your monitor does not eliminate tearing, you only get rid of tearing if the frames are synchronized with the monitor (vsync or VRR on)
And I just explained that I also use Enhanced/Fast sync.

And if you enable Enhanced/Fast sync then you are truly getting nothing in return because now you've added that framebuffer lag back and the game is capped. I suppose it would be better vs just vsync but you'll get a ton of stutter, Enhanced/Fast sync is intended to work well when the framerate is many times over that of your monitor.
Again, that's not how it works. Enhanced/Fast sync is essentially the same as the uncapped frame rate situation, except that the frames that your monitor can't display due to its refresh rate are dropped. There is no more framebuffer lag than there would be in an uncapped situation. You have a lot more lag in the traditional Vsync situation where your frames are queued to be displayed whenever your monitor is ready.

What a bizarre thing to say, just because I don't use something that doesn't mean I can't argue over what is true and what isn't.
Sure, you can always argue about stuff that you have no first hand experience with. It'll just make you look like an idiot. No big deal. ;)
 
Joined
Jun 14, 2020
Messages
3,530 (2.14/day)
System Name Mean machine
Processor 12900k
Motherboard MSI Unify X
Cooling Noctua U12A
Memory 7600c34
Video Card(s) 4090 Gamerock oc
Storage 980 pro 2tb
Display(s) Samsung crg90
Case Fractal Torent
Audio Device(s) Hifiman Arya / a30 - d30 pro stack
Power Supply Be quiet dark power pro 1200
Mouse Viper ultimate
Keyboard Blackwidow 65%
Why is this still goin? It's obvious he is wrong on so many things and is not going to admit it even if you put his feet to the fire. And - apparently - nvidia themselves are clueless on the subject, since there is an option on the control panel that allows you to choose the amount of frames the CPU pre- RENDERS. What a clueless bunch these nvidia engineers, they should hire the local troll.
 
Joined
Jan 5, 2006
Messages
18,584 (2.68/day)
System Name AlderLake
Processor Intel i7 12700K P-Cores @ 5Ghz
Motherboard Gigabyte Z690 Aorus Master
Cooling Noctua NH-U12A 2 fans + Thermal Grizzly Kryonaut Extreme + 5 case fans
Memory 32GB DDR5 Corsair Dominator Platinum RGB 6000MT/s CL36
Video Card(s) MSI RTX 2070 Super Gaming X Trio
Storage Samsung 980 Pro 1TB + 970 Evo 500GB + 850 Pro 512GB + 860 Evo 1TB x2
Display(s) 23.8" Dell S2417DG 165Hz G-Sync 1440p
Case Be quiet! Silent Base 600 - Window
Audio Device(s) Panasonic SA-PMX94 / Realtek onboard + B&O speaker system / Harman Kardon Go + Play / Logitech G533
Power Supply Seasonic Focus Plus Gold 750W
Mouse Logitech MX Anywhere 2 Laser wireless
Keyboard RAPOO E9270P Black 5GHz wireless
Software Windows 11
Benchmark Scores Cinebench R23 (Single Core) 1936 @ stock Cinebench R23 (Multi Core) 23006 @ stock
Easy, you'll just have to deal with it for now....
 
Joined
Jun 14, 2020
Messages
3,530 (2.14/day)
System Name Mean machine
Processor 12900k
Motherboard MSI Unify X
Cooling Noctua U12A
Memory 7600c34
Video Card(s) 4090 Gamerock oc
Storage 980 pro 2tb
Display(s) Samsung crg90
Case Fractal Torent
Audio Device(s) Hifiman Arya / a30 - d30 pro stack
Power Supply Be quiet dark power pro 1200
Mouse Viper ultimate
Keyboard Blackwidow 65%
A picture is a thousand words I guess, optimum tech tested exactly that. If you don't run any of nvidias latency boost tech (which basically does cap your framerate in a nutshell) - these are the results. Lower fps, lower latency. "How is that possible"? :roll:

1680948591616.png
 
Joined
Jan 14, 2019
Messages
12,567 (5.80/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
A picture is a thousand words I guess, optimum tech tested exactly that. If you don't run any of nvidias latency boost tech (which basically does cap your framerate in a nutshell) - these are the results. Lower fps, lower latency. "How is that possible"? :roll:

View attachment 290766
I think I've just written the most concise, most logical post that I could explaining Enhanced/Fast sync and screen tearing. And now you've got this. There's really nothing more we can do now.
 
Joined
Nov 9, 2010
Messages
5,689 (1.10/day)
System Name Space Station
Processor Intel 13700K
Motherboard ASRock Z790 PG Riptide
Cooling Arctic Liquid Freezer II 420
Memory Corsair Vengeance 6400 2x16GB @ CL34
Video Card(s) PNY RTX 4080
Storage SSDs - Nextorage 4TB, Samsung EVO 970 500GB, Plextor M5Pro 128GB, HDDs - WD Black 6TB, 2x 1TB
Display(s) LG C3 OLED 42"
Case Corsair 7000D Airflow
Audio Device(s) Yamaha RX-V371
Power Supply SeaSonic Vertex 1200w Gold
Mouse Razer Basilisk V3
Keyboard Bloody B840-LK
Software Windows 11 Pro 23H2
Yes my only option now is to reduce graphics settings. My overclock, like you said, obviously isn't stable with Metro:Exodus but this is the only game giving me this issue.
Then you probably should have said your OC scenario, instead of your GPU limited scenario. As I said before, I ran Metro Exodus on my GTX 1080 just fine, which I don't need to tell you is less capable than what you have.
 
Joined
Dec 12, 2020
Messages
1,755 (1.19/day)
@Frag_Maniac
Did you try the DLC Sam's Story without reducing graphical detail/fidelity? That's the Metro:Exodus map that's dragging my FPS down to 40.
 

Mussels

Freshwater Moderator
Joined
Oct 6, 2004
Messages
58,413 (7.91/day)
Location
Oystralia
System Name Rainbow Sparkles (Power efficient, <350W gaming load)
Processor Ryzen R7 5800x3D (Undervolted, 4.45GHz all core)
Motherboard Asus x570-F (BIOS Modded)
Cooling Alphacool Apex UV - Alphacool Eisblock XPX Aurora + EK Quantum ARGB 3090 w/ active backplate
Memory 2x32GB DDR4 3600 Corsair Vengeance RGB @3866 C18-22-22-22-42 TRFC704 (1.4V Hynix MJR - SoC 1.15V)
Video Card(s) Galax RTX 3090 SG 24GB: Underclocked to 1700Mhz 0.750v (375W down to 250W))
Storage 2TB WD SN850 NVME + 1TB Sasmsung 970 Pro NVME + 1TB Intel 6000P NVME USB 3.2
Display(s) Phillips 32 32M1N5800A (4k144), LG 32" (4K60) | Gigabyte G32QC (2k165) | Phillips 328m6fjrmb (2K144)
Case Fractal Design R6
Audio Device(s) Logitech G560 | Corsair Void pro RGB |Blue Yeti mic
Power Supply Fractal Ion+ 2 860W (Platinum) (This thing is God-tier. Silent and TINY)
Mouse Logitech G Pro wireless + Steelseries Prisma XL
Keyboard Razer Huntsman TE ( Sexy white keycaps)
VR HMD Oculus Rift S + Quest 2
Software Windows 11 pro x64 (Yes, it's genuinely a good OS) OpenRGB - ditch the branded bloatware!
Benchmark Scores Nyooom.
That's categorically not true, the higher the framerate the lower the average latency, so by caping your framerate you are most certainly not lowering your latency, that makes no sense.


And you are also increasing the average frametimes as a result, I don't know what use this could have to anyone, also I don't know where you guys got this idea that maxed out CPU/GPU = more even frametime or less latency, none of that is true.



If you can't be bothered to actually explain yourself and you just post a link I am going to assume that, as in many other things, you didn't have anything intelligent to say.
When multiple people tell you that you misunderstood and explain how and why, then post the evidence to back it up say that they're wrong because they posted a link to the definitive source of information on that topic?
No man, that's all you. You do not understand this topic or how a conversation works. Make a claim, and back it up with proof - or don't make that claim.


GPU bottlenecks result in uneven frametimes and render latency, as the CPU renders ahead while it's waiting. Frametimes and FPS are the same thing expressed in a different way. 1000Hz in a second, divided by the FPS... thats a frametime. This is not the same as render time.

CPU bottlenecks result in microstutters at worst, or a lower ceiling cap to max FPS. That said, you're going to have lower input latency in this situation.

Variable refresh rates allow the system to enter new information in a shorter period - If you had a 240Hz display with a VRR range of 1Hz to 240Hz, a 1FPS signal is still updated in the speed the 240Hz is - meaning that a new frame can be displayed at any of those 240 incremental steps, as soon as it's ready. This allows a much faster recovery from any stutters, and offers the reduced input latency of the maximum refresh rate even if the frame rate is lower.


The only reason people use uncapped framerates is because some older game engines like CS gave a competitive advantage to high frame rates as an engine bug, combined with reducing the input latency of a maxed out GPU. VRR provides that reduced latency with Vsync on, and an FPS cap gives you all the benefits without needing anything.

You'd know all this - If you weren't too lazy and arrogant to click a damned link.

It takes 10 seconds to fire up nvidias monitoring stats and see this in real time, it's not rocket science.

These are render latency values. These are from nvidia at the GPU stage of the pipeline only, before the monitor is involved.
Every ms of render latency is a delay in you seeing the image, which means you are responding to an older image. It's not input latency from your input device, but it IS an added delay to you reacting to what is actually happening.



60Hz, Vsync on. Oooh yay i love 31ms of input lag!
1681209424779.png



Exactly the same but with a 60FPS cap that prevents the CPU from rendering ahead of the GPU. (2 frames in DX12, so 60 +2 rendered ahead by the CPU, just like if the GPU was maxed out)
5.6ms.

Let's not use an FPS cap and enjoy 5.6x more render latency
(There is some beauty in that it's 5.6ms, and that vsync was 5.6 times higher)
1681209464540.png



"BUT I WANT VSYNC OFF AND MAXING THE GPU IS BETTER BECAUSE THATS HOW I BENCHMARK"
1681209606488.png


Sure, we go from 59FPS to 151 on the those 99% lows but... oh wait the input latency doubled.

letting my GPU max out its clocks to get that maximum performance managed to... still be worse.
1681209756734.png




Freesync/Vsync is a whole nother bag of fun on top of this, because freesync behaves differently between AMD and Nvidia, and Gsync is nvidia exclusive and different again.

The main key is when they work properly (Vsync off and maxed out GPU's is not properly) they can update at divisions of the full refresh rate.
Using Samsungs 240Hz displays as an example here, they have a minimum of 80Hz because it's 1/3 the max refresh rate.

At 1/2 or 1/3 of a supported refresh rate, the highest rate is used - and the signal sent for a new frame is sent at the start of the final repetition, not at the start. So 80Hz is 12.5ms, 160Hz is 6.25ms and 240Hz is 4.166ms

80FPS at 240Hz VRR would give you better input latency and response times because it asks for the new frame at the start of the final duplicate - 4.166/4.166/4.166
In a GPU limited situation you lose that benefit with frames rendered ahead end up right back at the higher latency values anyway.

Running 235Hz and FPS would give you 4.255ms input latency. If your GPU was maxed out and the CPU had to render ahead to cover up the lack, you'll end up at 8.51ms or 12.765ms, despite being at the higher framerate

Unlimited framerates are not a positive. Maxing your GPU is not a positive. They need spare performance ready to render the next frame when it's needed, because if they're even 0.000001 milliseconds late, you're getting doubled or tripled input latency and that's the most common microstutter there is, as input latency values go crazy.

Doing stupid things like running the GPU at 100% all the time forces you into that higher latency state all the time and people mistake that consistent latency for being the best they can get.
 
Last edited:

dgianstefani

TPU Proofreader
Staff member
Joined
Dec 29, 2017
Messages
5,081 (1.99/day)
Location
Swansea, Wales
System Name Silent
Processor Ryzen 7800X3D @ 5.15ghz BCLK OC, TG AM5 High Performance Heatspreader
Motherboard ASUS ROG Strix X670E-I, chipset fans replaced with Noctua A14x25 G2
Cooling Optimus Block, HWLabs Copper 240/40 + 240/30, D5/Res, 4x Noctua A12x25, 1x A14G2, Mayhems Ultra Pure
Memory 32 GB Dominator Platinum 6150 MT 26-36-36-48, 56.6ns AIDA, 2050 FCLK, 160 ns tRFC, active cooled
Video Card(s) RTX 3080 Ti Founders Edition, Conductonaut Extreme, 18 W/mK MinusPad Extreme, Corsair XG7 Waterblock
Storage Intel Optane DC P1600X 118 GB, Samsung 990 Pro 2 TB
Display(s) 32" 240 Hz 1440p Samsung G7, 31.5" 165 Hz 1440p LG NanoIPS Ultragear, MX900 dual gas VESA mount
Case Sliger SM570 CNC Aluminium 13-Litre, 3D printed feet, custom front, LINKUP Ultra PCIe 4.0 x16 white
Audio Device(s) Audeze Maxwell Ultraviolet w/upgrade pads & LCD headband, Galaxy Buds 3 Pro, Razer Nommo Pro
Power Supply SF750 Plat, full transparent custom cables, Sentinel Pro 1500 Online Double Conversion UPS w/Noctua
Mouse Razer Viper V3 Pro 8 KHz Mercury White w/Tiger Ice Skates & Pulsar Supergrip tape, Razer Atlas
Keyboard Wooting 60HE+ module, TOFU-R CNC Alu/Brass, SS Prismcaps W+Jellykey, LekkerV2 mod, TLabs Leath/Suede
Software Windows 11 IoT Enterprise LTSC 24H2
Benchmark Scores Legendary
When multiple people tell you that you misunderstood and explain how and why, then post the evidence to back it up say that they're wrong because they posted a link to the definitive source of information on that topic?
No man, that's all you. You do not understand this topic or how a conversation works. Make a claim, and back it up with proof - or don't make that claim.


GPU bottlenecks result in uneven frametimes and render latency, as the CPU renders ahead while it's waiting. Frametimes and FPS are the same thing expressed in a different way. 1000Hz in a second, divided by the FPS... thats a frametime. This is not the same as render time.

CPU bottlenecks result in microstutters at worst, or a lower ceiling cap to max FPS. That said, you're going to have lower input latency in this situation.

Variable refresh rates allow the system to enter new information in a shorter period - If you had a 240Hz display with a VRR range of 1Hz to 240Hz, a 1FPS signal is still updated in the speed the 240Hz is - meaning that a new frame can be displayed at any of those 240 incremental steps, as soon as it's ready. This allows a much faster recovery from any stutters, and offers the reduced input latency of the maximum refresh rate even if the frame rate is lower.


The only reason people use uncapped framerates is because some older game engines like CS gave a competitive advantage to high frame rates as an engine bug, combined with reducing the input latency of a maxed out GPU. VRR provides that reduced latency with Vsync on, and an FPS cap gives you all the benefits without needing anything.

You'd know all this - If you weren't too lazy and arrogant to click a damned link.

It takes 10 seconds to fire up nvidias monitoring stats and see this in real time, it's not rocket science.

These are render latency values. These are from nvidia at the GPU stage of the pipeline only, before the monitor is involved.
Every ms of render latency is a delay in you seeing the image, which means you are responding to an older image. It's not input latency from your input device, but it IS an added delay to you reacting to what is actually happening.



60Hz, Vsync on. Oooh yay i love 31ms of input lag!
View attachment 291088


Exactly the same but with a 60FPS cap that prevents the CPU from rendering ahead of the GPU. (2 frames in DX12, so 60 +2 rendered ahead by the CPU, just like if the GPU was maxed out)
5.6ms.

Let's not use an FPS cap and enjoy 5.6x more render latency
(There is some beauty in that it's 5.6ms, and that vsync was 5.6 times higher)
View attachment 291089


"BUT I WANT VSYNC OFF AND MAXING THE GPU IS BETTER BECAUSE THATS HOW I BENCHMARK"
View attachment 291091

Sure, we go from 59FPS to 151 on the those 99% lows but... oh wait the input latency doubled.

letting my GPU max out its clocks to get that maximum performance managed to... still be worse.
View attachment 291093



Freesync/Vsync is a whole nother bag of fun on top of this, because freesync behaves differently between AMD and Nvidia, and Gsync is nvidia exclusive and different again.

The main key is when they work properly (Vsync off and maxed out GPU's is not properly) they can update at divisions of the full refresh rate.
Using Samsungs 240Hz displays as an example here, they have a minimum of 80Hz because it's 1/3 the max refresh rate.

At 1/2 or 1/3 of a supported refresh rate, the highest rate is used - and the signal sent for a new frame is sent at the start of the final repetition, not at the start. So 80Hz is 12.5ms, 160Hz is 6.25ms and 240Hz is 4.166ms

80FPS at 240Hz VRR would give you better input latency and response times because it asks for the new frame at the start of the final duplicate - 4.166/4.166/4.166
In a GPU limited situation you lose that benefit with frames rendered ahead end up right back at the higher latency values anyway.

Running 235Hz and FPS would give you 4.255ms input latency. If your GPU was maxed out and the CPU had to render ahead to cover up the lack, you'll end up at 8.51ms or 12.765ms, despite being at the higher framerate

Unlimited framerates are not a positive. Maxing your GPU is not a positive. They need spare performance ready to render the next frame when it's needed, because if they're even 0.000001 milliseconds late, you're getting doubled or tripled input latency and that's the most common microstutter there is, as input latency values go crazy.

Doing stupid things like running the GPU at 100% all the time forces you into that higher latency state all the time and people mistake that consistent latency for being the best they can get.
Mah man :D :D :D

Couldn't have said it better myself.

237 FPS locked...

1681213977861.png
 

Mussels

Freshwater Moderator
Joined
Oct 6, 2004
Messages
58,413 (7.91/day)
Location
Oystralia
System Name Rainbow Sparkles (Power efficient, <350W gaming load)
Processor Ryzen R7 5800x3D (Undervolted, 4.45GHz all core)
Motherboard Asus x570-F (BIOS Modded)
Cooling Alphacool Apex UV - Alphacool Eisblock XPX Aurora + EK Quantum ARGB 3090 w/ active backplate
Memory 2x32GB DDR4 3600 Corsair Vengeance RGB @3866 C18-22-22-22-42 TRFC704 (1.4V Hynix MJR - SoC 1.15V)
Video Card(s) Galax RTX 3090 SG 24GB: Underclocked to 1700Mhz 0.750v (375W down to 250W))
Storage 2TB WD SN850 NVME + 1TB Sasmsung 970 Pro NVME + 1TB Intel 6000P NVME USB 3.2
Display(s) Phillips 32 32M1N5800A (4k144), LG 32" (4K60) | Gigabyte G32QC (2k165) | Phillips 328m6fjrmb (2K144)
Case Fractal Design R6
Audio Device(s) Logitech G560 | Corsair Void pro RGB |Blue Yeti mic
Power Supply Fractal Ion+ 2 860W (Platinum) (This thing is God-tier. Silent and TINY)
Mouse Logitech G Pro wireless + Steelseries Prisma XL
Keyboard Razer Huntsman TE ( Sexy white keycaps)
VR HMD Oculus Rift S + Quest 2
Software Windows 11 pro x64 (Yes, it's genuinely a good OS) OpenRGB - ditch the branded bloatware!
Benchmark Scores Nyooom.
Oh i forgot:

Vsync without VRR, adds a latency you can't measure.

Using 60Hz here as an example 1000Hz (per second) divided by 60FPS, you get 16.6ms.
If your GPU can have a render time under 16.6ms, you're stutter free. See my 60Hz with a 60FPS cap example above.


However, if you're at 2/3 that value - let's say 40FPS things go bad at the monitor level.

40FPS is 25ms exactly, but the monitor can only display every 16.6ms - some frames line up better than others, but what happens there is every second frame has higher display latency than the previous one.
Putting a star where a frame every 25ms would line up, shows the issue

16.66 -> 33.32* -> 49.98ms -> 66.64ms* -> 83.3ms*

I'm too tired to do the damned math on this one for you, but you can see the pattern - some frames are delayed one, two, or zero frames resulting in what visually would appear as skipping and jumping, with erratic latency. This is the situation people run Vsync off to fix, as tearing would be far better than this level of mess.

Combine that with the render ahead issues of a maxed out GPU, and you can have random lag spikes into the 100ms+ range at any time from the moment the CPU renders the initial frame, to it being visible on a users display, with wild variance frame to frame - and it cant all be measured in software.


The monitor tech matters. The settings matter.
Not all of it can be seen or measured in software, and sometimes something as simple as lowering a refresh rate or an FPS cap is the answer - why run 144Hz with stutters or tearing, when 120Hz would have neither?
 
Top