• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Are game requirements and VRAM usage a joke today?

Joined
Nov 11, 2016
Messages
3,145 (1.14/day)
System Name The de-ploughminator Mk-II
Processor i7 13700KF
Motherboard MSI Z790 Carbon
Cooling ID-Cooling SE-226-XT + Phanteks T30
Memory 2x16GB G.Skill DDR5 7200Cas34
Video Card(s) Asus RTX4090 TUF
Storage Kingston KC3000 2TB NVME
Display(s) LG OLED CX48"
Case Corsair 5000D Air
Audio Device(s) KEF LSX II LT speakers + KEF KC62 Subwoofer
Power Supply Corsair HX850
Mouse Razor Viper Ultimate
Keyboard Corsair K75
Software win11
From AMDs perspective its probably why not given how cheap VRAM is right now wholesale. Given how hungry modern games are in VRAM I would be doing the same. The nice thing about high texture resolution is its practically free on performance, it needs VRAM but doesnt really need rendering performance.

The most popular games are online competitive games where extra VRAM is useless. I play PUBG and it uses like 5GB @4K competitive settings.
Using Silicon cost calculator, for an extra 60usd AMD could make a much bigger GPU that could be 30% faster than current rx7600, but the tape out cost for new chip is huge so...
 
Last edited:
Joined
Feb 1, 2019
Messages
2,746 (1.41/day)
Location
UK, Leicester
System Name Main PC
Processor 13700k
Motherboard Asrock Z690 Steel Legend D4 - Bios 13.02
Cooling Noctua NH-D15S
Memory 32 Gig 3200CL14
Video Card(s) 3080 RTX FE 10G
Storage 1TB 980 PRO (OS, games), 2TB SN850X (games), 2TB DC P4600 (work), 2x 3TB WD Red, 2x 4TB WD Red
Display(s) LG 27GL850
Case Fractal Define R4
Audio Device(s) Asus Xonar D2X
Power Supply Antec HCG 750 Gold
Software Windows 10 21H2 LTSC
The most popular games are online competitive games where extra VRAM is useless. I play PUBG and it uses like 5GB @4K competitive settings.
Using Silicon cost calculator, for an extra 60usd AMD could make a much bigger GPU that could be 30% faster than current rx7600, but the tape out cost for new chip is huge so...

Yeah those games are designed to be played at those crazy uber frame rates with less focus on visuals, but some of us play RPGs that are made to be pretty and low frame rate. AMD is catering for guys like me, its just lack of SGSSAA keeping me away.
 
Joined
Jun 27, 2019
Messages
1,909 (1.06/day)
Location
Hungary
System Name I don't name my systems.
Processor i5-12600KF 'stock power limits/-115mV undervolt+contact frame'
Motherboard Asus Prime B660-PLUS D4
Cooling ID-Cooling SE 224 XT ARGB V3 'CPU', 4x Be Quiet! Light Wings + 2x Arctic P12 black case fans.
Memory 4x8GB G.SKILL Ripjaws V DDR4 3200MHz
Video Card(s) Asus TuF V2 RTX 3060 Ti @1920 MHz Core/@950mV Undervolt
Storage 4 TB WD Red, 1 TB Silicon Power A55 Sata, 1 TB Kingston A2000 NVMe, 256 GB Adata Spectrix s40g NVMe
Display(s) 29" 2560x1080 75 Hz / LG 29WK600-W
Case Be Quiet! Pure Base 500 FX Black
Audio Device(s) Onboard + Hama uRage SoundZ 900+USB DAC
Power Supply Seasonic CORE GM 500W 80+ Gold
Mouse Canyon Puncher GM-20
Keyboard SPC Gear GK630K Tournament 'Kailh Brown'
Software Windows 10 Pro
I can only laugh when I see people cherrypicking numbers to force cards like 3070 into having VRAM issues, by running the most demanding games today in 4K/UHD, sometimes with RT on top, just to force the VRAM dry. Not a real world scenario at all. The GPU itself would not even be able to run these settings, even if it had 16GB VRAM. GPU power is the problem, not VRAM.

In most cases, the GPU is the limiting factor. Not the VRAM at all. Besides, 95% of PC gamers are using 1440p or less (Steam HW Survey). Looking at 4K/UHD native fps numbers means little for the majority of PC gamers and 96% of people on Steam HW Survey have 12GB VRAM or less. Developers make games to earn money, and you don't sell any games if you make games for the 4-5% marketshare.

Beside, most games today look almost identical on high and ultra preset, while high uses alot less vram. Often motion blur and dof is pushed to ultra presets and uses more memory while looking worse to the end-user. Even medium preset looks great in most new games, and sometimes even more "clean" than high and especially ultra, which is filled with blur, dof and all kinds of features that tank performance but necessarily don't make the visuals "better"

This is something I can agree with cause its also what I'm personally experiencing with a 3060 Ti aka a lowly shitty 8 GB card according to some ppl around here. 'and yes I did pick this card over a 6700 XT..'
Ppl keep bringing up the 4k argument even tho its still not a common res to game at, maybe on forums/tech sites like TPU it is but in general nope and its also a moot point for me playing on a 2560x1080 monitor which has less pixels than a 1440p monitor.

In newer games I run out GPU power way before running out of Vram, especially with Unreal Engine 5 games and those are gonna be more and more common just like the UE 3-4 games before. 'sure the devs could use very high res textures in the future but the engine itself is very GPU heavy if nanite and lumen is used and thats gonna be the limiting factor first'
Immortals of Aveum was seriously choking my GPU on high settings even with DLSS but it had no Vram related issues, same with Plague Tale Requiem 'in house engine but still' which is an amazing looking game with great textures and it had zero Vram issues with RT off 'it did not even have RT when I was playing it at the relase' but the game was heavily limited by my GPU.

Older or lighter games are also a moot point cause those are easy to run anyway, even maxed out.

Sure theres always that edge case with shit optimized games at launch but even then its not a big deal to turn settings down a notch or wait for a fix like with Last of Us which is perfectly fine by now even on 8 GB cards at 1440p ~high settings so I'm all good whenever I will decide to play/buy it.

While I do agree this Vram issue is an existing thing but in my opinion its is way overblown with ppl always finding a reason for it or pushin their cards where it shoudn't even be performing at in the first place.

Personally I would have no problems buying a 12 GB card like a 4070/super if I could afford it cause I'm sure it would easily last me my usual ~3 years if not more. 'this console generation at least + I'm not planning on upgrading my resolution/monitor either'
 
Last edited:
Joined
Feb 1, 2019
Messages
2,746 (1.41/day)
Location
UK, Leicester
System Name Main PC
Processor 13700k
Motherboard Asrock Z690 Steel Legend D4 - Bios 13.02
Cooling Noctua NH-D15S
Memory 32 Gig 3200CL14
Video Card(s) 3080 RTX FE 10G
Storage 1TB 980 PRO (OS, games), 2TB SN850X (games), 2TB DC P4600 (work), 2x 3TB WD Red, 2x 4TB WD Red
Display(s) LG 27GL850
Case Fractal Define R4
Audio Device(s) Asus Xonar D2X
Power Supply Antec HCG 750 Gold
Software Windows 10 21H2 LTSC
For reference FF15 is pushing my VRAM to 9.5 gig utilisation (out of 10 gigs), I am playing at 4k rendering and, and the GPU itself is barely at 30% utilisation with reflections set to high (thats a GPU killer on FF15). The 4k texture pack is of course pushing the VRAM usage up.

We have to bear in mind the comments claiming that rendering resource are exhausted first are dependent on the game's you play and the settings used in those games. I cannot remember when I last played a game where my GPU was sustained over 50% utilisation, FF7 remake had moments, if I customised the engine settings to add particles etc. then it would occasionally on things like limit break activations, briefly hit the 90s.

The reason its blown up a lot is VRAM is not easy to upgrade, the GPU has to be replaced, if it was a modular system, where it could be expanded same way as DRAM then it wouldnt be such a big deal. Some of us are sensitive to substandard textures, pop in's and so forth.

Another ironic thing is Nvidia are pushing RT which in itself is a VRAM guzzler.

For me bottlenecks in anything recent I played (last 3 years) is probably in this order.

CPU -> VRAM -> GPU rasterization. Which is funny as the dominant opinion puts GPU first. However I dont play at high framerates so routinely my GPU is at low to medium utilisation. My CPU platform upgrade easily feels more impactful on my games than my last GPU upgrade.
 
Joined
Jun 27, 2019
Messages
1,909 (1.06/day)
Location
Hungary
System Name I don't name my systems.
Processor i5-12600KF 'stock power limits/-115mV undervolt+contact frame'
Motherboard Asus Prime B660-PLUS D4
Cooling ID-Cooling SE 224 XT ARGB V3 'CPU', 4x Be Quiet! Light Wings + 2x Arctic P12 black case fans.
Memory 4x8GB G.SKILL Ripjaws V DDR4 3200MHz
Video Card(s) Asus TuF V2 RTX 3060 Ti @1920 MHz Core/@950mV Undervolt
Storage 4 TB WD Red, 1 TB Silicon Power A55 Sata, 1 TB Kingston A2000 NVMe, 256 GB Adata Spectrix s40g NVMe
Display(s) 29" 2560x1080 75 Hz / LG 29WK600-W
Case Be Quiet! Pure Base 500 FX Black
Audio Device(s) Onboard + Hama uRage SoundZ 900+USB DAC
Power Supply Seasonic CORE GM 500W 80+ Gold
Mouse Canyon Puncher GM-20
Keyboard SPC Gear GK630K Tournament 'Kailh Brown'
Software Windows 10 Pro
For reference FF15 is pushing my VRAM to 9.5 gig utilisation (out of 10 gigs), I am playing at 4k rendering and, and the GPU itself is barely at 30% utilisation with reflections set to high (thats a GPU killer on FF15). The 4k texture pack is of course pushing the VRAM usage up.

We have to bear in mind the comments claiming that rendering resource are exhausted first are dependent on the game's you play and the settings used in those games. I cannot remember when I last played a game where my GPU was sustained over 50% utilisation, FF7 remake had moments, if I customised the engine settings to add particles etc. then it would occasionally on things like limit break activations, briefly hit the 90s.

The reason its blown up a lot is VRAM is not easy to upgrade, the GPU has to be replaced, if it was a modular system, where it could be expanded same way as DRAM then it wouldnt be such a big deal. Some of us are sensitive to substandard textures, pop in's and so forth.

Another ironic thing is Nvidia are pushing RT which in itself is a VRAM guzzler.

For me bottlenecks in anything recent I played (last 3 years) is probably in this order.

CPU -> VRAM -> GPU rasterization. Which is funny as the dominant opinion puts GPU first. However I dont play at high framerates so routinely my GPU is at low to medium utilisation. My CPU platform upgrade easily feels more impactful on my games than my last GPU upgrade.

For me its GPU>CPU>Vram, I do not play competitive games or simulator/heavy strategy games but pretty much everything else goes. 'currently playing Far Cry 6 maxed out and it does max out my GPU during heavy fights/scenes'
At my resolution and at 75 Hz refresh rate I find myself running out of GPU power the first in more demanding or newer games, I guess thats a normal thing with weaker cards like mine. '3060 Ti is still stronger than what most ppl have outside of forums/sites like this, supposedly'

Also, not every game is sensitive to maxed out Vram usage, some games wont even stutter if you go over the limit, it really depends on the game and some games will stutter no matter what cause they are just crap like that. 'say hi to the infamous UE 4 stutters..'
 
Last edited:
Joined
Jan 14, 2019
Messages
10,181 (5.17/day)
Location
Midlands, UK
System Name Holiday Season Budget Computer (HSBC)
Processor AMD Ryzen 7 7700X
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 16 GB Corsair Vengeance EXPO DDR5-6000
Video Card(s) Sapphire Pulse Radeon RX 6500 XT 4 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2, 4 + 8 TB Seagate Barracuda 3.5"
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Windows 10 Pro
If the game ever goes on sale I will take a peek myself.
It's on a 25% sale on Epic right now.

I also took a look with a 1660 Ti because my 2070 seems to be half dead at this point. Besides the loss of average performance, the game stutters quite a bit more, and the LOD quality drop is way more noticeable. As a conclusion, I'd say that despite the game being able to allocate 11-13 GB VRAM, 8 GB is what it really needs. 6 GB gives you stutters and object drop-ins that you don't want.
 
Joined
Aug 15, 2016
Messages
486 (0.17/day)
Processor Intel i7 4770k
Motherboard ASUS Sabertooth Z87
Cooling BeQuiet! Shadow Rock 3
Memory Patriot Viper 3 RedD 16 GB @ 1866 MHz
Video Card(s) XFX RX 480 GTR 8GB
Storage 1x SSD Samsung EVO 250 GB 1x HDD Seagate Barracuda 3 TB 1x HDD Seagate Barracuda 4 TB
Display(s) AOC Q27G2U QHD, Dell S2415H FHD
Case Cooler Master HAF XM
Audio Device(s) Magnat LZR 980, Razer BlackShark V2, Altec Lansing 251
Power Supply Corsair AX860
Mouse Razer DeathAdder V2
Keyboard Razer Huntsman Tournament Edition
Software Windows 10 Pro x64
I'd say that despite the game being able to allocate 11-13 GB VRAM, 8 GB is what it really needs. 6 GB gives you stutters and object drop-ins that you don't want.
Yeah, the GPU has to resort to system RAM (much slower) when it runs out of available VRAM, resulting in increased latency, stuttering, FPS drops.
 
Joined
Feb 1, 2019
Messages
2,746 (1.41/day)
Location
UK, Leicester
System Name Main PC
Processor 13700k
Motherboard Asrock Z690 Steel Legend D4 - Bios 13.02
Cooling Noctua NH-D15S
Memory 32 Gig 3200CL14
Video Card(s) 3080 RTX FE 10G
Storage 1TB 980 PRO (OS, games), 2TB SN850X (games), 2TB DC P4600 (work), 2x 3TB WD Red, 2x 4TB WD Red
Display(s) LG 27GL850
Case Fractal Define R4
Audio Device(s) Asus Xonar D2X
Power Supply Antec HCG 750 Gold
Software Windows 10 21H2 LTSC
For me its GPU>CPU>Vram, I do not play competitive games or simulator/heavy strategy games but pretty much everything else goes. 'currently playing Far Cry 6 maxed out and it does max out my GPU during heavy fights/scenes'
At my resolution and at 75 Hz refresh rate I find myself running out of GPU power the first in more demanding or newer games, I guess thats a normal thing with weaker cards like mine. '3060 Ti is still stronger than what most ppl have outside of forums/sites like this, supposedly'

Also, not every game is sensitive to maxed out Vram usage, some games wont even stutter if you go over the limit, it really depends on the game and some games will stutter no matter what cause they are just crap like that. 'say hi to the infamous UE 4 stutters..'
Yes, we play different games so different observations.
 
Joined
Feb 24, 2023
Messages
2,307 (4.92/day)
Location
Russian Wild West
System Name DLSS / YOLO-PC
Processor i5-12400F / 10600KF
Motherboard Gigabyte B760M DS3H / Z490 Vision D
Cooling Laminar RM1 / Gammaxx 400
Memory 32 GB DDR4-3200 / 16 GB DDR4-3333
Video Card(s) RX 6700 XT / RX 480 8 GB
Storage A couple SSDs, m.2 NVMe included / 240 GB CX1 + 1 TB WD HDD
Display(s) Compit HA2704 / Viewsonic VX3276-MHD-2
Case Matrexx 55 / Junkyard special
Audio Device(s) Want loud, use headphones. Want quiet, use satellites.
Power Supply Thermaltake 1000 W / FSP Epsilon 700 W / Corsair CX650M [backup]
Mouse Don't disturb, cheese eating in progress...
Keyboard Makes some noise. Probably onto something.
VR HMD I live in real reality and don't need a virtual one.
Software Windows 10 and 11
And I am thoroughly nutter so I elaborately and extensively tweak (and freak) Cyberpunk 2077 over and over and over and over again. At 4K + FSR at Performance, playability with RT Reflections enabled is a given if your GPU is something like RTX 4070 Super.

But I got an another dozen-gigabyter. RX 6700 XT wasn't made for RT whatsoever, let alone 4K+RT. But this + tweaked configuration files + modded UHD textures = I finally made use of 11.5 of 12 GB I had onboard. The game itself is borderline with FPS hovering around 43 give or take 5 FPS with episodous leans to either 30 or 60 FPS. But with double the scaling, RT Reflections are great even despite me using 1080p internal rendering.

Anyway, realistically, an average Joe runs out of VRAM... way after he achieves stutterfest due to CPU/GPU/RAM being too sluggish and no additional VRAM can help the matter. Of course some games are much heavier on the VRAM than on the computing power but these are outliers anyway.
 
Joined
Jan 14, 2019
Messages
10,181 (5.17/day)
Location
Midlands, UK
System Name Holiday Season Budget Computer (HSBC)
Processor AMD Ryzen 7 7700X
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 16 GB Corsair Vengeance EXPO DDR5-6000
Video Card(s) Sapphire Pulse Radeon RX 6500 XT 4 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2, 4 + 8 TB Seagate Barracuda 3.5"
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Windows 10 Pro
Yeah, the GPU has to resort to system RAM (much slower) when it runs out of available VRAM, resulting in increased latency, stuttering, FPS drops.
Absolutely. The big question is, do you notice when the game touches system RAM to account for the lack of VRAM? Based on what I saw, I'd say that with 8 GB VRAM, you don't, but with 6, you do.
 
Joined
Sep 20, 2021
Messages
280 (0.28/day)
Processor Ryzen 7 7900x
Motherboard Asrock B650E PG Riptide WiFi
Cooling Underfloor CPU cooling
Memory 2x32GB 6600
Video Card(s) RX 7900 XT OC Edition
Storage Kingston Fury Renegade 1TB, Seagate Exos 12TB
Display(s) MSI Optix MAG301RF 2560x1080@200Hz
Case Phanteks Enthoo Pro
Power Supply NZXT C850 850W Gold
Mouse Bloody W95 Max Naraka
Everything can be tested very easily, is it the VRAM enough or not?
Start Superposition in this mode, choose how much VRAM it needs to allocate (as you see it can be really much), minimize it, start your game/bench/whatever, and test.

I tested it with Shadow of the Tomb Rider, but because I do not have a swap file and everything is in the RAM/Cache programs, the frame drop was really low and there is no stuttering, but in your systems will be interesting :D
When I start TR, my allocated RAM is 32GB, at the end of the bench it goes to 38.5GB, so part of this 6.5GB is transferred from the VRAM to RAM.

If you looking into whether the game really needs all this VRAM - as we know, everything is about game optimizations, which is correlated to the game studio that developed the game. So how much or did really AMD/Nvidia pay for these "optimizations" we can only speculate and nothing more.


1705845047294.png


Screenshot 2024-01-21 161007.png
 
Joined
Sep 3, 2019
Messages
3,035 (1.74/day)
Location
Thessaloniki, Greece
System Name PC on since Aug 2019, 1st CPU R5 3600 + ASUS ROG RX580 8GB >> MSI Gaming X RX5700XT (Jan 2020)
Processor Ryzen 9 5900X (July 2022), 160W PPT limit, 75C temp limit, CO -9~14
Motherboard Gigabyte X570 Aorus Pro (Rev1.0), BIOS F37h, AGESA V2 1.2.0.B
Cooling Arctic Liquid Freezer II 420mm Rev7 with off center mount for Ryzen, TIM: Kryonaut
Memory 2x16GB G.Skill Trident Z Neo GTZN (July 2022) 3600MHz 1.42V CL16-16-16-16-32-48 1T, tRFC:288, B-die
Video Card(s) Sapphire Nitro+ RX 7900XTX (Dec 2023) 314~465W (390W current) PowerLimit, 1060mV, Adrenalin v24.5.1
Storage Samsung NVMe: 980Pro 1TB(OS 2022), 970Pro 512GB(2019) / SATA-III: 850Pro 1TB(2015) 860Evo 1TB(2020)
Display(s) Dell Alienware AW3423DW 34" QD-OLED curved (1800R), 3440x1440 144Hz (max 175Hz) HDR1000, VRR on
Case None... naked on desk
Audio Device(s) Astro A50 headset
Power Supply Corsair HX750i, 80+ Platinum, 93% (250~700W), modular, single/dual rail (switch)
Mouse Logitech MX Master (Gen1)
Keyboard Logitech G15 (Gen2) w/ LCDSirReal applet
Software Windows 11 Home 64bit (v23H2, OSB 22631.3155)
I'm playing FarCry6 these days.
Monitor is 3440x1440 (4.95MP) but I've set the rendering at x1.5
That is 5160x2160 (11.15MP). Everything else is maxed out with HD textures on.
For comparison 4K is 8.3MP

The avg VRAM usage is between 14-15GB and peaks at 16+
System RAM is loaded no more than 8-9GB from game (after 2hours).
Avg FPS is around 95 with lows around 75.
Gameplay is very smouth

I should notice here that monitor is working great with VRR enabled from adrenalin, even though its a Gsync certified one. That contributes on smoothness.

I'm very curious about this though:
Untitled_59.png


Does anyone know what the "GPU Memory Usage" stands for?
 
Top