• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Are game requirements and VRAM usage a joke today?

Joined
Jul 13, 2016
Messages
3,391 (1.09/day)
Processor Ryzen 7800X3D
Motherboard ASRock X670E Taichi
Cooling Noctua NH-D15 Chromax
Memory 32GB DDR5 6000 CL30
Video Card(s) MSI RTX 4090 Trio
Storage P5800X 1.6TB 4x 15.36TB Micron 9300 Pro 4x WD Black 8TB M.2
Display(s) Acer Predator XB3 27" 240 Hz
Case Thermaltake Core X9
Audio Device(s) JDS Element IV, DCA Aeon II
Power Supply Seasonic Prime Titanium 850w
Mouse PMM P-305
Keyboard Wooting HE60
VR HMD Valve Index
Software Win 10
I didn't say my numbers were better. I'm saying any specific value is not necessarily the highest amount it will ever use. If a game usually uses 11 GB but sometimes uses 12.5, that's still an issue for the GPU. What matters is the highest usage, not the average usage, or the lowest usage.

As I pointed out earlier your numbers are using allocation and not actual usage. And again immediately going over the VRAM size doesn't cause an issue. The 3070 for example is completely fine using 10GB in RE4 because the actual hot data set that the game frequently accesses is less than the total VRAM usage. You are saying DLSS is solving a VRAM issue but you have yet to prove that. You theorize that DLSS can but that fact that we can't find this phantom scenario either makes it very rare or non-existence which only proves my point.

Wat?! Says who? :wtf:

DLSS/FSR/XeSS and the like absolutely help VRAM strapped cards. As someone who was on "Team 8GB" up until last week, upscaling is what let me play games like Far Cry 6 at 4K ultra + RT without everything turning into a slideshow. Rendering at a lower internal res dropped my VRAM usage by ~2GB.

It boosted your frame-rate but 8GB cards don't have a VRAM issue in that game at ultra RT 4K:

1697381694131.png


As you can see in the chart above, 8GB cards perform as they should at those settings including 1% lows. Again DLSS is not solving a VRAM issue that doesn't exist for 8GB cards here.
 
Joined
Jun 19, 2023
Messages
115 (0.20/day)
System Name EnvyPC
Processor Ryzen 7 7800X3D
Motherboard Asus ROG Strix B650E-I Gaming WiFi
Cooling Thermalright Peerless Assassin 120 White / be quiet! Silent Wings Pro 4 120mm (x2), 140mm (x3)
Memory 64GB Klevv Cras V RGB White DDR5-6000 (Hynix A) 30-36-36-76 1.35v
Video Card(s) MSI GeForce RTX 4080 16GB Gaming X Trio White Edition
Storage SK Hynix Platinum P41 2TB NVME
Display(s) HP Omen 4K 144Hz
Case Lian Li Dan Case A3 White
Audio Device(s) Creative Sound BlasterX G6
Power Supply Corsair SF750 Platinum
Mouse Logitech G600 White
Software Windows 11 Pro
It boosted your frame-rate but 8GB cards don't have a VRAM issue in that game at ultra RT 4K:

View attachment 317580

As you can see in the chart above, 8GB cards perform as they should at those settings including 1% lows. Again DLSS is not solving a VRAM issue that doesn't exist for 8GB cards here.

It "boosted my framerate" by lowering the VRAM requirement. :laugh:

8GB cards choke at native 4K using RT in FC6 without the HD textures (I know, I played it on one), so those results look super sus. The game was completely unplayable at those settings with my 3070 Ti, just like in the TPU's benchmarks.



It's not just just bad 1% lows, it's a literal slideshow.

Using FSR Quality (1440p internal render) got the VRAM requirements just under the 8GB limit and let me play the entire game comfortably north of 60fps (minus the HD texture pack).
 
Joined
Oct 14, 2007
Messages
663 (0.11/day)
Location
Auburn Hills, MI
Processor Intel i9-13900KF
Motherboard ASUS Z790M-Plus
Memory 64 GB DDR5 @ 6000 G. Skill Trident Z
Video Card(s) ASUS TUF Gaming RTX 4090
Storage 2 TB SN850X + 2x 4 TB Lexar NM790
Display(s) 32" 4K/240 Hz W-OLED w/ 1080P/480Hz Mode + 39" 3440x1440 240 Hz W-OLED
Case Lian Li O11 Mini
Audio Device(s) Kali LP-UNF + Audeze Maxwell
Power Supply Corsair RM1000x
Mouse Logitech G502 X Plus
Keyboard Keychron Q6 Pro
I'm sorry if you love the game but geometry and textures are way more important for realism than lighting UE5 achieves both, if a game stil uses geometry like this RT won't help and this from the path tracing version of the game:
View attachment 317551
Control has the same issue with worse textures. With what UE5 achieves my expectations are higher.

Is that the highest resolution texture I've ever seen? No. That's not the point, gross hyperbole doesn't strengthen arguments, it weakens them by trashing the credibility of the person making the claim. And, you know what that image doesn't look like?

PS1 games. If you think it does, you weren't alive long enough ago to have actually played PS1 games.

Saying some parts look like an early PS4 game, might be valid. Saying it looks like a PS1 game, that's just utterly laughable.

I've also barely ever seen a game where you couldn't cherrypick really low res texture screenshots from various spots of the game.
 
Last edited:
Joined
Jan 18, 2020
Messages
867 (0.48/day)
Yes they are.

You'll be able to tweak a few settings and get acceptable performance on old hardware.

A lot of the software industry is actively engaged in trying to help flog newer hardware that isn't needed for many use cases...
 
Joined
May 13, 2008
Messages
762 (0.13/day)
System Name HTPC whhaaaat?
Processor 2600k @ 4500mhz
Motherboard Asus Maximus IV gene-z gen3
Cooling Noctua NH-C14
Memory Gskill Ripjaw 2x4gb
Video Card(s) EVGA 1080 FTW @ 2037/11016
Storage 2x512GB MX100/1x Agility 3 128gb ssds, Seagate 3TB HDD
Display(s) Vizio P 65'' 4k tv
Case Lian Li pc-c50b
Audio Device(s) Denon 3311
Power Supply Corsair 620HX
It "boosted my framerate" by lowering the VRAM requirement. :laugh:

8GB cards choke at native 4K using RT in FC6 without the HD textures (I know, I played it on one), so those results look super sus. The game was completely unplayable at those settings with my 3070 Ti, just like in the TPU's benchmarks.



It's not just just bad 1% lows, it's a literal slideshow.

Using FSR Quality (1440p internal render) got the VRAM requirements just under the 8GB limit and let me play the entire game comfortably north of 60fps (minus the HD texture pack).

I had not seen this.

4060 ti: 10.7
3070 ti: 11.4
2080 ti: 50.2

Not to rub salt in your wound, but what a completely perfect example of why any nVIDIA card under $800 not called 2080 ti is a total farce.
 
Joined
Sep 17, 2014
Messages
22,840 (6.06/day)
Location
The Washing Machine
System Name Tiny the White Yeti
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling CPU: Thermalright Peerless Assassin / Case: Phanteks T30-120 x3
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
VR HMD HD 420 - Green Edition ;)
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
Eh, games already push 16, frequently, at 4K. If the 4090 had 16 GB I'd genuinely consider that unacceptable for a card that expensive. It might not need 24 GB today, but it makes it a lot more future proofed.
Irony has it many Knights of the Low VRAM Table totally disagree with you: they say cards with high vram are a waste because future proofing doesnt exist!

Almost sounds like they are penny wise pound foolish! Strange indeed huh :)


I had not seen this.

4060 ti: 10.7
3070 ti: 11.4
2080 ti: 50.2

Not to rub salt in your wound, but what a completely perfect example of why any nVIDIA card under $800 not called 2080 ti is a total farce.
Now you are just cherry picking, this cant be VRAM! this is probably AMDs fault too and otherwise its Ubisoft!!!one

/s (in case anyone missed it)
 
Joined
Apr 13, 2017
Messages
150 (0.05/day)
System Name AMD System
Processor Ryzen 7900 at 180Watts 5650 MHz, vdroop from 1.37V to 1.24V
Motherboard MSI MAG x670 Tomahawk Wifi
Cooling AIO240 for CPU, Wraith Prism's Fan for RAM but suspended above it without touching anything in case.
Memory 32GB dual channel Gskill DDR6000CL30 tuned for CL28, at 1.42Volts
Video Card(s) Msi Ventus 2x Rtx 4070 and Gigabyte Gaming Oc Rtx 4060 ti
Storage Samsung Evo 970
Display(s) Old 1080p 60FPS Samsung
Case Normal atx
Audio Device(s) Dunno
Power Supply 1200Watts
Mouse wireless & quiet
Keyboard wireless & quiet
VR HMD No
Software Windows 11
Benchmark Scores 1750 points in cinebench 2024 42k 43k gpu cpu points in timespy 50+ teraflops total compute power.
Maybe it's time for game makers to start compressing those data.
 
Joined
Sep 17, 2014
Messages
22,840 (6.06/day)
Location
The Washing Machine
System Name Tiny the White Yeti
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling CPU: Thermalright Peerless Assassin / Case: Phanteks T30-120 x3
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
VR HMD HD 420 - Green Edition ;)
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
Maybe it's time for game makers to start compressing those data.
Nah its time for Ada and Ampere owners to upgrade now or in the next year. ;)
 
Joined
Jul 13, 2016
Messages
3,391 (1.09/day)
Processor Ryzen 7800X3D
Motherboard ASRock X670E Taichi
Cooling Noctua NH-D15 Chromax
Memory 32GB DDR5 6000 CL30
Video Card(s) MSI RTX 4090 Trio
Storage P5800X 1.6TB 4x 15.36TB Micron 9300 Pro 4x WD Black 8TB M.2
Display(s) Acer Predator XB3 27" 240 Hz
Case Thermaltake Core X9
Audio Device(s) JDS Element IV, DCA Aeon II
Power Supply Seasonic Prime Titanium 850w
Mouse PMM P-305
Keyboard Wooting HE60
VR HMD Valve Index
Software Win 10
It "boosted my framerate" by lowering the VRAM requirement. :laugh:

8GB cards choke at native 4K using RT in FC6 without the HD textures (I know, I played it on one), so those results look super sus. The game was completely unplayable at those settings with my 3070 Ti, just like in the TPU's benchmarks.



It's not just just bad 1% lows, it's a literal slideshow.

Using FSR Quality (1440p internal render) got the VRAM requirements just under the 8GB limit and let me play the entire game comfortably north of 60fps (minus the HD texture pack).

Tom's Hardware and HotHardware align with Techspot's (considering of course that Techspot's figures also include the high texture pack):

1697388580714.png


1697389893587.png


The graph you provided seems to be the odd man out. I believe this large difference may be due to test setup used. TechPowerUp tends to use lower speced memory speed where as TechSpot (aka HWUB who writes their GPU articles) and other publications tend to use high frequency and/or lower latency RAM. Far Cry games are usually very memory sensative and pushing your VRAM limit and spilling over to the main system memory may combine to what we see in the TechPowerUp graph. What you experienced may have been a combination of using subpar memory while pushing your GPU's VRAM limits.

EDIT ** Replaced a webp image that wasn't displaying with a png. God I hate webp

Maybe it's time for game makers to start compressing those data.

Game data is already compressed both on disk and in memory.
 
Joined
Sep 17, 2014
Messages
22,840 (6.06/day)
Location
The Washing Machine
System Name Tiny the White Yeti
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling CPU: Thermalright Peerless Assassin / Case: Phanteks T30-120 x3
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
VR HMD HD 420 - Green Edition ;)
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
Tom's Hardware and HotHardware align with Techspot's (considering of course that Techspot's figures also include the high texture pack):

View attachment 317589

View attachment 317595

The graph you provided seems to be the odd man out. I believe this large difference may be due to test setup used. TechPowerUp tends to use lower speced memory speed where as TechSpot (aka HWUB who writes their GPU articles) and other publications tend to use high frequency and/or lower latency RAM. Far Cry games are usually very memory sensative and pushing your VRAM limit and spilling over to the main system memory may combine to what we see in the TechPowerUp graph. What you experienced may have been a combination of using subpar memory while pushing your GPU's VRAM limits.

EDIT ** Replaced a webp image that wasn't displaying with a png. God I hate webp



Game data is already compressed both on disk and in memory.
While different numbers the gist and practical situation is similar. Go over the limit and you get an unplayable affair. And thats averages which is being pretty kind on any sort of stuttering, too.

It echoes my experience on an 8GB card.
 
Joined
Apr 18, 2019
Messages
2,425 (1.16/day)
Location
Olympia, WA
System Name Sleepy Painter
Processor AMD Ryzen 5 3600
Motherboard Asus TuF Gaming X570-PLUS/WIFI
Cooling FSP Windale 6 - Passive
Memory 2x16GB F4-3600C16-16GVKC @ 16-19-21-36-58-1T
Video Card(s) MSI RX580 8GB
Storage 2x Samsung PM963 960GB nVME RAID0, Crucial BX500 1TB SATA, WD Blue 3D 2TB SATA
Display(s) Microboard 32" Curved 1080P 144hz VA w/ Freesync
Case NZXT Gamma Classic Black
Audio Device(s) Asus Xonar D1
Power Supply Rosewill 1KW on 240V@60hz
Mouse Logitech MX518 Legend
Keyboard Red Dragon K552
Software Windows 10 Enterprise 2019 LTSC 1809 17763.1757
Nah its time for Ada and Ampere owners to upgrade now or in the next year. ;)
I don't think its deniable at this point:
This is on purpose.
It's not some grand conspiracy, it's just become a norm.

Personally,
Other than 2 upcoming titles, I have 0 interest in playing the industry's games any longer.
I'm fairly uncaring about RT (in games), and have many games of bygone eras to check out and revisit
+ indie titles that are either 'tweakable' or were intended to run on a potato.

IMHO, either
Pay-up ($800+ - $1k+) for a top-VRAM top-membuswidth card for 'new games'
or
Learn what you have, and how to love it, and work w/in its limitations.

I've done option A.
with the 4890, 7970, and 780Ti.
Not doing that again.
 
Joined
Sep 17, 2014
Messages
22,840 (6.06/day)
Location
The Washing Machine
System Name Tiny the White Yeti
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling CPU: Thermalright Peerless Assassin / Case: Phanteks T30-120 x3
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
VR HMD HD 420 - Green Edition ;)
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
I don't think its deniable at this point:
This is on purpose.
It's not some grand conspiracy, it's just become a norm.

Personally,
Other than 2 upcoming titles, I have 0 interest in playing the industry's games any longer.
I'm fairly uncaring about RT (in games), and have many games of bygone eras to check out and revisit
+ indie titles that are either 'tweakable' or were intended to run on a potato.

IMHO, either
Pay-up ($800+ - $1k+) for a top-VRAM top-membuswidth card for 'new games'
or
Learn what you have, and how to love it, and work w/in its limitations.

I've done option A.
with the 4890, 7970, and 780Ti.
Not doing that again.
I kinda do both. I explore gaming at my own pace, when my wallet likes me to break another grand on hardware, I upgrade and I use said hardware for a long time. It always pays off, even now, and even with so much horse manure getting released. Gaming is only expensive if you want to chase the latest greatest all the time - much like a lot of other things. But with gaming, its really big that way: wait a year and you can budget game everything. Wait two and you can get the hardware to max said games out on top. And best of all, the whole thing is bug free :)
 
Joined
Apr 30, 2020
Messages
1,016 (0.59/day)
System Name S.L.I + RTX research rig
Processor Ryzen 7 5800X 3D.
Motherboard MSI MEG ACE X570
Cooling Corsair H150i Cappellx
Memory Corsair Vengeance pro RGB 3200mhz 32Gbs
Video Card(s) 2x Dell RTX 2080 Ti in S.L.I
Storage Western digital Sata 6.0 SDD 500gb + fanxiang S660 4TB PCIe 4.0 NVMe M.2
Display(s) HP X24i
Case Corsair 7000D Airflow
Power Supply EVGA G+1600watts
Mouse Corsair Scimitar
Keyboard Cosair K55 Pro RGB
I was looking at old videos and found where I was playing Guild Wars 2 back in 2013 and barely using 300 MB of VRAM, at 5040x900 resolution.


I think I was using either a Radeon HD 7850 GPU, or R9 380 at the time, and I recall GW2 being one of the best-looking games (or at least MMORPG) at that time. It must have been relatively optimized too for me to be pulling off 50 FPS :p

What's with some game requirements nowadays requiring 8GB or more VRAM for 1080p? I've heard a Plague Tale and some Jedi game being big offenders.

The 900 vertical lines keeps it low.

If you double a square, it's usually 4 times bigger in area. You didn't really double a square you just made it a rectangle (5040 vs 1080) instead of a square. The vertical lines didn't move.

Now if you go from 1080p to 2160P/4K on desktop you would double that area of the square.

Here is something to remember about raytracing. The first game to show up with raytracing was Battlefield. The RTX 2080 ti dropped 50% of its frame generated when raytracing got enabled & that was on 1080p.

I have to admit that I've been wrong about double the screen size & rendering inside of the 3D world in games, where I referred to as "quadrupling" what was needed to be rendered. The issues here that we usually think about increase in the resolution of gaming in this perspective that we've all become accustomed to the 2D plane.



whatyoutthough3D.png



so one would think here well a 50% increase should be able to get close to getting back up around that if we're going from 1080p to 2160P/4K
Expect we're wrong because that my friend is the 2D plane you're viewing.

This the problem when moving to render inside the 3D world from 1080P to 2160P/4K
what3Dworld.png


it's not HD texture packs here at work it's the sheer volume increase in the 3D world needed rendering that increase by 8 times.
sitting here claim that the RTX 4090 can play games at 4K with a bunch of "software Bandaids D.L.S.S & F.G" shows how bad this really is. It only gets to about half of what it actually needs to render in 2160P/4K path-tracying. What it get like 45fps-60 FPS? When were gaming not too long ago at about 144fps to 240fps on DX11, that's about 6-8 times faster than what we're getting with all these new RT games & the move the 2160p/4K
Instead of losing out of crappier details & latency hits with all these upscaling techs & frame generation. The plain truth, pure raw power wins
With multiple cards is the only way to overcome this bottleneck right now. A 50% increase in performance is a laughable amount in increase by the comparison in the second picture.
 
Joined
Mar 7, 2023
Messages
943 (1.40/day)
System Name BarnacleMan
Processor 14700KF
Motherboard Gigabyte B760 Aorus Elite Ax DDR5
Cooling ARCTIC Liquid Freezer II 240 + P12 Max Fans
Memory 32GB Kingston Fury Beast
Video Card(s) Asus Tuf 4090 24GB
Storage 4TB sn850x, 2TB sn850x, 2TB Netac Nv7000 + 2TB p5 plus, 4TB MX500 * 2 = 18TB. Plus dvd burner.
Display(s) Dell 23.5" 1440P IPS panel
Case Lian Li LANCOOL II MESH Performance Mid-Tower
Audio Device(s) Logitech Z623
Power Supply Gigabyte ud850gm pg5
As I pointed out earlier your numbers are using allocation and not actual usage. And again immediately going over the VRAM size doesn't cause an issue. The 3070 for example is completely fine using 10GB in RE4 because the actual hot data set that the game frequently accesses is less than the total VRAM usage. You are saying DLSS is solving a VRAM issue but you have yet to prove that. You theorize that DLSS can but that fact that we can't find this phantom scenario either makes it very rare or non-existence which only proves my point.
Like I already said without dlss, the last of the us would crash on my 3070. This was early in its life it might not crash today but its still stuttery on 8gb cards. What, do you want logs or something?
 
Joined
Jun 19, 2023
Messages
115 (0.20/day)
System Name EnvyPC
Processor Ryzen 7 7800X3D
Motherboard Asus ROG Strix B650E-I Gaming WiFi
Cooling Thermalright Peerless Assassin 120 White / be quiet! Silent Wings Pro 4 120mm (x2), 140mm (x3)
Memory 64GB Klevv Cras V RGB White DDR5-6000 (Hynix A) 30-36-36-76 1.35v
Video Card(s) MSI GeForce RTX 4080 16GB Gaming X Trio White Edition
Storage SK Hynix Platinum P41 2TB NVME
Display(s) HP Omen 4K 144Hz
Case Lian Li Dan Case A3 White
Audio Device(s) Creative Sound BlasterX G6
Power Supply Corsair SF750 Platinum
Mouse Logitech G600 White
Software Windows 11 Pro
Tom's Hardware and HotHardware align with Techspot's (considering of course that Techspot's figures also include the high texture pack):

View attachment 317589

View attachment 317595

The graph you provided seems to be the odd man out. I believe this large difference may be due to test setup used. TechPowerUp tends to use lower speced memory speed where as TechSpot (aka HWUB who writes their GPU articles) and other publications tend to use high frequency and/or lower latency RAM. Far Cry games are usually very memory sensative and pushing your VRAM limit and spilling over to the main system memory may combine to what we see in the TechPowerUp graph. What you experienced may have been a combination of using subpar memory while pushing your GPU's VRAM limits.

EDIT ** Replaced a webp image that wasn't displaying with a png. God I hate webp



Game data is already compressed both on disk and in memory.
I was playing it on my sig rig, with a 7800X3D and DDR5 6000 cl30 RAM... running native 4K +RT would still cripple my 3070 Ti.

Exactly like:

As you'll notice in the vid, turning on FSR Quality lowers the VRAM bottleneck and suddenly the game is playable! It's magic!

I'd load it up and bench it myself if I still owned the card. Maybe someone else here with an 8GB card can do it since you seem to think I imagined it all. :laugh:
 
Last edited:
Joined
Dec 25, 2020
Messages
7,215 (4.88/day)
Location
São Paulo, Brazil
System Name "Icy Resurrection"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard ASUS ROG Maximus Z790 Apex Encore
Cooling Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM
Memory 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD
Display(s) 55-inch LG G3 OLED
Case Pichau Mancer CV500 White Edition
Audio Device(s) Apple USB-C + Sony MDR-V7 headphones
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Classic Intellimouse
Keyboard IBM Model M type 1391405
Software Windows 10 Pro 22H2
Benchmark Scores I pulled a Qiqi~
God I wish more people understood this! The "VRAM" hysteria is out of control. People seem to generally be horrible at seeing the "big picture".
Much of this nonsense originates from "Hardware Unboxed" and people just continue to regurgitate it.

Hardware Unboxed only showed proof that an 8 GB card is long in the tooth for high-resolution gaming. This is the truth. However, it's not like most 12 GB GPUs today are exactly what you would call heavyweights, I would say this is generally non-issue at this segment (GA104 x Navi 22); the AMD cards behave a little better due to having more memory, although neither can truly handle the heat from the workload without dropping the frame rate.

I personally consider 16 GB to be an adequate amount of video memory, if you play at 4K. More is not truly necessary right now, although, there are corner cases where it can be beneficial. What gamers really should have on their PC is 32 GB of RAM. More if possible.
 
Joined
Jan 14, 2019
Messages
13,237 (6.05/day)
Location
Midlands, UK
Processor Various Intel and AMD CPUs
Motherboard Micro-ATX and mini-ITX
Cooling Yes
Memory Anything from 4 to 48 GB
Video Card(s) Various Nvidia and AMD GPUs
Storage A lot
Display(s) Monitors and TVs
Case The smaller the better
Audio Device(s) Speakers and headphones
Power Supply 300 to 750 W, bronze to gold
Mouse Wireless
Keyboard Wired
VR HMD Not yet
Software Linux gaming master race
I was playing it on my sig rig, with a 7800X3D and DDR5 6000 cl30 RAM... running native 4K +RT would still cripple my 3070 Ti.

Exactly like:

As you'll notice in the vid, turning on FSR Quality lowers the VRAM bottleneck and suddenly the game is playable! It's magic!

I'd load it up and bench it myself if I still owned the card. Maybe someone else here with an 8GB card can do it since you seem to think I imagined it all. :laugh:
I still have a 2070 on the shelf... but haven't got FC6. :laugh:
 
Joined
Jun 19, 2023
Messages
115 (0.20/day)
System Name EnvyPC
Processor Ryzen 7 7800X3D
Motherboard Asus ROG Strix B650E-I Gaming WiFi
Cooling Thermalright Peerless Assassin 120 White / be quiet! Silent Wings Pro 4 120mm (x2), 140mm (x3)
Memory 64GB Klevv Cras V RGB White DDR5-6000 (Hynix A) 30-36-36-76 1.35v
Video Card(s) MSI GeForce RTX 4080 16GB Gaming X Trio White Edition
Storage SK Hynix Platinum P41 2TB NVME
Display(s) HP Omen 4K 144Hz
Case Lian Li Dan Case A3 White
Audio Device(s) Creative Sound BlasterX G6
Power Supply Corsair SF750 Platinum
Mouse Logitech G600 White
Software Windows 11 Pro
I still have a 2070 on the shelf... but haven't got FC6. :laugh:

Well I, for one, can think of no better reason to buy the game than to settle this pointless internet argument :D

 
Joined
Jan 14, 2019
Messages
13,237 (6.05/day)
Location
Midlands, UK
Processor Various Intel and AMD CPUs
Motherboard Micro-ATX and mini-ITX
Cooling Yes
Memory Anything from 4 to 48 GB
Video Card(s) Various Nvidia and AMD GPUs
Storage A lot
Display(s) Monitors and TVs
Case The smaller the better
Audio Device(s) Speakers and headphones
Power Supply 300 to 750 W, bronze to gold
Mouse Wireless
Keyboard Wired
VR HMD Not yet
Software Linux gaming master race
Well I, for one, can think of no better reason to buy the game than to settle this pointless internet argument :D

I think I'll survive without Pacific Shooter 6534689, but thanks for the suggestion. :D
 
Joined
Apr 13, 2017
Messages
150 (0.05/day)
System Name AMD System
Processor Ryzen 7900 at 180Watts 5650 MHz, vdroop from 1.37V to 1.24V
Motherboard MSI MAG x670 Tomahawk Wifi
Cooling AIO240 for CPU, Wraith Prism's Fan for RAM but suspended above it without touching anything in case.
Memory 32GB dual channel Gskill DDR6000CL30 tuned for CL28, at 1.42Volts
Video Card(s) Msi Ventus 2x Rtx 4070 and Gigabyte Gaming Oc Rtx 4060 ti
Storage Samsung Evo 970
Display(s) Old 1080p 60FPS Samsung
Case Normal atx
Audio Device(s) Dunno
Power Supply 1200Watts
Mouse wireless & quiet
Keyboard wireless & quiet
VR HMD No
Software Windows 11
Benchmark Scores 1750 points in cinebench 2024 42k 43k gpu cpu points in timespy 50+ teraflops total compute power.
Maxed 1080p starfield needs 5-6GB video memory at least and should be maximum for any game. 4070 60FPS, 4060ti 45FPS.
 
Joined
Jul 13, 2016
Messages
3,391 (1.09/day)
Processor Ryzen 7800X3D
Motherboard ASRock X670E Taichi
Cooling Noctua NH-D15 Chromax
Memory 32GB DDR5 6000 CL30
Video Card(s) MSI RTX 4090 Trio
Storage P5800X 1.6TB 4x 15.36TB Micron 9300 Pro 4x WD Black 8TB M.2
Display(s) Acer Predator XB3 27" 240 Hz
Case Thermaltake Core X9
Audio Device(s) JDS Element IV, DCA Aeon II
Power Supply Seasonic Prime Titanium 850w
Mouse PMM P-305
Keyboard Wooting HE60
VR HMD Valve Index
Software Win 10
Like I already said without dlss, the last of the us would crash on my 3070. This was early in its life it might not crash today but its still stuttery on 8gb cards. What, do you want logs or something?

?? You are quoting my comment that was a response to Fizban. No idea why you would reply with "like I already said" when you weren't part of the convo to begin with, unless you are posting from multiple accounts on this thread.

The Last of us crashing on 8GB cards was a bug that has since been fixed, so really whether or not DLSS could have helped in that scenario is irrelevant. A 3070 absolutely has too little VRAM on ultra in that game as you seem to be aware of given the stuttering you pointed out. HWUB has a video on the stuttering of 8GB cards including TLOU. This is akin to Resident Evil 4's RT crashing on 8GB cards. It was fixed but it is absolutely not recommended to use RT on 8GB cards with ultra settings because the texture swapping and performance will be horrendous regardless of if you use DLSS or not.

I was playing it on my sig rig, with a 7800X3D and DDR5 6000 cl30 RAM... running native 4K +RT would still cripple my 3070 Ti.

Exactly like:

As you'll notice in the vid, turning on FSR Quality lowers the VRAM bottleneck and suddenly the game is playable! It's magic!

I'd load it up and bench it myself if I still owned the card. Maybe someone else here with an 8GB card can do it since you seem to think I imagined it all. :laugh:

The problem is that some people have benchmarked 4 GB video cards in that game exceeding 4 GB of allocation and have not run into such a massive issue.

For example, we know the game takes 7.4GB at 1080p ultra / max (no RT)

1697436177119.png


and yet the 4 GB 5500 XT doesn't have issues:

1697436309648.png


Neither does the 6GB 5600 XT:

1697436423813.png


When reviews are reporting that 4GB and 6GB cards run fine at settings that take 7.4GB of VRAM (because pushing low priority data to main system memory if VRAM is full is a thing), it's reasonable to doubt a source that says 0.035 GB over VRAM size in the same game is going to cause severe issues There are a ton of games that TechPowerUp and other publicans have benchmarked that go over 8GB of VRAM usage and there is no ill effect on 8GB video cards until you get 1-3GB over their VRAM size. It doesn't make any sense. Yes that could be a VRAM issue, and all the more reason to recommend people advocate for more VRAM, but it appears more likely that it's either a bug or an issue with the test setup itself as in the same game we have reviews that don't display this issue at the same settings with the same GPU and we have cards with an even greater VRAM deficiency not displaying the issue either.
 
Joined
Mar 7, 2023
Messages
943 (1.40/day)
System Name BarnacleMan
Processor 14700KF
Motherboard Gigabyte B760 Aorus Elite Ax DDR5
Cooling ARCTIC Liquid Freezer II 240 + P12 Max Fans
Memory 32GB Kingston Fury Beast
Video Card(s) Asus Tuf 4090 24GB
Storage 4TB sn850x, 2TB sn850x, 2TB Netac Nv7000 + 2TB p5 plus, 4TB MX500 * 2 = 18TB. Plus dvd burner.
Display(s) Dell 23.5" 1440P IPS panel
Case Lian Li LANCOOL II MESH Performance Mid-Tower
Audio Device(s) Logitech Z623
Power Supply Gigabyte ud850gm pg5
?? You are quoting my comment that was a response to Fizban. No idea why you would reply with "like I already said" when you weren't part of the convo to begin with

Sorry if that was too pointed. I was kind of irritable this morning and found it kind of annoying that you were calling Fizban's point theoretical, and a 'phanton scenario' when I had an example just a couple posts below him.


The Last of us crashing on 8GB cards was a bug that has since been fixed, so really whether or not DLSS could have helped in that scenario is irrelevant.

You said 'DLSS doesn't help VRAM issues', so how is it irrelevant when its a bug triggered by low vram and avoided by dlss?


A 3070 absolutely has too little VRAM on ultra in that game

I didn't say I was using ultra, and I wasn't using ultra, I was just trying to find settings that didn't crash with reasonable visual quality at 1440p. Without dlss nothing but low would work, with dlss on balanced I could get away with high. And since I was getting 60 fps without dlss and 60 fps with dlss, its clearly not a "Game's too damn demanding issue", but a vram issue.

it is absolutely not recommended to use RT on 8GB cards

What? I never use RT... Even with my 4090.
 
Last edited:
Joined
Sep 17, 2014
Messages
22,840 (6.06/day)
Location
The Washing Machine
System Name Tiny the White Yeti
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling CPU: Thermalright Peerless Assassin / Case: Phanteks T30-120 x3
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
VR HMD HD 420 - Green Edition ;)
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
Maxed 1080p starfield needs 5-6GB video memory at least and should be maximum for any game. 4070 60FPS, 4060ti 45FPS.
You realize this is an optimal example of stagnation right? This game is practically twenty years old, a bit like an old lady full of botox and silicones. Matter of fact the facial mocap might look very similar too ;)
 
Joined
Feb 1, 2019
Messages
3,684 (1.70/day)
Location
UK, Midlands
System Name Main PC
Processor 13700k
Motherboard Asrock Z690 Steel Legend D4 - Bios 13.02
Cooling Noctua NH-D15S
Memory 32 Gig 3200CL14
Video Card(s) 4080 RTX SUPER FE 16G
Storage 1TB 980 PRO, 2TB SN850X, 2TB DC P4600, 1TB 860 EVO, 2x 3TB WD Red, 2x 4TB WD Red
Display(s) LG 27GL850
Case Fractal Define R4
Audio Device(s) Soundblaster AE-9
Power Supply Antec HCG 750 Gold
Software Windows 10 21H2 LTSC
Ironically not long ago I noticed something interesting related to VRAM consumption.

It seems Firefox (and maybe other GPU accelerated desktop apps) are not garbage collecting VRAM consumption or doing it badly, when I put my PC to sleep last night it was using just under 3 gigs of VRAM for all desktop apps running, I wake it up and its consuming 1.1gigs. The apps are still running, but some kind of flush occurs when putting the PC to sleep and waking it up again. This phenomenon doesnt occur with system RAM usage.

The example I provided is from last night but is consistent behaviour on every sleep with Firefox running.

If it hasnt happened already in the thread I know the old commit vs utilisation argument will come up, but Windows cannot over commit memory by its design, so high commit vs utilisation is nasty in windows, spice wars e.g. has a really bad issue with this on system RAM, where it might be using say 8 gigs of RAM, but using 25 gigs of commit (virtual memory), if windows runs out of virtual memory its game over. If you run out of VRAM, it falls back to shared RAM space, not game over just a large drop in performance (assuming you have spare commit), games try to avoid this by aggressively controlling asset usage in games.

I think there is two clear things that could be done to improve things, which would allow things to work better on GPU's that have less than 16 gigs of VRAM.

1 - Desktop GPU accelerated apps could probably use the shared RAM space without a noticable performance impact, so system VRAM usage is close to zero like it was in the old days. If you have a CPU with a iGPU, this can be forced, I have been doing testing with great results and am about to change my main desktop config, basically run the iGPU as your main GPU, you still get GPU acceleration for browser etc. but it all goes in system RAM. Then use the discrete GPU just for games.

2 - Game developers porting to PC should not carry over unified memory optimisations from consoles (this loads RAM stuff into VRAM), so remember the fix for far cry 4 a while back? Basically anything that can work in RAM, put it in RAM, only put things in VRAM that need to go in there to avoid large performance losses. But of course this is more work for the dev's which they will try to avoid. The industry has a big issue with this right now, I have fired up games with real poo textures and yet they still consuming 4-6 gigs of VRAM, clearly loading up generic data in there. FF7 remake uses close to 4 gigs just to load the title screen.
 
Last edited:
Joined
Jun 19, 2023
Messages
115 (0.20/day)
System Name EnvyPC
Processor Ryzen 7 7800X3D
Motherboard Asus ROG Strix B650E-I Gaming WiFi
Cooling Thermalright Peerless Assassin 120 White / be quiet! Silent Wings Pro 4 120mm (x2), 140mm (x3)
Memory 64GB Klevv Cras V RGB White DDR5-6000 (Hynix A) 30-36-36-76 1.35v
Video Card(s) MSI GeForce RTX 4080 16GB Gaming X Trio White Edition
Storage SK Hynix Platinum P41 2TB NVME
Display(s) HP Omen 4K 144Hz
Case Lian Li Dan Case A3 White
Audio Device(s) Creative Sound BlasterX G6
Power Supply Corsair SF750 Platinum
Mouse Logitech G600 White
Software Windows 11 Pro
?? You are quoting my comment that was a response to Fizban. No idea why you would reply with "like I already said" when you weren't part of the convo to begin with, unless you are posting from multiple accounts on this thread.

The Last of us crashing on 8GB cards was a bug that has since been fixed, so really whether or not DLSS could have helped in that scenario is irrelevant. A 3070 absolutely has too little VRAM on ultra in that game as you seem to be aware of given the stuttering you pointed out. HWUB has a video on the stuttering of 8GB cards including TLOU. This is akin to Resident Evil 4's RT crashing on 8GB cards. It was fixed but it is absolutely not recommended to use RT on 8GB cards with ultra settings because the texture swapping and performance will be horrendous regardless of if you use DLSS or not.



The problem is that some people have benchmarked 4 GB video cards in that game exceeding 4 GB of allocation and have not run into such a massive issue.

For example, we know the game takes 7.4GB at 1080p ultra / max (no RT)

View attachment 317692

and yet the 4 GB 5500 XT doesn't have issues:

View attachment 317693

Neither does the 6GB 5600 XT:

View attachment 317694

When reviews are reporting that 4GB and 6GB cards run fine at settings that take 7.4GB of VRAM (because pushing low priority data to main system memory if VRAM is full is a thing), it's reasonable to doubt a source that says 0.035 GB over VRAM size in the same game is going to cause severe issues There are a ton of games that TechPowerUp and other publicans have benchmarked that go over 8GB of VRAM usage and there is no ill effect on 8GB video cards until you get 1-3GB over their VRAM size. It doesn't make any sense. Yes that could be a VRAM issue, and all the more reason to recommend people advocate for more VRAM, but it appears more likely that it's either a bug or an issue with the test setup itself as in the same game we have reviews that don't display this issue at the same settings with the same GPU and we have cards with an even greater VRAM deficiency not displaying the issue either.

Those reviews all have RT disabled though. If I had to guess, games can swap assets on the fly to minimize the perf hit when there's not enough VRAM available in rasterized titles, but with RT enabled, the GPU needs access to all the data all at once to calculate the light bounces. And pulling from system memory slows things to an absolute craaawl.

Upscaling can help all the crucial data fit into a smaller framebuffer.
 
Top