• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

12Gb GPUs already obsolete, brand new game takes up to 18Gb Vram at 1440p

Joined
May 17, 2021
Messages
3,421 (2.51/day)
Processor Ryzen 5 5700x
Motherboard B550 Elite
Cooling Thermalright Perless Assassin 120 SE
Memory 32GB Fury Beast DDR4 3200Mhz
Video Card(s) Gigabyte 3060 ti gaming oc pro
Storage Samsung 970 Evo 1TB, WD SN850x 1TB, plus some random HDDs
Display(s) LG 27gp850 1440p 165Hz 27''
Case Lian Li Lancool II performance
Power Supply MSI 750w
Mouse G502
Not sure what you're getting at. What is wrong with this type of test methodology? Best card, higher memory bandwidth and more than sufficient VRAM to measure memory usage. Why do we need results from lower specced GPUs (with hardware limitations) to identify max VRAM utilization? Thats almost like testing how fast a car can run on a road limited by a 60mph speed limit

because it doesn't work like that, a card with 24GB will ALLOCATE more, there is no utilisation identified
 
Joined
Dec 12, 2020
Messages
1,755 (1.16/day)
"Future Proof" is a myth in PC hardware. It has never existed and most likely never will.
I think back to a member here who, around 8 years ago, proposed a build with 4 Maxwell Titan X which was almost $5,000 in GPUs alone. He said he wanted to build a beast that would be future proof for at least 7 years. SLI died years ago and most people didn't see it coming but it did come. Another member bought a Titan Z for $3,000 and complained because of lack of support from Nvidia even during the Maxwell series. He said that he expected Nvidia to give the best quality support for his Titan Z for at least 10 years because of the price he paid.

Who knows what the future will bring but whatever it does it is certain that it will make today's hardware obsolete and puny in comparison.
$2999, I can hardly believe that's how much the Titan Z went for. Amazing. According to the CPI Inflation Calculator that's $3,797.91 in today's dollars. I had always wanted a TitanZ or a AMD 295x2 but they were crafted of unaffordium back then. Interestingly enough, the 4090 I paid $1800 for in April would've cost $1422 in 2014 dollars (cheaper even than the AMD r9 295x2!) so even top-tier GPU prices really have gone down, although the Titan Black was cheaper in cost adjusted dollars than a 4090 today (but not by much, maybe a 100 bones).
 
Joined
May 8, 2018
Messages
1,594 (0.65/day)
Location
London, UK
The way this trend is going, next generation of gpus will be with 32/64gb.
 
Joined
May 17, 2021
Messages
3,421 (2.51/day)
Processor Ryzen 5 5700x
Motherboard B550 Elite
Cooling Thermalright Perless Assassin 120 SE
Memory 32GB Fury Beast DDR4 3200Mhz
Video Card(s) Gigabyte 3060 ti gaming oc pro
Storage Samsung 970 Evo 1TB, WD SN850x 1TB, plus some random HDDs
Display(s) LG 27gp850 1440p 165Hz 27''
Case Lian Li Lancool II performance
Power Supply MSI 750w
Mouse G502
The way this trend is going, next generation of gpus will be with 32/64gb.

if you give lazy developers/engineers more why should they optimise? A dangerous path.
 
Joined
Jan 20, 2019
Messages
1,631 (0.74/day)
Location
London, UK
System Name ❶ Oooh (2024) ❷ Aaaah (2021) ❸ Ahemm (2017)
Processor ❶ 5800X3D ❷ i7-9700K ❸ i7-7700K
Motherboard ❶ X570-F ❷ Z390-E ❸ Z270-E
Cooling ❶ ALFIII 360 ❷ X62 + X72 (GPU mod) ❸ X62
Memory ❶ 32-3600/16 ❷ 32-3200/16 ❸ 16-3200/16
Video Card(s) ❶ 3080 X Trio ❷ 2080TI (AIOmod) ❸ 1080TI
Storage ❶ NVME/SSD/HDD ❷ <SAME ❸ SSD/HDD
Display(s) ❶ 1440/165/IPS ❷ 1440/144/IPS ❸ 1080/144/IPS
Case ❶ BQ Silent 601 ❷ Cors 465X ❸ Frac Mesh C
Audio Device(s) ❶ HyperX C2 ❷ HyperX C2 ❸ Logi G432
Power Supply ❶ HX1200 Plat ❷ RM750X ❸ EVGA 650W G2
Mouse ❶ Logi G Pro ❷ Razer Bas V3 ❸ Logi G502
Keyboard ❶ Logi G915 TKL ❷ Anne P2 ❸ Logi G610
Software ❶ Win 11 ❷ 10 ❸ 10
Benchmark Scores I have wrestled bandwidths, Tussled with voltages, Handcuffed Overclocks, Thrown Gigahertz in Jail
because it doesn't work like that, a card with 24GB will ALLOCATE more, there is no utilisation identified

Well if its allocating more than its required, either the title is poorly optimised or more is a requirement to keep up with larger assets management/compression (etc), or a mix of both. But lets not make assumptions to influence the idea less is adequate when its clearly not, "especially going forward".

Anyway, it doesn't make sense to test this configuration with less VRAM or any memory confined environment (bottleneck). Smart game engines will "dynamically" compromise on assets/graphics quality if VRAM is cramped. Already we're seeing waning graphics fidelity when titles are close to topping out VRAM, especially dynamic assets swapping which is observably evident in real-time gameplay. More refined and less weightier optimisations have been seen to work nicely but in some cases they present less of an "optimal" state traded with a feasible but lesser quality compromise (i suppose the illusion of makeshift "ultra"). To make matters worse, dev contracted mainstream slower bandwidths are met with pre-designated shrivelled utilisation of VRAM, often leaving the impression more VRAM is unavailing. Same applies to badly ported titles. Just because some heavy lifters in the memory department havent succumbed to stuttering, crashing, artifacts, etc etc... it doesn't mean we're getting the full produce of a games rendered resources at the highest quality setting.

On top, with RT/PT on the mainstream horizon surely its time to break open the VRAM seal across all performance segments (incl. 1080p ~hardware). If we're paying the global illumination tax, in the least memory provisioning should be adequately well balanced to support it.

I'm still trying to work out what all this consumer-drivelled fascination is with preserving current VRAM capacities, that too at lower/slower bandwidths. Hypothetically speaking, even if VRAM presented zero compromises/bottlenecks when did "more VRAM" become a bad idea? I would much rather have developers unrestricted to drawing on a smaller pallette and work less strenously in cramming everything in, which always ends up in a shit-show. Seeing consoles have adopted ~16GB this alone is a pretty good indication developers will unload greater graphics eye-candy possibilities across the board. So far the 6-8GB mass market whom developers foremost respond to have served as a limiting factor in graphics prowess. So which is it, preserve lower VRAM capacities or increase in unlocking easily attainable graphics galore alongside what is already copiously very fast modern day graphics cards and CPUs?

Its either more VRAM or a game-changer 2x/4x compression innovation - even Nvidia identifies where things are heading with visualized VRAM exploitation https://research.nvidia.com/labs/rtr/neural_texture_compression/

Oddly enough we're all jumping the trampolene at the sound of Directstorage incorp... but no, please not the VRAM. What gives?

if you give lazy developers/engineers more why should they optimise? A dangerous path.

When did giving less require less optimisations?
 
Joined
May 17, 2021
Messages
3,421 (2.51/day)
Processor Ryzen 5 5700x
Motherboard B550 Elite
Cooling Thermalright Perless Assassin 120 SE
Memory 32GB Fury Beast DDR4 3200Mhz
Video Card(s) Gigabyte 3060 ti gaming oc pro
Storage Samsung 970 Evo 1TB, WD SN850x 1TB, plus some random HDDs
Display(s) LG 27gp850 1440p 165Hz 27''
Case Lian Li Lancool II performance
Power Supply MSI 750w
Mouse G502
But lets not make assumptions to influence the idea less is adequate when its clearly not, "especially going forward".

We have to make assumptions, after all it's apparently impossible to test how much a game ALLOCATES of vram for different cards. Because of reasons and stuff. It's far better to just use a 4090 for 560p and conclude nothing and assume everything. Maybe we are meant to assume and not conclude, it certainly seems like it.
It does help to drive sales of new cards and the drama factor of any "review". I know its hard to drive views when no one cares about the new cards, so there is that.
 
Joined
Jan 20, 2019
Messages
1,631 (0.74/day)
Location
London, UK
System Name ❶ Oooh (2024) ❷ Aaaah (2021) ❸ Ahemm (2017)
Processor ❶ 5800X3D ❷ i7-9700K ❸ i7-7700K
Motherboard ❶ X570-F ❷ Z390-E ❸ Z270-E
Cooling ❶ ALFIII 360 ❷ X62 + X72 (GPU mod) ❸ X62
Memory ❶ 32-3600/16 ❷ 32-3200/16 ❸ 16-3200/16
Video Card(s) ❶ 3080 X Trio ❷ 2080TI (AIOmod) ❸ 1080TI
Storage ❶ NVME/SSD/HDD ❷ <SAME ❸ SSD/HDD
Display(s) ❶ 1440/165/IPS ❷ 1440/144/IPS ❸ 1080/144/IPS
Case ❶ BQ Silent 601 ❷ Cors 465X ❸ Frac Mesh C
Audio Device(s) ❶ HyperX C2 ❷ HyperX C2 ❸ Logi G432
Power Supply ❶ HX1200 Plat ❷ RM750X ❸ EVGA 650W G2
Mouse ❶ Logi G Pro ❷ Razer Bas V3 ❸ Logi G502
Keyboard ❶ Logi G915 TKL ❷ Anne P2 ❸ Logi G610
Software ❶ Win 11 ❷ 10 ❸ 10
Benchmark Scores I have wrestled bandwidths, Tussled with voltages, Handcuffed Overclocks, Thrown Gigahertz in Jail
We have to make assumptions, after all it's apparently impossible to test how much a game ALLOCATES of vram for different cards. Because of reasons and stuff. It's far better to just use a 4090 for 560p and conclude nothing and assume everything. Maybe we are meant to assume and not conclude, it certainly seems like it.
It does help to drive sales of new cards and the drama factor of any "review". I know its hard to drive views when no one cares about the new cards, so there is that.

I'm not sure how varied allocations across limited/unlimited capacities helps when determining max VRAM usage - Isn't that the primary objective? If you're just genuinely concerned whether a given card is capable of running a particular title at a given setting thats fine. I too wouldn't mind seeing how other cards compare or what sacrifices/compromises ensue with older or inferior hardware. But max VRAM usage is max VRAM usage without limitations. No level of "assumption" is going to change that.
 

Mussels

Freshwater Moderator
Joined
Oct 6, 2004
Messages
58,413 (7.86/day)
Location
Oystralia
System Name Rainbow Sparkles (Power efficient, <350W gaming load)
Processor Ryzen R7 5800x3D (Undervolted, 4.45GHz all core)
Motherboard Asus x570-F (BIOS Modded)
Cooling Alphacool Apex UV - Alphacool Eisblock XPX Aurora + EK Quantum ARGB 3090 w/ active backplate
Memory 2x32GB DDR4 3600 Corsair Vengeance RGB @3866 C18-22-22-22-42 TRFC704 (1.4V Hynix MJR - SoC 1.15V)
Video Card(s) Galax RTX 3090 SG 24GB: Underclocked to 1700Mhz 0.750v (375W down to 250W))
Storage 2TB WD SN850 NVME + 1TB Sasmsung 970 Pro NVME + 1TB Intel 6000P NVME USB 3.2
Display(s) Phillips 32 32M1N5800A (4k144), LG 32" (4K60) | Gigabyte G32QC (2k165) | Phillips 328m6fjrmb (2K144)
Case Fractal Design R6
Audio Device(s) Logitech G560 | Corsair Void pro RGB |Blue Yeti mic
Power Supply Fractal Ion+ 2 860W (Platinum) (This thing is God-tier. Silent and TINY)
Mouse Logitech G Pro wireless + Steelseries Prisma XL
Keyboard Razer Huntsman TE ( Sexy white keycaps)
VR HMD Oculus Rift S + Quest 2
Software Windows 11 pro x64 (Yes, it's genuinely a good OS) OpenRGB - ditch the branded bloatware!
Benchmark Scores Nyooom.
please enlighten me

tested all the cards but only posts vram usage of one card


View attachment 294962


isn't this feeding the fear mongering?
That's one card tested to show the maximum VRAM usage, so you can tell if it fits inside any GPU.
What part of that is useless or fear mongering?
 
Joined
May 17, 2021
Messages
3,421 (2.51/day)
Processor Ryzen 5 5700x
Motherboard B550 Elite
Cooling Thermalright Perless Assassin 120 SE
Memory 32GB Fury Beast DDR4 3200Mhz
Video Card(s) Gigabyte 3060 ti gaming oc pro
Storage Samsung 970 Evo 1TB, WD SN850x 1TB, plus some random HDDs
Display(s) LG 27gp850 1440p 165Hz 27''
Case Lian Li Lancool II performance
Power Supply MSI 750w
Mouse G502
That's one card tested to show the maximum VRAM usage, so you can tell if it fits inside any GPU.
What part of that is useless or fear mongering?

are you sure a 8GB card will show the same allocation then a 24GB card? do the test and came back to me. No assumptions this time.
 
Joined
Mar 22, 2020
Messages
27 (0.02/day)
Well if its allocating more than its required, either the title is poorly optimised or more is a requirement to keep up with larger assets management/compression (etc), or a mix of both. But lets not make assumptions to influence the idea less is adequate when its clearly not, "especially going forward".
It's not a matter of optimisation. Many games will allocate vram depending on what's available, because it just can't hurt to have more. This is the default behavior of UE for instance. This is also clearly the behavior of this title since measured vram usage is much lower with the same settings on my 12GB card (around 9GB in UWQHD with RT no FSR).
Measuring vram usage on a 24GB card in ultra is useless to determine how much vram is actually needed to run the game. It does nothing to inform on the behaviour on cards with less vram or at lower settings, ie for 99% of gamers. It's misleading and fuels the current hysteria regarding vram.

There is also the fact that the test does not specify how the vram usage was measured. The default value showed by afterburner represents the whole system vram usage for instance. So this graph really is doing a disservice to TPU readers.

are you sure a 8GB card will show the same allocation then a 24GB card? do the test and came back to me. No assumptions this time.
The test itself proves it does not, at least not in the selected scene. Look at the performance of the 3080 and 4070 for instance.
 
Last edited:
Joined
Feb 3, 2017
Messages
3,898 (1.33/day)
Processor Ryzen 7800X3D
Motherboard ROG STRIX B650E-F GAMING WIFI
Memory 2x16GB G.Skill Flare X5 DDR5-6000 CL36 (F5-6000J3636F16GX2-FX5)
Video Card(s) INNO3D GeForce RTX™ 4070 Ti SUPER TWIN X2
Storage 2TB Samsung 980 PRO, 4TB WD Black SN850X
Display(s) 42" LG C2 OLED, 27" ASUS PG279Q
Case Thermaltake Core P5
Power Supply Fractal Design Ion+ Platinum 760W
Mouse Corsair Dark Core RGB Pro SE
Keyboard Corsair K100 RGB
VR HMD HTC Vive Cosmos
That's one card tested to show the maximum VRAM usage, so you can tell if it fits inside any GPU.
What part of that is useless or fear mongering?
Today most if not all engines do some sort of texture streaming to a dynamic texture pool in VRAM. While there is a practical minimum size for said pool depending on game needs making it bigger could be beneficial or at least not harmful. Given a reasonably good algorithm for texture streaming that minimum size might be surprisingly small.

Properly testing this is a real bitch though.
 

HTC

Joined
Apr 1, 2008
Messages
4,668 (0.76/day)
Location
Portugal
System Name HTC's System
Processor Ryzen 5 5800X3D
Motherboard Asrock Taichi X370
Cooling NH-C14, with the AM4 mounting kit
Memory G.Skill Kit 16GB DDR4 F4 - 3200 C16D - 16 GTZB
Video Card(s) Sapphire Pulse 6600 8 GB
Storage 1 Samsung NVMe 960 EVO 250 GB + 1 3.5" Seagate IronWolf Pro 6TB 7200RPM 256MB SATA III
Display(s) LG 27UD58
Case Fractal Design Define R6 USB-C
Audio Device(s) Onboard
Power Supply Corsair TX 850M 80+ Gold
Mouse Razer Deathadder Elite
Software Ubuntu 20.04.6 LTS
With my brand new RX 6600 with 8GB VRam, i should be "future proofed" for @ least the next 5 years.
 

wolf

Better Than Native
Joined
May 7, 2007
Messages
8,458 (1.30/day)
System Name MightyX
Processor Ryzen 9800X3D
Motherboard Gigabyte B650I AX
Cooling Scythe Fuma 2
Memory 32GB DDR5 6000 CL30
Video Card(s) Asus TUF RTX3080 Deshrouded
Storage WD Black SN850X 2TB
Display(s) LG 42C2 4K OLED
Case Coolermaster NR200P
Audio Device(s) LG SN5Y / Focal Clear
Power Supply Corsair SF750 Platinum
Mouse Corsair Dark Core RBG Pro SE
Keyboard Glorious GMMK Compact w/pudding
VR HMD Meta Quest 3
Software case populated with Artic P12's
Benchmark Scores 4k120 OLED Gsync bliss
When did giving less require less optimisations?
Giving a lesser quality game, unoptimized to work on several tiers of GPU and CPU hardware, require's more of said hardware to run well. Remember he said;
if you give lazy developers/engineers more why should they optimise? A dangerous path.
So yeah, if you give them way overspecced hardware relative to consoles that (in theory) already run well/look good enough, like gobs of unnecessary VRAM, CPU power, GPU compute power etc, why would they optimize further.

I mean we're already bordering on some systems having 2X+ the amount of physical hardware a console game needs to run XYZ settings, potentially feeding it unnecessarily, like the big VRAM boogeyman, and it could be a dangerous path. Nek minnit we might re-enter situations like craptacular budget/low end cards with 2/3/4x the amount of VRAM they need and can reasonably use, relative to their compute power. None of this is to say that some current cards could use more than they have, but simply advocating for all of them to have shittonnes doesn't actually solve the problem at all, in fact it exacerbates it.

My solution? Allow AIB's to just make double memory capacity cards, at their own markup. Then people that want their GPU to last an almost unreasonable amount of time, like 6+ years or 3+ generations, can spend big on VRAM, without buying the top SKU, to run max textures far beyond when their GPU can run max anything else. Would a 20GB 3080 be cool? sure! would I have bought one given the choice at launch for $100-200 USD more than the 10GB? hell no. Some might have, I don't begrudge them that, hence why the option could be useful and well received. Then at least they can get ahead of any boogeyman arguments by giving your everyman the option to have bought double for a GPU that can only make purposeful use out of it in niche situations like modding, or when we have multiple fucking terrible ports in a row, or when they want to play the ultra long game.
 
Joined
Jan 8, 2017
Messages
9,661 (3.27/day)
System Name Good enough
Processor AMD Ryzen R9 7900 - Alphacool Eisblock XPX Aurora Edge
Motherboard ASRock B650 Pro RS
Cooling 2x 360mm NexXxoS ST30 X-Flow, 1x 360mm NexXxoS ST30, 1x 240mm NexXxoS ST30
Memory 32GB - FURY Beast RGB 5600 Mhz
Video Card(s) Sapphire RX 7900 XT - Alphacool Eisblock Aurora
Storage 1x Kingston KC3000 1TB 1x Kingston A2000 1TB, 1x Samsung 850 EVO 250GB , 1x Samsung 860 EVO 500GB
Display(s) LG UltraGear 32GN650-B + 4K Samsung TV
Case Phanteks NV7
Power Supply GPS-750C
Imagine how stupid this discussion is retrospectively when you look back every couple of years.

2000: These lazy developers, 32MB is enough.
2005: These lazy developers, 128MB is enough.
2010: These lazy developers, 1GB is enough.
2015: These lazy developers, 4GB is enough.
2023: These lazy developers, 8GB is enough.

You get the picture.
 

dgianstefani

TPU Proofreader
Staff member
Joined
Dec 29, 2017
Messages
5,237 (2.02/day)
Location
Swansea, Wales
System Name Silent/X1 Yoga/S25U-1TB
Processor Ryzen 9800X3D @ 5.575ghz all core 1.24 V, Thermal Grizzly AM5 High Performance Heatspreader/1185 G7
Motherboard ASUS ROG Strix X670E-I, chipset fans replaced with Noctua A14x25 G2
Cooling Optimus Block, HWLabs Copper 240/40 + 240/30, D5/Res, 4x Noctua A12x25, 1x A14G2, Mayhems Ultra Pure
Memory 64 GB Dominator Titanium White 6000 MT, 130 ns tRFC, active cooled
Video Card(s) RTX 3080 Ti Founders Edition, Conductonaut Extreme, 18 W/mK MinusPad Extreme, Corsair XG7 Waterblock
Storage Intel Optane DC P1600X 118 GB, Samsung 990 Pro 2 TB
Display(s) 32" 240 Hz 1440p Samsung G7, 31.5" 165 Hz 1440p LG NanoIPS Ultragear, MX900 dual gas VESA mount
Case Sliger SM570 CNC Aluminium 13-Litre, 3D printed feet, custom front, LINKUP Ultra PCIe 4.0 x16 White
Audio Device(s) Audeze Maxwell Ultraviolet w/upgrade pads & LCD headband, Galaxy Buds 3 Pro, Razer Nommo Pro
Power Supply SF1000 Plat, full transparent custom cables, Sentinel Pro 1500 Online Double Conversion UPS w/Noctua
Mouse Razer Viper V3 Pro 8 KHz Mercury White & Pulsar Supergrip tape, Razer Atlas, Razer Strider Chroma
Keyboard Wooting 60HE+ module, TOFU-R CNC Alu/Brass, SS Prismcaps W+Jellykey, LekkerV2 mod, TLabs Leath/Suede
Software Windows 11 IoT Enterprise LTSC 24H2
Benchmark Scores Legendary
Imagine how stupid this discussion is retrospectively when you look back every couple of years.

2000: These lazy developers, 32MB is enough.
2005: These lazy developers, 128MB is enough.
2010: These lazy developers, 1GB is enough.
2015: These lazy developers, 4GB is enough.
2023: These lazy developers, 8GB is enough.

You get the picture.
Wow what a meaningful contribution to the discussion.

18 GB > 8 GB FYI.

18 GB > almost every other game that offers 4K textures.
 
Joined
Jan 8, 2017
Messages
9,661 (3.27/day)
System Name Good enough
Processor AMD Ryzen R9 7900 - Alphacool Eisblock XPX Aurora Edge
Motherboard ASRock B650 Pro RS
Cooling 2x 360mm NexXxoS ST30 X-Flow, 1x 360mm NexXxoS ST30, 1x 240mm NexXxoS ST30
Memory 32GB - FURY Beast RGB 5600 Mhz
Video Card(s) Sapphire RX 7900 XT - Alphacool Eisblock Aurora
Storage 1x Kingston KC3000 1TB 1x Kingston A2000 1TB, 1x Samsung 850 EVO 250GB , 1x Samsung 860 EVO 500GB
Display(s) LG UltraGear 32GN650-B + 4K Samsung TV
Case Phanteks NV7
Power Supply GPS-750C
Wow what a meaningful contribution to the discussion.
About as meaningful as 15 pages of people being in denial about the fact that VRAM requirements go up as the years go by.
 

dgianstefani

TPU Proofreader
Staff member
Joined
Dec 29, 2017
Messages
5,237 (2.02/day)
Location
Swansea, Wales
System Name Silent/X1 Yoga/S25U-1TB
Processor Ryzen 9800X3D @ 5.575ghz all core 1.24 V, Thermal Grizzly AM5 High Performance Heatspreader/1185 G7
Motherboard ASUS ROG Strix X670E-I, chipset fans replaced with Noctua A14x25 G2
Cooling Optimus Block, HWLabs Copper 240/40 + 240/30, D5/Res, 4x Noctua A12x25, 1x A14G2, Mayhems Ultra Pure
Memory 64 GB Dominator Titanium White 6000 MT, 130 ns tRFC, active cooled
Video Card(s) RTX 3080 Ti Founders Edition, Conductonaut Extreme, 18 W/mK MinusPad Extreme, Corsair XG7 Waterblock
Storage Intel Optane DC P1600X 118 GB, Samsung 990 Pro 2 TB
Display(s) 32" 240 Hz 1440p Samsung G7, 31.5" 165 Hz 1440p LG NanoIPS Ultragear, MX900 dual gas VESA mount
Case Sliger SM570 CNC Aluminium 13-Litre, 3D printed feet, custom front, LINKUP Ultra PCIe 4.0 x16 White
Audio Device(s) Audeze Maxwell Ultraviolet w/upgrade pads & LCD headband, Galaxy Buds 3 Pro, Razer Nommo Pro
Power Supply SF1000 Plat, full transparent custom cables, Sentinel Pro 1500 Online Double Conversion UPS w/Noctua
Mouse Razer Viper V3 Pro 8 KHz Mercury White & Pulsar Supergrip tape, Razer Atlas, Razer Strider Chroma
Keyboard Wooting 60HE+ module, TOFU-R CNC Alu/Brass, SS Prismcaps W+Jellykey, LekkerV2 mod, TLabs Leath/Suede
Software Windows 11 IoT Enterprise LTSC 24H2
Benchmark Scores Legendary
About as meaningful as 15 pages of people being in denial about the fact that VRAM requirements go up as the years go by.
Ah yes, an accurate and non-disingenuous representation from you (as usual :laugh:) of the discussion of a specific game taking much more VRAM than is justified considering it's fidelity, when compared to other games released recently, and whether a badly produced game implies that anything under halo tier 24 GB cards are obsolete.
 
Joined
Jun 19, 2021
Messages
164 (0.12/day)
System Name HAL
Processor AMD Ryzen 3700x
Motherboard ASRock B450 Pro4
Cooling AORUS Liquid 240
Memory 32GB Teamgroup 3200mhz
Video Card(s) EVGA 2060 GTX
In every generation, you see some games that come out that just have shitty coding and optimization. It's a poor port. No excuse for not optimizing vram routines.
 
Joined
Jun 1, 2011
Messages
4,827 (0.97/day)
Location
in a van down by the river
Processor faster at instructions than yours
Motherboard more nurturing than yours
Cooling frostier than yours
Memory superior scheduling & haphazardly entry than yours
Video Card(s) better rasterization than yours
Storage more ample than yours
Display(s) increased pixels than yours
Case fancier than yours
Audio Device(s) further audible than yours
Power Supply additional amps x volts than yours
Mouse without as much gnawing as yours
Keyboard less clicky than yours
VR HMD not as odd looking as yours
Software extra mushier than yours
Benchmark Scores up yours
With my brand new RX 6600 with 8GB VRam, i should be "future proofed" for @ least the next 5 years.
If you believe buying an entry level video will "future proof" you for five years then allow me to sell you a WD 256GB SSD as it will hold your entire PC video game collection for the next five years plus the OS, apps, pics, etc., etc., plenty of room!


About as meaningful as 15 pages of people being in denial about the fact that VRAM requirements go up as the years go by.
Most of the comments are not in denial and know VRAM requirements go up. It's when those requirements are A)ludicrous or B) the game is poorly optimize to scale down to market level GPU performance that people get ticked off especially when they either just recently purchased a video card or the cost to move up to higher VRAM is absurd to what it was just three years ago.
 
Joined
Jun 6, 2022
Messages
622 (0.64/day)
If it takes AMD featured or sponsored titles to push for higher VRAM provisions, i'll back that too! You call it AMD, i call it progress!
For two consecutive days I played with the UHD 770. I call it pleasure. The pleasure of playing.
There really is a game where you have to count the blades of grass and the apples from the tree?
 
Joined
Jan 8, 2017
Messages
9,661 (3.27/day)
System Name Good enough
Processor AMD Ryzen R9 7900 - Alphacool Eisblock XPX Aurora Edge
Motherboard ASRock B650 Pro RS
Cooling 2x 360mm NexXxoS ST30 X-Flow, 1x 360mm NexXxoS ST30, 1x 240mm NexXxoS ST30
Memory 32GB - FURY Beast RGB 5600 Mhz
Video Card(s) Sapphire RX 7900 XT - Alphacool Eisblock Aurora
Storage 1x Kingston KC3000 1TB 1x Kingston A2000 1TB, 1x Samsung 850 EVO 250GB , 1x Samsung 860 EVO 500GB
Display(s) LG UltraGear 32GN650-B + 4K Samsung TV
Case Phanteks NV7
Power Supply GPS-750C
It's when those requirements are A)ludicrous
How exactly do you determine if the requirements are 'ludicrous', whatever that even means. If a game uses more VRAM than any other previous game does that count as ludicrous ? Because obviously that line of thought is ridiculous, at some point in time as VRAM requirements go up there is always a game that needs more VRAM than any other game before it, that's how that works.

B) the game is poorly optimize to scale down to market level GPU performance that people get ticked off especially when they either just recently purchased a video card or the cost to move up to higher VRAM is absurd to what it was just three years ago.
Is your thought process simply :

low VRAM usage -> good optimization
high VRAM usage -> bad optimization

Are you a game developer ? How do you know if something is poorly optimized ?
 

dgianstefani

TPU Proofreader
Staff member
Joined
Dec 29, 2017
Messages
5,237 (2.02/day)
Location
Swansea, Wales
System Name Silent/X1 Yoga/S25U-1TB
Processor Ryzen 9800X3D @ 5.575ghz all core 1.24 V, Thermal Grizzly AM5 High Performance Heatspreader/1185 G7
Motherboard ASUS ROG Strix X670E-I, chipset fans replaced with Noctua A14x25 G2
Cooling Optimus Block, HWLabs Copper 240/40 + 240/30, D5/Res, 4x Noctua A12x25, 1x A14G2, Mayhems Ultra Pure
Memory 64 GB Dominator Titanium White 6000 MT, 130 ns tRFC, active cooled
Video Card(s) RTX 3080 Ti Founders Edition, Conductonaut Extreme, 18 W/mK MinusPad Extreme, Corsair XG7 Waterblock
Storage Intel Optane DC P1600X 118 GB, Samsung 990 Pro 2 TB
Display(s) 32" 240 Hz 1440p Samsung G7, 31.5" 165 Hz 1440p LG NanoIPS Ultragear, MX900 dual gas VESA mount
Case Sliger SM570 CNC Aluminium 13-Litre, 3D printed feet, custom front, LINKUP Ultra PCIe 4.0 x16 White
Audio Device(s) Audeze Maxwell Ultraviolet w/upgrade pads & LCD headband, Galaxy Buds 3 Pro, Razer Nommo Pro
Power Supply SF1000 Plat, full transparent custom cables, Sentinel Pro 1500 Online Double Conversion UPS w/Noctua
Mouse Razer Viper V3 Pro 8 KHz Mercury White & Pulsar Supergrip tape, Razer Atlas, Razer Strider Chroma
Keyboard Wooting 60HE+ module, TOFU-R CNC Alu/Brass, SS Prismcaps W+Jellykey, LekkerV2 mod, TLabs Leath/Suede
Software Windows 11 IoT Enterprise LTSC 24H2
Benchmark Scores Legendary
How exactly do you determine if the requirements are 'ludicrous', whatever that even means. If a game uses more VRAM than any other previous game does that count as ludicrous ? Because obviously that line of thought is ridiculous, at some point in time as VRAM requirements go up there is always a game that needs more VRAM than any other game before it, that's how that works.


Is your thought process simply :

low VRAM usage -> good optimization
high VRAM usage -> bad optimization

Are you a game developer ? How do you know if something is poorly optimized ?
Because people who aren't clowns can look at two similar games built on the same engine, and see that one has twice or more the VRAM requirements for zero additional, or perhaps worse fidelity and then draw conclusions.
 
  • Like
Reactions: 64K

Ruru

S.T.A.R.S.
Joined
Dec 16, 2012
Messages
13,311 (3.00/day)
Location
Jyväskylä, Finland
System Name 4K-gaming / console
Processor 5800X @ PBO +200 / i5-8600K @ 4.6GHz
Motherboard ROG Crosshair VII Hero / ROG Strix Z370-F
Cooling Alphacool Eisbaer 360 / Alphacool Eisbaer 240
Memory 32GB DDR4-3466 / 16GB DDR4-3600
Video Card(s) Asus RTX 3080 TUF OC / Powercolor RX 6700 XT
Storage 3.5TB of SSDs / several small SSDs
Display(s) 4K120 IPS + 4K60 IPS / 1080p60 HDTV
Case Corsair 4000D AF White / DeepCool CC560 WH
Audio Device(s) Sony WH-CH720N / TV speakers
Power Supply EVGA G2 750W / Fractal ION Gold 550W
Mouse Razer Basilisk / Logitech G400s
Keyboard Roccat Vulcan 121 AIMO / NOS C450 Mini Pro
VR HMD Oculus Rift CV1
Software Windows 11 Pro / Windows 11 Pro
Benchmark Scores They run Crysis
I still have to disagree. I have a 6700 XT and I play at 4K60 without any problems.
 
Joined
Jan 8, 2017
Messages
9,661 (3.27/day)
System Name Good enough
Processor AMD Ryzen R9 7900 - Alphacool Eisblock XPX Aurora Edge
Motherboard ASRock B650 Pro RS
Cooling 2x 360mm NexXxoS ST30 X-Flow, 1x 360mm NexXxoS ST30, 1x 240mm NexXxoS ST30
Memory 32GB - FURY Beast RGB 5600 Mhz
Video Card(s) Sapphire RX 7900 XT - Alphacool Eisblock Aurora
Storage 1x Kingston KC3000 1TB 1x Kingston A2000 1TB, 1x Samsung 850 EVO 250GB , 1x Samsung 860 EVO 500GB
Display(s) LG UltraGear 32GN650-B + 4K Samsung TV
Case Phanteks NV7
Power Supply GPS-750C
Because people who aren't clowns can look at two similar games built on the same engine, and see that one has twice or more the VRAM requirements for zero additional, or perhaps worse fidelity and then draw conclusions.

The biggest clowns by far are those who don't realize that despite this those games run just fine even without 24GB.

1683651008580.png


4070ti is still faster than a 3090 despite half the VRAM.
 
Joined
May 17, 2021
Messages
3,421 (2.51/day)
Processor Ryzen 5 5700x
Motherboard B550 Elite
Cooling Thermalright Perless Assassin 120 SE
Memory 32GB Fury Beast DDR4 3200Mhz
Video Card(s) Gigabyte 3060 ti gaming oc pro
Storage Samsung 970 Evo 1TB, WD SN850x 1TB, plus some random HDDs
Display(s) LG 27gp850 1440p 165Hz 27''
Case Lian Li Lancool II performance
Power Supply MSI 750w
Mouse G502
Imagine how stupid this discussion is retrospectively when you look back every couple of years.

2000: These lazy developers, 32MB is enough.
2005: These lazy developers, 128MB is enough.
2010: These lazy developers, 1GB is enough.
2015: These lazy developers, 4GB is enough.
2023: These lazy developers, 8GB is enough.

You get the picture.

quote me another time in the past where every single AAA game is absolute dog shit, and worst, the improvements are hardly visible from past games, in some cases they look worst, perform worst, play worst, stutter worst.

Not even a 2000$ gpu can get you a decent frame rate, no stutters, and they still don't look that much better then 2,3 year old games.
 
Top