• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

24 Gb video RAM is worth for gaming nowadays?

24 Gb video RAM is worth for gaming nowadays?

  • Yes

    Votes: 66 41.5%
  • No

    Votes: 93 58.5%

  • Total voters
    159
Joined
Jan 14, 2019
Messages
12,690 (5.83/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
One thing to note about VRAM usage on benchmarks, these are clean systems or systems with very little on them. They don't have multiple applications open that consume a wide range of GPU resources. I don't have any games open, but a lot of windows and different applications installed that consume GPU resources. My VRAM utilization right now is 4.3GB. Now, if I close everything or restart it will go to 1~1.5GB. But why would I do that? I have seen my VRAM usage go as high as 6GB. I typically only have 10~12GB of available VRAM for video games for 4k output. I would absolutely love to have 24GB of VRAM. 16GB is absolute minimum for 4k and maybe even 1440p. I think I'd only go for a 12GB card if I had a 1080p monitor though 12GB would only leave 6~8GB of VRAM which I guess is good enough for 1080p? Been a super long time since I have played at that resolution.

Now if all your system is used for is strictly gaming, you close your browser when you start a game, and the only other application you have installed is something like Discord, then you could get buy with less.
There's also a part of system memory that's reserved to act as a sort of backup VRAM. If your actual VRAM gets full, currently unused things get dumped there (or even more things if the game needs more).

There's also the effect of the OS loading more things into your RAM and VRAM if you have more available. For example, I see around 1 GB VRAM used on the Windows desktop on my 6750 XT. I don't see nearly as much on the 1030 in my HTPC.

Except it now looks like it isn't several years old, so that totally depends if you've already played it, and what version. That's my take on it anyway. I had only played a few small parts of it on my friend's PS4 years ago, and it's still totally worth it to me, especially if the hardest two modes are noticeably harder.

The gameplay feels a bit easier than I thought it would, especially only having to kill one Bloater, well 2 if you count the one Ellie kills that goes down a bit easier, not needing to use many of the weapons, and not needing a lot of the upgrades with the ending being such an easy scenario of sneak killing, ending with an uber easy sneak.

That said, I'm hoping Survivor and Grounded modes will make up for the lack of challenge on Hard, because compared to games like Dead Space and The Evil Within 1, it feels more like adventure horror than survival horror so far. It kind of does the reverse of most horror games, where the story is the most compelling thing about it, though the story is really good.
I only play games for their story and atmosphere, so I'll definitely buy it when it gets cheaper. :)

I've never played the original, but a lot of people have, and there have been a lot of videos, reviews, articles, etc. about it that were simply impossible to avoid, so it still feels like an old game. Besides, I know it's just a PC port, there's no way around it. The developers didn't put as much effort into it as they would have had to when making a new game, and that isn't worth £30, let alone £60 to me.
 
Joined
Jun 1, 2011
Messages
4,690 (0.95/day)
Location
in a van down by the river
Processor faster at instructions than yours
Motherboard more nurturing than yours
Cooling frostier than yours
Memory superior scheduling & haphazardly entry than yours
Video Card(s) better rasterization than yours
Storage more ample than yours
Display(s) increased pixels than yours
Case fancier than yours
Audio Device(s) further audible than yours
Power Supply additional amps x volts than yours
Mouse without as much gnawing as yours
Keyboard less clicky than yours
VR HMD not as odd looking as yours
Software extra mushier than yours
Benchmark Scores up yours
For the end user that is entirely irrelevant tbh. Only the performance matters.
for most end users sure but if you are going to have a debate about it than it's easier to discuss one at a time rather than jump around
Alot has been said about the game being unreasonably cpu heavy
the only thing I've seen about it (granted I'm not looking for it) was the pcgamer review when the reviewer stated the game pushed his 9700k to full usage. Most likely the review was done prior to any patches in order to meet a deadline so it may seem CPU performance is be better from your statement (and Frag's) since the review.
 
Last edited:
Joined
Nov 9, 2010
Messages
5,689 (1.10/day)
System Name Space Station
Processor Intel 13700K
Motherboard ASRock Z790 PG Riptide
Cooling Arctic Liquid Freezer II 420
Memory Corsair Vengeance 6400 2x16GB @ CL34
Video Card(s) PNY RTX 4080
Storage SSDs - Nextorage 4TB, Samsung EVO 970 500GB, Plextor M5Pro 128GB, HDDs - WD Black 6TB, 2x 1TB
Display(s) LG C3 OLED 42"
Case Corsair 7000D Airflow
Audio Device(s) Yamaha RX-V371
Power Supply SeaSonic Vertex 1200w Gold
Mouse Razer Basilisk V3
Keyboard Bloody B840-LK
Software Windows 11 Pro 23H2
wait until you spend day two on a tech/gamer forum and we discuss hardware people feel won't be to their liking

LOL, been there, done that. Tech forums invariably wind up with arguments that go nowhere. I was one of the first on the tech forum we were talking about it to say the 8700K was going to be a beast of a CPU when the announcement of it first hit. Years later I'm still running it, and still loving it, with no need whatsoever for an OC,

For the end user that is entirely irrelevant tbh. Only the performance matters.

Alot has been said about the game being unreasonably cpu heavy, and i don't see that being the case. My 11700 isn't exactly the fastest cpu, and the performance is stellar.

That's what I keep thinking through all TLoU Part I discussions. Too many people hung up on the numbers instead of the actual performance.

I only play games for their story and atmosphere, so I'll definitely buy it when it gets cheaper. :)

I've never played the original, but a lot of people have, and there have been a lot of videos, reviews, articles, etc. about it that were simply impossible to avoid, so it still feels like an old game. Besides, I know it's just a PC port, there's no way around it. The developers didn't put as much effort into it as they would have had to when making a new game, and that isn't worth £30, let alone £60 to me.

That sounds a lot like you're claiming price gouging, but only because you spoiled the game for yourself. C'mon man, you can do better than that! :D

Seriously though, I get what you're saying, the assets are already there to draw from. However when you look at the difference in visual quality, all the graphics features and settings added, and RT implementation and the testing required, it took a fair bit of time just the same. And as well, like it or not, there are FAR less people that buy AAA games on PC compared to on console, so it wouldn't be worth it to them to charge only half the price because of that alone.

Some things about the gaming industry like the attrition vs high demand disparities between platforms we cannot control, and neither developer, publisher, or consumer are at fault. It just is what it is.

for most end users sure but if you are going to have a debate about it than it's easier to discuss one at a time rather than jump around

the only thing I've seen about it (granted I'm not looking for it) was the pcgamer review when the reviewer stated the game pushed his 9700k to full usage. Most likely the review was done prior to any patches in order to meet a deadline so it may seem CPU performance is be better from your statement (and Frag's) since the review.

Yeah and I even get a bit cynical about highly regarded sites like Digital Foundry when they start out poking fun about the performance and showing blurry as hell brick walls, without even saying what build version it was. Then I go in game, on patch 1.0.1.0 mind you, and find no such blurriness on the same Med Textures. It actually makes me wonder if they somehow got a hold of a prerelease version that hadn't even gotten ANY patching.

Digital Foundry used to be my number one source for optimized PC settings on games, now I'm not so sure if I can trust what they are saying and showing is even true.

THAT SAID, game studios should be damn careful about releasing unpatched versions of their games without THOROGHLY checking them for performance and bug problems. So IMO, if Iron Galaxy did that, bad on them, as it's only asking for TONS of bad feedback and can ruin the reputation of a game at it's critical release time.
 
Last edited:
Joined
Jun 14, 2020
Messages
3,559 (2.14/day)
System Name Mean machine
Processor 12900k
Motherboard MSI Unify X
Cooling Noctua U12A
Memory 7600c34
Video Card(s) 4090 Gamerock oc
Storage 980 pro 2tb
Display(s) Samsung crg90
Case Fractal Torent
Audio Device(s) Hifiman Arya / a30 - d30 pro stack
Power Supply Be quiet dark power pro 1200
Mouse Viper ultimate
Keyboard Blackwidow 65%
I really dont get how understanding what a cpu bottleneck is, and how to trigger it, is so hard for "some" people. But as you correctly said @fevgatos you just lower resolution, until you don't see an fps increase by lowering resolution any further... and that is your cpu performance level.

4k - 82 fps



1440p - 121 fps, but only 92% gpu load, so really we already know that it is cpu limited



1080p - as expected didn't increase fps any further. So in this particular scene, with a 11700f cpu, the cpu performance level is 120 fps.



Then one could argue that a game is cpu heavy if you aren't able to obtain a satisfying fps, and the cpu bottleneck (aka gpu load is below 100%) is apparent all the time - in other words that the cpu performance (or lack) is the culprit of the poor performance in the game.

Judging cpu performance in a game on anything else than the game performance seems rather ridiculous.
Yeah but look at your CPU usage to achieve that result. You are hitting 90% on a relatively empty scene man. You can't tell me it's not a heavy game. Lot's of games (atomic heart for example) get similar / much higher framerate with much lower CPU utilization.
 

64K

Joined
Mar 13, 2014
Messages
6,773 (1.72/day)
Processor i7 7700k
Motherboard MSI Z270 SLI Plus
Cooling CM Hyper 212 EVO
Memory 2 x 8 GB Corsair Vengeance
Video Card(s) Temporary MSI RTX 4070 Super
Storage Samsung 850 EVO 250 GB and WD Black 4TB
Display(s) Temporary Viewsonic 4K 60 Hz
Case Corsair Obsidian 750D Airflow Edition
Audio Device(s) Onboard
Power Supply EVGA SuperNova 850 W Gold
Mouse Logitech G502
Keyboard Logitech G105
Software Windows 10
24 GB VRAM is overkill. Also just because a card is using a lot of VRAM doesn't necessarily mean the game requires that much VRAM. In some cases the engine is loading up the VRAM just because it's there.
 
Joined
Jun 14, 2020
Messages
3,559 (2.14/day)
System Name Mean machine
Processor 12900k
Motherboard MSI Unify X
Cooling Noctua U12A
Memory 7600c34
Video Card(s) 4090 Gamerock oc
Storage 980 pro 2tb
Display(s) Samsung crg90
Case Fractal Torent
Audio Device(s) Hifiman Arya / a30 - d30 pro stack
Power Supply Be quiet dark power pro 1200
Mouse Viper ultimate
Keyboard Blackwidow 65%
What I find fascinating is that Plague Tale requires 4.5 to 6GB of VRAM at 4k ultra. TLOU requires 9.5 to 11gb of VRAM at 720p. Im sure this has nothing to do with lazy devs, it's just greedy nvidia not offering enough vram, right.
 
Joined
Mar 29, 2023
Messages
1,045 (1.63/day)
Processor Ryzen 7800x3d
Motherboard Asus B650e-F Strix
Cooling Corsair H150i Pro
Memory Gskill 32gb 6000 mhz cl30
Video Card(s) RTX 4090 Gaming OC
Storage Samsung 980 pro 2tb, Samsung 860 evo 500gb, Samsung 850 evo 1tb, Samsung 860 evo 4tb
Display(s) Acer XB321HK
Case Coolermaster Cosmos 2
Audio Device(s) Creative SB X-Fi 5.1 Pro + Logitech Z560
Power Supply Corsair AX1200i
Mouse Logitech G700s
Keyboard Logitech G710+
Software Win10 pro
Yeah but look at your CPU usage to achieve that result. You are hitting 90% on a relatively empty scene man. You can't tell me it's not a heavy game. Lot's of games (atomic heart for example) get similar / much higher framerate with much lower CPU utilization.

You really don't seem to get it - in a cpu bottlenecked scenario, you want the usage to be as high as possible. In the oppersite scenario you are leaving a ton of performance on the table when it's just a single thread bottlenecking the performance.

As for other games getting higher fps with less usage - there are many reasons for that, and in this context they are rather irrelevant tbh. But the primary reason is that this game is specifically made for the ps5's decoder engine - something we don't have on pc, and it's therefore done by the cpu instead - much like in the spiderman game. The same will be the case with other sony games that are ps5 ports.
But as said, it's really rather irrelevant, as long as the performance is good, which it is.

What I find fascinating is that Plague Tale requires 4.5 to 6GB of VRAM at 4k ultra. TLOU requires 9.5 to 11gb of VRAM at 720p. Im sure this has nothing to do with lazy devs, it's just greedy nvidia not offering enough vram, right.

It's a question of how much bandwidth you wanna use on streaming textures, and how much vram you wanna use storing textures. Using as much vram as possible to cache usually provides better performance, provided you don't go over your vram limit.
 
Joined
Apr 14, 2022
Messages
769 (0.78/day)
Location
London, UK
Processor AMD Ryzen 7 5800X3D
Motherboard ASUS B550M-Plus WiFi II
Cooling Noctua U12A chromax.black
Memory Corsair Vengeance 32GB 3600Mhz
Video Card(s) Palit RTX 4080 GameRock OC
Storage Samsung 970 Evo Plus 1TB + 980 Pro 2TB
Display(s) Acer Nitro XV271UM3B IPS 180Hz
Case Asus Prime AP201
Audio Device(s) Creative Gigaworks - Razer Blackshark V2 Pro
Power Supply Corsair SF750
Mouse Razer Viper
Keyboard Asus ROG Falchion
Software Windows 11 64bit
What I find fascinating is that Plague Tale requires 4.5 to 6GB of VRAM at 4k ultra. TLOU requires 9.5 to 11gb of VRAM at 720p. Im sure this has nothing to do with lazy devs, it's just greedy nvidia not offering enough vram, right.

I think it's about priorities, not laziness.
The console devs make the games that way so they can use first the cpu then the ram and then the gpu.
The cpus are decent enough in consoles, the ram in marginally acceptable amount but the gpus are always the limiting factor.

So they develop the games the exact opposite way to the games they would develop firstly for PCs.
 

Mussels

Freshwater Moderator
Joined
Oct 6, 2004
Messages
58,413 (7.90/day)
Location
Oystralia
System Name Rainbow Sparkles (Power efficient, <350W gaming load)
Processor Ryzen R7 5800x3D (Undervolted, 4.45GHz all core)
Motherboard Asus x570-F (BIOS Modded)
Cooling Alphacool Apex UV - Alphacool Eisblock XPX Aurora + EK Quantum ARGB 3090 w/ active backplate
Memory 2x32GB DDR4 3600 Corsair Vengeance RGB @3866 C18-22-22-22-42 TRFC704 (1.4V Hynix MJR - SoC 1.15V)
Video Card(s) Galax RTX 3090 SG 24GB: Underclocked to 1700Mhz 0.750v (375W down to 250W))
Storage 2TB WD SN850 NVME + 1TB Sasmsung 970 Pro NVME + 1TB Intel 6000P NVME USB 3.2
Display(s) Phillips 32 32M1N5800A (4k144), LG 32" (4K60) | Gigabyte G32QC (2k165) | Phillips 328m6fjrmb (2K144)
Case Fractal Design R6
Audio Device(s) Logitech G560 | Corsair Void pro RGB |Blue Yeti mic
Power Supply Fractal Ion+ 2 860W (Platinum) (This thing is God-tier. Silent and TINY)
Mouse Logitech G Pro wireless + Steelseries Prisma XL
Keyboard Razer Huntsman TE ( Sexy white keycaps)
VR HMD Oculus Rift S + Quest 2
Software Windows 11 pro x64 (Yes, it's genuinely a good OS) OpenRGB - ditch the branded bloatware!
Benchmark Scores Nyooom.
As for the 6800 having too much vram... a gpu cant have "too much vram". That's like saying a car has a too big gas tank.
That's a very poor analogy and actually really great at proving why you're incorrect

Do you know why planes dont fill up to 100% and instead get as accurate weight from passengers and cargo as they can to carry as little fuel as possible? Because they have to burn fuel to carry the weight of the extra fuel.

Do you know why GPU's dont have heaps of VRAM, cost be damned?

Because they use power and produce heat, my 3090 with it's 375W limit loses performance the more VRAM is used, because it cant power all the VRAM and the GPU at full speeds and remain under 375W - I need to undervolt or run a 450W BIOS to use it. That's why Nvidia didnt include more VRAM on the other models in the series, except the 3090Ti which then used half the VRAM chips at double density once they were available to avoid that problem.

You're saying a car can never have enough fuel, so let's attach a fuel tanker to the back - lets go even further and put a refinery back there!
(You'll inevitably say "thats not the point" "you took it too far" etc, but no, that was your claim. YOU made the claim you never have too much, but there is drawbacks to doing so)
 
Joined
Dec 25, 2020
Messages
7,092 (4.83/day)
Location
São Paulo, Brazil
System Name "Icy Resurrection"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard ASUS ROG Maximus Z790 Apex Encore
Cooling Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM
Memory 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD
Display(s) 55-inch LG G3 OLED
Case Pichau Mancer CV500 White Edition
Audio Device(s) Apple USB-C + Sony MDR-V7 headphones
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Classic Intellimouse
Keyboard IBM Model M type 1391405 (distribución española)
Software Windows 11 IoT Enterprise LTSC 24H2
Benchmark Scores I pulled a Qiqi~
24 GB VRAM is overkill. Also just because a card is using a lot of VRAM doesn't necessarily mean the game requires that much VRAM. In some cases the engine is loading up the VRAM just because it's there.

The problem with this logic is that it technically applies every single time there's any advances in capacity. Ten years ago when the original Titan launched, the only thing that one would replace from your post is "24" to "6" to make the exact same argument against it.

Yet... It's been six years since the GTX 1060 made the same 6 GB the "minimum acceptable" VRAM amount, gaming on a 4 GB card today is an exercise in patience to find which trade-off are you going to go with to play your game, no matter how powerful the GPU may be.

24 GB isn't overkill, IMO. It's an ample and adequate amount for a high end GPU which is expected to do high end things.

We'd eventually have 24 GB cards in the more popular segments, wasn't for the... death of the budget and performance segment GPUs. What an anomaly this generation has been...
 

Mussels

Freshwater Moderator
Joined
Oct 6, 2004
Messages
58,413 (7.90/day)
Location
Oystralia
System Name Rainbow Sparkles (Power efficient, <350W gaming load)
Processor Ryzen R7 5800x3D (Undervolted, 4.45GHz all core)
Motherboard Asus x570-F (BIOS Modded)
Cooling Alphacool Apex UV - Alphacool Eisblock XPX Aurora + EK Quantum ARGB 3090 w/ active backplate
Memory 2x32GB DDR4 3600 Corsair Vengeance RGB @3866 C18-22-22-22-42 TRFC704 (1.4V Hynix MJR - SoC 1.15V)
Video Card(s) Galax RTX 3090 SG 24GB: Underclocked to 1700Mhz 0.750v (375W down to 250W))
Storage 2TB WD SN850 NVME + 1TB Sasmsung 970 Pro NVME + 1TB Intel 6000P NVME USB 3.2
Display(s) Phillips 32 32M1N5800A (4k144), LG 32" (4K60) | Gigabyte G32QC (2k165) | Phillips 328m6fjrmb (2K144)
Case Fractal Design R6
Audio Device(s) Logitech G560 | Corsair Void pro RGB |Blue Yeti mic
Power Supply Fractal Ion+ 2 860W (Platinum) (This thing is God-tier. Silent and TINY)
Mouse Logitech G Pro wireless + Steelseries Prisma XL
Keyboard Razer Huntsman TE ( Sexy white keycaps)
VR HMD Oculus Rift S + Quest 2
Software Windows 11 pro x64 (Yes, it's genuinely a good OS) OpenRGB - ditch the branded bloatware!
Benchmark Scores Nyooom.
You really don't seem to get it - in a cpu bottlenecked scenario, you want the usage to be as high as possible. In the oppersite scenario you are leaving a ton of performance on the table when it's just a single thread bottlenecking the performance.

As for other games getting higher fps with less usage - there are many reasons for that, and in this context they are rather irrelevant tbh. But the primary reason is that this game is specifically made for the ps5's decoder engine - something we don't have on pc, and it's therefore done by the cpu instead - much like in the spiderman game. The same will be the case with other sony games that are ps5 ports.
But as said, it's really rather irrelevant, as long as the performance is good, which it is.



It's a question of how much bandwidth you wanna use on streaming textures, and how much vram you wanna use storing textures. Using as much vram as possible to cache usually provides better performance, provided you don't go over your vram limit.
You don't ever want a bottleneck
A CPU bottleneck results in stuttering, while a GPU bottleneck results in CPU's rendering frames ahead and input lag.
 
Joined
Mar 29, 2023
Messages
1,045 (1.63/day)
Processor Ryzen 7800x3d
Motherboard Asus B650e-F Strix
Cooling Corsair H150i Pro
Memory Gskill 32gb 6000 mhz cl30
Video Card(s) RTX 4090 Gaming OC
Storage Samsung 980 pro 2tb, Samsung 860 evo 500gb, Samsung 850 evo 1tb, Samsung 860 evo 4tb
Display(s) Acer XB321HK
Case Coolermaster Cosmos 2
Audio Device(s) Creative SB X-Fi 5.1 Pro + Logitech Z560
Power Supply Corsair AX1200i
Mouse Logitech G700s
Keyboard Logitech G710+
Software Win10 pro
That's a very poor analogy and actually really great at proving why you're incorrect

Do you know why planes dont fill up to 100% and instead get as accurate weight from passengers and cargo as they can to carry as little fuel as possible? Because they have to burn fuel to carry the weight of the extra fuel.

Do you know why GPU's dont have heaps of VRAM, cost be damned?

Because they use power and produce heat, my 3090 with it's 375W limit loses performance the more VRAM is used, because it cant power all the VRAM and the GPU at full speeds and remain under 375W - I need to undervolt or run a 450W BIOS to use it. That's why Nvidia didnt include more VRAM on the other models in the series, except the 3090Ti which then used half the VRAM chips at double density once they were available to avoid that problem.

You're saying a car can never have enough fuel, so let's attach a fuel tanker to the back - lets go even further and put a refinery back there!
(You'll inevitably say "thats not the point" "you took it too far" etc, but no, that was your claim. YOU made the claim you never have too much, but there is drawbacks to doing so)

That's not the point, you took it too far ! :p

Seriously though, i did not find the vram to be a limiting factor at all on my 3090 :)
 
Joined
Dec 25, 2020
Messages
7,092 (4.83/day)
Location
São Paulo, Brazil
System Name "Icy Resurrection"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard ASUS ROG Maximus Z790 Apex Encore
Cooling Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM
Memory 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD
Display(s) 55-inch LG G3 OLED
Case Pichau Mancer CV500 White Edition
Audio Device(s) Apple USB-C + Sony MDR-V7 headphones
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Classic Intellimouse
Keyboard IBM Model M type 1391405 (distribución española)
Software Windows 11 IoT Enterprise LTSC 24H2
Benchmark Scores I pulled a Qiqi~
That's a very poor analogy and actually really great at proving why you're incorrect

Do you know why planes dont fill up to 100% and instead get as accurate weight from passengers and cargo as they can to carry as little fuel as possible? Because they have to burn fuel to carry the weight of the extra fuel.

Do you know why GPU's dont have heaps of VRAM, cost be damned?

Because they use power and produce heat, my 3090 with it's 375W limit loses performance the more VRAM is used, because it cant power all the VRAM and the GPU at full speeds and remain under 375W - I need to undervolt or run a 450W BIOS to use it. That's why Nvidia didnt include more VRAM on the other models in the series, except the 3090Ti which then used half the VRAM chips at double density once they were available to avoid that problem.

You're saying a car can never have enough fuel, so let's attach a fuel tanker to the back - lets go even further and put a refinery back there!
(You'll inevitably say "thats not the point" "you took it too far" etc, but no, that was your claim. YOU made the claim you never have too much, but there is drawbacks to doing so)

Being fair this is a problem specific to the 3090, the 3090 Ti solved it and the Titan RTX wasn't really affected, the blame is more of excessive amount of power hungry first generation GDDR6X than actually the capacity in itself. 24 chips that suck 7, 8 watts each was a bad idea, but they needed a product to position as the Titan RTX's successor.
 
Joined
Mar 29, 2023
Messages
1,045 (1.63/day)
Processor Ryzen 7800x3d
Motherboard Asus B650e-F Strix
Cooling Corsair H150i Pro
Memory Gskill 32gb 6000 mhz cl30
Video Card(s) RTX 4090 Gaming OC
Storage Samsung 980 pro 2tb, Samsung 860 evo 500gb, Samsung 850 evo 1tb, Samsung 860 evo 4tb
Display(s) Acer XB321HK
Case Coolermaster Cosmos 2
Audio Device(s) Creative SB X-Fi 5.1 Pro + Logitech Z560
Power Supply Corsair AX1200i
Mouse Logitech G700s
Keyboard Logitech G710+
Software Win10 pro
You don't ever want a bottleneck
A CPU bottleneck results in stuttering, while a GPU bottleneck results in CPU's rendering frames ahead and input lag.

Ofc you don't - you will have alot more frametime varience while cpu bottlenecked. But im saying he got it backwards about high potential cpu load being bad - the oppersite is what is bad, only having 1 thread being maxed out, while the rest of the cpu barely does anything. Sure, it will give less watt draw, but it will also leave ALOT of potential performance on the table vs if the game had proper parallelized cpu threads.
 
Joined
Jan 14, 2019
Messages
12,690 (5.83/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
That sounds a lot like you're claiming price gouging, but only because you spoiled the game for yourself. C'mon man, you can do better than that! :D

Seriously though, I get what you're saying, the assets are already there to draw from. However when you look at the difference in visual quality, all the graphics features and settings added, and RT implementation and the testing required, it took a fair bit of time just the same. And as well, like it or not, there are FAR less people that buy AAA games on PC compared to on console, so it wouldn't be worth it to them to charge only half the price because of that alone.

Some things about the gaming industry like the attrition vs high demand disparities between platforms we cannot control, and neither developer, publisher, or consumer are at fault. It just is what it is.
So basically, look at the game as a remake, not as a resold old product. I get that. :)

It's only that when it comes to remakes, I can't help but think of the ones like Black Mesa, which isn't just a graphical upgrade over Half-Life, but a thoroughly redone game which was free in its early days, then started selling for something like 10 or 15 quid once Valve took on the devs to finish the project with proper funding. Now, that's something I consider worth your money! I also love what the devs did with Homeworld Remastered, not to mention that game wasn't too expensive at launch, either. As a big fan of The Witcher, I can only endorse the recent TW3 upgrade because it comes as a free patch for original owners, not as a new game (which it is definitely not).

The problem is, 9 out of 10 remakes of current days are nothing more than slight graphical upgrades for extortionate prices, and TLoU is no exception, I'm afraid. Like I said, I'll buy it once its price drops to about £10 because I'm eager to play it. But that's as much as a remake is worth to me, unfortunately.

Accepting bad prices as "market conditions" and shrugging them off saying "that's just how it is these days" is the wrong thing to do. We have the power to turn trends around when we vote with our wallets, and that's exactly what I'm doing.
 
Joined
Mar 29, 2023
Messages
1,045 (1.63/day)
Processor Ryzen 7800x3d
Motherboard Asus B650e-F Strix
Cooling Corsair H150i Pro
Memory Gskill 32gb 6000 mhz cl30
Video Card(s) RTX 4090 Gaming OC
Storage Samsung 980 pro 2tb, Samsung 860 evo 500gb, Samsung 850 evo 1tb, Samsung 860 evo 4tb
Display(s) Acer XB321HK
Case Coolermaster Cosmos 2
Audio Device(s) Creative SB X-Fi 5.1 Pro + Logitech Z560
Power Supply Corsair AX1200i
Mouse Logitech G700s
Keyboard Logitech G710+
Software Win10 pro
Being fair this is a problem specific to the 3090, the 3090 Ti solved it and the Titan RTX wasn't really affected, the blame is more of excessive amount of power hungry first generation GDDR6X than actually the capacity in itself. 24 chips that suck 7, 8 watts each was a bad idea, but they needed a product to position as the Titan RTX's successor.

They did indeed. The amount of vram the 3090 got will also make it the only 30 series card to age well.
 
Joined
Nov 9, 2010
Messages
5,689 (1.10/day)
System Name Space Station
Processor Intel 13700K
Motherboard ASRock Z790 PG Riptide
Cooling Arctic Liquid Freezer II 420
Memory Corsair Vengeance 6400 2x16GB @ CL34
Video Card(s) PNY RTX 4080
Storage SSDs - Nextorage 4TB, Samsung EVO 970 500GB, Plextor M5Pro 128GB, HDDs - WD Black 6TB, 2x 1TB
Display(s) LG C3 OLED 42"
Case Corsair 7000D Airflow
Audio Device(s) Yamaha RX-V371
Power Supply SeaSonic Vertex 1200w Gold
Mouse Razer Basilisk V3
Keyboard Bloody B840-LK
Software Windows 11 Pro 23H2
We have the power to turn trends around when we vote with our wallets, and that's exactly what I'm doing.

Yeah, well, people have been saying that for years, and look what good it's done, very little. If I were to be as nit picky as you imply we should be, I'd be bored to death with the endless waiting.
 

dgianstefani

TPU Proofreader
Staff member
Joined
Dec 29, 2017
Messages
5,125 (2.00/day)
Location
Swansea, Wales
System Name Silent/X1 Yoga
Processor Ryzen 7800X3D @ 5.15ghz BCLK OC, TG AM5 High Performance Heatspreader/1185 G7
Motherboard ASUS ROG Strix X670E-I, chipset fans replaced with Noctua A14x25 G2
Cooling Optimus Block, HWLabs Copper 240/40 + 240/30, D5/Res, 4x Noctua A12x25, 1x A14G2, Mayhems Ultra Pure
Memory 32 GB Dominator Platinum 6150 MT 26-36-36-48, 56.6ns AIDA, 2050 FCLK, 160 ns tRFC, active cooled
Video Card(s) RTX 3080 Ti Founders Edition, Conductonaut Extreme, 18 W/mK MinusPad Extreme, Corsair XG7 Waterblock
Storage Intel Optane DC P1600X 118 GB, Samsung 990 Pro 2 TB
Display(s) 32" 240 Hz 1440p Samsung G7, 31.5" 165 Hz 1440p LG NanoIPS Ultragear, MX900 dual gas VESA mount
Case Sliger SM570 CNC Aluminium 13-Litre, 3D printed feet, custom front, LINKUP Ultra PCIe 4.0 x16 white
Audio Device(s) Audeze Maxwell Ultraviolet w/upgrade pads & LCD headband, Galaxy Buds 3 Pro, Razer Nommo Pro
Power Supply SF750 Plat, full transparent custom cables, Sentinel Pro 1500 Online Double Conversion UPS w/Noctua
Mouse Razer Viper V3 Pro 8 KHz Mercury White w/Tiger Ice Skates & Pulsar Supergrip tape, Razer Atlas
Keyboard Wooting 60HE+ module, TOFU-R CNC Alu/Brass, SS Prismcaps W+Jellykey, LekkerV2 mod, TLabs Leath/Suede
Software Windows 11 IoT Enterprise LTSC 24H2
Benchmark Scores Legendary
The problem with this logic is that it technically applies every single time there's any advances in capacity. Ten years ago when the original Titan launched, the only thing that one would replace from your post is "24" to "6" to make the exact same argument against it.

Yet... It's been six years since the GTX 1060 made the same 6 GB the "minimum acceptable" VRAM amount, gaming on a 4 GB card today is an exercise in patience to find which trade-off are you going to go with to play your game, no matter how powerful the GPU may be.

24 GB isn't overkill, IMO. It's an ample and adequate amount for a high end GPU which is expected to do high end things.

We'd eventually have 24 GB cards in the more popular segments, wasn't for the... death of the budget and performance segment GPUs. What an anomaly this generation has been...
Yeah and if the original titan did have 24 GB VRAM it would still be borderline useless today, because the VRAM is only useful if the card is powerful enough to use it.

The point of this thread is that there's obvious scenarios where cards have more VRAM than they need or can use. For anything below halo cards today, 24 GB VRAM is pointless.

I'd take a 4080 over a 3090 any day, because while it has 8 GB less VRAM and is therefore bad according to many in this thread, it's going to deliver a much better experience.
 
Joined
Jun 1, 2011
Messages
4,690 (0.95/day)
Location
in a van down by the river
Processor faster at instructions than yours
Motherboard more nurturing than yours
Cooling frostier than yours
Memory superior scheduling & haphazardly entry than yours
Video Card(s) better rasterization than yours
Storage more ample than yours
Display(s) increased pixels than yours
Case fancier than yours
Audio Device(s) further audible than yours
Power Supply additional amps x volts than yours
Mouse without as much gnawing as yours
Keyboard less clicky than yours
VR HMD not as odd looking as yours
Software extra mushier than yours
Benchmark Scores up yours
Yeah, well, people have been saying that for years, and look what good it's done, very little. If I were to be as nit picky as you imply we should be, I'd be bored to death with the endless waiting.
So what would be the alternative, hand over your money for things you find over priced? I believe both you and @AusWolf have a right to spend your money as you please, that doesn't mean your opinions have to align with how you spend it or don't spend it.
 
Joined
Jan 14, 2019
Messages
12,690 (5.83/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
Yeah, well, people have been saying that for years, and look what good it's done, very little.
That's because not enough people have been saying that. The majority buys anything for any price and puts up with everything that entertainment companies say or do.

If I were to be as nit picky as you imply we should be, I'd be bored to death with the endless waiting.
I'm not bored. :) I've got more than 500 games on Steam and GOG, including old classics that I'm always happy to play again, and lots of other titles that I bought on sale but haven't tried yet.

Edit: I'm not saying one should be nitpicky about this stuff. I'm just stating that I am.
 
Joined
Jan 8, 2017
Messages
9,521 (3.27/day)
System Name Good enough
Processor AMD Ryzen R9 7900 - Alphacool Eisblock XPX Aurora Edge
Motherboard ASRock B650 Pro RS
Cooling 2x 360mm NexXxoS ST30 X-Flow, 1x 360mm NexXxoS ST30, 1x 240mm NexXxoS ST30
Memory 32GB - FURY Beast RGB 5600 Mhz
Video Card(s) Sapphire RX 7900 XT - Alphacool Eisblock Aurora
Storage 1x Kingston KC3000 1TB 1x Kingston A2000 1TB, 1x Samsung 850 EVO 250GB , 1x Samsung 860 EVO 500GB
Display(s) LG UltraGear 32GN650-B + 4K Samsung TV
Case Phanteks NV7
Power Supply GPS-750C
You don't ever want a bottleneck
A CPU bottleneck results in stuttering, while a GPU bottleneck results in CPU's rendering frames ahead and input lag.

You always have a bottleneck somewhere, there is no such thing as a "bottleneck free system", otherwise you'd get what ? Infinite frames per second ? And no, neither of those things have to mean stuttering or input lag, that's not how it works. According to this logic you always either get stutter or input lag which makes no sense.

CPU bottleneck => lower bound on CPU time for each frame
GPU bottleneck => lower bound on GPU time for each frame

That's all these things mean, nothing more nothing less.

I'd expect it to be pretty heavy
I don't, why should it be heavy ? Like I said the game logic is still that of a game that's 10 years old, there is obviously something amiss about CPU performance and I don't see what it could be other than legacy code.

The aforementioned Crysis Remaster was also updated to CryEngine 3, yet it had the same backend logic, the developers admitted it, so just because a game is ported to a new engine that does not mean it's actually rewritten from the ground up and I bet it's the same story here.
 
Last edited:
Joined
Jun 1, 2011
Messages
4,690 (0.95/day)
Location
in a van down by the river
Processor faster at instructions than yours
Motherboard more nurturing than yours
Cooling frostier than yours
Memory superior scheduling & haphazardly entry than yours
Video Card(s) better rasterization than yours
Storage more ample than yours
Display(s) increased pixels than yours
Case fancier than yours
Audio Device(s) further audible than yours
Power Supply additional amps x volts than yours
Mouse without as much gnawing as yours
Keyboard less clicky than yours
VR HMD not as odd looking as yours
Software extra mushier than yours
Benchmark Scores up yours
That's a very poor analogy and actually really great at proving why you're incorrect

Do you know why planes dont fill up to 100% and instead get as accurate weight from passengers and cargo as they can to carry as little fuel as possible? Because they have to burn fuel to carry the weight of the extra fuel.
Agreed and I'll do you one better with the fuel analogy. Think of automotive ultimate performance in race cars, open wheel cars like F1 are extremely weight conscious. Put in too big a gas tank and you weigh the car down preventing it from achieving top speeds, too small a gas tank and you create a situation where the car needs more pit stops. You want a just right size tank to be able to achieve top speeds yets carry you through the race with minimal stops. Similar to video cards, adding more RAM than the chip can effectively use just increased power used, can create more heat internally, and adds an additional cost to the end user. Too little and you can hamper performance as we have seen. You want the just right amount.
You don't ever want a bottleneck
A CPU bottleneck results in stuttering, while a GPU bottleneck results in CPU's rendering frames ahead and input lag.
I know what you are saying but everyone has a bottleneck be it CPU, GPU, SSD/HDD, monitor, RAM, games/software, even PSU. Really its, you don't want a bottleneck that severely prevents your hardware from performing up to its full potential with severely being the key word.
 
Joined
Mar 29, 2023
Messages
1,045 (1.63/day)
Processor Ryzen 7800x3d
Motherboard Asus B650e-F Strix
Cooling Corsair H150i Pro
Memory Gskill 32gb 6000 mhz cl30
Video Card(s) RTX 4090 Gaming OC
Storage Samsung 980 pro 2tb, Samsung 860 evo 500gb, Samsung 850 evo 1tb, Samsung 860 evo 4tb
Display(s) Acer XB321HK
Case Coolermaster Cosmos 2
Audio Device(s) Creative SB X-Fi 5.1 Pro + Logitech Z560
Power Supply Corsair AX1200i
Mouse Logitech G700s
Keyboard Logitech G710+
Software Win10 pro
You always have a bottleneck somewhere, there is no such thing as a "bottleneck free system", otherwise you'd get what ? Infinite frames per second ? And no, neither of those things have to mean stuttering or input lag, that's not how it works. According to this logic you always either get stutter or input lag which makes no sense.

CPU bottleneck => lower bound on CPU time for each frame
GPU bottleneck => lower bound on GPU time for each frame

That's all these things mean, nothing more nothing less.

I think he means that ideally you want to use an fps limiter - which is also what i do.
 
Joined
Jan 14, 2019
Messages
12,690 (5.83/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
You always have a bottleneck somewhere, there is no such thing as a "bottleneck free system", otherwise you'd get what ? Infinite frames per second ? And no, neither of those things have to mean stuttering or input lag, that's not how it works. According to this logic you always either get stutter or input lag which makes no sense.

CPU bottleneck => lower bound on CPU time for each frame
GPU bottleneck => lower bound on GPU time for each frame

That's all these things mean, nothing more nothing less.
I know what you are saying but everyone has a bottleneck be it CPU, GPU, SSD/HDD, monitor, RAM, games/software, even PSU. Really its, you don't want a bottleneck that severely prevents your hardware from performing up to its full potential with severely being the key word.
IMO, ideally, you'd always want to have a GPU bottleneck on your system, as it provides lower average framerates (whether "lower" means 60 instead of 80, or 200 instead of 250), while a RAM or VRAM bottleneck means stutters or freezes during asset loading, and a CPU bottleneck usually means random stutters at random points, both of which are infinitely more annoying than just a generally lower than "x" FPS.
 
Joined
Jan 8, 2017
Messages
9,521 (3.27/day)
System Name Good enough
Processor AMD Ryzen R9 7900 - Alphacool Eisblock XPX Aurora Edge
Motherboard ASRock B650 Pro RS
Cooling 2x 360mm NexXxoS ST30 X-Flow, 1x 360mm NexXxoS ST30, 1x 240mm NexXxoS ST30
Memory 32GB - FURY Beast RGB 5600 Mhz
Video Card(s) Sapphire RX 7900 XT - Alphacool Eisblock Aurora
Storage 1x Kingston KC3000 1TB 1x Kingston A2000 1TB, 1x Samsung 850 EVO 250GB , 1x Samsung 860 EVO 500GB
Display(s) LG UltraGear 32GN650-B + 4K Samsung TV
Case Phanteks NV7
Power Supply GPS-750C
I think he means that ideally you want to use an fps limiter - which is also what i do.

And what does that exactly achieve ? Unless there is some issue with the game engine there is no reason to artificially limit performance, you're not getting any less stutter and you're certainly getting more input lag if you do that.
 
Top