• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

24 Gb video RAM is worth for gaming nowadays?

24 Gb video RAM is worth for gaming nowadays?

  • Yes

    Votes: 66 41.5%
  • No

    Votes: 93 58.5%

  • Total voters
    159
...... -_-

The usage DOES NOT MATTER - some games will bottleneck on a single thread, others will use as much as it can. But the fps at which you get cpu bottlenecked is what matters. You were running 130-170 fps there. Do a vid of you driving around in the city in cyberpunk, and show us if the fps is higher than that...
Im not looking at the usage, im looking at the power draw. Nothing as of yet has managed to draw that much power from my 12900k. And sure, here is a video in cyberpunk, got one with the 13900k as well

 
Im not looking at the usage, im looking at the power draw. Nothing as of yet has managed to draw that much power from my 12900k

Again, doesn't matter - it's just cause it's fully using all the threads... the fps at which you get cpu bottlenecked is what matters....
 
Again, doesn't matter - it's just cause it's fully using all the threads... the fps at which you get cpu bottlenecked is what matters....
I get similar FPS on cyberpunk as well, but with lower power draw. Video above.

And that's with RT that makes it much heavier on the CPU
 
Wrong. I see you ran out of VRAM so this framerate can't be taken seriously and that's why I asked you for a screenshot with ~90% VRAM usage so it could be comprehensible.

/facepalm

Welcome to my ignore list.
 
I get similar FPS on cyberpunk as well, but with lower power draw. Video above.

And that's with RT that makes it much heavier on the CPU

That does not look like RT, and the vid doesn't show it being enabled either - using presets is super buggy. You have to manually do the settings to make sure they apply. And vid doesn't even show the cpu watt metric you are speaking about being the issue...

But again, it is simply a consequence of the game utilizing your cores to a higher degree - just like some games use more wattage on the gpu. What matters is the actual game performance, not what game uses 20 watts more.

Αm i going to be in company with other people who share facts you don't like? :roll:

No, you can disagree all you want, but that guy is beyond communicating with - he didn't even understand the point of my screenshot in the first place.
 
That does not look like RT, and the vid doesn't show it being enabled either - using presets is super buggy. You have to manually do the settings to make sure they apply. And vid doesn't even show the cpu watt metric you are speaking about being the issue...

But again, it is simply a consequence of the game utilizing your cores to a higher degree - just like some games use more wattage on the gpu. What matters is the actual game performance, not what game uses 20 watts more.
Oh come on. Here is a video where you can see it's enabled - and you can also see the power draw.


How can the power draw not matter? Im getting similar fps in cyberpunk with much lower power draw, how can that possibly not matter , lol
 
Oh come on. Here is a video where you can see it's enabled - and you can also see the power draw.


How can the power draw not matter? Im getting similar fps in cyberpunk with much lower power draw, how can that possibly not matter , lol

So as i knew, when you are driving inside the city (2 min and onwards), you are down to 110 fps - so a fair bit lower than TLOU.

And again, it is obvious why you get lower power draw in cyberpunk - the 2 primary threads are at 85-90% load, and the rest at 60% load vs all threads being maxed out in the last of us. That will obviously lead to higher power consumption. But we are talking a difference of 30 watts. If that is a big deal to you, then maybe you shouldn't have bought a 13900k...
 
So as i knew, when you are driving inside the city (2 min and onwards), you are down to 110 fps - so a fair bit lower than TLOU.

And again, it is obvious why you get lower power draw in cyberpunk - the 2 primary threads are at 85-90% load, and the rest at 60% load vs all threads being maxed out in the last of us. That will obviously lead to higher power consumption. But we are talking a difference of 30 watts. If that is a big deal to you, then maybe you shouldn't have bought a 13900k...
I drop to lower than 110 on TLOUs heavy scenes as well, for example in the intro with all the flames im barely above 100. So what?

The above numbers are with a 12900k not a 13900k but that is not the point. TLOU is the heaviest freaking game on terms of CPU ive ever used, and you are insisting it's not - when obviously - with videos, it is. I mean wtf?
 
I drop to lower than 110 on TLOUs heavy scenes as well, for example in the intro with all the flames im barely above 100. So what?

The above numbers are with a 12900k not a 13900k but that is not the point. TLOU is the heaviest freaking game on terms of CPU ive ever used, and you are insisting it's not - when obviously - with videos, it is. I mean wtf?

You mean the heaviest in terms of watt, which i honestly don't care about. I just find it ironic that you do, since you bought the most power hungry cpu available...
 
If you use your brain just a bit... just tiny wee bit... when im running 8k, then obviously you need ALOT less gpu power and vram to run at something like 1440p or 1080p...

How about you watch @Frag_Maniac 's video, running the game on a gtx 1080 with high settings, at 60 fps...



Yeah you can clearly see in that vid that VRAM usage on High settings, which includes ALL Textures on High btw, stays at or under 7000MB. And CLEARLY a mere 8700K, despite being near 100% used, is handling the game fine on High. I also set AF to 16x, I really don't know why the game auto set it at 4x for me. And that frame dip to 38 FPS at the 8:33 mark is so brief you can hardly notice a single hitch, and it only did that in one other spot, which I did not capture, where you see the monkeys outdoors for the first time.

I DO recommend anyone on 16GB RAM uses a tool to disable all telemetry in W10 or 11 though, as it can free up at least 1GB RAM. I use O&O ShutUp10, and it's as easy as a single click, and can be reversed the same way. Today I saw a thread on TLoU Part I Steam forum where a guy said patch 1.0.7.0 fixed the crashing. Others of course saw it as sarcasm, you know how the madding Steam crowd can be. Some chimed in on a serious note that it was the Nvidia driver that fixed the crashing.

News flash, the Nvidia 531.41 driver for the game came out 3/23/23, a whole FIVE DAYS before the game released, and it's just now dawning on some of these people they need to install it. Now granted, Iron Galaxy's Arkham Knight and Uncharted LoTC ports were not so good, but it's clear to me this one is getting unfairly bashed by people that haven't a clue what it takes to get good gaming results on a PC, which includes compiling shaders 100% BEFORE playing games that have a main menu compiler.

I also feel the Tempest engine made for PS5 by Naughty Dog is a very efficient engine. It makes full use of your hardware, but stays within tolerable limits even on 8GB GPUs even at High Texture settings, which look great btw, it has very few frame dips and doesn't stutter while doing so, and it EASILY performs better than their in-game VRAM graph leads you to believe it will. Furthermore, there's no reason not to try the game now that Valve have given the game an unlimited refund period.

Personally, I'm a bit worried Valve are giving those complaining too much leeway extending the refund period. Whenever you have people neglecting basic and obvious steps to make their games run as they should, it's only asking for more people to try the game on a whim, and levy more bashing that will no doubt continue to ruin the game's reputation and curtail sales. At this rate we'll be damn lucky if we ever even SEE a PC port for The Last of Us Part II. :rolleyes:
 
You mean the heaviest in terms of watt, which i honestly don't care about. I just find it ironic that you do, since you bought the most power hungry cpu available...
It doesn't matter if you care or not - I said it's the heaviest in terms of CPU and it is, both in terms of %usage and power draw - nothing comes close to TLOU. 6 core CPUs struggle with it. Even the 7600x drops to 75 fps in heavy scenes.

I've sold my 13900k and went back to the 12900k but I don't see how this is relevant.
 
...6 core CPUs struggle with it.

This is not true at all, I'm running above auto detect settings on my 6 core 8700K at stock clocks, with a mere 8GB GTX 1080 GPU, and 2x8GB 3200 RAM, and the game stays at or near 60 FPS most of the time, with no stutters or crashes, which you can clearly see in the 28 min video I posted above. And that's just on patch 1.0.6.0, I've not even had a chance to try it yet on 1.0.7.0.

I mean it just seems to me you are echoing the negative reviews you're reading, vs trying the game yourself. :rolleyes:

This is a reputable TECH forum people, with brilliant staff like W1zzard. It's chat needs to be more intelligent than a mere Steam forum for God's sake!
 
This is not true at all, I'm running above auto detect settings on my 6 core 8700K at stock clocks, with a mere 8GB GTX 1080 GPU, and 2x8GB 3200 RAM, and the game stays at or near 60 FPS most of the time, with no stutters or crashes, which you can clearly see in the 28 min video I posted above. And that's just on patch 1.0.6.0, I've not even had a chance to try it yet on 1.0.7.0.

I mean it just seems to me you are echoing the negative reviews you're reading, vs trying the game yourself. :rolleyes:

This is a reputable TECH forum people, with brilliant staff like W1zzard. It's chat needs to be more intelligent than a mere Steam forum for God's sake!O
I have the game, ive never seen any other game push the CPU that hard either in terms of cpu usage or power draw, so what exactly am I echoing? Also at 720p it requires double the amount of vram Plague tale requiem needs for 4k ultra. So....WHAT?
 
People arguing about a pc port by the worst studio at doing ports on the planet. I've seen it all now.
 
I've read the techspot reviews and how the hardware demands drop significantly at high and medium settings

ok :wtf:

The game itself is not really my cup of tea but people can enjoy it at 720p, 1080p, 1440p, 4k, 8k, and every resolution in between.

True, although the hardware gets thoroughly used regardless of spec or settings, it DOES stay within reasonable limits, like under 7000MB VRAM on a 8GB GPU, and under 11000MB RAM on a 16GB mem kit. It also stays well under 100% CPU usage, even when at above auto detect settings, and my GPU and CPU stay well within reasonable temps.

On the comment not your cup of tea though, I have to slightly agree to a point. I LOVE the story and great dialog and voice acting, it's truly well executed in that regard, but from a survival horror standpoint, it just does not grip me like games such as Dead Space or The Evil Within 1 did. There are FAR too many places where you can sneak past or easily dispatch enemies. Only ONE Bloater need actually be killed. The upgrade system also feels a bit unnecessary to a degree because there are many weapons you don't need to use, and the bottle/brick stun followed by a one melee hit kill option is an easy fall back. Once you get the axe, it will do several one hit kills on human enemies.

It's just strange to me that Uncharted 4 and Lost Legacy's combat was so visceral, yet The Last of Us, their very biggest title, by comparison feels more like Adventure Horror than Survival Horror. That said, I've yet to play it on Survivor or Grounded modes, but unless they are somehow WAY harder than Hard mode, I'm going to be a bit disappointed in the gameplay.

I have the game, ive never seen any other game push the CPU that hard either in terms of cpu usage or power draw, so what exactly am I echoing? Also at 720p it requires double the amount of vram Plague tale requiem needs for 4k ultra. So....WHAT?

I said it because you made the claim 6 core CPUs struggle with it. Does my above video LOOK like it's struggling on my stock 8700K to you? You make remarks like that, you get labeled as someone listening to claims, vs your own experience, so YEAH, it seems OBVIOUS you have not played it on a 6 core CPU, which was my point!

Stick with what you actually KNOW bro, do NOT merely quote the status quo! I can't BELIEVE you choose to throw stones at Iron Galaxy playing it on the spec you have! :rolleyes:

I've also played a Plague Tale Requiem, extensively, on this same spec, and I know first hand it performs FAR worse than The Last of Us Part I.

Here's some vids I made just in case you don't believe I've played it.




Usage is one thing, performance another. Just because a game fully uses your spec doesn't mean it causes problems, and I feel I've clearly showed that with my above vid on TLoU Part I. A Plague Tale Requiem however suffers, MANY frame drops and lag because of it.
 
Last edited:
The PS3 version of the engine used in the first last of us was designed to offload a lot of GPU functions like culling onto software which ran on the Cell processor, I bet they still use the same backend, there is no way they reworked the engine for PC, they probably just ported everything to use normal CPU multithreading.

That's why it has such CPU bounded garbage performance, it's a 10 year old game, it's obvious that the game logic can't be heavy, it's just doing a lot of work that it shouldn't on the CPU.
 
The PS3 version of the engine used in the first last of us was designed to offload a lot of GPU functions like culling onto software which ran on the Cell processor, I bet they still use the same backend, there is no way they reworked the engine for PC, they probably just ported everything to use normal CPU multithreading.

That's why it has such CPU bounded garbage performance, it's a 10 year old game, it's obvious that the game logic can't be heavy, it's just doing a lot of work that it shouldn't on the CPU.

Possibly but this is also technically a port of the PS5 version and even on comparable pc hardware seems to run significantly worse. I'd lean towards it just being a shit port vs legacy code causing issues.

Could be using the cpu to do a ton of decompression somthing both the PS5/XSX have dedicated hardware for just not in a very efficient way. Spiderman is also relatively CPU heavy but is also open world and uses RT done by the vastly superior Nixes who are probably the best pc porting studio.
 

People playing without the 531.41 driver, people not compiling shaders completely before playing, people running too many W10/11 apps instead of using a simple one click tool to turn off telemetry, AND finally people judging it by just the 1.01.0 patch condition, or those whom played on just it and bashed it, and not trying it for themselves.

In the world of gaming, considering how gamers are these days, PLENTY can go wrong that has ZERO to do with the developers. And that's coming from someone who played Arkham Knight and Uncharted LoTC and really loathed what Iron Galaxy left us with on those port releases.

It's clear to me Iron Galaxy have come a long way, but I suspect it's also the Tempest engine made for PS5 by Naughty Dog that is why the game fully utilizes your hardware, yet stays within limits regardless of spec, even with above auto detect settings, and does not stutter or crash constantly even when frames dip a bit, as LONG as you compile shaders first.

Like the old tech saying goes, the problem exists between the PC and the chair!
 
Last edited:
Possibly but this is also technically a port of the PS5 version and even on comparable pc hardware seems to run significantly worse. I'd lean towards it just being a shit port vs legacy code causing issues.

Could be using the cpu to do a ton of decompression somthing both the PS5/XSX have dedicated hardware for just not in a very efficient way. Spiderman is also relatively CPU heavy but is also open world and uses RT done by the vastly superior Nixes who are probably the best pc porting studio.

The PS5 version is using the same engine but it's capped at 60 so it doesn't matter, shit port and legacy code causing issues are not mutually exclusive, the port can be shit because of legacy code, Crysis Remastred had the same problem it would drop below 60 simply because of engine limitations caused by 15 year old legacy code.
 
The PS5 version is using the same engine but it's capped at 60 so it doesn't matter, shit port and legacy code causing issues are not mutually exclusive, the port can be shit because of legacy code, Crysis Remastred had the same problem it would drop below 60 simply because of engine limitations caused by 15 year old legacy code.

It did it by underutilizung the cpu though it was primarily capped to like 2 threads. This seems to use quite a bit more cpu resources in general I'm just not sure if it's actually doing it efficiently.

I'm not a game developer so i couldn't say I just look at the studio who did this ports track record at doing ports and I'm not surprised. It is amusing seeing all these obvious engine developers in this thread argue about it though.
 
People playing without the 531.41 driver, people not compiling shaders completely before playing, people running too many W10/11 apps instead of using a simple one click tool to turn off telemetry, AND finally people judging it by just the 1.01.0 patch condition, or those whom played on just it and bashed it, and not trying it for themselves.

In the world of gaming, considering how gamers are these days, PLENTY can go wrong that has ZERO to do with the developers. And that's coming from someone who played Arkham Knight and Uncharted LoTC and really loathed what Iron Galaxy left us with on those port releases.

It's clear to me Iron Galaxy have come a long way, but I suspect it's also the Tempest engine made for PS5 by Naughty Dog that is why the game fully utilizes your hardware, yet stays within limits regardless of spec, even with above auto detect settings, and does not stutter or crash constantly even when frames dip a bit, as LONG as you compile shaders first.
from the outside looking in (like i said previously I don't have this game) it does seem to have been rushed and could have been optimized better so those are roles of the publisher and port studio. I'm not knocking their hard work, just their timeline and grasp of the situation (happens all the time in business). From what you are saying and from the few things I have read, patches and driver updates have been released or are on the way to making the game enjoyable which is great for gamers. Also, many of the earlier FPS issues seem to impact play only on the ultra settings as per techspot;

Using slightly dialed-down quality settings, The Last of Us Part I appears quite easy to run, and provided you can manage the game's VRAM requirements, you shouldn't have any issues. It's remarkable how well a previous generation GPU such as the Radeon RX 6800 plays this game, it's buttery smooth and that's probably not something you'd expect to find after all the online controversy.


What I clearly despise is all the early on people who use this poorly optimized port (not the first game its happened to) upon its release to rush out and justify their hardware. Seeing comments like it can't play on a six core CPU or you need 10GB of v-ram are all meaningless comments since none of that talks about performance just one aspect of a piece of hardware design.

It is amusing seeing all these obvious engine developers in this thread argue about it though.
I never knew how many of my friends had law and medical degrees until I saw their posts in facebook threads about covid and politics
 
The PS3 version of the engine used in the first last of us was designed to offload a lot of GPU functions like culling onto software which ran on the Cell processor, I bet they still use the same backend, there is no way they reworked the engine for PC, they probably just ported everything to use normal CPU multithreading.

That's why it has such CPU bounded garbage performance, it's a 10 year old game, it's obvious that the game logic can't be heavy, it's just doing a lot of work that it shouldn't on the CPU.

This is not based on the PS3 version of the game, it's an enhanced version of the PS5 remake (not remaster), which was completely remade from the ground up in the TLOU 2 engine. Similar to the recently released Resident Evil 4.

I'd expect it to be pretty heavy, although, quality issues with the PC version were quite expected given a certain fellow's stance on PC games ;)

Sony will probably have the studio work it out.
 
You mean the heaviest in terms of watt, which i honestly don't care about.
How else do you suggest testing whether a game is CPU-dependant or not? FPS means nothing in this case, as it's only an indication of how well the game runs on YOUR system, not about the game in general.

People arguing about a pc port by the worst studio at doing ports on the planet. I've seen it all now.
Just because I'm not gonna buy it for a whopping 60 quid (I'll wait until it drops to 10 or 15), it doesn't mean arguing about it isn't fun. :p
 
Back
Top