• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

The Last of Us Is Now Available on PC

I'll have to go through and look again, but I don't recall seeing any textures as blurry as Alex is showing at Med textures. My main concern is the CPU being so heavily loaded, which makes me think bumping up to a 4070 Ti on my 8700K isn't going to necessarily solve the entire problem FPS wise. I'm sure it would texture wise though because 12GB VRAM should allow High textures, possibly even at 1440p. Then again, I have just 16GB RAM and wasn't planning on going 32GB until I upgrade to something like a 13700K, meaning I might still be stuck at 1080p. So I may end up just upgrading the core platform like I originally planned, and eventually go with something like a 4090 or 5080.

I've now finished the game though and I only had that one crash, which again, very well may have been caused by my minimizing the game at length while doing something else, and just the two minor bugs. I never had a problem with stutter or hitching though, and actually finished out the game with FSR 2 on Quality mode with mostly near 60 FPS.

I have to say though, I was a bit disappointed by the ending of the game. The tunnel was FAR too easy to sneak through, and the hospital seemed cool at first, but then just felt too repetitive, with a WAY too easy sneak to get to Ellie.

This might be the very first Digital Foundry review where I don't entirely agree with Alex's assessment, whom I've always admired for his attention to detail. I won't know until I look at the game closer though. Hopefully the devs are paying attention to this video and will be able to determine how they can make some further improvements.
I've only watched about 2/3 of the video so I'm not sure if this came up later, but I did wonder if DLSS being on every time he did texture comparisons played any part, especially since he mentioned that it was an older version. Even so, the resource usage is way too heavy.

In any case, while I'm certainly disappointed about the various issues, I already finished the game on the PS4. It was a slog as I'll never get used to controllers, but the story kept me from quiting. Once the dust settles with patches, I will try it just to see how mouse and keyboard feel.

The ending put me in the same type of mind frame as the airport section in Max Payne 3, no more sneaking, terminate with extreme prejudice.
 
I've only watched about 2/3 of the video so I'm not sure if this came up later, but I did wonder if DLSS being on every time he did texture comparisons played any part, especially since he mentioned that it was an older version. Even so, the resource usage is way too heavy.

In any case, while I'm certainly disappointed about the various issues, I already finished the game on the PS4. It was a slog as I'll never get used to controllers, but the story kept me from quiting. Once the dust settles with patches, I will try it just to see how mouse and keyboard feel.

The ending put me in the same type of mind frame as the airport section in Max Payne 3, no more sneaking, terminate with extreme prejudice.

I've only checked textures more closely on the DLC Left Behind so far. At first I thought they were fine, very detailed on a rolling door I looked at closely. Then I started noticing many walls that did look rather blurry. I'll know more when I look at brick walls when I play the main game again.

It is a shame about the heavy resource usage, and I feel saying High preset for 1080p is not enough. They really should have said with Med Textures in parenthesis, because obviously those are some of THE most important settings.

Actually I thought the opposite of the ending. It wasn't too hard for me to sneak kill most enemies in the hospital, and the last bunch were all looking only one direction down the hallway, making it easy to just sneak past most of them. Sure, the very last part was a mad, overt chase, but you only had to know the path.
 
Last edited:
As I wrote early in the thread. I tried the game. It ran fine beside two lag spikes.

But i do also have a decently beefy pc. 5950X, 32 GB ram and a rtx 4090.

But I agree with people. Thinking on how old the original game is. It is way to demanding.

With one ccd active, load on 8 cores/16 threads is 80 to 90 % load amd peak at 100 % load. Thats way to much for such a old game. Nearly maxing 16 threads out is wild.

At 1440P ultra 18 gb+ of memory use is also quite high.

Vram usage was 11 gb the short amount of time I had time for. But still again high for how old the base game is.

There are definitely a need for optimizing. It's to demanding for what it is.
 
18 gb+ of memory use is also quite high

I thought you meant VRAM but then saw you specified VRAM later in the post... 18GB of RAM is more than quite high, that's completely unacceptable for any program let alone a game to use.
 
I thought you meant VRAM but then saw you specified VRAM later in the post... 18GB of RAM is more than quite high, that's completely unacceptable for any program let alone a game to use.

There are many games that use more than 16gb ram at this point...

Hardware requirements increase over time, as they always have, and always will...
 
I thought you meant VRAM but then saw you specified VRAM later in the post... 18GB of RAM is more than quite high, that's completely unacceptable for any program let alone a game to use.
Above 16 gb ram usage will over time be a new norm in games what ever you like it or not.

Just at Quad-core cpu was the norm once, hexa core has taken over that part. Octa core Will be the norm in a not so distant future.

Also gpu and vram will increase over time.

That's the back side of getting games with better graphics, better ai and details. That put more strain on the pc hardware. Future games will be more demanding over time.

But to get back to the last of us. For how old the base game is, then yes I agree. It is apselutely to demanding and use to much ressurses. It should not be nessesery to taxing 16 threads alsost to max and use +16 gb of memory. It's to high. It's sadly another bad port.

There are unfortunately way to many games that gets released way to soon to much annoyance for us gamers and it ruining the experience of pc gaming. I can definitely see why some people simply quit pc gaming for console.

All the bad optimized games throw time is also why I dropped pre-order games back in 2014. Far Cry 4 was for me simply unplayable. So since 2014 i have not pre-ordered a single game. It's too annoying and the disappointment is not funny. Game developers must earn my money by not releasing crappy optimized games.

Release crap = no money
 
Above 16 gb ram usage will over time be a new norm in games what ever you like it or not.

Just at Quad-core cpu was the norm once, hexa core has taken over that part. Octa core Will be the norm in a not so distant future.

Also gpu and vram will increase over time.

That's the back side of getting games with better graphics, better ai and details. That put more strain on the pc hardware. Future games will be more demanding over time.

But to get back to the last of us. For how old the base game is, then yes I agree. It is apselutely to demanding and use to much ressurses. It should not be nessesery to taxing 16 threads alsost to max and use +16 gb of memory. It's to high. It's sadly another bad port.

There are unfortunately way to many games that gets released way to soon to much annoyance for us gamers and it ruining the experience of pc gaming. I can definitely see why some people simply quit pc gaming for console.

All the bad optimized games throw time is also why I dropped pre-order games back in 2014. Far Cry 4 was for me simply unplayable. So since 2014 i have not pre-ordered a single game. It's too annoying and the disappointment is not funny. Game developers must earn my money by not releasing crappy optimized games.

Release crap = no money

Octa already is the norm, as the current consoles have amd 3700 cpus, thus all games being developped atm will target a minimum of 8 cores.
 
Octa already is the norm, as the current consoles have amd 3700 cpus, thus all games being developped atm will target a minimum of 8 cores.

They are mobile version of Zen 2 making their IPC actually more similar to Zen+ due to the lower clocks. The 3700 is much faster.
 
But i do also have a decently beefy pc. 5950X, 32 GB ram and a rtx 4090.
Yeah, because there is something better than a RTX 4090?

Honestly, yeah, people with pretty much $2500 GPU's shouldn't talk about how well optimized something is unless it makes that card wimper.
 
The RTX 4090 can be had for around $1600 to $1800 in the US.
And? It's still expensive at 1600$ and not everyone lives in the US...

The fact is the game has no business being so demanding for how crappy it looks compared to, lets say RE4, HZD, Spiderman etc. that run way better.
 
And? It's still expensive at 1600$ and not everyone lives in the US...

The fact is the game has no business being so demanding for how crappy it looks compared to, lets say RE4, HZD, Spiderman etc. that run way better.

Anyone in the world can become rich if they want. It's really easy. Call up a semi-truck company in USA, they are all desperate and pay for the work visa process, and starting pay is 80k a year for most of them now. Training is free as well for most of them these days.

My Dad works a truck gate, almost all of them can't even speak English.

Get paid 80k to listen to podcasts and audiobooks, sleep in your semi truck's full size bed, stack money for 5 years and live frugally. Then buy a small modest house in cash, since you were smart and didn't spend any money in the last five years just driving and driving! Now after buying your house in cash go buy a 5090 gpu cause those are out now.

Done. Enjoy.
 
Honestly, yeah, people with pretty much $2500 GPU's shouldn't talk about how well optimized something is unless it makes that card wimper.
But you can't really gatekeep who is allowed to complain or not, no matter how detached from actually experiencing 'the thing' they are. Case in point, the multitude of people who keep criticising the 3080's VRAM despite not owning one and making that complaint to someone that does and has no issues ... :kookoo: But hey, free speech and all.

In this example, I think the owner of a 4090 still has every right to talk about how well optimised a game is, their rigs performance is a perfectly valid data point to show how high end hardware scales with the game.
 
But you can't really gatekeep who is allowed to complain or not, no matter how detached from actually experiencing 'the thing' they are. Case in point, the multitude of people who keep criticising the 3080's VRAM despite not owning one and making that complaint to someone that does and has no issues ... :kookoo: But hey, free speech and all.

In this example, I think the owner of a 4090 still has every right to talk about how well optimised a game is, their rigs performance is a perfectly valid data point to show how high end hardware scales with the game.
It's not about who is and isn't allowed to complain. It's more about the fact that there's no need to state the obvious. When you have a 4090, of course your games run well, the same way when you have a Ferrari, of course it's fast. When your games don't run well on your 4090, that's when you should complain.
 
It's more about the fact that there's no need to state the obvious. When you have a 4090, of course your games run well, the same way when you have a Ferrari, of course it's fast.
Still a valid data point in context, it shows what level of brute force hardware can overcome the crap optimisation, it can be used to extrapolate what one might expect from their set-up and so on. And what if it didn't run fine, that feedback is OK but that it does, isn't?
 
Still a valid data point in context, it shows what level of brute force hardware can overcome the crap optimisation, it can be used to extrapolate what one might expect from their set-up and so on.
No it's not. When you get people commenting with low-end and midrange hardware on how the game runs for them, that's valuable information. There's nothing new about a game running well on the fastest graphics card in existence.

And what if it didn't run fine, that feedback is OK but that it does, isn't?
I'm not saying it's not OK. What I'm saying is, it has no information content, so it's not relevant.

Stating that your Lamborghini has good acceleration is not information because everybody knows that. It's just bragging.
 
No it's not
Well that's your opinion and you're entitled to it, nobody can force you to find validity/value in it. I find it to be a valuable data point in context to the specific game and how it plays on certain hardware, Wether that's for other 4090 owners value, people considering what to buys' value, or anyone else that can extract even an ounce of value, as I have for my own comparative analysis. As to your Ferrari and Lamborghini examples, I find them remarkably irrelevant. He's not saying his 4090 is fast, he's saying how it plays this game.

In any case, this has derailed the thread enough, I propose we agree to disagree.
 
Well that's your opinion and you're entitled to it, nobody can force you to find validity/value in it. I find it to be a valuable data point in context to the specific game and how it plays on certain hardware, Wether that's for other 4090 owners value, people considering what to buys' value, or anyone else that can extract even an ounce of value, as I have for my own comparative analysis. As to your Ferrari and Lamborghini examples, I find them remarkably irrelevant. He's not saying his 4090 is fast, he's saying how it plays this game.
Of course it plays this game. The 4090 is the fastest 2022-23 card designed to play every single 2022-23 game. Stating that it does is like stating that the kettle boils water. Big surprise!

In any case, this has derailed the thread enough, I propose we agree to didisagree.
Fair enough, let's leave it at that. :)
 
Yep, I agree, many don't understand VRAM usage and thus, based on their past experience, simply set everything to 'Ultra' and expect the game to run smoothly. Just like DSR, the traversal and odd hitching issue have been blown up by mouth breathers. I've played the game on both my rigs, and with my weaker rig (R9 3900X + 32GB RAM + RX 6900 XT) I can have a pretty good experience with it on my 4K TV with FSR2 at 'Quality'. Framerate dips below 60fps, but is still quite playable despite the ever present hitching. It's not a "stuttering mess" as some have chosen to describe it, unless, like for TLOU players, they have set ingame graphics settings too high for their hardware to handle.
Yeah, those morons expect their 8GB GPUs +16/32GB RAM to run a game with textures that look worse than the PS4 remaster (medium texutures look like shit and exceed 8GB VRAM on native 1440p). Bunch of mouth breathing dorks. /s
 
Octa already is the norm, as the current consoles have amd 3700 cpus, thus all games being developped atm will target a minimum of 8 cores.
lol, love when people post about game development and obviously have no clue to how game engines run

its about performance not cores, cores are meaningless as a stand alone stat and this has been proves so many times its no longer funny just pathetic every time its posted

comparing the a console CPU to a desktop CPU is like comparing a daddy long legs to an octopus because they both have eight legs and coming to the conclusion an octopus can infest your garage window with its web

FYI, the Playstation 4 had an eight core CPU (custum jaguar based using two quads) and that launched in 2013. You would think a decade of development would be enough to require 8 cores as a minimum for games...unless you are valve working on Half Life 3.

They are mobile version of Zen 2 making their IPC actually more similar to Zen+ due to the lower clocks. The 3700 is much faster.
not only lower clocked (by a full 1ghz if I recall correctly) but also had their cache cut significantly (I believe around 50%) to the 3700 and we know how much cache impacts gaming and specifically AMD CPUs for gaming. In fact when digital foundry tested the the 3700 at similar speeds as the console CPUs, they got Ryzen 1500x type performance and that was with the 3700 cache intact. The console CPUs is fine but hardly impressive but name me one console CPU that really was. Consoles are always power limited with the GPU soaking up the most power, the CPU is always a power cost cutting piece of hardware.

The AMD fan boys ran with the console CPU as a way to justify their purchase only to watch the six core 5600x blow anything from the Ryzen 3xxxx out of the water for gaming and the 7600x justifying that performance still rules for gaming. Unfortunately it also create the eight core = future proof myth that ironically Intel fan boys picked up the flag once they found themselves on the wrong side of the CPU performance war.
 
Last edited:
lol, love when people post about game development and obviously have no clue to how game engines run

its about performance not cores, cores are meaningless as a stand alone stat and this has been proves so many times its no longer funny just pathetic every time its posted

comparing the a console CPU to a desktop CPU is like comparing a daddy long legs to an octopus because they both have eight legs and coming to the conclusion an octopus can infest your garage window with its web


not only lower clocked (by a full 1ghz if I recall correctly) but also had their cache cut significantly (I believe around 50%) to the 3700 and we know how much cache impacts gaming and specifically AMD CPUs for gaming. In fact when digital foundry tested the the 3700 at similar speeds as the console CPUs, they got Ryzen 1500x type performance and that was with the 3700 cache intact. The console CPUs is fine but hardly impressive but name me one console CPU that really was. Consoles are always power limited with the GPU soaking up the most power, the CPU is always a power cost cutting piece of hardware.

The AMD fan boys ran with the console CPU as a way to justify their purchase only to watch the six core 5600x blow anything from the Ryzen 3xxxx out of the water for gaming and the 7600x justifying that performance still rules for gaming. Unfortunately it also create the eight core = future proof myth that ironically Intel fan boys picked up the flag once they found themselves on the wrong side of the CPU performance war.


Overall CPU performance is the #1 factor for sure. I'd still lean towards 8 cores especially with how Intel cuts cache on it's lower tier cpu models. On Zen it's less of an issue but I'd still lean 7700X/7700. That has absolutely nothing to do with the consoles and by todays standards meh hardware.

People can't really go wrong with any modern cpu though both intel and amd have pretty compelling options for every budget at this point.
 
I'd still lean towards 8 cores especially with how Intel cuts cache on it's lower tier cpu models.
yeah, their lack of cache on lower tier CPUs sucks but still great modern day performance as you said
On Zen it's less of an issue but I'd still lean 7700X/7700
they tend to use an equal level of L3 cache which seems to lessen the impact for gaming performance when moving from CPU to CPU in the same series
People can't really go wrong with any modern cpu though both intel and amd have pretty compelling options for every budget at this point.
most pointless modern day debate especially when half the arguments are either GPU limited or performance levels are over the refresh rate of the monitor in question anyways
 
most pointless modern day debate especially when half the arguments are either GPU limited or performance levels are over the refresh rate of the monitor in question anyways
I think the original point was something similar: for gaming, you can't go wrong with basically any modern CPU these days.
 
They are mobile version of Zen 2 making their IPC actually more similar to Zen+ due to the lower clocks. The 3700 is much faster.

Singlethread performance, not IPC :) the 3700 is faster, but its performance doesn't scale liniarly with clocks, thus the cpu of the ps5 isn't the 30 ish % slower that the clockspeeds would suggest. But it isn't really apples to apples either, as the ps5 cpu is ofc a custom chip with a shared memory pool, using substantially faster memory.
lol, love when people post about game development and obviously have no clue to how game engines run

its about performance not cores, cores are meaningless as a stand alone stat and this has been proves so many times its no longer funny just pathetic every time its posted

comparing the a console CPU to a desktop CPU is like comparing a daddy long legs to an octopus because they both have eight legs and coming to the conclusion an octopus can infest your garage window with its web

FYI, the Playstation 4 had an eight core CPU (custum jaguar based using two quads) and that launched in 2013. You would think a decade of development would be enough to require 8 cores as a minimum for games...unless you are valve working on Half Life 3.


not only lower clocked (by a full 1ghz if I recall correctly) but also had their cache cut significantly (I believe around 50%) to the 3700 and we know how much cache impacts gaming and specifically AMD CPUs for gaming. In fact when digital foundry tested the the 3700 at similar speeds as the console CPUs, they got Ryzen 1500x type performance and that was with the 3700 cache intact. The console CPUs is fine but hardly impressive but name me one console CPU that really was. Consoles are always power limited with the GPU soaking up the most power, the CPU is always a power cost cutting piece of hardware.

The AMD fan boys ran with the console CPU as a way to justify their purchase only to watch the six core 5600x blow anything from the Ryzen 3xxxx out of the water for gaming and the 7600x justifying that performance still rules for gaming. Unfortunately it also create the eight core = future proof myth that ironically Intel fan boys picked up the flag once they found themselves on the wrong side of the CPU performance war.

The only pathetic thing here is the manner in which you replied to me.
 
Back
Top