• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

The Last of Us Part I Benchmark Test & Performance Analysis

Joined
Dec 22, 2011
Messages
3,890 (0.84/day)
Processor AMD Ryzen 7 3700X
Motherboard MSI MAG B550 TOMAHAWK
Cooling AMD Wraith Prism
Memory Team Group Dark Pro 8Pack Edition 3600Mhz CL16
Video Card(s) NVIDIA GeForce RTX 3080 FE
Storage Kingston A2000 1TB + Seagate HDD workhorse
Display(s) Samsung 50" QN94A Neo QLED
Case Antec 1200
Power Supply Seasonic Focus GX-850
Mouse Razer Deathadder Chroma
Keyboard Logitech UltraX
Software Windows 11
Damn, you now need a 4090 to get comfortably over 60FPS at 4K on a console game released 10 years ago.

No wonder I nab cards second hand these days.
 
Joined
Feb 20, 2019
Messages
7,848 (3.89/day)
System Name Bragging Rights
Processor Atom Z3735F 1.33GHz
Motherboard It has no markings but it's green
Cooling No, it's a 2.2W processor
Memory 2GB DDR3L-1333
Video Card(s) Gen7 Intel HD (4EU @ 311MHz)
Storage 32GB eMMC and 128GB Sandisk Extreme U3
Display(s) 10" IPS 1280x800 60Hz
Case Veddha T2
Audio Device(s) Apparently, yes
Power Supply Samsung 18W 5V fast-charger
Mouse MX Anywhere 2
Keyboard Logitech MX Keys (not Cherry MX at all)
VR HMD Samsung Oddyssey, not that I'd plug it into this though....
Software W10 21H1, barely
Benchmark Scores I once clocked a Celeron-300A to 564MHz on an Abit BE6 and it scored over 9000.
Valve already does this with the Steam Deck... and every other hardware/software combo in existence that has the pre-caching feature activated on Linux. If you have it enabled you process the shaders if they aren't in the database or you download them if they already exist. I wonder the size of the database, must be massive.
Yeah, that's how this conversation string started:
Unavoidable? It's 12GB of shaders that will be the same for any given graphics architecture. My Steam deck downloads pre-compiled shaders for deck-verified games hosted by Valve. Why does my Geforce or Radeon need to waste half an hour creating its own when they could be pre-compiled by the developer and downloaded as additional data during install?
You gotta also take in mind that the compiled shaders have to match the version of the drivers. So it's not only by architecture, also by driver version and possibly OS (kernel).
Drivers, no - I only have to recompile shaders if I DDU or manually clear the shader cache, so no driver upgrades don't invoke that. Kernel/OS, I can believe needing a different shader, though I'm not entirely sure why as a DX12 shader or a Vulkan shader is the same API regardless of host OS.
 
Last edited:
Joined
Mar 4, 2022
Messages
31 (0.03/day)
Have we reached a point where games are getting so sophisticated that shader compilation will not be done "quietly" anymore ?
Yes, especially if you expect that to run on a i5 2500K and GTX 1060 as well as on a R9 7950X3D and RTX 4090

There aren't that many architectures that meet the minimum specs. RDNA1, RDNA2, Turing, Ampere, Arc. That's it, just precompile those and host them on the digital download service.

If you want to game on a GPU that isn't covered then sure, compiling shaders yourself is the fallback, but the game was originally designed for RDNA2 and according to the Steam hardware survey, over 50% of the market is running Turing or Ampere.
Or you can use 10-30 minute of your machine's time to compile that instead of wasting thousends of Terabyte of bandwidth to download every one of them and Terabyte of drive space to host them...
Also how many combinations they need to host? Because you basically need every GPU/CPU combo possible, even for those 4 architectures is a nightmare.
Come on guys, it is a one-off thing, Horizon Zero Down is doing the same, I didn't saw people freak out online for that.
 
Joined
Aug 7, 2019
Messages
345 (0.19/day)
Drivers, no - I only have to recompile shaders if I DDU or manually clear the shader cache, so no driver upgrades don't invoke that. Kernel/OS, I can believe needing a different shader, though I'm not entirely sure why as a DX12 shader or a Vulkan shader is the same API regardless of host OS.
Meaningful driver updates, not incremental. If the core of the driver doesn't change you don't need to recompile.
 
Joined
Nov 18, 2010
Messages
7,356 (1.46/day)
Location
Rīga, Latvia
System Name HELLSTAR
Processor AMD RYZEN 9 5950X
Motherboard ASUS Strix X570-E
Cooling 2x 360 + 280 rads. 3x Gentle Typhoons, 3x Phanteks T30, 2x TT T140 . EK-Quantum Momentum Monoblock.
Memory 4x8GB G.SKILL Trident Z RGB F4-4133C19D-16GTZR 14-16-12-30-44
Video Card(s) Sapphire Pulse RX 7900XTX. Water block. Crossflashed.
Storage Optane 900P[Fedora] + WD BLACK SN850X 4TB + 750 EVO 500GB + 1TB 980PRO+SN560 1TB(W11)
Display(s) Philips PHL BDM3270 + Acer XV242Y
Case Lian Li O11 Dynamic EVO
Audio Device(s) SMSL RAW-MDA1 DAC
Power Supply Fractal Design Newton R3 1000W
Mouse Razer Basilisk
Keyboard Razer BlackWidow V3 - Yellow Switch
Software FEDORA 41
A 1080Ti should exist in the charts. It would show the difference between the generations when vram is not an issue.

No, not needed really anymore. Just look at 2080Ti, they are not that far off.

While it even could be useful for me, but cards performance never matched W1z, I was always under water and shunt moded. So what's the point? 2080Ti is enough to get your idea about that gen.
 

Winssy

New Member
Joined
Mar 31, 2023
Messages
20 (0.04/day)
40+ minutes for shader compilation + 10.5GB of VRAM at 900p in a decade-old game?) Excellent job, Sony, you managed to make the most buggy port of your most popular game.. I would give each person involved in creating this port a 3060ti or 3070 and make them complete the game twice at 1440p on ultra settings.
 
Last edited:

Hammerman

New Member
Joined
Jul 29, 2022
Messages
14 (0.02/day)
Anybody notice how almost every console port uses a ton of VRAM? It's not like AMD doing this intentionally right?

It's not just a remake
It's a remake of a remake of a remake :roll:
Original PS3, Remaster on PS4(no original release on PS4) and now this PS5 remade version.

The GPU situation doesnt help for these sort of games. This is quite abit more demanding than most games out there.
 
Joined
Jun 20, 2008
Messages
2,872 (0.49/day)
Location
Northants. UK
System Name Bad Moon Ryzen
Processor Ryzen 5 5600X
Motherboard Asrock B450M Pro4-F
Cooling Vetroo V5
Memory Crucial Ballistix 32Gb (8gb x 4) 3200 MHz DDR 4
Video Card(s) 6700 XT
Storage Samsung 860 Evo 1Tb, Samsung 860 Evo 500Gb,WD Black 8Tb, WD Blue 2Tb
Display(s) Gigabyte G24F-2 (180Hz Freesync) & 4K Samsung TV
Case Fractal Design Meshify 2 Compact w/Dark Tempered Glass
Audio Device(s) Onboard
Power Supply MSI MPG A850GF (850w)
VR HMD Rift S
Seriously, at least for me, the game runs great at 2560 x 1440 medium texture settings everything else max, fsr 2 quality on an old 5700 xt 8gb. Initial shader compile was long but I have no stutters, pop ins or crashes. The mouse issues are/were there but I use my Dualsense anyway. Looks better than the majority of games on high/v.high and this is coming from someone who used to play at 4k vhigh/high regularly with a 3080 12Gb.

Edit - Also convinced the missus that a 7900 XTX would be a good 'investment' too so I'm not fussed as much :)
 
Joined
Dec 5, 2013
Messages
631 (0.16/day)
Location
UK
I mean, why is this acceptable with today's gaming tech?
It's "acceptable" because an endless stream of moronic "Real Gamers (tm)" constantly lower their expectations into the gutter whilst simultaneously have all the "don't pre-order self-control" of Steven Seagal at a burger stand... Until PC gamers regrow their spines and start calling out sh*tty console ports for what they are again, PC gamers will get to continue to 'enjoy' a new wave of bad console ports whilst the PC tech sector quietly cheers on 12GB VRAM usage for 900p gaming as a reason for 'needing' a $1200 GPU as the new 'budget gamer' option...
 
Joined
Mar 2, 2022
Messages
146 (0.16/day)
Processor Intel Core i9-13900KS
Motherboard ASUS ROG STRIX Z790-H Gaming DDR5 ATX Motherboard
Cooling ASUS ROG Ryujin II 360mm
Memory Team Group Delta RGB DDR5 32GB 7800
Video Card(s) MSI Gaming GeForce RTX 4090 GAMING TRIO 24GB
Display(s) LG CX 55" OLED 120Hz
Case Corsair 5000D Airflow Tempered Glass Mid-Tower ATX
Mouse Xtrfy M4 Retro
Keyboard Logitech G810 w/ Romer-G Tactile Switches
VR HMD Valve Index
Software Win 11
just like rtx 4070 12gb & 4070 ti 12gb become obselete with 2160p on this game.....

rtx 3090 24gb & rtx 3090 ti 24gb still rules the way it's meant to be played......
Something that is noteworthy is the 4080 is faster in this title even with 16GBs of VRAM over the 3090 with 24GBs of VRAM.

Edit: As well, the 4070 ti is faster than the 3090 ti in 1440p. So it does seem like the 4070 ti is a 1440p king unless you're using DLSS 3 then you can get some good 4k performance too.
 
Last edited:
Joined
Jun 30, 2008
Messages
234 (0.04/day)
Location
Sweden
System Name Shadow Warrior
Processor 7800x3d
Motherboard Gigabyte X670 Gaming X AX
Cooling Thermalright Peerless Assassin 120 SE ARGB White
Memory 64GB 6000Mhz cl30
Video Card(s) XFX 7900XT
Storage 12TB NVME + 16TB SSD + 2x12TB 5400rpm
Display(s) HP X34 Ultrawide 165hz
Case Fractal Design Define 7 (modded)
Audio Device(s) Sound BlasterX AE-5+ / Topping DX5 Lite (Dan Clark Audio Aeon2 closed)
Power Supply Corsair hx1000i
Mouse Roccat Kain 120 aimo / Roccat Burst Pro
Keyboard Cherry Stream 3.0 SX-switches
VR HMD Quest 1, Pico 4 128GB
Software Win11 x64
Joined
Mar 20, 2010
Messages
246 (0.05/day)
It's crazy to play new games upon release these days :laugh: I always give 3 - 4 months to let the developers and GPU manufacturers to iron out the majority of issues.
Heck, I have just started playing Horizon Zero Dawn and Cyberpunk. They were released years ago. The bugs (well, mostly for Cyberpunk) have been resolved.

Although that's a long time to complile shaders, yes it's worth it. The constant shader loading stuttering *ahem Dead Space Remake ahem* is super annoying. Even on the 4090 build.
 
Joined
Nov 26, 2021
Messages
1,495 (1.49/day)
Location
Mississauga, Canada
Processor Ryzen 7 5700X
Motherboard ASUS TUF Gaming X570-PRO (WiFi 6)
Cooling Noctua NH-C14S (two fans)
Memory 2x16GB DDR4 3200
Video Card(s) Reference Vega 64
Storage Intel 665p 1TB, WD Black SN850X 2TB, Crucial MX300 1TB SATA, Samsung 830 256 GB SATA
Display(s) Nixeus NX-EDG27, and Samsung S23A700
Case Fractal Design R5
Power Supply Seasonic PRIME TITANIUM 850W
Mouse Logitech
VR HMD Oculus Rift
Software Windows 11 Pro, and Ubuntu 20.04
No, not needed really anymore. Just look at 2080Ti, they are not that far off.
You're being rather optimistic. A stock 1080 Ti should be around the 3060 or 5700XT in newer games.
 
Joined
Jul 13, 2016
Messages
3,090 (1.04/day)
Processor Ryzen 7800X3D
Motherboard ASRock X670E Taichi
Cooling Noctua NH-D15 Chromax
Memory 32GB DDR5 6000 CL30
Video Card(s) MSI RTX 4090 Trio
Storage Too much
Display(s) Acer Predator XB3 27" 240 Hz
Case Thermaltake Core X9
Audio Device(s) Topping DX5, DCA Aeon II
Power Supply Seasonic Prime Titanium 850w
Mouse G305
Keyboard Wooting HE60
VR HMD Valve Index
Software Win 10
Something that is noteworthy is the 4080 is faster in this title even with 16GBs of VRAM over the 3090 with 24GBs of VRAM.

No it's really not noteworthy that a game that uses 14GB of VRAM doesn't choke on a 16GB card.

Edit: As well, the 4070 ti is faster than the 3090 ti in 1440p. So it does seem like the 4070 ti is a 1440p king unless you're using DLSS 3 then you can get some good 4k performance too.

No, the 3090 Ti is actually faster at 1440p:

1680277567563.png


Pretty clear from the above chart that the 4070 Ti is slower at 1440p. This isn't the only game where the 3090 Ti is faster either and given the limited memory and memory bandwidth of the 4070 Ti, I only expect the 3090 Ti to look better as time goes no.

If you are going to comment please read the article before posting.
 
Joined
Apr 21, 2005
Messages
174 (0.02/day)
Something that is noteworthy is the 4080 is faster in this title even with 16GBs of VRAM over the 3090 with 24GBs of VRAM.

Edit: As well, the 4070 ti is faster than the 3090 ti in 1440p. So it does seem like the 4070 ti is a 1440p king unless you're using DLSS 3 then you can get some good 4k performance too.

A 1440p king that drops like a stone at 4k is an utter disappointment at $800.
 
Joined
Nov 4, 2005
Messages
11,878 (1.73/day)
System Name Compy 386
Processor 7800X3D
Motherboard Asus
Cooling Air for now.....
Memory 64 GB DDR5 6400Mhz
Video Card(s) 7900XTX 310 Merc
Storage Samsung 990 2TB, 2 SP 2TB SSDs, 24TB Enterprise drives
Display(s) 55" Samsung 4K HDR
Audio Device(s) ATI HDMI
Mouse Logitech MX518
Keyboard Razer
Software A lot.
Benchmark Scores Its fast. Enough.
Seems like a prime candidate for Direct Storage.
 
Joined
Jul 13, 2016
Messages
3,090 (1.04/day)
Processor Ryzen 7800X3D
Motherboard ASRock X670E Taichi
Cooling Noctua NH-D15 Chromax
Memory 32GB DDR5 6000 CL30
Video Card(s) MSI RTX 4090 Trio
Storage Too much
Display(s) Acer Predator XB3 27" 240 Hz
Case Thermaltake Core X9
Audio Device(s) Topping DX5, DCA Aeon II
Power Supply Seasonic Prime Titanium 850w
Mouse G305
Keyboard Wooting HE60
VR HMD Valve Index
Software Win 10
A 1440p king that drops like a stone at 4k is an utter disappointment at $800.

Ironically not even a 1440p king, looses to the 3090 Ti opposite to what he said. It's trash, $800 GPU that yet again has too little VRAM and way too little memory bandwidth.
 
Joined
Mar 2, 2022
Messages
146 (0.16/day)
Processor Intel Core i9-13900KS
Motherboard ASUS ROG STRIX Z790-H Gaming DDR5 ATX Motherboard
Cooling ASUS ROG Ryujin II 360mm
Memory Team Group Delta RGB DDR5 32GB 7800
Video Card(s) MSI Gaming GeForce RTX 4090 GAMING TRIO 24GB
Display(s) LG CX 55" OLED 120Hz
Case Corsair 5000D Airflow Tempered Glass Mid-Tower ATX
Mouse Xtrfy M4 Retro
Keyboard Logitech G810 w/ Romer-G Tactile Switches
VR HMD Valve Index
Software Win 11
No it's really not noteworthy that a game that uses 14GB of VRAM doesn't choke on a 16GB card.



No, the 3090 Ti is actually faster at 1440p:

View attachment 289925

Pretty clear from the above chart that the 4070 Ti is slower at 1440p. This isn't the only game where the 3090 Ti is faster either and given the limited memory and memory bandwidth of the 4070 Ti, I only expect the 3090 Ti to look better as time goes no.

If you are going to comment please read the article before posting.
Sorry I meant 3090 not 3090 ti. That was a typo. Thanks for correcting that.

A 1440p king that drops like a stone at 4k is an utter disappointment at $800.
But if you only have a 1440p monitor it's faster than the 3090 and especially with DLSS 3 on you'll be WAYYY ahead the 3090.

Ironically not even a 1440p king, looses to the 3090 Ti opposite to what he said. It's trash, $800 GPU that yet again has too little VRAM and way too little memory bandwidth.
The 3090 and 3090 ti seems a lot more than $800 but if you're playing in 4k those are faster natively of course. But I would probably recommend a AMD card for non DLSS 3 performance in 4k over a 3090 ti.
 
Last edited:
Joined
Apr 21, 2005
Messages
174 (0.02/day)
Sorry I meant 3090 not 3090 ti. That was a typo. Thanks for correcting that.


But if you only have a 1440p monitor it's faster than the 3090 and especially with DLSS 3 on you'll be WAYYY ahead the 3090.


The 3090 and 3090 ti seems a lot more than $800 but if you're playing in 4k those are faster natively of course. But I would probably recommend a AMD card for non DLSS 3 performance in 4k over a 3090 ti.

Why would I want to use upscaling at 1440p with an $800 GPU already?

DLSS at lower resolutions is not as good as at 4K and frame gen is worse at low frame rates because the difference between each frame is larger so more room for artifacts. Let alone the fact that the main advantage of high FPS is reducing input latency.
 
Joined
Mar 2, 2022
Messages
146 (0.16/day)
Processor Intel Core i9-13900KS
Motherboard ASUS ROG STRIX Z790-H Gaming DDR5 ATX Motherboard
Cooling ASUS ROG Ryujin II 360mm
Memory Team Group Delta RGB DDR5 32GB 7800
Video Card(s) MSI Gaming GeForce RTX 4090 GAMING TRIO 24GB
Display(s) LG CX 55" OLED 120Hz
Case Corsair 5000D Airflow Tempered Glass Mid-Tower ATX
Mouse Xtrfy M4 Retro
Keyboard Logitech G810 w/ Romer-G Tactile Switches
VR HMD Valve Index
Software Win 11
Why would I want to use upscaling at 1440p with an $800 GPU already?

DLSS at lower resolutions is not as good as at 4K and frame gen is worse at low frame rates because the difference between each frame is larger so more room for artifacts. Let alone the fact that the main advantage of high FPS is reducing input latency.
DLSS 3.0 is actually generating entirely new frames between the native frame. It's not a form of upscaling. I often turn off DLSS 2.0 (upscaling) and turn on DLSS 3.0 (frame generation) because it provides the best visuals. And if you're already hitting 120 too, turning on FG doubles your %1 lows which will do even more than upgrading to DDR5 RAM. Check out that FG in person sometime, I think you might be impressed.
 
Joined
Apr 21, 2005
Messages
174 (0.02/day)
DLSS 3.0 is actually generating entirely new frames between the native frame. It's not a form of upscaling. I often turn off DLSS 2.0 (upscaling) and turn on DLSS 3.0 (frame generation) because it provides the best visuals. And if you're already hitting 120 too, turning on FG doubles your %1 lows which will do even more than upgrading to DDR5 RAM. Check out that FG in person sometime, I think you might be impressed.

Would rather stick to native and get lower input latency without the same artifacts.

DLAA is the best piece of NV software tech, just needs implementing in more titles.
 
Joined
Mar 2, 2022
Messages
146 (0.16/day)
Processor Intel Core i9-13900KS
Motherboard ASUS ROG STRIX Z790-H Gaming DDR5 ATX Motherboard
Cooling ASUS ROG Ryujin II 360mm
Memory Team Group Delta RGB DDR5 32GB 7800
Video Card(s) MSI Gaming GeForce RTX 4090 GAMING TRIO 24GB
Display(s) LG CX 55" OLED 120Hz
Case Corsair 5000D Airflow Tempered Glass Mid-Tower ATX
Mouse Xtrfy M4 Retro
Keyboard Logitech G810 w/ Romer-G Tactile Switches
VR HMD Valve Index
Software Win 11
Would rather stick to native and get lower input latency without the same artifacts.

DLAA is the best piece of NV software tech, just needs implementing in more titles.
Okay yeah, everyone has their own preferences.

I've noticed on 4090 DLSS 3.0 looks better then turning on 2.0, maybe it's my eyes or something lol. Of course, you can run both together too if you want. DLAA is a bit tough to run but that can yield some amazing AA.

Luckily it seems like we have slightly better choices this gen than last. Next gen will be like "HAS TECHNOLOGY GONE TOO FAR?" Going to be FULL path tracing
 
Joined
Apr 10, 2010
Messages
1,838 (0.35/day)
Location
London
System Name Jaspe
Processor Ryzen 1500X
Motherboard Asus ROG Strix X370-F Gaming
Cooling Stock
Memory 16Gb Corsair 3000mhz
Video Card(s) EVGA GTS 450
Storage Crucial M500
Display(s) Philips 1080 24'
Case NZXT
Audio Device(s) Onboard
Power Supply Enermax 425W
Software Windows 10 Pro
Top