• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Hogwarts Legacy Benchmark Test & Performance Analysis

Joined
Dec 12, 2012
Messages
778 (0.18/day)
Location
Poland
System Name THU
Processor Intel Core i5-13600KF
Motherboard ASUS PRIME Z790-P D4
Cooling SilentiumPC Fortis 3 v2 + Arctic Cooling MX-2
Memory Crucial Ballistix 2x16 GB DDR4-3600 CL16 (dual rank)
Video Card(s) MSI GeForce RTX 4070 Ventus 3X OC 12 GB GDDR6X (2610/21000 @ 0.91 V)
Storage Lexar NM790 2 TB + Corsair MP510 960 GB + PNY XLR8 CS3030 500 GB + Toshiba E300 3 TB
Display(s) LG OLED C8 55" + ASUS VP229Q
Case Fractal Design Define R6
Audio Device(s) Yamaha RX-V381 + Monitor Audio Bronze 6 + Bronze FX | FiiO E10K-TC + Sony MDR-7506
Power Supply Corsair RM650
Mouse Logitech M705 Marathon
Keyboard Corsair K55 RGB PRO
Software Windows 10 Home
Benchmark Scores Benchmarks in 2024?
You know how amazing games would be on PC if they didn't have to make them for consoles?

Like Crysis? ;)

Yeah, it looks beautiful even today, but it was a nightmare to run. And due to its dual-threaded nature, performance can still tank even with modern CPUs, especially if you use physics mods.
The game came out right around the time when the CPU focus changed from chasing high clocks and IPC to adding more cores.
The 13900K gets 902 points in the CPU-Z single-threaded bench (probably over 5.5 GHz), while the Core 2 Duo E8500 gets 272 points at 3.16 GHz. So clock for clock, that's just like double the IPC 14 years later, while multi-threaded performance is 32 times higher.

The weird thing is that consoles are basically PCs now. And they even made low-level APIs that are supposed to alleviate many bottlenecks.
So the issue must be with incompetent developers. If you look at Nixxes, they make absolutely fantastic PC versions, probably the best. They know exactly what the PC platform needs. But most PC versions are outsourced to companies that have no idea what they're doing, and nobody's controlling their work. They release garbage versions that take months to fix, and even after multiple patches, they still don't run perfectly. Take GTA IV for example, which is one of the worsts ports in history, but you can double your framerate with a mod that translates DX9 to Vulkan. It's astonishing what "amateur" people can achieve sometimes, while other people get paid for incompetence.

There's nothing wrong with consoles. In fact, the only reason we get so many amazing games is because consoles exist and are so popular. Most people don't want the hassle you have to put up with on PC. They just want to sit down and play.
 
Joined
May 3, 2019
Messages
169 (0.08/day)
Like Crysis? ;)

Yeah, it looks beautiful even today, but it was a nightmare to run. And due to its dual-threaded nature, performance can still tank even with modern CPUs, especially if you use physics mods.
The game came out right around the time when the CPU focus changed from chasing high clocks and IPC to adding more cores.
The 13900K gets 902 points in the CPU-Z single-threaded bench (probably over 5.5 GHz), while the Core 2 Duo E8500 gets 272 points at 3.16 GHz. So clock for clock, that's just like double the IPC 14 years later, while multi-threaded performance is 32 times higher.

The weird thing is that consoles are basically PCs now. And they even made low-level APIs that are supposed to alleviate many bottlenecks.
So the issue must be with incompetent developers. If you look at Nixxes, they make absolutely fantastic PC versions, probably the best. They know exactly what the PC platform needs. But most PC versions are outsourced to companies that have no idea what they're doing, and nobody's controlling their work. They release garbage versions that take months to fix, and even after multiple patches, they still don't run perfectly. Take GTA IV for example, which is one of the worsts ports in history, but you can double your framerate with a mod that translates DX9 to Vulkan. It's astonishing what "amateur" people can achieve sometimes, while other people get paid for incompetence.

There's nothing wrong with consoles. In fact, the only reason we get so many amazing games is because consoles exist and are so popular. Most people don't want to have the hassle you have to put up with on PC. They just want to sit down and play.
That's a one of thing. Go back to the late 90s and look at all the games that were designed specifically for PC gameplay.
 
Joined
Nov 13, 2007
Messages
10,854 (1.73/day)
Location
Austin Texas
System Name stress-less
Processor 9800X3D @ 5.42GHZ
Motherboard MSI PRO B650M-A Wifi
Cooling Thermalright Phantom Spirit EVO
Memory 64GB DDR5 6400 1:1 CL30-36-36-76 FCLK 2200
Video Card(s) RTX 4090 FE
Storage 2TB WD SN850, 4TB WD SN850X
Display(s) Alienware 32" 4k 240hz OLED
Case Jonsbo Z20
Audio Device(s) Yes
Power Supply Corsair SF750
Mouse DeathadderV2 X Hyperspeed
Keyboard 65% HE Keyboard
Software Windows 11
Benchmark Scores They're pretty good, nothing crazy.
Looks fake, I've seen the 5800X3D running at almost 300fps with the 4090, in live tests. lol
I doubt that’s fake. There’s another youtuber - Daniel Owens that is using the x3d having cpu bottlenecks. Not saying you’re wrong - I’m sure it can run at 300fps as well—maybe it’s a specific area or something else with the system, engine, or combo with Nvidia and some setting etc. would be good to get more data.
 
Joined
Jul 20, 2021
Messages
29 (0.02/day)
I doubt that’s fake. There’s another youtuber - Daniel Owens that is using the x3d having cpu bottlenecks. Not saying you’re wrong - I’m sure it can run at 300fps as well—maybe it’s a specific area or something else with the system, or combo with Nvidia etc. would be good to get more data.
It is fake. He tested the Intel chips with DLSS 3 FG on. He admitted it himself in the YouTube comments.
 
Joined
Nov 13, 2007
Messages
10,854 (1.73/day)
Location
Austin Texas
System Name stress-less
Processor 9800X3D @ 5.42GHZ
Motherboard MSI PRO B650M-A Wifi
Cooling Thermalright Phantom Spirit EVO
Memory 64GB DDR5 6400 1:1 CL30-36-36-76 FCLK 2200
Video Card(s) RTX 4090 FE
Storage 2TB WD SN850, 4TB WD SN850X
Display(s) Alienware 32" 4k 240hz OLED
Case Jonsbo Z20
Audio Device(s) Yes
Power Supply Corsair SF750
Mouse DeathadderV2 X Hyperspeed
Keyboard 65% HE Keyboard
Software Windows 11
Benchmark Scores They're pretty good, nothing crazy.
It is fake. He tested the Intel chips with DLSS 3 FG on. He admitted it himself in the YouTube comments.
Ah that’s good to know. Thanks

edit: I just read that comment and the fix but there’s still performance issues on and platforms: from his comment in attachments.

Even after the fix he’s saying 7700x performance is brutally low - on par with 12600?
 

Attachments

  • AD6E2612-9061-434E-928F-7B2B61C959C1.jpeg
    AD6E2612-9061-434E-928F-7B2B61C959C1.jpeg
    62.1 KB · Views: 103
Last edited:
Joined
Dec 12, 2012
Messages
778 (0.18/day)
Location
Poland
System Name THU
Processor Intel Core i5-13600KF
Motherboard ASUS PRIME Z790-P D4
Cooling SilentiumPC Fortis 3 v2 + Arctic Cooling MX-2
Memory Crucial Ballistix 2x16 GB DDR4-3600 CL16 (dual rank)
Video Card(s) MSI GeForce RTX 4070 Ventus 3X OC 12 GB GDDR6X (2610/21000 @ 0.91 V)
Storage Lexar NM790 2 TB + Corsair MP510 960 GB + PNY XLR8 CS3030 500 GB + Toshiba E300 3 TB
Display(s) LG OLED C8 55" + ASUS VP229Q
Case Fractal Design Define R6
Audio Device(s) Yamaha RX-V381 + Monitor Audio Bronze 6 + Bronze FX | FiiO E10K-TC + Sony MDR-7506
Power Supply Corsair RM650
Mouse Logitech M705 Marathon
Keyboard Corsair K55 RGB PRO
Software Windows 10 Home
Benchmark Scores Benchmarks in 2024?
That's a one of thing. Go back to the late 90s and look at all the games that were designed specifically for PC gameplay.

I watch videos where people play old games on retro PCs. I can't really comment on how the experience was back then (I wasn't really a gamer in the 90s), but one thing that you can still see is that performance was pretty bad. 60 FPS was not really a thing on PC, unless you had the absolutely newest hardware (which was evolving extremely fast). And you had so many APIs, so many things that were not standardized (even audio). It was so much more nerdy than it is today.

People have gotten used to high-framerate gaming now, but that's actually a bad thing. The only reason we need that is because of how garbage LCD technology is compared to CRT when it comes to motion clarity and input lag. Even OLED has those problems.
Back in the X360/PS3 era, PC ports were actually much worse than today. Pretty much every game was designed for 30 FPS, so it was hard to even get 60 FPS on PC, not to mention more. That changed a lot with XBO/PS4, where CPUs were so bad that no PC had a problem outclassing them. And on XBSX and PS5 most games have a 60 FPS mode, which translates to PC really well.

The issue is still with developers converting to PC, not the games or the platforms themselves.
 
Joined
May 3, 2019
Messages
169 (0.08/day)
I watch videos where people play old games on retro PCs. I can't really comment on how the experience was back then (I wasn't really a gamer in the 90s), but one thing that you can still see is that performance was pretty bad. 60 FPS was not really a thing on PC, unless you had the absolutely newest hardware (which was evolving extremely fast). And you had so many APIs, so many things that were not standardized (even audio). It was so much more nerdy than it is today.

People have gotten used to high-framerate gaming now, but that's actually a bad thing. The only reason we need that is because of how garbage LCD technology is compared to CRT when it comes to motion clarity and input lag. Even OLED has those problems.
Back in the X360/PS3 era, PC ports were actually much worse than today. Pretty much every game was designed for 30 FPS, so it was hard to even get 60 FPS on PC, not to mention more. That changed a lot with XBO/PS4, where CPUs were so bad that no PC had a problem outclassing them. And on XBSX and PS5 most games have a 60 FPS mode, which translates to PC really well.

The issue is still with developers converting to PC, not the games or the platforms themselves.
I played 60fps most games on 800x600 with a voodoo 3 2000 and a 500mhz celeron.
 
Joined
Jul 20, 2021
Messages
29 (0.02/day)
Ah that’s good to know. Thanks

edit: I just read that comment and the fix but there’s still performance issues on and platforms: from his comment in attachments.

Even after the fix he’s saying 7700x performance is brutally low - on par with 12600?
That's not "brutally low," it's about right. (?) 8C Zen4 @ ~5GHz vs. 6+4 ADL @ ~ 4.9GHz in a heavily multithreaded game that does a ton of background work all the time compiling shaders (or something.)
 
Joined
Jun 14, 2020
Messages
3,554 (2.14/day)
System Name Mean machine
Processor 12900k
Motherboard MSI Unify X
Cooling Noctua U12A
Memory 7600c34
Video Card(s) 4090 Gamerock oc
Storage 980 pro 2tb
Display(s) Samsung crg90
Case Fractal Torent
Audio Device(s) Hifiman Arya / a30 - d30 pro stack
Power Supply Be quiet dark power pro 1200
Mouse Viper ultimate
Keyboard Blackwidow 65%
Looks fake, I've seen the 5800X3D running at almost 300fps with the 4090, in live tests. lol
Ιn hogwarts? With RT on? No. Unless you are looking at the skybox. Just no.
 
Joined
Nov 13, 2007
Messages
10,854 (1.73/day)
Location
Austin Texas
System Name stress-less
Processor 9800X3D @ 5.42GHZ
Motherboard MSI PRO B650M-A Wifi
Cooling Thermalright Phantom Spirit EVO
Memory 64GB DDR5 6400 1:1 CL30-36-36-76 FCLK 2200
Video Card(s) RTX 4090 FE
Storage 2TB WD SN850, 4TB WD SN850X
Display(s) Alienware 32" 4k 240hz OLED
Case Jonsbo Z20
Audio Device(s) Yes
Power Supply Corsair SF750
Mouse DeathadderV2 X Hyperspeed
Keyboard 65% HE Keyboard
Software Windows 11
Benchmark Scores They're pretty good, nothing crazy.
That's not "brutally low," it's about right. (?) 8C Zen4 @ ~5GHz vs. 6+4 ADL @ ~ 4.9GHz in a heavily multithreaded game that does a ton of background work all the time compiling shaders (or something.)
That’s pretty low - 8c zen 4 at 5.5ghz losing to 6 4.9ghz alderlake cores - e cores have no effect from my testing.

Seems like a utilization problem.
 
Joined
Dec 12, 2012
Messages
778 (0.18/day)
Location
Poland
System Name THU
Processor Intel Core i5-13600KF
Motherboard ASUS PRIME Z790-P D4
Cooling SilentiumPC Fortis 3 v2 + Arctic Cooling MX-2
Memory Crucial Ballistix 2x16 GB DDR4-3600 CL16 (dual rank)
Video Card(s) MSI GeForce RTX 4070 Ventus 3X OC 12 GB GDDR6X (2610/21000 @ 0.91 V)
Storage Lexar NM790 2 TB + Corsair MP510 960 GB + PNY XLR8 CS3030 500 GB + Toshiba E300 3 TB
Display(s) LG OLED C8 55" + ASUS VP229Q
Case Fractal Design Define R6
Audio Device(s) Yamaha RX-V381 + Monitor Audio Bronze 6 + Bronze FX | FiiO E10K-TC + Sony MDR-7506
Power Supply Corsair RM650
Mouse Logitech M705 Marathon
Keyboard Corsair K55 RGB PRO
Software Windows 10 Home
Benchmark Scores Benchmarks in 2024?
That's not "brutally low," it's about right. (?) 8C Zen4 @ ~5GHz vs. 6+4 ADL @ ~ 4.9GHz in a heavily multithreaded game that does a ton of background work all the time compiling shaders (or something.)

This game is not heavily multi-threaded, that seems to be one of its biggest problems right now.

And those background tasks you mentioned are not actually background in that sense. They are performed by the game's process, which is supposed to get assigned to the P-cores. Games are not aware of E-cores and I doubt they ever will be, they are meant for other background processes, but it's just copium. You're still limited by clocks and IPC, not by the number of cores (assuming you have at least 6 of them).
 
Joined
Jan 8, 2017
Messages
9,521 (3.27/day)
System Name Good enough
Processor AMD Ryzen R9 7900 - Alphacool Eisblock XPX Aurora Edge
Motherboard ASRock B650 Pro RS
Cooling 2x 360mm NexXxoS ST30 X-Flow, 1x 360mm NexXxoS ST30, 1x 240mm NexXxoS ST30
Memory 32GB - FURY Beast RGB 5600 Mhz
Video Card(s) Sapphire RX 7900 XT - Alphacool Eisblock Aurora
Storage 1x Kingston KC3000 1TB 1x Kingston A2000 1TB, 1x Samsung 850 EVO 250GB , 1x Samsung 860 EVO 500GB
Display(s) LG UltraGear 32GN650-B + 4K Samsung TV
Case Phanteks NV7
Power Supply GPS-750C
This game is not heavily multi-threaded, that seems to be one of its biggest problems right now.
Even if that's the case the differences between Intel and AMD are very small in single thread performance, certainly not enough to cause 50% worse performance.

This game seems to run suspiciously bad on anything AMD branded and then there is also that DLSS "bug" as well. It's all rather curios.
 
Joined
Jul 20, 2021
Messages
29 (0.02/day)
Even if that's the case the differences between Intel and AMD are very small in single thread performance, certainly not enough to cause 50% worse performance.

This game seems to run suspiciously bad on anything AMD branded and then there is also that DLSS "bug" as well. It's all rather curios.
It doesn't have 50% worse performance. Read up the thread; the screenshot posted earlier was from the video, which erroneously used DLSS3 FG on the Intel machines. He posted revised performance numbers in a comment:
13600K = 107 fps average , 81fps 1%
12600K = 92 fps average , 67fps 1%
12100F = 74 fps average, 49 fps 1%
7700X = 87 fps average, 54 fps 1%
 

nonfatmatt

New Member
Joined
Feb 11, 2023
Messages
2 (0.00/day)
Can confirm from developing on Unreal Engine that Epic has been historically biased towards Nvidia. The engine is primarily developed and tested on Nvidia hardware. Amd is often at something like a 30 percent deficit for raster performance at the same hardware level. I would suspect that RT has that deficit or more. Using DXVK in Linux, for instance, UE games perform the same on similar level hardware because of vendor agnostic driver abstractions, so I often will get more FPS in Linux on a UE4 game using a Radeon card than I will in WIndows on the exact same card with "native" drivers.
 

HTC

Joined
Apr 1, 2008
Messages
4,664 (0.76/day)
Location
Portugal
System Name HTC's System
Processor Ryzen 5 5800X3D
Motherboard Asrock Taichi X370
Cooling NH-C14, with the AM4 mounting kit
Memory G.Skill Kit 16GB DDR4 F4 - 3200 C16D - 16 GTZB
Video Card(s) Sapphire Pulse 6600 8 GB
Storage 1 Samsung NVMe 960 EVO 250 GB + 1 3.5" Seagate IronWolf Pro 6TB 7200RPM 256MB SATA III
Display(s) LG 27UD58
Case Fractal Design Define R6 USB-C
Audio Device(s) Onboard
Power Supply Corsair TX 850M 80+ Gold
Mouse Razer Deathadder Elite
Software Ubuntu 20.04.6 LTS
Can confirm from developing on Unreal Engine that Epic has been historically biased towards Nvidia. The engine is primarily developed and tested on Nvidia hardware. Amd is often at something like a 30 percent deficit for raster performance at the same hardware level. I would suspect that RT has that deficit or more. Using DXVK in Linux, for instance, UE games perform the same on similar level hardware because of vendor agnostic driver abstractions, so I often will get more FPS in Linux on a UE4 game using a Radeon card than I will in WIndows on the exact same card with "native" drivers.

Really?

In that case, i do wonder what the performance of this game is under linux VS under windows.
 
Joined
Jan 8, 2017
Messages
9,521 (3.27/day)
System Name Good enough
Processor AMD Ryzen R9 7900 - Alphacool Eisblock XPX Aurora Edge
Motherboard ASRock B650 Pro RS
Cooling 2x 360mm NexXxoS ST30 X-Flow, 1x 360mm NexXxoS ST30, 1x 240mm NexXxoS ST30
Memory 32GB - FURY Beast RGB 5600 Mhz
Video Card(s) Sapphire RX 7900 XT - Alphacool Eisblock Aurora
Storage 1x Kingston KC3000 1TB 1x Kingston A2000 1TB, 1x Samsung 850 EVO 250GB , 1x Samsung 860 EVO 500GB
Display(s) LG UltraGear 32GN650-B + 4K Samsung TV
Case Phanteks NV7
Power Supply GPS-750C
Can confirm from developing on Unreal Engine that Epic has been historically biased towards Nvidia. The engine is primarily developed and tested on Nvidia hardware. Amd is often at something like a 30 percent deficit for raster performance at the same hardware level. I would suspect that RT has that deficit or more. Using DXVK in Linux, for instance, UE games perform the same on similar level hardware because of vendor agnostic driver abstractions, so I often will get more FPS in Linux on a UE4 game using a Radeon card than I will in WIndows on the exact same card with "native" drivers.

Maybe that's the case for UE4, UE5 seems to run fine, in Fortnite for example a 7900XTX is as fast as a 4080 with ray tracing.

1676153421775.png
 

nonfatmatt

New Member
Joined
Feb 11, 2023
Messages
2 (0.00/day)
Maybe that's the case for UE4, UE5 seems to run fine, in Fortnite for example a 7900XTX is as fast as a 4080 with ray tracing.

View attachment 283400
This is true, and it is related to DX12 being more driver and hardware agnostic than DX11. There is less in the driver that Nvidia can leverage for a performance advantage. However, it should be noted that there are several things that can be turned off and on in terms of DXR in Unreal, and i would imagine Fortnite is using a less robust / hybrid ray tracing method than Hogwarts. It's also a relatively simple game in terms of materials / shader complexity and geometry.

https://docs.unrealengine.com/4.27/en-US/RenderingAndGraphics/RayTracing/RayTracingSettings/ is a list of possible settings that were probably used in Hogwarts (as its a UE4 game). Some of them have more cost on AMD hardware by virtue of AMD's RT implementation being sub-par and partially software based. Others have more cost because they were Nvidia-specific features in the beginning.
 
Joined
Jan 8, 2017
Messages
9,521 (3.27/day)
System Name Good enough
Processor AMD Ryzen R9 7900 - Alphacool Eisblock XPX Aurora Edge
Motherboard ASRock B650 Pro RS
Cooling 2x 360mm NexXxoS ST30 X-Flow, 1x 360mm NexXxoS ST30, 1x 240mm NexXxoS ST30
Memory 32GB - FURY Beast RGB 5600 Mhz
Video Card(s) Sapphire RX 7900 XT - Alphacool Eisblock Aurora
Storage 1x Kingston KC3000 1TB 1x Kingston A2000 1TB, 1x Samsung 850 EVO 250GB , 1x Samsung 860 EVO 500GB
Display(s) LG UltraGear 32GN650-B + 4K Samsung TV
Case Phanteks NV7
Power Supply GPS-750C
i would imagine Fortnite is using a less robust / hybrid ray tracing method than Hogwarts. It's also a relatively simple game in terms of materials / shader complexity and geometry.
It's pretty comprehensive, from ray traced reflections to global illumination, actually the way lighting is handled is one of the most realistic I've seen in a game despite the cartoonish art style. Also nanite increases geometry by a lot, Fortnite is not as basic of a game visually as it once used to be.

But anyway it's clear that with a proper game engine RT performance and performance in general is fine on AMD and should not differ from Nvidia much.
 
Last edited:
Joined
Jun 14, 2020
Messages
3,554 (2.14/day)
System Name Mean machine
Processor 12900k
Motherboard MSI Unify X
Cooling Noctua U12A
Memory 7600c34
Video Card(s) 4090 Gamerock oc
Storage 980 pro 2tb
Display(s) Samsung crg90
Case Fractal Torent
Audio Device(s) Hifiman Arya / a30 - d30 pro stack
Power Supply Be quiet dark power pro 1200
Mouse Viper ultimate
Keyboard Blackwidow 65%


From capframe x, the 12900k has a 3090, everyrhkng else is on a 4090

This is the exact spot he is testing, stock 12900k + 4090


91.JPGhogwarts.JPG
 
Last edited:
Joined
Jan 14, 2019
Messages
12,682 (5.83/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
I watch videos where people play old games on retro PCs. I can't really comment on how the experience was back then (I wasn't really a gamer in the 90s), but one thing that you can still see is that performance was pretty bad. 60 FPS was not really a thing on PC, unless you had the absolutely newest hardware (which was evolving extremely fast). And you had so many APIs, so many things that were not standardized (even audio). It was so much more nerdy than it is today.

People have gotten used to high-framerate gaming now, but that's actually a bad thing. The only reason we need that is because of how garbage LCD technology is compared to CRT when it comes to motion clarity and input lag. Even OLED has those problems.
Back in the X360/PS3 era, PC ports were actually much worse than today. Pretty much every game was designed for 30 FPS, so it was hard to even get 60 FPS on PC, not to mention more. That changed a lot with XBO/PS4, where CPUs were so bad that no PC had a problem outclassing them. And on XBSX and PS5 most games have a 60 FPS mode, which translates to PC really well.

The issue is still with developers converting to PC, not the games or the platforms themselves.
Performance wasn't bad in the 90s. Like you said, 60 FPS wasn't a thing and consequently, no one was missing it. To be honest, I still don't need it today in most games. I'll have a stable 40 FPS over a choppy 60 any time.

This game is not heavily multi-threaded, that seems to be one of its biggest problems right now.

And those background tasks you mentioned are not actually background in that sense. They are performed by the game's process, which is supposed to get assigned to the P-cores. Games are not aware of E-cores and I doubt they ever will be, they are meant for other background processes, but it's just copium. You're still limited by clocks and IPC, not by the number of cores (assuming you have at least 6 of them).
I don't think games are aware of cores at all. It's the OS that assigns processes to specific cores keeping the CPU's built-in core hierarchy (preferred cores) in mind.
 
Joined
Jul 20, 2021
Messages
29 (0.02/day)
Performance wasn't bad in the 90s. Like you said, 60 FPS wasn't a thing and consequently, no one was missing it. To be honest, I still don't need it today in most games. I'll have a stable 40 FPS over a choppy 60 any time.

This whole conversation thread is wildly off-topic, but let's get our facts straight, lol. 60 FPS absolutely was a thing way back in the 1970s; most games on consoles before Gen 5 (not all!) ran at "60 FPS". Even when you're talking about 3D Games, 60 FPS absolutely was a thing in the 1990s; Quake could run at 60 FPS on fast hardware of the day even before GLQuake came around, which was in January 1997. Once 3D acceleration became popular, 60 FPS rapidly became the expectation; 3dfx's marketing was even hinged around the idea that less than 60 FPS was inferior and unacceptable.
 
Joined
Jan 14, 2019
Messages
12,682 (5.83/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
This whole conversation thread is wildly off-topic, but let's get our facts straight, lol. 60 FPS absolutely was a thing way back in the 1970s; most games on consoles before Gen 5 (not all!) ran at "60 FPS". Even when you're talking about 3D Games, 60 FPS absolutely was a thing in the 1990s; Quake could run at 60 FPS on fast hardware of the day even before GLQuake came around, which was in January 1997. Once 3D acceleration became popular, 60 FPS rapidly became the expectation; 3dfx's marketing was even hinged around the idea that less than 60 FPS was inferior and unacceptable.
Well, those are examples, but for example Doom was locked at 30 FPS by its engine. The other thing is that neither I, nor my friends had a PC that could run the latest games anywhere near 60 FPS and we were still happy. Your experience might have been different. :)
 
Top