• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Hogwarts Legacy Benchmark Test & Performance Analysis

Joined
Jul 13, 2016
Messages
3,252 (1.07/day)
Processor Ryzen 7800X3D
Motherboard ASRock X670E Taichi
Cooling Noctua NH-D15 Chromax
Memory 32GB DDR5 6000 CL30
Video Card(s) MSI RTX 4090 Trio
Storage Too much
Display(s) Acer Predator XB3 27" 240 Hz
Case Thermaltake Core X9
Audio Device(s) Topping DX5, DCA Aeon II
Power Supply Seasonic Prime Titanium 850w
Mouse G305
Keyboard Wooting HE60
VR HMD Valve Index
Software Win 10
I don't get the hate. First 'Next gen' games are always like this. Devs eventually learn to optimize better, and cards get more powerful. This has always happened in gaming.
Visuals are getting harder to improve meaningfully without impacting performance too much, that's how diminishing returns work. I also wouldn't expect AAA, story-based games to err on the side of performance rather than visuals.

There are two problems people have:

1) The performance and cost for that performance. I remember my Radeon 4850 Vapor-X doing a really good job of running Crysis and that card was only $230 USD brand new. Of course people are going to complain, you could buy an entire console for cheaper then what it would cost to buy just a GPU capable of running this game as well as the console. Forget about the cost of the rest of the PC given this game's CPU requirements, talking $1,300+.

2) The visual benefit. There are comparably good looking open world games that run much better


Thank you, I was waiting for someone to make this joke :)
 
Joined
Jan 14, 2019
Messages
12,273 (5.78/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
Joined
Jan 14, 2019
Messages
12,273 (5.78/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
Joined
Nov 13, 2007
Messages
10,684 (1.72/day)
Location
Austin Texas
System Name Planet Espresso
Processor 13700KF @ 5.5GHZ 1.285v - 235W cap
Motherboard MSI 690-I PRO
Cooling Thermalright Phantom Spirit EVO
Memory 48 GB DDR5 7600 MHZ CL36
Video Card(s) RTX 4090 FE
Storage 2TB WD SN850, 4TB WD SN850X
Display(s) Alienware 32" 4k 240hz OLED
Case Jonsbo Z20
Audio Device(s) Yes
Power Supply Corsair SF750
Mouse Xlite V2
Keyboard 65% HE Keyboard
Software Windows 11
Benchmark Scores They're pretty good, nothing crazy.
@W1zzard are you playing it with DLSS3 on? If so, does it work well in this title to make it playable or not really?
 
Joined
Jun 14, 2020
Messages
3,290 (2.05/day)
System Name Mean machine
Processor 12900k
Motherboard MSI Unify X
Cooling Noctua U12A
Memory 7600c34
Video Card(s) 4090 Gamerock oc
Storage 980 pro 2tb
Display(s) Samsung crg90
Case Fractal Torent
Audio Device(s) Hifiman Arya / a30 - d30 pro stack
Power Supply Be quiet dark power pro 1200
Mouse Viper ultimate
Keyboard Blackwidow 65%
Can someone provide a logical reason for the decline in performance of the 7900XTX to the level of a Mid-end 3060 when RT is enabled? lol
Yes, amd cards suck at RT
 
Joined
Jan 14, 2019
Messages
12,273 (5.78/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
Yes, amd cards suck at RT
Not a valid explanation when it's at 3090 (Ti) / 4080 level in every other game.
1675969668220.png
 

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
27,706 (3.70/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
Can someone provide a logical reason for the decline in performance of the 7900XTX to the level of a Mid-end 3060 when RT is enabled? lol
Probably some kind of bug.. do note this is at ultra rt .. as mentioned in the conclusion, dialing down RT to lower levels helps AMD a lot.
 
Joined
Nov 10, 2020
Messages
21 (0.01/day)
Processor Core i9 10900k @ 5.1 Ghz
Motherboard Asus ROG Strix Z490-E
Cooling DH15 3x fans
Memory 4x16 Crucial 3600 CL16
Video Card(s) 3090 FE
Storage 2x 970 Evo Plus 2Tb + WD Gold 14Tb 2x
Display(s) Dell AW3418DW + BenQ PD2700Q
Case BeQuiet Dark Pro 900v² + 2 fans
Audio Device(s) Ext
Power Supply Be Quiet Dark Power 13 1000w
Mouse Zowie FK2 for Quake/Logitech G600 + Artisan Hien soft XL
I feel like we've come to meet diminishing returns in terms of visuals per hardware requirements.
For ages I haven't felt a real sense of awe in any game.
The last one probably was the Witcher 3, which ran fine on 980 Ti.
Nowadays we have countless times the processing power and I don't feel visuals have improved much. Even evolving backwards in some cases.
Is it just me?
 
Joined
Jan 14, 2019
Messages
12,273 (5.78/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
I feel like we've come to meet diminishing returns in terms of visuals per hardware requirements.
For ages I haven't felt a real sense of awe in any game.
The last one probably was the Witcher 3, which ran fine on 980 Ti.
Nowadays we have countless times the processing power and I don't feel visuals have improved much. Even evolving backwards in some cases.
Is it just me?
No, I feel the same.
 

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
27,706 (3.70/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
Arc A770 results have been added .. especially the RT perf looks pretty good. I could manage a benchmark run without crashes at 4K Ultra + RT Ultra by repeatedly rebooting and just not giving up. It's still suboptimal, I'll ping Intel about it
 
Joined
Aug 4, 2020
Messages
1,608 (1.03/day)
Location
::1
I feel like we've come to meet diminishing returns in terms of visuals per hardware requirements.
For ages I haven't felt a real sense of awe in any game.
The last one probably was the Witcher 3, which ran fine on 980 Ti.
Nowadays we have countless times the processing power and I don't feel visuals have improved much. Even evolving backwards in some cases.
Is it just me?
good graphics do not a good game make
more polygons do not good graphics make
 
Joined
Dec 10, 2022
Messages
484 (0.70/day)
System Name The Phantom in the Black Tower
Processor AMD Ryzen 7 5800X3D
Motherboard ASRock X570 Pro4 AM4
Cooling AMD Wraith Prism, 5 x Cooler Master Sickleflow 120mm
Memory 64GB Team Vulcan DDR4-3600 CL18 (4×16GB)
Video Card(s) ASRock Radeon RX 7900 XTX Phantom Gaming OC 24GB
Storage WDS500G3X0E (OS), WDS100T2B0C, TM8FP6002T0C101 (x2) and ~40TB of total HDD space
Display(s) Haier 55E5500U 55" 2160p60Hz
Case Ultra U12-40670 Super Tower
Audio Device(s) Logitech Z200
Power Supply EVGA 1000 G2 Supernova 1kW 80+Gold-Certified
Mouse Logitech MK320
Keyboard Logitech MK320
VR HMD None
Software Windows 10 Professional
Benchmark Scores Fire Strike Ultra: 19484 Time Spy Extreme: 11006 Port Royal: 16545 SuperPosition 4K Optimised: 23439
I guess that all the people who swear that they can't enjoy a game without playing at 4K Ultra with RT are going to have to pass on this game! Good lord, I've never seen such a staggering hit to performance for what amounts to just slightly more than nothing in image quality. However, I see a big elephant in the room that has me puzzled.

There's something in these results that isn't making sense to me. On one chart, it says how much VRAM that the game was demonstrated to need:

As we see, the game uses 14,576MB at 2160p ultra settings with RT, which is mind-boggling, but was measured.

However, then we see this:

The RTX 3090, which has more than enough VRAM, is only 2.6FPS faster than the RTX 3080 which has nowhere near enough. The performance difference is about 17% which is close enough to their 13% average performance delta that it can be inferred to have nothing to do with VRAM. I say this because at 2160p, the performance delta between them in Rainbow Six Siege is 16% and Rainbow Six Siege does NOT use much VRAM.

This is the TL : DR of this mystery:
  1. The game needs 14,576MB at 2160p Ultra with RT
  2. The RTX 3080 10GB has, at most, 10,240MB of VRAM
  3. The performance delta between the RTX 3080 and RTX 3090 in a game that doesn't use much VRAM is 15.5%.
  4. At 4K Ultra with RT, the RTX 3090 only has a 17% FPS lead on the RTX 3080, despite the 3080 having 4GB+ less VRAM than the game needs.
What could be the cause of this?
 
Last edited:
Joined
Jan 14, 2019
Messages
12,273 (5.78/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
Arc A770 results have been added .. especially the RT perf looks pretty good. I could manage a benchmark run without crashes at 4K Ultra + RT Ultra by repeatedly rebooting and just not giving up. It's still suboptimal, I'll ping Intel about it
Thanks for that! :) RT results look promising. Crashes do not, unfortunately.

I guess that all the people who swear that they can't enjoy a game without playing at 4K Ultra with RT are going to have to pass on this game! Good lord, I've never seen such a staggering hit to performance for what amounts to just slightly more than nothing in image quality. However, I see a big elephant in the room that has me puzzled.

There's something in these results that isn't making sense to me. On one chart, it says how much VRAM that the game was demonstrated to need:

As we see, the game uses 14,576MB at 2160p ultra settings with RT, which is mind-boggling, but was measured.

However, then we see this:

The RTX 3090, which has more than enough VRAM, is only 2.6FPS faster than the RTX 3080 which has nowhere near enough. The performance difference is about 17% which is close enough to their 13% average performance delta that it can be inferred to have nothing to do with VRAM. I say this because at 2160p, the performance delta between them in Rainbow Six Siege is 16% and Rainbow Six Siege does NOT use much VRAM.

This is the TL : DR of this mystery:
  1. The game needs 14,576MB at 2160p Ultra with RT
  2. The RTX 3080 10GB has, at most, 10,240MB of VRAM
  3. The performance delta between the RTX 3080 and RTX 3090 in a game that doesn't use much VRAM is 15.5%.
  4. At 4K Ultra with RT, the RTX 3090 only has a 17% FPS lead on the RTX 3080, despite the 3080 having 4GB+ less VRAM than the game needs.
What could be the cause of this?
I think this only proves once again that VRAM allocation and VRAM usage aren't the same thing.
 
Joined
Nov 13, 2007
Messages
10,684 (1.72/day)
Location
Austin Texas
System Name Planet Espresso
Processor 13700KF @ 5.5GHZ 1.285v - 235W cap
Motherboard MSI 690-I PRO
Cooling Thermalright Phantom Spirit EVO
Memory 48 GB DDR5 7600 MHZ CL36
Video Card(s) RTX 4090 FE
Storage 2TB WD SN850, 4TB WD SN850X
Display(s) Alienware 32" 4k 240hz OLED
Case Jonsbo Z20
Audio Device(s) Yes
Power Supply Corsair SF750
Mouse Xlite V2
Keyboard 65% HE Keyboard
Software Windows 11
Benchmark Scores They're pretty good, nothing crazy.
seems like there's a CPU limit in this game
 

iGigaFlop2

New Member
Joined
Dec 11, 2022
Messages
9 (0.01/day)
Great game but man it seems games are taking more and more power to run. I have a 4090 and it runs 4k over 60fps without ray tracing but with it nope got to use dlss. I don’t see how people with gpu’s like a 1060, rx 580, 1650, and so play recent games even at 1080p. Your average person who just wants to game should just get a ps5 or series s or x. Plus it seem like 75% the pc ports coming out last year and this year have problems. Its sad I have a 7950x and 4090 but just find it easier and less maddening to play on consoles. If they were like last gen i would say no most games were 30 fps and they had horrible cpu’s and mechanical hd’s. But this gen we get a decent cpu pretty fast gpu’s and nvme ssd’s and most game have a 60 fps mode or even 120.
 

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
27,706 (3.70/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
The RTX 3090, which has more than enough VRAM, is only 2.6FPS faster than the RTX 3080 which has nowhere near enough.
"Allocated" is not "used". You can see cards with 8 GB dropping down in their relative positioning, that's when they are actually running out of VRAM (enough to make a difference in FPS)
 
Joined
Nov 13, 2007
Messages
10,684 (1.72/day)
Location
Austin Texas
System Name Planet Espresso
Processor 13700KF @ 5.5GHZ 1.285v - 235W cap
Motherboard MSI 690-I PRO
Cooling Thermalright Phantom Spirit EVO
Memory 48 GB DDR5 7600 MHZ CL36
Video Card(s) RTX 4090 FE
Storage 2TB WD SN850, 4TB WD SN850X
Display(s) Alienware 32" 4k 240hz OLED
Case Jonsbo Z20
Audio Device(s) Yes
Power Supply Corsair SF750
Mouse Xlite V2
Keyboard 65% HE Keyboard
Software Windows 11
Benchmark Scores They're pretty good, nothing crazy.
Great game but man it seems games are taking more and more power to run. I have a 4090 and it runs 4k over 60fps without ray tracing but with it nope got to use dlss. I don’t see how people with gpu’s like a 1060, rx 580, 1650, and so play recent games even at 1080p. Your average person who just wants to game should just get a ps5 or series s or x. Plus it seem like 75% the pc ports coming out last year and this year have problems. Its sad I have a 7950x and 4090 but just find it easier and less maddening to play on consoles. If they were like last gen i would say no most games were 30 fps and they had horrible cpu’s and mechanical hd’s. But this gen we get a decent cpu pretty fast gpu’s and nvme ssd’s and most game have a 60 fps mode or even 120.


A 3080 class card can run this at 4k with DLSS quality/balanced no RT pretty easily - a 6700xt ($300) with FSR is good at 1440P.

Sure, if you want to crank all settings and turn off resolution scaling you get a slideshow, but those aren't the settings consoles run either. PS5 uses scaling on top of dynamic resolution so it's cheaper sure, but Im not sure why having ability to make the game look much better is infuriating.
 
Joined
Mar 31, 2010
Messages
333 (0.06/day)
Location
Los Angeles, USA.
System Name Intel 2023
Processor Intel Core i5 13600KF
Motherboard Gigabyte B660M Aourus Pro
Cooling Custom water cooling loop
Memory 2x16gb Adata PC 3600
Video Card(s) AMD 6950XT
Storage 2TB Corsair MP600
Display(s) Nixeus 27 EDG
Case Phanteks P600
Audio Device(s) Topping DX3 Pro +
Power Supply Corsair RM850
Mouse Razer Basilisk
Keyboard Womier K87 with Tecsee Purple Panda switches
Software Win 11 Pro 64bit
Benchmark Scores Unfortunately no time anymore to benchmark....
Pretty sure that vga drivers could do with some optimization too... Having said this it pisses me off that AMD have not delivered a new driver since Dec 2022 for anything other than series 7 cards!!!
 
Joined
Nov 13, 2007
Messages
10,684 (1.72/day)
Location
Austin Texas
System Name Planet Espresso
Processor 13700KF @ 5.5GHZ 1.285v - 235W cap
Motherboard MSI 690-I PRO
Cooling Thermalright Phantom Spirit EVO
Memory 48 GB DDR5 7600 MHZ CL36
Video Card(s) RTX 4090 FE
Storage 2TB WD SN850, 4TB WD SN850X
Display(s) Alienware 32" 4k 240hz OLED
Case Jonsbo Z20
Audio Device(s) Yes
Power Supply Corsair SF750
Mouse Xlite V2
Keyboard 65% HE Keyboard
Software Windows 11
Benchmark Scores They're pretty good, nothing crazy.
Based on the article's data, I would say that all GPUs are running terribly, especially if you compare the cost in fps with the graphical result obtained.

Honestly, this runs like Cyberpunk - if you crank everything and turn RT on you get 38-45 FPS on a 4090. But the game looks identical with DLSS quality/3.0 and that runs at 120+fps.

I think optimization + good scaling implementation and the result will be very good.
 
Top