• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Hogwarts Legacy Benchmark Test & Performance Analysis

Mussels

Freshwater Moderator
Joined
Oct 6, 2004
Messages
58,413 (7.90/day)
Location
Oystralia
System Name Rainbow Sparkles (Power efficient, <350W gaming load)
Processor Ryzen R7 5800x3D (Undervolted, 4.45GHz all core)
Motherboard Asus x570-F (BIOS Modded)
Cooling Alphacool Apex UV - Alphacool Eisblock XPX Aurora + EK Quantum ARGB 3090 w/ active backplate
Memory 2x32GB DDR4 3600 Corsair Vengeance RGB @3866 C18-22-22-22-42 TRFC704 (1.4V Hynix MJR - SoC 1.15V)
Video Card(s) Galax RTX 3090 SG 24GB: Underclocked to 1700Mhz 0.750v (375W down to 250W))
Storage 2TB WD SN850 NVME + 1TB Sasmsung 970 Pro NVME + 1TB Intel 6000P NVME USB 3.2
Display(s) Phillips 32 32M1N5800A (4k144), LG 32" (4K60) | Gigabyte G32QC (2k165) | Phillips 328m6fjrmb (2K144)
Case Fractal Design R6
Audio Device(s) Logitech G560 | Corsair Void pro RGB |Blue Yeti mic
Power Supply Fractal Ion+ 2 860W (Platinum) (This thing is God-tier. Silent and TINY)
Mouse Logitech G Pro wireless + Steelseries Prisma XL
Keyboard Razer Huntsman TE ( Sexy white keycaps)
VR HMD Oculus Rift S + Quest 2
Software Windows 11 pro x64 (Yes, it's genuinely a good OS) OpenRGB - ditch the branded bloatware!
Benchmark Scores Nyooom.
People are quick to jump to conspiracy on the internet, the fact that the 3090 is barely faster than the A770 in 4k RT means something is not working right.
Do people seriously thinks that the A770 is suppose to match a 3080Ti in 4K RT?
On top of that, the RT in this game is nothing special for it to run this poorly.
View attachment 283140

Edit: Apperently HUB also found a menu bug in this game.
Hardware Unboxed:"OMG! I solved the issue, it's not a Ryzen bug but rather a menu bug. Although DLSS and all forms of upscaling were disabled &amp; greyed out in the menu, frame generation was on for just the 40 series. I had to enable DLSS, then enabled FG, then disable FG and disable DLSS to fix it!" / Twitter
This isnt the first game to have had frame gen on while the setting was greyed out, either


Ooops we didnt mean to do that, we'll patch it later after our sponsor finishes lookin good
 
Joined
Sep 18, 2020
Messages
119 (0.08/day)
System Name Vedica
Processor Intel Core i7-9700K
Motherboard Gigabyte AORUS Ultra Z390
Cooling Alphacool Eisblock XPX
Memory 32GB DDR4, 4x Crucial Ballistix Sport LT BLS8G4D30AESBK
Video Card(s) Nvidia RTX 3080
Storage 2x Sabrent 1 TB Rocket - 1x Seagate Barracuda ST4000DM004
Display(s) Dell AW3423DWF
Case Fractal Design Define R6
Audio Device(s) Motu M2
Power Supply Corsair RM1000x
Mouse Cooler Master MM720
Keyboard Wooting One
RT also known as performance-killing glossifier.
 
Joined
Feb 3, 2017
Messages
3,836 (1.33/day)
Processor Ryzen 7800X3D
Motherboard ROG STRIX B650E-F GAMING WIFI
Memory 2x16GB G.Skill Flare X5 DDR5-6000 CL36 (F5-6000J3636F16GX2-FX5)
Video Card(s) INNO3D GeForce RTX™ 4070 Ti SUPER TWIN X2
Storage 2TB Samsung 980 PRO, 4TB WD Black SN850X
Display(s) 42" LG C2 OLED, 27" ASUS PG279Q
Case Thermaltake Core P5
Power Supply Fractal Design Ion+ Platinum 760W
Mouse Corsair Dark Core RGB Pro SE
Keyboard Corsair K100 RGB
VR HMD HTC Vive Cosmos
Can someone provide a logical reason for the decline in performance of the 7900XTX to the level of a Mid-end 3060 when RT is enabled? lol
AMD cards still have less horsepower when it comes to raytracing. The specific problematic part has so far been specifically tracing rays. There is a base performance hit for enabling RT due to all the preparatory work that needs to happen - generally BVH generation and all the associated hoopla - but after that it is all about how many rays can be traced in a given time. So far, Nvidia cards can do more of that even when comparing Ampere to RDNA3. Some suspicions are there on why exactly AMD gets a bigger hit sometimes, primarily revolving around AMD potentially doing some of the work on shaders but that is a minor detail in big picture.

AMD does relatively better when given RT effect involves less actual rays. Some AMD-sponsored games with RT do exactly that - a simpler effect, less rays and things are more even. Nvidia of course pushed for their strengths - more effects, more rays, knowing that AMD cards will take a hit sooner than theirs. And even if the rays needed/intended exceeds what Nvidia cards can do - and that happens often enough in games with more and more intensive RT effects - the resulting hit is still smaller for them.

Capability for more rays are obviously better but there is a balance to be struck given it clearly takes dedicated hardware for reasonable RT performance in the first place. AMD has banked on RT not being relevant (yet) and skimping on capability. Nvidia - maybe slightly weirdly - has also been holding back on increasing the relative amounts of RT Cores (only increasing their capability to some degree) so they seem to be hedging their bets a little too.

Edit:
Hogwarts-specific example - ComputerBase tested different RT quality levels and 7900XTX only gets the relatively big hit (compared to 4080) at Ultra:
 
Last edited:
Joined
May 20, 2020
Messages
1,389 (0.82/day)
The intel arc 770 card runs this Hogwash™ exceedingly well, perhaps finally showing its true potential. AMD Hoseron™ doesn't do all that well, looks like optimisations can still be done, we shall see.
 
Joined
Feb 3, 2017
Messages
3,836 (1.33/day)
Processor Ryzen 7800X3D
Motherboard ROG STRIX B650E-F GAMING WIFI
Memory 2x16GB G.Skill Flare X5 DDR5-6000 CL36 (F5-6000J3636F16GX2-FX5)
Video Card(s) INNO3D GeForce RTX™ 4070 Ti SUPER TWIN X2
Storage 2TB Samsung 980 PRO, 4TB WD Black SN850X
Display(s) 42" LG C2 OLED, 27" ASUS PG279Q
Case Thermaltake Core P5
Power Supply Fractal Design Ion+ Platinum 760W
Mouse Corsair Dark Core RGB Pro SE
Keyboard Corsair K100 RGB
VR HMD HTC Vive Cosmos
Joined
Mar 20, 2008
Messages
898 (0.15/day)
System Name Raptor
Processor Core i7 13700K
Motherboard MSI Z690 Tomahawk WiFi
Cooling ArcticFreezer 420
Memory Corsair VENGEANCE® 32GB (2x16GB) DDR5 5600MHz C36
Video Card(s) Palit GameRock 3080Ti OC
Storage M.2 Addlink S70 Lite , Samsung SSD 980 PRO 2TB, SanDisk Ultra II 480GB, 1TB seagate
Display(s) ASUS TUF VG27AQL1A
Case LANCOOL III
Audio Device(s) Realtek® ALC4080 Codec + Philips SHP9500
Power Supply Seasonic GX-1000
Mouse G502 Proteus Spectrum
Keyboard ASUS CERBERUS
Software Windows 10
I'm disappointed that DLSS wasn't used in this review with such poor performance title (no wonder it's Unreal Engine).
 

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
27,976 (3.71/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
I think that there is something wrong with the RT ON tests here for AMD
Definitely. I clean installed the newest drivers and ran the game. Not sure what else I could do
 
Joined
Jan 9, 2023
Messages
336 (0.46/day)
Just watched the HUB video, the differences in results are actually very stark. I'm honestly not sure what to believe here.
Not that I'm faulting anyone, but there's something seriously different somewhere.
If I had to point at something I'd wager it's the CPU, 13900K vs 7700X is a major difference, I'm not so sure I agree with Steve's "I tested the 7900XTX and the 4090 with the 13900K and saw comparable performance". Perhaps some more testing is necessary between CPUs.
 
Joined
May 31, 2016
Messages
4,447 (1.42/day)
Location
Currently Norway
System Name Bro2
Processor Ryzen 5800X
Motherboard Gigabyte X570 Aorus Elite
Cooling Corsair h115i pro rgb
Memory 32GB G.Skill Flare X 3200 CL14 @3800Mhz CL16
Video Card(s) Powercolor 6900 XT Red Devil 1.1v@2400Mhz
Storage M.2 Samsung 970 Evo Plus 500MB/ Samsung 860 Evo 1TB
Display(s) LG 27UD69 UHD / LG 27GN950
Case Fractal Design G
Audio Device(s) Realtec 5.1
Power Supply Seasonic 750W GOLD
Mouse Logitech G402
Keyboard Logitech slim
Software Windows 10 64 bit
@W1zzard , ramake the test please

Seen that one. Steve said that (and showed which was astonishing)the graphics cards should be equipped with at least 12GB vram if you wanna play some serious resolution and detail level with RT. It was so damn weird seeing 3060 overtaking 3080 and 3070 cards due to vram. Although the FPS was low but still.
Though the results here and on HWUB are strangely different and it is not by a few % but rather noticeable difference. Maybe it is due to the 13900K Wizz used but from what Steve said there is no difference or should be no difference according to his findings.
What was also weird, 7900xtx and 7900xt on top of the chart for 1080p with RT enabled Ultra quality.
Simply the game needs some improvement here and there.
One more thing that pops into my head. Intel and NVidia had a game ready driver and AMD didn't. Wonder if the driver when released change something noticeably or not.
I wonder also, if there is a difference between the ReBAR on and off for platforms that support it.
 
Joined
May 31, 2016
Messages
4,447 (1.42/day)
Location
Currently Norway
System Name Bro2
Processor Ryzen 5800X
Motherboard Gigabyte X570 Aorus Elite
Cooling Corsair h115i pro rgb
Memory 32GB G.Skill Flare X 3200 CL14 @3800Mhz CL16
Video Card(s) Powercolor 6900 XT Red Devil 1.1v@2400Mhz
Storage M.2 Samsung 970 Evo Plus 500MB/ Samsung 860 Evo 1TB
Display(s) LG 27UD69 UHD / LG 27GN950
Case Fractal Design G
Audio Device(s) Realtec 5.1
Power Supply Seasonic 750W GOLD
Mouse Logitech G402
Keyboard Logitech slim
Software Windows 10 64 bit
Hardware Unboxed used a 7700X as their CPU.
Yes and Steve mentioned it did not bring any significant improvements in FPS when he used 13900K with few GPUs. Maybe at some point there will be a follow up video and 13900K will be used for comparison.
 
Joined
Feb 23, 2019
Messages
6,113 (2.86/day)
Location
Poland
Processor Ryzen 7 5800X3D
Motherboard Gigabyte X570 Aorus Elite
Cooling Thermalright Phantom Spirit 120 SE
Memory 2x16 GB Crucial Ballistix 3600 CL16 Rev E @ 3600 CL14
Video Card(s) RTX3080 Ti FE
Storage SX8200 Pro 1 TB, Plextor M6Pro 256 GB, WD Blue 2TB
Display(s) LG 34GN850P-B
Case SilverStone Primera PM01 RGB
Audio Device(s) SoundBlaster G6 | Fidelio X2 | Sennheiser 6XX
Power Supply SeaSonic Focus Plus Gold 750W
Mouse Endgame Gear XM1R
Keyboard Wooting Two HE
So hogwarts is a resource hog, who would have thought.

It's looks like fun game but optimization doesn't seem to be the priority here. Sure makes the new gpu's look good.

@W1zzard any reason for not including 3080 Ti in performance charts? Clearly you can't extrapolate the results between 3080 and 3090 due to the difference in VRAM.
 
Joined
Feb 11, 2009
Messages
5,582 (0.96/day)
System Name Cyberline
Processor Intel Core i7 2600k -> 12600k
Motherboard Asus P8P67 LE Rev 3.0 -> Gigabyte Z690 Auros Elite DDR4
Cooling Tuniq Tower 120 -> Custom Watercoolingloop
Memory Corsair (4x2) 8gb 1600mhz -> Crucial (8x2) 16gb 3600mhz
Video Card(s) AMD RX480 -> RX7800XT
Storage Samsung 750 Evo 250gb SSD + WD 1tb x 2 + WD 2tb -> 2tb MVMe SSD
Display(s) Philips 32inch LPF5605H (television) -> Dell S3220DGF
Case antec 600 -> Thermaltake Tenor HTCP case
Audio Device(s) Focusrite 2i4 (USB)
Power Supply Seasonic 620watt 80+ Platinum
Mouse Elecom EX-G
Keyboard Rapoo V700
Software Windows 10 Pro 64bit
jeez did they follow the Nvidia guidance of implementing RT and stuck by it....
this game is on consoles right? and has RT there as well right? so what gives?
 
Joined
Jun 20, 2005
Messages
80 (0.01/day)
Location
Leeds, UK
System Name My PC
Processor 6700K @ 4.5GHz
Motherboard GigaByte GA-Z170XP-SLI
Cooling Pure Rock 2 + 4 Fans
Memory 2 x 16GB Corsair 3200MHz DDR4
Video Card(s) MSI RX 6900 XT Gaming X Trio
Storage PNY CS3030 NVMe 1TB, MX500 2TB x 2, 3TB WD Blue
Display(s) 27" curved 165Hz VA 1080p (Gigabyte)
Case Corsair 200R
Audio Device(s) Creative X4, AVR + Monitor Audio MASS 5.1
Power Supply Corsair RM750
Mouse Deathadder 2
Keyboard Xtrfy K4
Software W10 Pro
Benchmark Scores 14k1 (ish) Timespy (20k2 gfx 5k2 cpu)
The HUB 1080 ultra RT show the Radeon in a far better light than here, massively so. Seems area used for testing is as critical as choosing gfx card. Further evidence the game's just broken.
 
Joined
Jan 14, 2019
Messages
12,768 (5.86/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE Plasma
What is going on here? A770 is on par with 3090 in RT and much faster than 7900 xtx?! Bruh, this is some wild crap.
It is indeed. Intel's Alchemist is extremely good in ray tracing... not so much in anything else, unfortunately. Its drivers are also kind of crap.

HwInfo64 its a bit extensive monitoring (can lower fps a few) used with RTSS Riva Tuner Statistics Server.
Or if you have an AMD Radeon in the performance metrics section of the Adrenalin control panel.
Also for Nvidia MSI afterburner includes RTSS.
Those tools (or any tool in fact) only shows VRAM allocation, as far as I know. It cannot differentiate between VRAM used for assets on screen and VRAM used to store extra stuff for later.

People are quick to jump to conspiracy on the internet, the fact that the 3090 is barely faster than the A770 in 4k RT means something is not working right.
Do people seriously thinks that the A770 is suppose to match a 3080Ti in 4K RT?
On top of that, the RT in this game is nothing special for it to run this poorly.
View attachment 283140

Edit: Apperently HUB also found a menu bug in this game.
Hardware Unboxed:"OMG! I solved the issue, it's not a Ryzen bug but rather a menu bug. Although DLSS and all forms of upscaling were disabled &amp; greyed out in the menu, frame generation was on for just the 40 series. I had to enable DLSS, then enabled FG, then disable FG and disable DLSS to fix it!" / Twitter
So with a 40-series card, frame generation is on whether you want it or not... that's shady! Very shady! No wonder 40-series cards do so much better in reviews. :shadedshu:
 
Joined
Mar 20, 2008
Messages
898 (0.15/day)
System Name Raptor
Processor Core i7 13700K
Motherboard MSI Z690 Tomahawk WiFi
Cooling ArcticFreezer 420
Memory Corsair VENGEANCE® 32GB (2x16GB) DDR5 5600MHz C36
Video Card(s) Palit GameRock 3080Ti OC
Storage M.2 Addlink S70 Lite , Samsung SSD 980 PRO 2TB, SanDisk Ultra II 480GB, 1TB seagate
Display(s) ASUS TUF VG27AQL1A
Case LANCOOL III
Audio Device(s) Realtek® ALC4080 Codec + Philips SHP9500
Power Supply Seasonic GX-1000
Mouse G502 Proteus Spectrum
Keyboard ASUS CERBERUS
Software Windows 10
Was ReBAR enabled on the Radeons? For these cards this is very important.
It's enabled.

1676031901680.png
 

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
27,976 (3.71/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
any reason for not including 3080 Ti in performance charts? Clearly you can't extrapolate the results between 3080 and 3090 due to the difference in VRAM.
I can only include so many cards, didn't include the GeForce 30 Ti's except for 3090 Ti, because fastest GeForce 30
 

brichard0625

New Member
Joined
Feb 10, 2023
Messages
1 (0.00/day)
And this is why I don't trust commercial reviewers when they post their results. Here's a video of my 7900xtx(default settings no OC) and in 1440p I was averaging 50fps which is right below the 4080. How are you getting in the teens I don't understand)
watch user videos with stats running. Tomorrow I'll be posting a 4k video rt ultra comparing default vs oc. 4k 7900xtx gets around 20fps outside inside the fps shoots up to about 45fps
 
Joined
Dec 28, 2012
Messages
3,977 (0.91/day)
System Name Skunkworks 3.0
Processor 5800x3d
Motherboard x570 unify
Cooling Noctua NH-U12A
Memory 32GB 3600 mhz
Video Card(s) asrock 6800xt challenger D
Storage Sabarent rocket 4.0 2TB, MX 500 2TB
Display(s) Asus 1440p144 27"
Case Old arse cooler master 932
Power Supply Corsair 1200w platinum
Mouse *squeak*
Keyboard Some old office thing
Software Manjaro
This games all over social media today for all the wrong reasons, but one thing stood out - one user found .ini files you can customise the ray tracing in, and found the defaults to be absolute garbage for performance and quality

FB screenshots couldnt really show much of a difference, but the FPS values were a lot higher with his changes (and since thats some random FB user, i'm sure better guides will exist elsewhere soon enough)


High performance storage and RAM will alleviate that issue
Someone on DDR4 2133 with a sata SSD would have a stutter fest, but review level hardware with high speed RAM and storage behind it wont suffer anywhere near as much.


And yes, it really is that drastic an issue - i fixed a friends system with a weakass 2GB GT960 and nearly doubled her FPS in DX12 games by OCing her ram from 2133 to 2667 - higher VRAM is a buffer, but if a system can stream that data fast enough it's not needed (but can cause those 1% and 0.1% lows to dip)

One of my intel machines (i7 6700, locked to DDR4 2133) has great CPU performance but was *Garbage* with a GTX980 4GB GPU with lots of stuttering - all gone with an 8GB 1070 - the exact opposite fix to the same problem.
If VRAM was seriously running out, even with OC DDR5 DRAM and NVMe 5 storage, you still still see major stuttering and dropped frame issues in the 1% lows. That card has 760GB/s of bandwidth for a reason.

The HUB 1080 ultra RT show the Radeon in a far better light than here, massively so. Seems area used for testing is as critical as choosing gfx card. Further evidence the game's just broken.
I think thats a big part. Getting consistent results is like pulling teeth.
 
Joined
Apr 7, 2016
Messages
68 (0.02/day)
The moment you hand $2k for a 4090 saying this is a high refresh rate 4k gaming card. It should stay this way for years to come
 
Joined
Jan 14, 2019
Messages
12,768 (5.86/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE Plasma
The moment you hand $2k for a 4090 saying this is a high refresh rate 4k gaming card. It should stay this way for years to come
It should. But it doesn't. There lies Nvidia's marketing power.
 
Top