Hogwarts Legacy is one of the best movie/book adaptations in a game that I've encountered in a long time. Not only did the developers get the Hogwarts Castle right, they were also able to fill the world with lots of unique and entertaining places that are worth visiting. What's even more important is that these are believable in the J.K. Rowling's Harry Potter Universe and don't feel like the devs went crazy with their own ideas. The story, while fairly linear, is decent and there's plenty of ways to develop your character. There's also a good number of side quests available and plenty to explore and discover. It's not a looter shooter, there won't be tons of gear to compare and sort, but there's enough to customize your character the way you like it. It's everything a Harry Potter fan could have asked for. Even if you're not particularly into Harry Potter, but enjoy the magic/fantasy/wizardry concept, then you should definitely take a look, check out the numerous game reviews online.
In terms of graphics, Hogwarts Legacy can impress, too. I would rate them "good" to "very good," definitely better than most games out there. Avalanche Software (not to be confused with Avalanche Studios, i.e. the "Just Cause" series) used Unreal Engine 4 to create their masterpiece. They opted for the use of DirectX 12 exclusively, which make sense, considering they are implementing technologies such as DLSS and ray tracing, which only work on DX12. There's pretty much everything there in terms of graphics technologies: DLSS, DLSS 3 Frame Generation, DLAA, FSR 2, Intel XeSS, RT Shadows, RT Reflections, RT Ambient Occlusion. If you've checked out our screenshots I'm sure you'll agree that the game looks great. I'm slightly unhappy with the cartoonish art style, which builds characters and objects with relatively simple polygon meshes, so they look lower fidelity. I actually think that this is a deliberate choice, to emulate the style seen in many recent Disney movies—Harry Potter is a kid's book after all.
Just like in most other recent releases, shader compiling and stuttering is a problem in Hogwarts Legacy. Right on startup, before the main menu, the game presents you a "compiling shaders" screen that lasts for three to five minutes. Unfortunately the shaders that get compiled at this time are only the most essential ones. As you progress through the game you'll encounter serious drops in framerate (to unplayable levels) for about 30 seconds. Just stop and wait for the shader compiler to finish—this breaks the immersion of course and I wonder if this can't be solved more elegantly. I investigated a bit further and the game is compiling shaders in the background even during normal stutter-free gameplay, too, without affecting the game much. We've seen such problems in many games recently and I keep hearing how these will get fixed in the "day one" patch, "soon" after launch, maybe with the next big patch, or with the next DLC. For nearly all games these issues never get fixed, and there's never any noteworthy performance improvements, so /doubt. My recommendation is to just live with it, stop playing for a few seconds when you get hit by the stutter, take a short break, and resume when things are smooth again. Just to clarify, for 98% of my play time in the first few hours there was no stutter and no frame drop issues, everything was super smooth.
Even without shader trouble, performance requirements of Hogwarts Legacy are very high. In order to reach 60 FPS at the 1080p Full HD resolution you need a GeForce RTX 2080 Ti, RTX 3070, or a Radeon RX 6700 XT. For 1440p gaming, an RTX 3080 or Radeon RX 6800 XT is required to achieve 60 FPS and beyond. If you're gaming at 4K then you'll need an RTX 4090 or RX 7900 XTX. Don't believe any benchmarks that are run inside the castle. Here there isn't much to see and your view distance is low. We've done our benchmark runs in the open world areas and FPS are much lower here. I picked a test scene that's demanding, but not worst case. Such high hardware requirements are fairly similar to what we saw in Dead Space and Forspoken, and I'm starting to wonder if we'll even see any well-optimized games this year. Lowering the settings is possible, but the range is rather small. Going from "Ultra" to "Low" you'll gain 10 to 20%, which isn't much. Sure, the game looks great on lowest settings, but I don't think that's the point of lower settings, rather they should offer a way for the game to perform well on lower-end PCs, even if it means compromising on the graphics quality—the gameplay will still be good and enjoyable.
As mentioned before, ray tracing is supported in Hogwarts Legacy, too. The developers added three separate RT options that can all be toggled separately: RT Shadows, RT Reflections and RT Ambient Occlusion. While RT Reflections definitely look flashy in Hogwarts, because the floors are super reflective, they do very little in the open world and in other areas. It seems the devs spent only a bit of time to get it added to the starting areas and then didn't bother finishing the rest of the map. One example is that water in lakes and rivers is not enabled for RT reflections. The shadows are a bit hit and miss, RT or not. While they look pretty realistic most of the time, there's situations where things look terribly wrong. If you turn on ray tracing here, which promises physically accurate rendering, things don't magically turn perfect. Rather things look different, and I often wondered "doesn't RT-off look better here?" Games are still developed without ray tracing, and the map designers have decades of experience building believable and immersive worlds without ray tracing, so it'll take time for us to see big improvements.
In terms of performance, the hit from RT is just shocking. Basically your performance gets cut in half on modern NVIDIA GPUs, and on AMD things just fall apart, with not a single Radeon card reaching 30 FPS at 1080p (!), and you get sub-10 FPS at 4K. It seems nobody tested the game on Radeon with ray tracing, because they would have noticed the white trees that turn to the correct green color once you walk up to them. Let's hope that a new driver from AMD can fix that. While Intel has a game-ready driver available, neither AMD nor NVIDIA has released drivers for Hogwarts Legacy, which is a huge shame, considering this is a big AAA title that's published by Warner Brothers. Sure, you can make the argument that the game isn't officially released yet, only buying the Deluxe Edition gives everyone a three-day early access—yet there's hundreds of thousands of people playing right now. Not the best support from AMD and NVIDIA for their customers who spend hundreds of dollars (or more) on expensive GPUs.
The ray tracing performance hit can be reduced by lowering the ray tracing quality setting from the "Ultra" default, which helps a lot with performance, especially on AMD. RX 7900 XTX can now reach 60 FPS with ray tracing "low," at 1080p. Not impressive, but still a huge improvement over 28 FPS. In terms of VRAM usage, Hogwarts Legacy set a new record, we measured 15 GB VRAM allocated at 4K with RT, 12 GB at 4K with RT disabled. While these numbers seem shockingly high, you have to put them in perspective. Due to the high performance requirements you'll definitely not be gaming with a sub-16 GB card at 4K. 8 GB+ at 1080p is certainly a lot, but here, too, owners of weaker GPUs will have to dial down their settings anyway. What's always an option is to use the various upscaling methods like DLSS, FSR and XeSS, which lower VRAM usage, too, by running at a lower internal rendering resolution.
If you think now that Hogwarts Legacy is a terrible game, then you're completely wrong. I'm loving every minute of it and can highly recommend it, despite all the technical difficulties. As mentioned before, I wouldn't bet that a future patch will be able to bring big improvements, maybe new drivers could help, but this isn't a guarantee.
Update Feb 10: The game is now released for everyone and there's a new patch. I've tested the patch on RTX 4090 and RX 7900 XTX, with RT on and off, there's no change in performance. The DLSS 3 menu bug is also not fixed.