# Ryzen 3300X vs. 5800X3D: 20 game benchmarks @ 1080p with max settings



## QuietBob (May 13, 2022)

Being a lucky owner of the latest gaming-focused processor from AMD, I thought it'd be interesting to compare it to their former budget gaming champ.

The *5800X3D* is positioned as a top-end SKU with an MSRP of $450, while the *3300X *was a low-cost option that launched at mere $120. However, both CPUs are of special interest to gamers, since they share the efficient single CCD-single CCX architecture. This design maximizes the number of cores on a single die, with each of them having immediate access to the full L3 cache. As *games profit from increased L3 cache size*, this implementation provides an obvious advantage. Another* benefit for gaming is* the inherent *lower latencies*.

Obviously, the 5800X3D is based on the newer Zen 3 architecture which has seen numerous improvements over its predecessor. One huge upgrade is double the amount of L3 cache per CCX, and the addition of another 64 MB in the form of the groundbreaking 3D V-cache. That's a whopping 96 MB - six times what the 3300X has. And of course the 5800X3D has twice the number of cores and threads compared to its little brother.

So, can a humble quad, based on a three-year-old Zen 2 architecture, even hope to compete with a CPU that has been designed specifically to win the gaming crown?

In order to give the *3300X *a fighting chance, it has been tested with a 4500 MHz all core overclock and FCLK:UCLK:MCLK running 1:1:1 at 1866 MHz. The RAM used is 2x8 GB Crucial Ballistix with CL16 and tweaked timings. The *5800X3D* boosts to 4450 MHz on all cores in games, meaning almost the same clock on both units. The newer CPU is running FCLK:UCLK:MCLK 1:1:1 at 1900 MHz, plus 2x8 GB G.SKILL Ripjaws V RAM with CL14 and also tweaked. Both systems are paired with a mid-range GPU - the *Radeon RX 6600XT*.








Since the main purpose of my testing was to *evaluate gaming performance*, I went with a selection of popular titles that are relatively easy to benchmark. I used the built-in benchmark function to try to obtain consistent results. Even though each consecutive run is not 100% identical due to some irregularity (initial actor placement, NPC AI, various random events, etc.), the results should provide a general idea of what to expect from each game - and CPU:


*Title*​*Year of release*​*API*​*Engine*​Assassin's Creed: Valhalla2020​dx12​Anvil (new)Deus Ex: Mankind Divided2016​dx12​Dawn (based on Glacier 2)Far Cry 62021​dx12​Dunia 2  Final Fantasy XV2018​dx11​Luminous          Forza Horizon V2021​dx12​ForzaTech          Gears 52019​dx12​UE4Gears Tactics2020​dx12​UE4Ghost Recon: Breakpoint2019​Vulkan​AnvilNext (old)Grand Theft Auto V2015​dx11​Rage              Hitman 32021​dx12​Glacier 2Metro Exodus Enhanced Edition2021​dx12​4AMiddle-earth: Shadow of War2017​dx11​Firebird          Rainbow Six: Extraction2022​dx12​Anvil (new)Shadow of the Tomb Raider2018​dx12​Foundation          The Division 22019​dx12​Snowdrop          The Riftbreaker2021​dx12​Schmetterling          Total War: Warhammer III2022​dx11​?Watch Dogs: Legion2020​dx12​Disrupt              

Rather than testing performance in CPU-bound but completely unrealistic 720p minimum detail scenarios, I opted for the common *1080p resolution with details maxed out*. Note that this wasn't done by simply selecting the highest quality preset for each game, but by manually maxing out every available option, albeit with a few exceptions. Motion blur and anti-aliasing was turned off, unless the game didn't allow it - in which case the lowest value was picked. DirectX Raytracing (DX12 RTX) features were disabled. Variable Rate Shading (DX12 VSR) was enabled if it was presented as a toggle, otherwise it was left off.

All testing was done *without any form of resolution or quality scaling, uncapped minimum and maximum frame rates, and using game default FOV*. Gears 5, Middle-earth: Shadow of War and Watch Dogs: Legion had their respective UHD texture packs installed.

Would the two CPUs demonstrate any differences in such *heavily GPU-bound settings*? Instead of focusing on the often quoted average and minimum fps numbers, I decided to dig deeper. The problem with these figures is that neither of them represents the real gaming experience. For this reason I chose the metrics that most closely relate to what is being perceived as *"smooth gameplay"*:

*- median frame rate *
This is an average representing the overall feel, though irrespective of the outliers (extremely low or high but infrequent fps values)

*- 1% and 0.1% lows *
Severe dips in frame rate resulting in noticeable hitching

*- frame time variance*
The difference in how quickly consecutive frames are presented, where large and/or inconsistent values demonstrate themselves as stuttering

I used the excellent *CapFrameX* tool to collect and aggregate the captured game data. Rather than measuring frames per second, this utility actually calculates the presented figures based on their respective frame times. Consequently, it most closely recreates the real experience. Let's look at the results:













Despite being one generation older, having half the amount of cores/threads and a sixth of L3 cache, the 3300X is able to keep up with its big brother! In fact, the two CPUs are quite evenly matched in twelve of the eighteen games. *Far Cry 6, Gears Tactics, Hitman 3 and Shadow of the Tomb Raider show notable improvements in 1% and 0.1% lows* *on the 5800X3D*, while *GTA5 and The Riftbreaker demonstrate marked gains in the 1% low and median* metrics.

But again, let's keep in mind that these tests were conducted with a *mid-range video card in largely GPU-bound scenarios*. The differences would be much more pronounced with a powerful GPU that could keep the 5800X3D fed with data. And the fps numbers don't reveal the whole story. Frame time graphs paint a more accurate picture:



In practically all games tested *the 5800X3D enables a smoother, much less juddery playing experience* - often visibly so. It goes to show that frame rates alone should not be taken to represent actual gameplay. Neither should they be used exclusively to judge a particular CPU's suitability for gaming.

Later I'm gonna post what should be the most relevant metric to a gamer: frame time variance charts, as well as some aggregated figures and other interesting statistics I gathered while testing.


----------



## phill (May 13, 2022)

Nice work man!!  I bet that took some time to setup, configure, test, get results sorted and wrote up!!    I think I'm reading the results correct, but maybe a little more definitive marking on the results to show what CPU etc.   I might be a bit tired so apologies if its clear as day!


----------



## elghinnarisa (May 13, 2022)

Thats some good shit right there! Nice work!


----------



## eidairaman1 (May 13, 2022)

QuietBob said:


> Being a lucky owner of the latest gaming-focused processor from AMD, I thought it'd be interesting to compare it to their former budget gaming champ.
> 
> The *5800X3D* is positioned as a top-end SKU with an MSRP of $450, while the *3300X *was a low-cost option that launched at mere $120. However, both CPUs are of special interest to gamers, since they share the efficient single CCD-single CCX architecture. This design maximizes the number of cores on a single die, with each of them having immediate access to the full L3 cache. As *games profit from increased L3 cache size*, this implementation provides an obvious advantage. Another* benefit for gaming is* the inherent *lower latencies*.
> 
> ...


Sort of like the FX 8000 being better today and the core i7 of the time being worse.


----------



## kapone32 (May 14, 2022)

The 3300X was my 2nd most favorite Ryzen CPU to own. I want a 5800X3d but can't do it just yet.


----------



## Arkz (May 14, 2022)

What's the deal with Hitman 3?


----------



## AlwaysHope (May 14, 2022)

There are differences between each setup with cpu / ram speeds & bios versions for example. Not an accurate test imo.


----------



## thesmokingman (May 14, 2022)

Bottlenecked...


----------



## tabascosauz (May 14, 2022)

Interesting test! But if you give the 5800X3D B-die, the 3300X should probably get the same even if it means running 3733 on the 5800X3D. There is an appreciable performance gulf, especially when subs/tRFC are tweaked, between B-die and Rev.E/B/CJR/DJR.


----------



## QuietBob (May 16, 2022)

Thanks for all the comments!

True, the results show a GPU bottleneck, but my goal wasn't to demonstrate it. I purposely picked the highest quality settings to see whether they'd deliver a playable frame rate on a mid-range video card in 1080p. I'm not really a fan of testing video games in the lowest resolution with minimum detail. What I wanted to see is how much improvement the new 5800X3D will bring in mostly GPU bound scenarios.

As for the differences in memory timings, I don't think using faster RAM would alter the results of the 3300X by much. Yes, tweaked B-die will show better latencies than Rev.E, but with diminishing returns for games over 1866 MHz IF. I wouldn't expect a gain of more than 2-3 fps in the 1% and 0.1% lows, with all the graphics settings maxed out.

As promised, here are the arguably most relevant charts for gaming: frame time variances. *The smaller the difference between two consecutive frames, the smoother the game feels*:



Again, we can see that *even in GPU limited scenarios the 5800X3D delivers more consistent frame times,* something which isn't obvious by looking at the fps charts alone.
The biggest difference is to be found in Hitman 3 - Dartmoor and The Riftbreaker. Those two scenes are very heavy on the CPU, taxing the processor with simulated physics (Hitman) or huge numbers of AI-driven units (Riftbreaker). The newer CPU does better here - mostly due to double the amount of cores/threads.

Some other interesting facts for all the games tested:


*average**3300X*​*5800X3D*​CPU package power45w​55w​total CPU load46%​22%​single threaded CPU load72%​56%​CPU temperature (25c room ambient, CM 690II case)55c (Thermalright Macho rev. C)​58c (Deepcool Assassin III)​GPU load93%​98%​GPU bound time71%​95%​
​Overall, the 5800X3D was a nice upgrade in my eyes. I'll definitely hang on to it for a long time. If I ever feel the need to upgrade my video card, the CPU will be more than capable of keeping up with it. In honesty, this probably won't happen soon, as I'm happy playing at 1200p60 and don't even game that often. Outside of games, the ST performance fully satisfies my needs, and the 16 threads are yet to see full utilization in what I do.


----------

