• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

RTX 4090 & 53 Games: Ryzen 7 5800X vs Ryzen 7 5800X3D

Good way to explain it. For some time it was great at being the top of FPS charts as well, but the new gen have outdone it there - at much higher wattages, but not always keeping those lows up

I'm expecting mine to age like the old i7 920 did, or the golden era K chips (2500k through 4790K) where owners of those are still able to game on them today

I immediately seen this correlation with my old X58 board and the 6 core 32nm chips I was using it them. I'll be gaming on this 3D for a few years or more with only doing GPU upgrades.
 
Thanks for the AM4 Parting Gift AMD! I sold my 5800X for $250 and bought the 5800X3D for $450 and got Uncharted for free LOL. So drop in CPU upgrade for $150 with this kind up increase in performance? Wow. AM4 is truely a legend. Now if only AMD could get the price of B650 boards in line with B550 boards. With the new 7000 series pricing we would be back on the path of value once again.
 
Are the game settings listed anywhere?
 
Traditionally they're ran at the highest in-game preset, but it may not be specified in this article

@W1zzard ?
Yeah it’s max settings pretty much, some games run slightly higher than the ultra profile equivalent when that leaves some Settings not maxed out. Some games i turn off things that are pretty much vendor specific like hairworks on Witcher
 
I just picked up a 5800X3D today but probably won't install until Monday.

1680379011565.jpeg


1680379044509.jpeg
 
Last edited:
I just picked up a 5800X3D today but probably won't install until Monday.

View attachment 290056

View attachment 290057
ONE OF US


Make sure to run a benchmark in something that shows you the minimum FPS values before and after - and dont forget a CMOS clear and 'load optimised defaults', settings get weirdly sticky changing ryzen CPUs at times
 
ONE OF US


Make sure to run a benchmark in something that shows you the minimum FPS values before and after - and dont forget a CMOS clear and 'load optimised defaults', settings get weirdly sticky changing ryzen CPUs at times
The 5800x is currently pbo tuned.

I will run a couple benchmarks before doing a cmos reset.

Then will drop the new chip in and retest.
 
The good thing about doing this from the same board and bios version is I was able to save profiles for each cpu in the bios.

So I have 4 profiles currently

1. 5800X Stock DOCP
2. 5800X Tuned - Sub memory and PBO
3. 5800X3D stock DOCP
4. 5800X3D tuned - Sub memory and PBO

And switch between each cpu will reset the bios to defaults so I can just load which profile I need after replacing cpu.

Right now i'm doing -20 PBO per core on the 5800X3D and I can probably get more aggressive on that -25 or -30 but I will need to test per core so this is a good start for now.
 
The good thing about doing this from the same board and bios version is I was able to save profiles for each cpu in the bios.

So I have 4 profiles currently

1. 5800X Stock DOCP
2. 5800X Tuned - Sub memory and PBO
3. 5800X3D stock DOCP
4. 5800X3D tuned - Sub memory and PBO

And switch between each cpu will reset the bios to defaults so I can just load which profile I need after replacing cpu.

Right now i'm doing -20 PBO per core on the 5800X3D and I can probably get more aggressive on that -25 or -30 but I will need to test per core so this is a good start for now.
My x3D (and many others) run instantly to -30, they do *not* behave the same as a normal CPU with those settings

You have to add manual undervolting on top to max it out, if the board allows it (newer AGESA added it for a lot more people) - mine runs at -0.06v before any performance hit comes in from clock stretching.
This same method on non 3D chip would cause instability, but the x3D treat it more like a VID than voltage, they clock down and lose performance instead of going unstable.
 
Currently at -25 on all cores and everything is good.

I'm wondering now if I should keep my ram at 3200 cl 14 or go faster speed with more relaxed timings.
 
Just made the switch fro PBO Tuner 2 to the newest bios update from Gigabyte, which enables quite a few things...
CO works fine... with -30 on all cores...
But damn I can set a higher temperature limit and PPT. I can also disable the 4,5Ghz clock limit with that new bios...
And I DO NOT want to fry my CPU...
 
Just made the switch fro PBO Tuner 2 to the newest bios update from Gigabyte, which enables quite a few things...
CO works fine... with -30 on all cores...
But damn I can set a higher temperature limit and PPT. I can also disable the 4,5Ghz clock limit with that new bios...
And I DO NOT want to fry my CPU...
Just dont up any voltages and you should be fine - the reports of dead CPU's have involved people overclocking with higher voltages, so far (other than one bad asus bios on AM5)
 
Just dont up any voltages and you should be fine - the reports of dead CPU's have involved people overclocking with higher voltages, so far (other than one bad asus bios on AM5)
That is just core voltage over 1.35v and very high VSOC voltage.
The 5800X3D has 1,2v VCORE max and only 1V VSOC. And I want to undervolt not overclock :D
 
I have a 5600X that is always at over 90% playing BF2042 (I like it), would a 5800X3D help me getting more FPS? I'm always at 90 to 120fps, with fps all over the place and its annoying.

I have a 5600X
32Gb DDR4 3600Mhz
RTX3080
Dell 34 (3440x1440 144hz)

Anyone with a 5800X3D that can share some input?

Thanks
 
I have a 5600X that is always at over 90% playing BF2042 (I like it), would a 5800X3D help me getting more FPS? I'm always at 90 to 120fps, with fps all over the place and its annoying.

I have a 5600X
32Gb DDR4 3600Mhz
RTX3080
Dell 34 (3440x1440 144hz)

Anyone with a 5800X3D that can share some input?

Thanks
What clock speed is it at when it's doing this?

HWinfo64 while gaming and seeing the realtime information would help, as would running cinebench R23 and comparing clock speeds vs effective clocks while the test was running.
You could be throttling and suffering from lower clock speeds and suffer the same problem with a new CPU.

A quick google search shows its actually a game bug, and new hardware would have the same issue.

Battlefield 2042 'high CPU usage' issue gets acknowledged (piunikaweb.com)

Battlefield Comms on Twitter: "With Season 3's release we've made changes to improve game performance on PC. While we're generally seeing a positive increase in performance, some of you have reported a high CPU load. Our team is working on a fix and we’ll keep you updated on our progress. https://t.co/HmzcSaVeJI" / Twitter

Performance BF2042 : battlefield2042 (reddit.com)

Steam Community :: Guide :: Reduce CPU workload (fix for 100% load)

1685605819593.png



Unfortunately theres a lot of misinformed people defending the games CPU usage, once you pass 50% usage you've moved pass physical cores and into SMT threads - and they perform far worse. Past 50%, your performance suffers.

Try one of the fixes in the links, or wait for them to patch it.
 
Last edited:
Good evening. I just happened to see this thread and was wondering how would 1% lows compare against a 5600x. Is it worth the upgrade? I've got a 5600x with a 3080 and while everything runs great minus RTX games lol, I feel like sometimes stuff drops a lot. I also play at 1440p. I'd love to get as many years as possible of this setup with in the future upgrading the GPU. Thanks for your time!
 
Good evening. I just happened to see this thread and was wondering how would 1% lows compare against a 5600x. Is it worth the upgrade? I've got a 5600x with a 3080 and while everything runs great minus RTX games lol, I feel like sometimes stuff drops a lot. I also play at 1440p. I'd love to get as many years as possible of this setup with in the future upgrading the GPU. Thanks for your time!

Are you asking if you should pair a 4090 with a 5600X for 1440p gaming?
 
Oh sorry. Could've swore I put 5800x3d. I meant as if it's worth upgrading to a 5800x3d from a 5600x basically.

Depends on the games you play..... My secondary computer is a 5800X paired with a 3080ti so not drastically different than your setup and its fine..... If you planned to jump up to a 7900XTX or 4080 it probably would make more sense but again it depends on what games you prefer to play.
 
Good evening. I just happened to see this thread and was wondering how would 1% lows compare against a 5600x. Is it worth the upgrade? I've got a 5600x with a 3080 and while everything runs great minus RTX games lol, I feel like sometimes stuff drops a lot. I also play at 1440p. I'd love to get as many years as possible of this setup with in the future upgrading the GPU. Thanks for your time!
TPU's CPU reviews contain this information, you can usually start at the newest CPU review and find what you're looking for

Intel Core i9-13900KS Review - The Empire Strikes Back - Minimum FPS / RTX 4090 | TechPowerUp

basically if you're happy with 120FPS, the 5600x is perfectly fine with any GPU - it's only higher frame rates that need faster cores in a CPU

1694055739092.png

1694055728989.png



values drop slightly at higher resolutions, where GPU's drop massively - so it may be 125 down to 110 on the CPU side while a 4090 can't manage 4K 120 in many titles without DLSS or lowering settings anyway so it matters less


1694055791757.png

1694055815080.png


20 FPS loss going to 4K, but the GPU itself would have halved


Avg FPS sits around the 150 mark, so these are still quite solid for even 4k 144, and these values can of course be improved upon with some basic tweaking of PBO and RAM settings - or just cap yourself to 120FPS and enjoy smooth gameplay anyway
 
TPU's CPU reviews contain this information, you can usually start at the newest CPU review and find what you're looking for

Intel Core i9-13900KS Review - The Empire Strikes Back - Minimum FPS / RTX 4090 | TechPowerUp

basically if you're happy with 120FPS, the 5600x is perfectly fine with any GPU - it's only higher frame rates that need faster cores in a CPU

View attachment 312454
View attachment 312453


values drop slightly at higher resolutions, where GPU's drop massively - so it may be 125 down to 110 on the CPU side while a 4090 can't manage 4K 120 in many titles without DLSS or lowering settings anyway so it matters less


View attachment 312455
View attachment 312456

20 FPS loss going to 4K, but the GPU itself would have halved


Avg FPS sits around the 150 mark, so these are still quite solid for even 4k 144, and these values can of course be improved upon with some basic tweaking of PBO and RAM settings - or just cap yourself to 120FPS and enjoy smooth gameplay anyway
Wow, thanks a ton! this is all I wanted to know, this is great lol, it's a crazy good CPU huh. It's good to know then that if I upgrade to a 40'' series GPU or 50'' series, it's just gonna be fine. I'm more than good with 120 FPS.

Thanks for taking the time and doing this elaborated response, wish you a great night!
 
Wow, thanks a ton! this is all I wanted to know, this is great lol, it's a crazy good CPU huh. It's good to know then that if I upgrade to a 40'' series GPU or 50'' series, it's just gonna be fine. I'm more than good with 120 FPS.

Thanks for taking the time and doing this elaborated response, wish you a great night!
The short version is that CPU's are good upto a certain frame rate, with the minimums - maximum and average don't matter as much if they can't sustain it long-term for any reason, because it could be 5 seconds at a higher frame rate and then a rollercoster the rest of the time.

These minimums don't change much as you go higher resolution - usually it's smaller than the ~20% drop we saw from 1080p to 4k there, but these are all modern demanding games too.

Correctly set up RAM will have a very large difference to those FPS minimums as well, and that's something few people seem to get done correctly. You can ask for help with that if you post your full system specs over here in the Ryzen Owners Zen Garden TechPowerUp Forums - Zentimings is a fast and easy way for us to spot things set up wrong.
 
The short version is that CPU's are good upto a certain frame rate, with the minimums - maximum and average don't matter as much if they can't sustain it long-term for any reason, because it could be 5 seconds at a higher frame rate and then a rollercoster the rest of the time.

These minimums don't change much as you go higher resolution - usually it's smaller than the ~20% drop we saw from 1080p to 4k there, but these are all modern demanding games too.

Correctly set up RAM will have a very large difference to those FPS minimums as well, and that's something few people seem to get done correctly. You can ask for help with that if you post your full system specs over here in the Ryzen Owners Zen Garden TechPowerUp Forums - Zentimings is a fast and easy way for us to spot things set up wrong.
Hmm that's interesting about the ram. Unfortunately, at the time of getting it, I did not know much so I have 2 sticks of single rank and 2 sticks of dual rank, they're all 3200 mhz CL16. Also did not know it would affect the FPS minimums. Thanks for the heads up!
 
Last edited:
Hmm that's interesting about the ram. Unfortunately, at the time of getting it, I did not know much so I have 2 sticks of single rank and 2 sticks of dual rank, they're all 3200 mhz CL16. Also did not know it would affect the FPS minimums. Thanks for the heads up!
That combination isn't terrible - it's going to be easier to run than 8 ranks (4xDR) while performance is going to get that uplift of 4+ ranks
 
Back
Top