Friday, April 8th 2022

First Game Test With the Ryzen 7 5800X3D Appears as Promised

XanxoGaming has now posted its first game benchmark with the Ryzen 7 5800X3D, paired with a NVIDIA GeForce RTX 3080 Ti Founders Edition. They put it up against an Intel Core i9-12900KS and Core i9-12900K. However, as you might have deduced from the headline of this news post, so far, they've only run a single game, but are promising to deliver more results shortly. That single game so far is Shadow of the Tomb Raider at 720p and using low settings, which means that this is a far cry from a real world scenario, but it does at least give a first taste of what's to come. For whatever reason, the Core i9 systems are using an NVIDIA GeForce RTX 3090 Ti and the CPUs are paired with DDR5 memory rated at 4800 MHz CAS 40. The Ryzen 7 5800X3D has been given another pair of 8 GB modules, so it's now using dual rank memory, but still at 3200 MHz and CAS 14.

In their test, the Core i9-12900K averages around 190 FPS, which they place as their baseline. The Core i9-12900KS manages around 200 FPS, or a bit over a five percent improvement. These benchmark numbers are provided by CapFrameX that claims that due to the low resolution used, the GPU doesn't really matter and although it's not an apples-to-apples comparison, it's very close. So what about the Ryzen 7 5800X3D? Well, it gets an average FPS number of 231, which is a bit odd, since the Intel CPU benchmarks are rounded and the AMD ones are not. Regardless, that's over a 20 percent increase over the Core i9-12900K and over 15 percent of the Core i9-12900KS. XanxoGaming is promising more benchmarks and those will be delivered at 1080p at Ultra settings according to the publication. In other words, this is still not what most of us have been waiting for.
Source: XanxoGaming
Add your own comment

109 Comments on First Game Test With the Ryzen 7 5800X3D Appears as Promised

#51
mahoney
MelvisLooking forward to just dropping this into my 4-5yr old system and get gaming performance like a boss for cheaper then a hole new system (Intel) talk about awesome!
You're still going to pay close to $500 for this and remember IT'S AN 8 CORE CPU
Posted on Reply
#52
lexluthermiester
RaresWow! If this is true Intel is in trouble again.
Not surprising. AlderLake only barely slid by the current Ryzens. It'll be interesting to see what Intel comes up with for RaptorLake..
Posted on Reply
#53
SL2
mahoneyYou're still going to pay close to $500 for this and remember IT'S AN 8 CORE CPU
Well, everybody knows that he's paying for something else than core count or clock speed.

I've seen a lot of forum members having a hard time wrapping their head around this.
If he just wanted 8 cores there are definitely better choices out there lol.
Posted on Reply
#54
mahoney
MatsWell, everybody knows that he's paying for something else than core count or clock speed.

I've seen a lot of forum members having a hard time wrapping their head around this.
If he just wanted 8 cores there are definitely better choices out there lol.
I find the hypocrisy from a certain camp annoying. 4 years ago they were outraged how Intel could sell the 9900k for $500 yet AMD is doing the same now. Fanboying over companies is never good
Posted on Reply
#55
GaryPoisonOak
mahoneyI find the hypocrisy from a certain camp annoying. 4 years ago they were outraged how Intel could sell the 9900k for $500 yet AMD is doing the same now. Fanboying over companies is never good
You’re right. But think of it as an early adopter’s tax. If they continue to sell the 5800x3d past the launch of zen 4, the price will drop. If 6000 Ryzen is out by Black Friday, these bad boys could be a great deal for AM4 owners this November.
Posted on Reply
#56
SL2
mahoneyI find the hypocrisy from a certain camp annoying. 4 years ago they were outraged how Intel could sell the 9900k for $500 yet AMD is doing the same now. Fanboying over companies is never good
lol you know that AMD pretty much invented the $1000 dollar CPU, right? Edit: Ok, maybe not first, but at least in 2004.

You're the one who's complaining about the price, while the rest is waiting for better tests before judging whether it's crap or not, worth it or not.
Posted on Reply
#57
Totally
mahoneyI find the hypocrisy from a certain camp annoying. 4 years ago they were outraged how Intel could sell the 9900k for $500 yet AMD is doing the same now. Fanboying over companies is never good
Think people were mad because it didn't offer much over the outgoing chip and got a HEDT-like price tag/price hike on a mainstream chipset. Don't see how that compares to this $500 chip. They weren't mad at the fact it was $500, they were mad at what one was getting for $500.
Posted on Reply
#58
mechtech
RaresWow! If this is true Intel is in trouble again. In some games/apps we'll see big improvements. About 10-15%, or even more...
hmmmm trouble.......... looking at their profits they don't seem to be in trouble......

;)

More importantly how did "a Peruvian site called XanxoGamging" end up with a sample? Or do they normally get first dibs and blow NDA's??
Posted on Reply
#59
harm9963
Did 185 avg with same settings , so 231 vs my 185 , AMD is sand bagging
Did a mild OC ,went from 185 to 194 , not close to that beast .
Posted on Reply
#60
eidairaman1
The Exiled Airman
ratirtI think certain games might get a boost with the new Ryzen and others decrease in performance due to lower clocks but I guess we have to wait for some reviews. The ram configs are a bit off here and could have been more in line with both Intel and AMD CPUs
These 3D CPUs are for those with anything older than Gen 3 Ryzen
mahoneyI find the hypocrisy from a certain camp annoying. 4 years ago they were outraged how Intel could sell the 9900k for $500 yet AMD is doing the same now. Fanboying over companies is never good
Idk I got a 5800 OEM for 262 on Ebay so bclk is only 400 MHz diff from x model and boost is only 100MHz less.
robbAgain all you're doing is showing how ignorant you are about CPU testing. Pull your head out of your rear end and realize that if you run it higher resolutions that you'll become GPU limited and therefore it's not even a CPU test anymore. My God I don't understand how people like you can even find the forums.
He used to be lynx29 is why, just put em on ignore
Posted on Reply
#61
Max(IT)
Great news for people looking to play at 720P on a 3080Ti
Posted on Reply
#62
aQi
Waiting for head to head equalant specs comparision. This is means benchers should use ddr4 with i9 as well.
Posted on Reply
#63
TheinsanegamerN
CallandorWoTeven if its not fake its a stupid test... no one games at 720p... this is just dumb. and does not deserve my attention
Many could say the same thing of 4k gaming, or high refresh rate gaming. If you dont like it then feel free to ignore it and move on.
Posted on Reply
#64
eidairaman1
The Exiled Airman
aQiWaiting for head to head equalant specs comparision. This is means benchers should use ddr4 with i9 as well.
ERM are you saying Ryzen 7 is equal to Core i9?
Posted on Reply
#65
aQi
eidairaman1ERM are you saying Ryzen 7 is equal to Core i9?
Lol no im saying to use z690 ddr4 motherboards as 5800x3d utilises ddr4. If 12900KS is truely the worlds fastest in anything, performing with ddr4 memory would not be an issue at all. This isnt ryzen 7 vs i9. Its simply 5800x3d claim vs 12900ks claim.
Posted on Reply
#66
Chrispy_
Just waiting on 5800X vs 5800X3D like-for-like testing.

I have a 5800X and have no intention of upgrading it, even if the 5800X is 40% faster - simply because I'm GPU limited most of the time and the 5800X is plenty fast enough.

Whether it beats Intel or not is largely irrelevant - What matters is whether the 5800X3D is worth 50% more than a 5800X at real world resolutions with common graphics cards.
Posted on Reply
#67
Melvis
mahoneyYou're still going to pay close to $500 for this and remember IT'S AN 8 CORE CPU
So? 1. 8 cores is plenty for gaming. 2. its still way less then buying a whole new Intel System.
Posted on Reply
#68
SL2
harm9963Did 185 avg with same settings , so 231 vs my 185 , AMD is sand bagging
The OP has already explained why.
Posted on Reply
#69
mahoney
MelvisSo? 1. 8 cores is plenty for gaming. 2. its still way less then buying a whole new Intel System.
It's about the price also i'm pretty sure this is gonna be a limited run just like the 3300x was which will drive the price even higher.
Posted on Reply
#70
lexluthermiester
MelvisSo? 1. 8 cores is plenty for gaming.
For gaming yes. However, there are other tasks for with the extra cache will be useful and a 12 or 16 core model would be very useful!
Posted on Reply
#71
Punkenjoy
lexluthermiesterFor gaming yes. However, there are other tasks for with the extra cache will be useful and a 12 or 16 core model would be very useful!
Yes but not the common high end desktops tasks (Video editing/rendering, 3d rendering, etc). Those have very few gain right now with the extra cache.

Some scientific simulations and similar stuff have shown gain but those are a very niche market.
Posted on Reply
#72
lexluthermiester
PunkenjoyYes but not the common high end desktops tasks (Video editing/rendering, 3d rendering, etc). Those have very few gain right now with the extra cache.

Some scientific simulations and similar stuff have shown gain but those are a very niche market.
You can't know that at the moment. The reviews and proper benchmarks have yet to be released.
Posted on Reply
#74
mrthanhnguyen
Intel is running with ddr5 4800c40. Come on, people run it at 7000+ c30 or c32 already. And AMD is running with 3200c14 while my AMD system runs at 3800c13.
Posted on Reply
#75
Shou Miko
mrthanhnguyenIntel is running with ddr5 4800c40. Come on, people run it at 7000+ c30 or c32 already. And AMD is running with 3200c14 while my AMD system runs at 3800c13.
As true as that might be but default DDR5 speeds 5200 and 5600MHz sadly ain't much faster than good DDR4 kit and most pre-builds with DDR5 at least in my country.

3200MHz CL16 = 10ns
3600MHz CL18 = 10ns
5200MHz CL38 = 14.615ns

So the higher speed and CL DDR5 memory is at default for people on the cheap stuff the faster DDR4 sadly is en latency if the calulation is right.

Link: notkyon.moe/ram-latency2.htm

Posted on Reply
Add your own comment
Mar 15th, 2025 20:03 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts