Friday, April 8th 2022

First Game Test With the Ryzen 7 5800X3D Appears as Promised

XanxoGaming has now posted its first game benchmark with the Ryzen 7 5800X3D, paired with a NVIDIA GeForce RTX 3080 Ti Founders Edition. They put it up against an Intel Core i9-12900KS and Core i9-12900K. However, as you might have deduced from the headline of this news post, so far, they've only run a single game, but are promising to deliver more results shortly. That single game so far is Shadow of the Tomb Raider at 720p and using low settings, which means that this is a far cry from a real world scenario, but it does at least give a first taste of what's to come. For whatever reason, the Core i9 systems are using an NVIDIA GeForce RTX 3090 Ti and the CPUs are paired with DDR5 memory rated at 4800 MHz CAS 40. The Ryzen 7 5800X3D has been given another pair of 8 GB modules, so it's now using dual rank memory, but still at 3200 MHz and CAS 14.

In their test, the Core i9-12900K averages around 190 FPS, which they place as their baseline. The Core i9-12900KS manages around 200 FPS, or a bit over a five percent improvement. These benchmark numbers are provided by CapFrameX that claims that due to the low resolution used, the GPU doesn't really matter and although it's not an apples-to-apples comparison, it's very close. So what about the Ryzen 7 5800X3D? Well, it gets an average FPS number of 231, which is a bit odd, since the Intel CPU benchmarks are rounded and the AMD ones are not. Regardless, that's over a 20 percent increase over the Core i9-12900K and over 15 percent of the Core i9-12900KS. XanxoGaming is promising more benchmarks and those will be delivered at 1080p at Ultra settings according to the publication. In other words, this is still not what most of us have been waiting for.
Source: XanxoGaming
Add your own comment

109 Comments on First Game Test With the Ryzen 7 5800X3D Appears as Promised

#1
Rares
Wow! If this is true Intel is in trouble again. In some games/apps we'll see big improvements. About 10-15%, or even more...
Posted on Reply
#2
ratirt
I think certain games might get a boost with the new Ryzen and others decrease in performance due to lower clocks but I guess we have to wait for some reviews. The ram configs are a bit off here and could have been more in line with both Intel and AMD CPUs
Posted on Reply
#3
prtskg
Good to see it wasn't just hype. What kind of difference should we expect from ram in this benchmark? I mean AMD vs Intel difference.
Posted on Reply
#4
the54thvoid
Super Intoxicated Moderator
I like a comparison made using different hardware. Very informative.

Not.
Posted on Reply
#5
Melvis
Looking forward to just dropping this into my 4-5yr old system and get gaming performance like a boss for cheaper then a hole new system (Intel) talk about awesome!
Posted on Reply
#6
ratirt
MelvisLooking forward to just dropping this into my 4-5yr old system and get gaming performance like a boss for cheaper then a hole new system (Intel) talk about awesome!
That is actually a good point. I just need to see if there is any benefit for me with my current 5800x. I play 4k mostly. I have doubts there will be any benefit here for my case but time will tell.
Posted on Reply
#7
Dicfylac
In the mean time, here in techpowerup, with a rtx 3080 non ti,

I don't know how the guy managed to achieved such fps.

I guess the main reason for amd, to deliver this new cpus in this time, late, is to sell them to the guys who have the 1000 and 2000 ryzen platforms, like me.
Posted on Reply
#8
AVATARAT
Yeah I think that this is fake too.
Posted on Reply
#9
Chomiq
I'll wait until someone who actually knows what they're doing gets their hands on it.
Posted on Reply
#10
Unregistered
Won't be surprised if it's faster, AMD created it on our purpose for this.
But I don't think it would make any difference for gaming at reasonable resolutions such as 1440p or 4k, at Fullhd most people won't be using a 3090 or a 6900XT hence are GPU limited anyways.
#11
ExcuseMeWtf
Will definitely wait for full tests before drawing any conclusions. Lots of conflicting info about this one.
Posted on Reply
#13
TheLostSwede
News Editor
ST.o.CHIn the mean time, here in techpowerup, with a rtx 3080 non ti,

I don't know how the guy managed to achieved such fps.

I guess the main reason for amd, to deliver this new cpus in this time, late, is to sell them to the guys who have the 1000 and 2000 ryzen platforms, like me.
Different test scene. Not hard to achieve very different results.
Posted on Reply
#14
mb194dc
ExcuseMeWtfWill definitely wait for full tests before drawing any conclusions. Lots of conflicting info about this one.
It's not conflicting, just that the extra 64mb l3 cache is only useful in really specific workloads.

The idea of ultra expensive gaming CPUs is only for a niche market anywhere, people who will pay 3k for a system to play at 1080p 200fps +.

If you play at 4k, non competitive, any cpu, even 7 year old 4790k OC can do the job.
Posted on Reply
#15
AVATARAT
TheLostSwedeDifferent test scene. Not hard to achieve very different results.
Yep, but when someone wants to show something, he needs to do it the right way, not a random scene from nowhere.
So this screenshot gives us nothing as information :)
Posted on Reply
#16
ratirt
mb194dcIf you play at 4k, non competitive, any cpu, even 7 year old 4790k OC can do the job.
Yes but you may notice some 1%low figures pretty low even at 4k in some games with the 4790K.
Posted on Reply
#17
ZeppMan217
Why wouldn't you use the in-game benchmark, which provides easily comparable results?
Posted on Reply
#18
ExcuseMeWtf
mb194dcIt's not conflicting, just that the extra 64mb l3 cache is only useful in really specific workloads.

The idea of ultra expensive gaming CPUs is only for a niche market anywhere, people who will pay 3k for a system to play at 1080p 200fps +.

If you play at 4k, non competitive, any cpu, even 7 year old 4790k OC can do the job.
And we haven't yet exactly determined what those loads are...
Posted on Reply
#19
medi01
TheLostSwedeShadow of the Tomb Raider at 720p
-Why are you running games at 720p in 2022???
-Because at higher resolution there is barely any difference between CPUs!
-Oh... OK then...
Posted on Reply
#20
Chomiq
ZeppMan217Why wouldn't you use the in-game benchmark, which provides easily comparable results?
You can compare things that are comparable:
Different CPU, Same memory type/speed, same gpu

Here you have:
Different CPU, Different memory type/speed, different GPU
Posted on Reply
#21
medi01
ChomiqDifferent CPU, Same memory type/speed
No.
It should be the best mem type/speed that CPU supports.
Posted on Reply
#22
ZeppMan217
ChomiqYou can compare things that are comparable:
Different CPU, Same memory type/speed, same gpu

Here you have:
Different CPU, Different memory type/speed, different GPU
I somehow missed that. Makes the entire "test" rather questionable, to put it mildly.
Posted on Reply
#23
TheLostSwede
News Editor
AVATARATYep, but when someone wants to show something, he needs to do it the right way, not a random scene from nowhere.
So this screenshot gives us nothing as information :)
It's not random, they're using the CapFrameEx software and whatever scene that uses, hence the comparison to hardware they don't have on hand.

I don't think it's a good test of this new CPU by any means, but you really can't compare with tests done by TPU.
Posted on Reply
#24
btk2k2
mb194dcIt's not conflicting, just that the extra 64mb l3 cache is only useful in really specific workloads.

The idea of ultra expensive gaming CPUs is only for a niche market anywhere, people who will pay 3k for a system to play at 1080p 200fps +.

If you play at 4k, non competitive, any cpu, even 7 year old 4790k OC can do the job.
Until you play a properly CPU limited game like Stellaris, CK3, Cities Skylines, Civ 6, Old Wolrd, Factorio etc etc.

Then you can be getting 2,000 fps at 4k Ultra but slow turn times or slow tic rates make them a pain to play on thr largest maps which totally changes the game play experience.
Posted on Reply
#25
SL2
Cherry picked game or not, what about the different GPU's? If the 3D had a 3090 Ti as well, those number should have been even higher?

I don't believe this for a second, because if true, the 3D would cost more. Neither AMD nor Intel gives something extra for nothing like that..

..well, unless there are very few CPU's available, like so many have suggested, then price can be low like this.
Posted on Reply
Add your own comment
Dec 22nd, 2024 12:32 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts