Friday, April 8th 2022

First Game Test With the Ryzen 7 5800X3D Appears as Promised

XanxoGaming has now posted its first game benchmark with the Ryzen 7 5800X3D, paired with a NVIDIA GeForce RTX 3080 Ti Founders Edition. They put it up against an Intel Core i9-12900KS and Core i9-12900K. However, as you might have deduced from the headline of this news post, so far, they've only run a single game, but are promising to deliver more results shortly. That single game so far is Shadow of the Tomb Raider at 720p and using low settings, which means that this is a far cry from a real world scenario, but it does at least give a first taste of what's to come. For whatever reason, the Core i9 systems are using an NVIDIA GeForce RTX 3090 Ti and the CPUs are paired with DDR5 memory rated at 4800 MHz CAS 40. The Ryzen 7 5800X3D has been given another pair of 8 GB modules, so it's now using dual rank memory, but still at 3200 MHz and CAS 14.

In their test, the Core i9-12900K averages around 190 FPS, which they place as their baseline. The Core i9-12900KS manages around 200 FPS, or a bit over a five percent improvement. These benchmark numbers are provided by CapFrameX that claims that due to the low resolution used, the GPU doesn't really matter and although it's not an apples-to-apples comparison, it's very close. So what about the Ryzen 7 5800X3D? Well, it gets an average FPS number of 231, which is a bit odd, since the Intel CPU benchmarks are rounded and the AMD ones are not. Regardless, that's over a 20 percent increase over the Core i9-12900K and over 15 percent of the Core i9-12900KS. XanxoGaming is promising more benchmarks and those will be delivered at 1080p at Ultra settings according to the publication. In other words, this is still not what most of us have been waiting for.
Source: XanxoGaming
Add your own comment

109 Comments on First Game Test With the Ryzen 7 5800X3D Appears as Promised

#77
mrthanhnguyen
chrcolukDDR5 needs 5200CL30.
Wrong at least 6400c30 1t or 7000c30 2t.
Posted on Reply
#78
Space Lynx
Astronaut
puma99dk|Me want :roll:

awwww :love::love::love::love::love::love:
Posted on Reply
#79
Block10
Brothers, there is something I don't understand about the test that was filtered from 5800x3D. According to the leaker, he tested the Shadow of the tomb Raider in 720p, low graphics. And there is a difference between the 12900k, which with DDR5, gives 190fps with a 3090ti and the one from AMD gives 231 fps with a 3080ti. However, how is it explained that my 12600k, with a 3070 and DDR4 3600 mhz, on a B660 Board, can achieve 201 fps, in that test? Here I send capture of my test.
Posted on Reply
#80
NDown
MelvisSo? 1. 8 cores is plenty for gaming
lol, ive seen this baaaaack in the good ole days of 3570K vs FX 8350 daaays

How the table have turned
Posted on Reply
#81
Shou Miko
Block10Brothers, there is something I don't understand about the test that was filtered from 5800x3D. According to the leaker, he tested the Shadow of the tomb Raider in 720p, low graphics. And there is a difference between the 12900k, which with DDR5, gives 190fps with a 3090ti and the one from AMD gives 231 fps with a 3080ti. However, how is it explained that my 12600k, with a 3070 and DDR4 3600 mhz, on a B660 Board, can achieve 201 fps, in that test? Here I send capture of my test.
You need to take to account that most "normal" DDR5 buyers that doesn't know about timmings and latency buys a generic set of memory and this have been seen before that lower speed bad timing memory doesn't benefit Intel's 12gen line up and prices overall makes DDR4 with 3200 CL16 or 3600CL18 a much better buy than the cheapest DDR5.

Intel also said they would strongly advice against using DDR4 on their 13gens CPU's even they support it what's up with this I am not sure maybe they just want you to steal your kidney I don't know but it's Intel's nutshell policy.
CallandorWoTawwww :love::love::love::love::love::love:
I made this as a joke but if AMD really hits it off with their V-cache it should be good for transcoding with the extra cache and other work related stuff
Posted on Reply
#83
Yraggul666
Going to wait for the real benchmarks......i still want it tho.
And yes, i know it's probably not worth the upgrade over the normal 5800X...
Posted on Reply
#84
thegnome
It's just like pre-orders... Wait for the real benchmarks to come out, I doubt it'll be very good in some games but much better in others. Either way a very innovative product, very interesting solution to the node shrink issues.
Posted on Reply
#85
Valantar
puma99dk|As true as that might be but default DDR5 speeds 5200 and 5600MHz sadly ain't much faster than good DDR4 kit and most pre-builds with DDR5 at least in my country.

3200MHz CL16 = 10ns
3600MHz CL18 = 10ns
5200MHz CL38 = 14.615ns

So the higher speed and CL DDR5 memory is at default for people on the cheap stuff the faster DDR4 sadly is en latency if the calulation is right.

Link: notkyon.moe/ram-latency2.htm

While those calculations are absolutely true, they don't quite translate directly to reality in terms of real-world latencies due to the architectural/signalling differences between DDR4 and DDR5. DDR5 is quite a dramatic change compared to previous DDR generations, making comparisons more difficult than previously. (Surprisingly, LTT did a pretty decent explainer on this.) Most Alder Lake reviews showed DDR5 performing slightly better than DDR4 even in latency-sensitive workloads (and in non bandwidth sensitive workloads) despite the spec sheet seeming to indicate that it would be quite a bit worse.
Posted on Reply
#86
SL2
Block10However, how is it explained that my 12600k, with a 3070 and DDR4 3600 mhz, on a B660 Board, can achieve 201 fps, in that test? Here I send capture of my test.
AGAIN, the OP has already explained that..
Posted on Reply
#87
Shou Miko
ValantarWhile those calculations are absolutely true, they don't quite translate directly to reality in terms of real-world latencies due to the architectural/signalling differences between DDR4 and DDR5. DDR5 is quite a dramatic change compared to previous DDR generations, making comparisons more difficult than previously. (Surprisingly, LTT did a pretty decent explainer on this.) Most Alder Lake reviews showed DDR5 performing slightly better than DDR4 even in latency-sensitive workloads (and in non bandwidth sensitive workloads) despite the spec sheet seeming to indicate that it would be quite a bit worse.
Still other reviewers like gamers nexus shows that DDR4 can still give DDR5 on the Alder lake platform a run for it's money.
Posted on Reply
#88
Valantar
puma99dk|Still other reviewers like gamers nexus shows that DDR4 can still give DDR5 on the Alder lake platform a run for it's money.
It absolutely can, especially with higher clocked RAM - but that's mostly due to DDR4 being extremely mature and DDR5 being the opposite. Current DDR5 clocks are very low even compared to the current range of JEDEC speed specs, let alone the planned JEDEC range expansion (up to 8400 IIRC). Once DDR5 matures a bit it'll leave DDR4 behind properly. I mean, we're barely seeing better-than-JEDEC timings on the market at all.
Posted on Reply
#89
Makaveli
Yraggul666Going to wait for the real benchmarks......i still want it tho.
And yes, i know it's probably not worth the upgrade over the normal 5800X...
if you are gaming at 1440p and above most likely won't be worth it.
Posted on Reply
#90
medi01
TheinsanegamerNThey test at 720p because this is supposed to show the sheer difference between cpus
Which was proved to be utter nonsense that doesn't really work.
Posted on Reply
#91
lexluthermiester
medi01Which was proved to be utter nonsense that doesn't really work.
And how do you arrive at that conclusion?
Posted on Reply
#92
Melvis
mahoneyIt's about the price also i'm pretty sure this is gonna be a limited run just like the 3300x was which will drive the price even higher.
Who knows? until its actually in stores your all guesssing but for me its a no brainer, cheaper in every way if you havent seen the Prices in Aus?
NDownlol, ive seen this baaaaack in the good ole days of 3570K vs FX 8350 daaays

How the table have turned
Name a game that uses all 16 Threads

The the 8350 (well 8 core CPU) ended up been the better choice it seems
Posted on Reply
#93
medi01
lexluthermiesterAnd how do you arrive at that conclusion?
The idea is flawed to begin with, as that is not how CPU power allocation works when game studios develop games. It is also not how scaling works, you can run into faux constraints, that developers simply didn't give a f*** about and all sorts of flukes.

As for "but what about in practice", we had Ryzen vs Intel, where Intel was showing "edge" at low resolutions, which was supposed to hint at huge advantage in the upcoming games, but when the said games arrived, its advantage evaporated.


High FPS low resolution testing makes sense when applied using ACTUAL USE CASES such as competitive shooters, but then, it's never 720p to begin with.

Oh, and it is also not Tomb Raider.
Posted on Reply
#94
SL2
medi01As for "but what about in practice", we had Ryzen vs Intel, where Intel was showing "edge" at low resolutions, which was supposed to hint at huge advantage in the upcoming games, but when the said games arrived, its advantage evaporated.
I wonder if there are any tests that focuses around this. Run 2018 games with CPU's and GPU's from the same year, then use a new GPU with the old CPU's and run 2021 games.
Posted on Reply
#95
Max(IT)
Block10Brothers, there is something I don't understand about the test that was filtered from 5800x3D. According to the leaker, he tested the Shadow of the tomb Raider in 720p, low graphics. And there is a difference between the 12900k, which with DDR5, gives 190fps with a 3090ti and the one from AMD gives 231 fps with a 3080ti. However, how is it explained that my 12600k, with a 3070 and DDR4 3600 mhz, on a B660 Board, can achieve 201 fps, in that test? Here I send capture of my test.
just wait for the real benchmark on TPU or Guru3D
Posted on Reply
#96
NDown
MelvisName a game that uses all 16 Threads
yeah well that's the kind of response the Intel die-hard always says up until Coffe Lake releases
MelvisThe the 8350 (well 8 core CPU) ended up been the better choice it seems
lol, cant believe i just saw "Better Choice" and "8350" in the same sentence :D
Posted on Reply
#97
kapone32
MelvisWho knows? until its actually in stores your all guesssing but for me its a no brainer, cheaper in every way if you havent seen the Prices in Aus?


Name a game that uses all 16 Threads

The the 8350 (well 8 core CPU) ended up been the better choice it seems
Ashes of the Singularity
Posted on Reply
#99
Melvis
NDownyeah well that's the kind of response the Intel die-hard always says up until Coffe Lake releases



lol, cant believe i just saw "Better Choice" and "8350" in the same sentence :D
Well if you knew me then you wouldnt be saying that :laugh:

Ill just leave this here for you ;) www.youtube.com/c/RATechYT/videos
kapone32Ashes of the Singularity
You mean Ashes of the benchmark right? come on! you gotta do better then that.......:slap:
Posted on Reply
Add your own comment
Dec 22nd, 2024 07:54 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts