Monday, April 3rd 2023
AMD Shows More Ryzen 7 7800X3D Gaming Benchmarks
AMD has revealed more Ryzen 7 7800X3D gaming benchmarks ahead of the official launch scheduled for April 6th. AMD has previously shared some results comparing this 8-core/16-thread Ryzen 7000X3D series SKU with Intel's Core i9-13900K or the predecessor, the Ryzen 7 5800X3D, showing up to 24 and 30 percent performance increase.
Now, a new slide has been leaked online, which is a part of AMD's Ryzen 7 7800X3D review guide, comparing it once again with the Intel Core i9-13900K, but going head to head in several more games. At 1080p resolution and high settings, the Ryzen 7 7800X3D is anywhere from 2 to 31 percent faster, but there are several games where the Core i9-13900K is also faster, such as CS:GO.As noted, the Ryzen 7 7800X3D is the third SKU in the Ryzen 7000X3D series lineup, and this one comes with a single CCD with 3D V-Cache, rather than the asymmetric design with two CCDs like the Ryzen 9 7950X3D and Ryzen 9 7900X3D, where only one of the CCDs had the 3D V-Cache. It works at up to 5.0 GHz Boost clock, has a total of 104 MB of L2+L3 cache, and a 120 W TDP.
The Ryzen 7 7800X3D launches at $449, which puts it at $20 more expensive than the Ryzen 9 7900 and about $100 cheaper than the 7900X. It is also around $110 more expensive than the Ryzen 7 7700X. Intel's Core i9-13900K is currently selling at $579.99, which is the lowest price in 30 days, according to Newegg.com.
Source:
Videocardz
Now, a new slide has been leaked online, which is a part of AMD's Ryzen 7 7800X3D review guide, comparing it once again with the Intel Core i9-13900K, but going head to head in several more games. At 1080p resolution and high settings, the Ryzen 7 7800X3D is anywhere from 2 to 31 percent faster, but there are several games where the Core i9-13900K is also faster, such as CS:GO.As noted, the Ryzen 7 7800X3D is the third SKU in the Ryzen 7000X3D series lineup, and this one comes with a single CCD with 3D V-Cache, rather than the asymmetric design with two CCDs like the Ryzen 9 7950X3D and Ryzen 9 7900X3D, where only one of the CCDs had the 3D V-Cache. It works at up to 5.0 GHz Boost clock, has a total of 104 MB of L2+L3 cache, and a 120 W TDP.
The Ryzen 7 7800X3D launches at $449, which puts it at $20 more expensive than the Ryzen 9 7900 and about $100 cheaper than the 7900X. It is also around $110 more expensive than the Ryzen 7 7700X. Intel's Core i9-13900K is currently selling at $579.99, which is the lowest price in 30 days, according to Newegg.com.
49 Comments on AMD Shows More Ryzen 7 7800X3D Gaming Benchmarks
Understandable, I don't blame you. Time will tell if you right on not. ;)
I've seen 13900k results that are 10% lower than expected due to a board enabling somthing it shouldn't or 7000 series performance be 10% lower than expected due to a motherboard vendors bios being trash on memory timmings most reputable reviewers will cross reference their result but not all do.
At this point given all the variences in testing etc I consider anything less than 5% a tie anything less than 10% a toss up only at around 20% or more at least with cpus does it get interesting and I start paying attention.
His numbers are literally impossible. Although the same applies to every game he tested, take a look at cyberpunk. At the 1:30 minute mark for example, the 12700k has the 3090Ti at 94% with 48 fps, while the 3d part has the 3090ti at 77% usage wtih 69 fps. LOL!
Just for your information, a 12700k can do 120 to 150 fps (depending on the area) in this game with RT on. With RT off it can probably hit 180-200 fps. All of these cpus would basically get the exact same fps at 4k with a 3090ti. You need to drop to 720p to see any difference between them. His tests are done with a different card or different settings on each CPU.
This is my 13900k running cyberpunk, and this is with RT on. With RT off,, add around 30 fps!
2) it's $200 more than the Ryzen 7600x for what 10% gain in non-gpu limited situations? Is it a great gaming CPU for benchmarks, undoubtedly. Is it a bit hyped up and over priced at launch? Obviously people can make a case for either side.
He is not even using the same GPU between the CPUs, lol
2) people will BS to justify their upgrade anyway
For me personally, if I'm doing a GPU upgrade I look for a minimum of 30% improvement but often at least 40%. 25% improvement just means I'm going from 30fps to 38fps, 60fps to 70fps or 100fps to 125fps. I can't tell the difference from any of those individual comparisons. In fact once I hit over 90FPS, I really could not tell you the difference of that and my 144hz limit unless I have a FPS tool up.
Although most 13900KS/13900K owners likely tweaked the shit out of their cpu likely making up or even surpassing any difference the 7800X3D will have. This cpu is for people who want to buy 6000 CL30/32 memory and call it a day. It also seems to scale pretty well down to 5600 mem as well though.
Just make sure to bring a high end gpu to the party. I think the informed pc hardware buyer is already well aware of this. RT complicates things as well becuase depending on what games you want to play and how much rt is actually used the differences can be massive game to game both from a gpu and a cpu level. I think it depends on the display tech as well my 165hz nano ips lg monitor feels/looks hella sluggish compared to my 120hz 4k Oled likely due to the nearly 5x slower pixel response.
Also at 60hz the ips monitor feels/looks horrible where the oled still feels pretty good.
It's gotten to a point i have a hard time even gaming on my ips monitor anymore and that's without accounting for how much better oled looks vs it.
I've tried high FPS gaming, when i play Fortnite I definitely prefer to have 165Hz. But for single player games, if frame pacing is constant, I have no issues playing at 30-40FPS.
However, once I played a few times on my new LG TV, I had to retire my VA and buy a QD-OLED, that is the biggest improvement to my gaming setup, regardless of GPU. So I can go back to low FPS without any issue, but I cannot go back from OLED. It depends on each of us, I guess.
is it pure rasterization?
Are you using something like DLSS 3 which generates an entirely unique frame every other frame at the cost of increased latency? Is that a worthy trade of?
Also 40 series owners are the only ones that can use it, see how it feels, and decide if a slight input latency hit is worth the improved visual smoothness. Watching shit compressed youtube videos at 60fps is pretty much useless.
I'd still say frame gen doesn't count as a framerate it only improves visual smoothess don't get me wrong CP with Psycho RT is hella impressive running at 4k oled at 120+ fps but frames that don't improve Latency shouldn't count.
Improved latency is the primary reason people desire high framerates I'd be willing to bet Guessing you mean 7800X3D but I agree if efficiency is most important to someone the X3D chips are in a class of their own at the same time if someones system runs at 600w with a 4090/13900k while gaming vs 500w with an X3D chip I doubt they care I know I wouldn't.
if that was the case everyone would buy a 4090/4080 and tune it down and pair it with a 7800X3D because for the performance you couldn't match the efficiency with anything else budget be damned
You know how fanboys can be thinking anything their hardware can use should be represented regardless of how variable it can make image quality. While I love RT/DLSS I completely understand why others don't and I even think frame generation is impressive given it's a 1st generation technology but I also don't care if they are included in day one hardware reviews....
I do hope to see more RT including in cpu reviews due to it being more demanding on the cpu than rasterization only but if reviewers don't have time for that I don't blame them.