• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

AMD Ryzen 7 9800X3D

yup, unless you are like me who held out but then couldn't hold out any longer lol :D
in all fairness a 7800x3d is more than enough for years to come
 
Considering Arrow Lake is a flop for gaming and actually regressed, AMD didn't really need to do much with this. Honestly though I was expecting more from this than just being 4% faster at 1080p over the 7800X3D. The higher gaming power consumption and lower gaming efficiency doesn't help as well compared to the 7800X3D, but honestly considering the difference is small and that it still is very damn efficient it isn't really that big of an issue. The release of the 5090 will be very interesting in order to see how much more the performances can be stretched.


Applications is a major surprise this time around with the 9800X3D being ahead of the 9700X.

Thanks for the review. It's interesting to see that contrary to the received wisdom, extra cache helps rendering as well.

View attachment 370498
But the thing is how though, all of the 7000X3D variants are slower than the regular ones. The 9800X3D does have a higher base clock (4.7 GHz) over the 9700X (3.8 GHz) though, could that be the reason?
 
From my point of view the 7800x3d remains better for 4k.
10% better on average in 1080p for 1% in 4K.
All for a consumption that doubles in productivity and gaming.
Intel has messed up so much with consumption that it gives the impression that this Zen5x3d is that good.

But compared to the 7800x3d if we read the data well it's just not great at all in fact.
I was in the starting blocks to possibly change my 7800x3d but now I'm just saying meh idk.
I'm not going to gamble with the stability of my system after 2 years for such a small margin and for 550€ less in my wallet when so much can go wrong by changing the CPU.
The games that will be released in the next 2/3 years will not fundamentally work better on 9800x3d than the 7800x3d.
I hope Zen6 will be more mature and will constitute an evolution that is worth it.
Done.
 
Last edited:
Considering Arrow Lake is a flop for gaming and actually regressed, AMD didn't really need to do much with this. Honestly though I was expecting more from this than just being 4% faster at 1080p over the 7800X3D. The higher gaming power consumption and lower gaming efficiency doesn't help as well compared to the 7800X3D, but honestly considering the difference is small and that it still is very damn efficient it isn't really that big of an issue. The release of the 5090 will be very interesting in order to see how much more the performances can be stretched.


Applications is a major surprise this time around with the 9800X3D being ahead of the 9700X.


But the thing is how though, all of the 7000X3D variants are slower than the regular ones. The 9800X3D does have a higher base clock (4.7 GHz) over the 9700X (3.8 GHz) though, could that be the reason?
According to HUB/Techspot, clocks under load are similar to the 9700X. The 7000 X3D variants had significantly lower clocks than their regular counterparts. It also seems that Zen 5 was held back by DRAM latency far more than Zen 4. That makes sense; if you improve the core, then bottlenecks shift elsewhere.
 
I will be keeping my 7800x3d until 11800x3d after all, cool.
 
@W1zzard

Random question. What was the idle temp of the CPU?

My 7800X3D normally idles at 41-44'c in coretemp but that reading is taken from the Tdie. Currently about 16'c ambient and T-die is 39'c but the actual core reading is 20-25'c in hwinfo.

1730910696633.png


Ive also manually set the SoC voltage to 1.2v to run the chip a little cooler.

I want to know if there are any changes to the way thermals are read.
 
Great part. Excellent progress by AMD. Using an older node than Intel (Arrow Lake) and yet delivering superior gaming performance. AMD should be commended on discovering a key find for gamers: that some games can and do benefit from extra L3 cache. Intel has yet to augment its Core design to add additional L3 cache. And the question is why not?

However, if you game at 1440p or 4K, it's not that necessary to get a 9800x3d. Great job AMD!
 
When can I expect to buy this?
 
I just skimmed the results and seen the C/O slide thanks for clarifying that.

Somthing I didn't mention above but I believe HUB also uses the most cpu limited section they can find. I could be wrong but I think TPU chooses more of a middle ground section. Which can also have a large impact on numbers.
They also outfitted the Intel SKUs with faster RAM than TPU did and that still wasn't enough.


CPUMemory
AMD Ryzen 9000 Series
AMD Ryzen 7000 Series
G.Skill Trident Z5 RGB
32GB DDR5-6000 CL30-38-38-96
Windows 11 24H2
AMD Ryzen 5000 SeriesG.Skill Ripjaws V Series
32GB DDR4-3600 CL14-15-15-35
Windows 11 24H2
Intel Core Ultra 200SG.Skill Trident Z5
CK 32GB DDR5-8200 CL40-52-52-131
Windows 11 23H2 [24H2 = Slower]
Intel 12th, 13th & 14thG.Skill Trident Z5 RGB
32GB DDR5-7200 CL34-45-45-115
Windows 11 23H2 [24H2 = Slower]
 
Benchmarking and its methods are a total mystery even to moderators, apparently.......

You benchmark at lower resolutions to push the CPU to the limit to test its actual capability. Same reason you test a GPU at 4k even if it cannot push 4k properly. A CPU test at 4k is as pointless as a GPU test at 720p. This gets explained every time there is a CPU benchmark and people still cant comprehend it.

Also, high refresh rate gamers? You know, the guys who do esports and competitive shooters? They all play at 1080p, and they want HIGH framerates. So its not a useless test anyway.

Dont forget games like Factorio, that slurp up those cache gains, or games like starcraft II and sins of a solar empire/rebellion.

isn't 1440p also a more important metric for 4k gaming since most people have DLSS enabled when gaming in 4k and that upscales from 1440p?
so gains for 4k should be a lot bigger with DLSS enabled if I'm correct
 
The king is dead, long live the king! Gaming performance is top notch, just as expected. But we knew it was gonna be a one horse race against Arrow Lake.

What impresses me far more is the application performance. The 9800X3D benchmarks similar to the 14900K in typical tasks such as web browsing, Excel and AV scan, but also in professional uses like GIT, Visual C++, MySQL, photogrammetry, genome analysis and even AV1 encoding. And it is faster in Photoshop.

What worries me is worse energy efficiency in everything compared to the 7800X3D, despite lower fabrication process and architectural optimizations.

Would I consider upgrading from my 5800X3D? No, even if it were purely for gaming. My target is 4K60, so it would make little sense. And with my interest in AAA titles all but gone, I'm planning to hold on to my current config for a good while.
 
Hey W1zzard
Can you benchmark chernobylite with two RTX 4090s with drivers 566 in 4K at ultra with RT enable with both 7800x 3D & 9800x 3D?
Getting tired of people complaining low cpu % difference at 4K.
 
Great review.

No NPU for AI acceleration is not really a negative we don't need to waste die space when this cpu will be paired with a DGPU that will offer much better AI acceleration than an NPU.
 
Great review.

No NPU for AI acceleration is not really a negative we don't need to waste die space when this cpu will be paired with a DGPU that will offer much better AI acceleration than an NPU.
The nitpickers are out in full effect
 
So it replaces the 7800X3D as the gaming chip to get, but isn't worth changing from any other X3D chip, debatably not even the 5700X3D unless you are actually using a 4090 to game at 720p/1080p (which is obviously stupid/beyond-niche).
 
I think Stalker 2 will be an excellent benchmark for this CPU and we will see it very soon (November 20th).
 
Impressive gaming performance, but I’ll repeat my comment from 2-3 days ago. Right now this money buys a 14600k and a decent Z790 board. Also 9700x and 9900x are way better deals if someone doesn’t own a 4090. Of course prices (and value) can change…
Yep, i just got 14600kf from amazon poland for 220 euro last week.
And i get Assassin Creed game for free worth like 40 euro on market.
BEST DEAL EVER
 
Thank you @W1zzard for the great review as always.

So as expected the 9800x3D improves on the weak area of the 7800x3D which was the applications/multithreaded performance. A 17% improvement in applications is pretty good, but the power draw is higher too. Although I thought the min FPS in games would improve further.

Anyway, a pretty good all rounded CPU overall.
 
So minimal gains at normal use case resolutions 1440p+ v 3 generations back. Will be less on tuned 58 or 57 3D presumably as well.

This with a 4090 which has <1% market share.

It's better than Intel, still marginal gains for the median use case for a lot of money.
You realize that the 9800x 3d could have been 99999% faster than the 7800x 3d and you'd still see minimal gains at 1440p right? Do you even understand what a GPU bottleneck is? What is the CPU supposed to do, ask the 4090 politely to go faster? What the hell man?
 
You realize that the 9800x 3d could have been 99999% faster than the 7800x 3d and you'd still see minimal gains at 1440p right? Do you even understand what a GPU bottleneck is? What is the CPU supposed to do, ask the 4090 politely to go faster? What the hell man?

This is why I want more simulation games, where there might actually be a big difference even at 4K.
 
Mediocre results, on par with Zen5%.

The real insult here is B840. PCIe 3.0 in 2024 should be illegal.
 
Back
Top