• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

AMD Ryzen 7 9800X3D

So it replaces the 7800X3D as the gaming chip to get, but isn't worth changing from any other X3D chip, debatably not even the 5700X3D unless you are actually using a 4090 to game at 720p/1080p (which is obviously stupid/beyond-niche).
Honestly, the most actually interesting thing for me that I got out of this review is the fact that the 4090 is actually STILL somewhat CPU limited in some scenarios even in realistic loads at 1440p and in rare instances even at 4K (okay, it’s notoriously CPU heavy BG3, but still). This kinda tells me that the eventual 5090s actual real performance will essentially be a mystery from the launch review.
This little factoid is unironically more interesting than the chip in question.
 
The nitpickers are out in full effect
Its not nit picking at all. NPU's are useful in a laptop where you have to deal with power constraints.

these cpu's are going into desktop system with DGPU's which will blow a NPU out of the water in AI. AMD is smart to not waste die space putting it in.

A Mid tier 3060 gpu will destroy all these NPU's in AI.
 
I expect AMD to release all future Ryzens with 4 or 8 MB of L2 cache per core.
 
Honestly, the most actually interesting thing for me that I got out of this review is the fact that the 4090 is actually STILL somewhat CPU limited in some scenarios even in realistic loads at 1440p and in rare instances even at 4K (okay, it’s notoriously CPU heavy BG3, but still). This kinda tells me that the eventual 5090s actual real performance will essentially be a mystery from the launch review.
This little factoid is unironically more interesting than the chip in question.

Probably nVidia is fed up with this and said.... ok these guys are useless. I'll make CPUs by myself to push the 5090.
:D
 
Its not nit picking at all. NPU's are useful in a laptop where you have to deal with power constraints.

these cpu's are going into desktop system with DGPU's which will blow a NPU out of the water in AI. AMD is smart to not waste die space putting it in.

A Mid tier 3060 gpu will destroy all these NPU's in AI.
I wasn't referring to your post, just in general. I thought you were correcting someone complaining about no NPU
 
You realize that the 9800x 3d could have been 99999% faster than the 7800x 3d and you'd still see minimal gains at 1440p right? Do you even understand what a GPU bottleneck is? What is the CPU supposed to do, ask the 4090 politely to go faster? What the hell man?

You do realize it's being sold as a gaming CPU, 99%+ of the gaming market don't even have a 4090 so will be bottlenecked even worse, so what's the point of the product? Better off just buying a $200 cpu since the performance will be identical ?
 
You do realize it's being sold as a gaming CPU, 99%+ of the gaming market don't even have a 4090 so will be bottlenecked even worse, so what's the point of the product? Better off just buying a $200 cpu since the performance will be identical ?
A 500$ CPU isn't targeting 99% of the gaming market. It's targeting those with high end GPUs a fast monitors. Please man, stop being silly. You don't have to play 4k native ultra settings, there are other options.
 
Better off just buying a $200 cpu since the performance will be identical ?
Fffuuuuyeaaahhnnnah, kinda-ish? Being fair, for a loooong time the go-to chips for gaming were i5 tier and everything above was often considered unnecessary. Only reason nowadays the i7/R7 are recommended is the 8 physical cores factor. Since that seems to be the goldilocks zone. If ever, say, AMD goes to 12 or 16 core CCDs and we will get 8 core R5… yeah, “just spend 250 bucks brah” would not be out of the realm of possibility as advice, especially since people usually don’t run halo tier cards costing the equivalent of small nations GDP.
 
Last edited:
You do realize it's being sold as a gaming CPU, 99%+ of the gaming market don't even have a 4090 so will be bottlenecked even worse, so what's the point of the product? Better off just buying a $200 cpu since the performance will be identical ?
What? Do you also buy your CPUs to play one game on exclusively a 4090 and then never upgrade a GPU ever again? Do you do basic reasoning? Do you know what a graphics options menu is? Upscaling? Playing older games at high refresh? Modding? Running stuff alongside your games?

CPUs survive several GPU upgrades, and a 9800X3D can last for a good 8 years not breaking a sweat in gaming. How is this not absolutely a killer product then for 480,-?

And yes, a lot of gamers are better of buying a 200 dollar CPU since performance will indeed be identical for them. Isn't it great to have options?
 
Indeed....:D

You do realize it's being sold as a gaming CPU, 99%+ of the gaming market don't even have a 4090 so will be bottlenecked even worse, so what's the point of the product? Better off just buying a $200 cpu since the performance will be identical ?

It's a 500$ AAA top tier gaming cpu mate. It doesn't target anything below the high end tier.
You get it if you have a 4080+ and you plan to stay in the same league when the 50s are out.
 
It's a 500$ AAA top tier gaming cpu mate. It doesn't target anything below the high end tier.
You get it if you have a 4080+ and you plan to stay in the same league when the 50s are out.
Or play Farthest Frontier at 60+ FPS in late game. Using 60W. Even at 4K, and even on a 4070 level of performance on GPU, or less (I'm using 60-70% of a 7900XT here).

There's more than triple A and RT ;)

Farthest Frontier_2024_08_10_19_46_51_480.jpg
 
Mediocre results, on par with Zen5%.

The real insult here is B840. PCIe 3.0 in 2024 should be illegal.

Absolutely disagree with this statement, borderline shocking tbh... As fevgatos said, the bottleneck is in the GPU anyway in most games so regardless of how much faster it is, many games won't show significant differences which will drag the overall average down.

Also an extremely important point: This is most useful for games that actually use and benefit from CPU's like strategy/simulators etc. Take Homeworld 3, it's 58%/56% faster than 285K/14900K at 1080p and no we aren't talking about 500 vs 600 fps, it's 86 vs 136. It's 17% faster than even the 7800X3D. Those are real gains even with current gen GPU's. In fact, in games that are CPU bottlenecked in general, the gains are insane. Hogwarts legacy has it at 43%/20% faster than 285K/7800X3D at 1080p. Go figure.

What do you even expect from a gaming CPU if you think this is mediocre? You also realise that these results will look even better once 5090 launches and takes away some of the GPU bottleneck? Forget intel who aren't even in the picture, this has substantial gains with current GPU's over a 7800X3D, and that's no small feat. I was fully expecting less improvement, and the 5090 to finally reveal the differences.

Agree with B840 though, no reason for that to exist but has nothing to do with 9800X3D launch.

I expect AMD to release all future Ryzens with 4 or 8 MB of L2 cache per core.
Not happening because you have to factor in latency as well when designing the L2 floorplan and size.
2MB jump in Zen 6 is possible, but 4 is out of the question IMO
 
Absolutely disagree with this statement, borderline shocking tbh... As fevgatos said, the bottleneck is in the GPU anyway in most games so regardless of how much faster it is, many games won't show significant differences which will drag the overall average down.

Also an extremely important point: This is most useful for games that actually use and benefit from CPU's like strategy/simulators etc. Take Homeworld 3, it's 58%/56% faster than 285K/14900K at 1080p and no we aren't talking about 500 vs 600 fps, it's 86 vs 136. It's 17% faster than even the 7800X3D. Those are real gains even with current gen GPU's. In fact, in games that are CPU bottlenecked in general, the gains are insane. Hogwarts legacy has it at 43%/20% faster than 285K/7800X3D at 1080p. Go figure.

What do you even expect from a gaming CPU if you think this is mediocre? You also realise that these results will look even better once 5090 launches and takes away some of the GPU bottleneck? Forget intel who aren't even in the picture, this has substantial gains with current GPU's over a 7800X3D, and that's no small feat. I was fully expecting less improvement, and the 5090 to finally reveal the differences.

Agree with B840 though, no reason for that to exist but has nothing to do with 9800X3D launch.

I guess it's a matter of perspective. it's really not a generational leap over the 7800X3D - but it does perform very nice IMO. Then again, so did the 7800X3D - and that one seems to be a much better value for money, until the prices on this one settle a little.

Absolutely agreed on rubbish B840 chipset though. It's even worse than the A620, apparently.
 
Absolutely disagree with this statement, borderline shocking tbh... As fevgatos said, the bottleneck is in the GPU anyway in most games so regardless of how much faster it is, many games won't show significant differences which will drag the overall average down.

Also an extremely important point: This is most useful for games that actually use and benefit from CPU's like strategy/simulators etc. Take Homeworld 3, it's 58%/56% faster than 285K/14900K at 1080p and no we aren't talking about 500 vs 600 fps, it's 86 vs 136. It's 17% faster than even the 7800X3D. Those are real gains even with current gen GPU's. In fact, in games that are CPU bottlenecked in general, the gains are insane. Hogwarts legacy has it at 43%/20% faster than 285K/7800X3D at 1080p. Go figure.

What do you even expect from a gaming CPU if you think this is mediocre? You also realise that these results will look even better once 5090 launches and takes away some of the GPU bottleneck? Forget intel who aren't even in the picture, this has substantial gains with current GPU's over a 7800X3D, and that's no small feat. I was fully expecting less improvement, and the 5090 to finally reveal the differences.

Agree with B840 though, no reason for that to exist but has nothing to do with 9800X3D launch.


Not happening because you have to factor in latency as well when designing the L2 floorplan and size.
2MB jump in Zen 6 is possible, but 4 is out of the question IMO
Offers too little over the 7800X3D to justify that price.
We shall see how much the picture improves with the 5090, but as it is right now, it's less efficient and overpriced.

And the new chipsets are an underserved downgrade. Somehow Intel now offers a better low end chipset lineup, how did this happen.
 
Best gaming cpu in the world, AMD has done it again!

I wont be upgrading from 7800x3d, at 1440p gaming we are talking about 5% performance gains.

And the 7800x3d is just more efficient at 1440p and 4k gaming, I dont feel bad about owning 7800x3d, it came out to $248 in my $470 microcenter deal I got 6 months ago.

I will wait till 10800x3d hopefully it will have a little better power consumption, with same gaming performance or better than 9800x3d.
 
Offers too little over the 7800X3D to justify that price.
We shall see how much the picture improves with the 5090, but as it is right now, it's less efficient and overpriced.

And the new chipsets are an underserved downgrade. Somehow Intel now offers a better low end chipset lineup, how did this happen.
Intel chipsets usually a bit better - doesn't really matter if the chips aren't good tho.

U can pick up good b650 for really cheap these days tho.... so not sure which chipset you're referring to?
 
An improvement, but less of an improvement than hoped for.
 
So no need for 4K gaming and I'm already happy with application performance. Excellent review, interesting to know my struggle with 6400 1:1 wasn't just me.
 
Intel chipsets usually a bit better - doesn't really matter if the chips aren't good tho.

U can pick up good b650 for really cheap these days tho.... so not sure which chipset you're referring to?
B650 is in an EoL path, we won't have it as an option going forward.
 
Waiting for 9950X3D as this architecture seems to love the extra cache.

Performance uplift is just ok but price increase is not. Not a fan of the power consumption either, one would hope if could be as efficient as older X3D chips if tuned properly.
 
Offers too little over the 7800X3D to justify that price.
We shall see how much the picture improves with the 5090, but as it is right now, it's less efficient and overpriced.

And the new chipsets are an underserved downgrade. Somehow Intel now offers a better low end chipset lineup, how did this happen.

It offers "too little" because of a GPU bottleneck. It's pretty apparent just how much faster it is happens when it's not.

As I said, for people looking at a CPU upgrade to play games that actually benefit from it, double digit gains from a 7800X3D even with a 4090 is nothing short of impressive IMO.

Overpriced? Sure for a 8 core chip it's terribly expensive, but the 7800X3D was roughly the same price and you get much faster productivity and even 5-15% bump in games. Considering the 7800X3D got gobbled up left, right and center AMD have no incentive to price it any less, more so now that the competition is much further away than they were before.
 
Last edited:
Very dissapointed in 9800x3D's gaming 1% lows (145 vs 143fps at 1080p) and power consumption (65W vs 43W gaming and 155W! vs 74W in MT) vs 7800X3D .
7800X3D still looks better overall CPU for gamers at least for gamers.
Yeah, they're terrible
1730919838085.jpeg
 
I guess it's a matter of perspective. it's really not a generational leap over the 7800X3D - but it does perform very nice IMO. Then again, so did the 7800X3D - and that one seems to be a much better value for money, until the prices on this one settle a little.

Absolutely agreed on rubbish B840 chipset though. It's even worse than the A620, apparently.

In games that are cpu heavy it does pretty well and with Intels perfomance regression in gaming this could be a generation if not 2 beyond it's competition. The 7800X3D is soon to be EOL people should have scooped that up in June/July when it was 330 usd ish.

24h2 seems to help it quite a bit as well and now I'm at least mildly excited about what the 9950X3D does other than the price is likely going to be stupid due to lack of competition. I don't think we can expect more than 5-10% going forward generation to generation anymore so this might be king for a long while at least till Zen6 X3D is a thing and honestly if that's radically different can we even count on a good increase....
 
Back
Top