• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Zen5 only 16 core.

Right you dont know because you only have 6 cores..

Oh yeah. I totally forgot that nobody else with a higher core count plays GTA V. My mistake.

how would anyone know if it's yet to be released on PC?


maybe it scales past six maybe a 4c/8t runs it just fine. It need to be properly tested but I highly doubt its core dependent as opposed to performance dependent

I dont know. I could be from the future :rolleyes:

1717545191655.png
 
Hot take. Why?
As an outsider looking in, I dislike the division it causes. A CPU specifically made just for games, but it can do other stuff ok. I would prefer a CPU that can do it all, just like what we had before :D

As an insider who owns one, I dislike how locked down the CPU is, as I prefer to tune manually. If I wanted a Dell I would have bought one :)

Outside of that, it isn't a bad CPU, and pretty easy to cool. And the performance at lower resolutions is pretty good..
 
As an outsider looking in, I dislike the division it causes. A CPU specifically made just for games, but it can do other stuff ok. I would prefer a CPU that can do it all, just like what we had before :D

As an insider who owns one, I dislike how locked down the CPU is, as I prefer to tune manually. If I wanted a Dell I would have bought one :)

Outside of that, it isn't a bad CPU, and pretty easy to cool. And the performance at lower resolutions is pretty good..
I hear you on the locked down point at least. Valid point.
 
I for one am sad that we still don't see a bump to e.g. 2x 12-core CCDs.

This is for compilation, lots of it. And I need the full max pre-core speed, too.
 
I wouldn't be surprised if Zen 6 gets 12-16 per CCD but I wouldn't put it past AMD to go to a hybrid arch with ZenC cores or whatever they calling them these days....

I have a feeling this will be the case. We already see with Strix Point that "12 cores" is not the whole story......can't just keep scaling up L3 when L3 doesn't keep scaling...so now on 12c Strix L3 is not quite uniform throughout the CCX.

I'm not writing off 5c just yet, because it's not necessarily crippled on freq/V-F like 4c is, but the less reliance on CPPC2 the better. It's not got the best track record, like Intel thread director.

Core count is fine. What needs to go is the ancient ass cheapo chiplet CPU interconnect. Wanna leverage CCD1 properly in games? Wanna make use of better UMC for higher DDR5 speeds? Get rid of ifop, bring us an infinity link derivative
 
Last edited:
That was very insightful :)
 
No need for more than 16c/32 on a desktop platform whatsoever even then I would say that is too much for a general consumer platform.... gaming doesn't need more than 8c absolute MAX in minute scenarios! and if productivity is what you're after and need more than 16c then you better get a workstation which you would be more suited to, who gives a flying fig if Intel have 8 big (peen) cores and 12 little ones, AMD can compete without frankensteining their CPU's, have the gaming edge and use shit tons less energy, Zen5 in July no new Intel parts until Q3/4 can't wait for my new 500w RaptorMeteorRainbowLake home heater

As an outsider looking in, I dislike the division it causes. A CPU specifically made just for games, but it can do other stuff ok. I would prefer a CPU that can do it all, just like what we had before :D

As an insider who owns one, I dislike how locked down the CPU is, as I prefer to tune manually. If I wanted a Dell I would have bought one :)

Outside of that, it isn't a bad CPU, and pretty easy to cool. And the performance at lower resolutions is pretty good..
You have a 5900x not x3d, you chose that, if you wanted better performance at the sake of productivity you could have gone x3d, likewise if you don't want the best in either but roughly as much as the best of both you could go Intel and consume 400w of CPU power #pentium4
 
I don't think there's much of a need to uplift core counts yet. With intel losing Hyper-threading, that might give AMD some multi-threaded breathing room that Intel's been pressuring against heavily with their e-core developments which, despite the criticism, are incredibly space efficient for multi-threaded tasks. And if the stated IPC uplifts of 16% for AMD and 14% for Intel pan out, AMD should do just fine in single threaded and gaming still.

The big question on my mind is this; the 7800X3D was a nice uplift over the 5800X3D, and Zen 5 is looking to be a bigger IPC uplift over Zen 4 than Zen 4 was over Zen 3, but how much of that was due to the DDR5 ecosystem? So how much will the 9800X3D offer over the 7800X3D? Time will tell.

While core count didn't go up, I'm loving the lower TDP numbers. Maybe they wanted to balance performance uplift with that this time around. In a world where everything is going up, up, up in power draw, this is nice. The X3Ds are already incredibly efficient in performance per watt in gaming (shout-out to the 7900 non-X in particular for multi-threaded stuff too), so if they get better? Yes, please. Maybe Zen 6 (which might end up on AM5 afterall given the recent update to 2027+ support for it) or Zen 7 will be when we see cores per CCD go up. Gaming doesn't need more than 16 cores (even over 6 or 8 is a luxury and not a necessity). The consoles are 8/16 and this will be fine for the foreseeable future. For multi-threaded tasks, higher options on both sides exist so it's not like when we were stuck in the quad core era.
As an outsider looking in, I dislike the division it causes. A CPU specifically made just for games, but it can do other stuff ok. I would prefer a CPU that can do it all, just like what we had before :D

As an insider who owns one, I dislike how locked down the CPU is, as I prefer to tune manually. If I wanted a Dell I would have bought one :)

Outside of that, it isn't a bad CPU, and pretty easy to cool. And the performance at lower resolutions is pretty good..
I don't like the mindset that the X3D chips can only play games and suffer massively outside of it. That's false. A Zen 3 based chip with extra cache can do productivity just about as well as a Zen 3 chip without. It's slightly worse due to the lower clock speeds, but it's not much. The fact that the cache doesn't help everything doesn't make it worse for it.

Of course, from a buying perspective, if you wanted productivity gains, you wouldn't spend extra money on a chip with extra cache that won't help performance in those situations. It doesn't make the chip a bad performer at those things, though.

I'd also like offer a counterpoint to the bold part; those chips we had before couldn't do it all, could they? They were obviously leaving performance in select scenarios on the table by lacking cache! Designing a chip is a balancing act. The fact that v-cache options exist, even if they don't help everything, is good! It's choice for consumers! We need more of that, not less.

I'm so in love with my 5800X3D (came from a 3700X, so only on generation older) that as long as these are offered, I'll probably never go back to a non-high cache chip. This thing has been an unbelievable uplift in certain scenarios (Minecraft is a good example) as well as the 1% low uplifts that the small "20%" uplift over the standard 5800X is actually understated.

With Intel still dealing with moving to a non-monolithic chip, I'm looking forward to them bringing their own high cache options out in later generations, as opposed to AMD "burying" theirs. As of right now, they have zero to do that unless they want to cede the gaming crown back to Intel. So I'll be looking at the 9800X3D (or whatever it's called) and then hoping Intel and AMD both continue to innovate, or I'll simply go with whoever has the best option later on.

As for overclocking, I already didn't do that on my 3700X so the X3D's loss there doesn't impact me at all. CPUs and GPUs have gotten good at boosting to their limits, and this cuts in on the attraction of overclocking for the lazy casuals like me.
 
As an outsider looking in, I dislike the division it causes. A CPU specifically made just for games, but it can do other stuff ok. I would prefer a CPU that can do it all, just like what we had before :D

OK, let's ignore the X3D and use the 4090 at 4K as a reference point. The 7950X gives you 6% fewer fps than the 14900K/S and both CPUs are great for applications with the 7950X trading slightly lower performance for ~40% lower power consumption, choose your preference. The gaming differences will be smaller to zero with lower end GPUs.

At 1440p you get the same fps (and likely 6% difference) with a 4070 Super, 3090 or 7900 GRE. Everything else is the same with fps differences reduced to zero with lower end GPUs.

At 1080p you get the same fps (and likely 6% difference) with a 4060 Ti/3070 Ti and 6800/7700XT. Everything else is the same, yadda yadda yadda.

So unless you're using those GPUs or better at lower resolutions, the 7950X is a fine use case that do it all like the 14900K/S. You'll get 6% lower fps and I wonder if you'd notice 112 fps vs. 119 fps while you were playing.
 
OK, let's ignore the X3D and use the 4090 at 4K as a reference point. The 7950X gives you 6% fewer fps than the 14900K/S and both CPUs are great for applications with the 7950X trading slightly lower performance for ~40% lower power consumption, choose your preference. The gaming differences will be smaller to zero with lower end GPUs.

At 1440p you get the same fps (and likely 6% difference) with a 4070 Super, 3090 or 7900 GRE. Everything else is the same with fps differences reduced to zero with lower end GPUs.

At 1080p you get the same fps (and likely 6% difference) with a 4060 Ti/3070 Ti and 6800/7700XT. Everything else is the same, yadda yadda yadda.

So unless you're using those GPUs or better at lower resolutions, the 7950X is a fine use case that do it all like the 14900K/S. You'll get 6% lower fps and I wonder if you'd notice 112 fps vs. 119 fps while you were playing.

Enthusiasts tend to grossly overestimate the CPU impact on games. I wager if you sit a gamer in front of a black box with a 4080, 98% wouldn't be able to tell the difference between anything 7600X/13600k and up.
 
Last edited:
If you are wanting more than 16C/32T CPU's for AM5 then you might want to wait for AMD to release it's AM5 Epic CPU's
 
I don't even see any usage for any higher than 8 cores for a pure gaming PC. Shouldn't it already tell something that R7 X3D chips are the fastest CPUs for gaming?
 
You only need 2-4 cores for so many games.

View attachment 350091View attachment 350093
Absolute nonsense. That might be true for some games or for games from 7 or 8 years ago, not for many modern titles. Granted, a 16 core CPU is a bit overkill for gaming, but a 6 core should be considered the starting point if 1440p is the target resolution. 8 core CPUs should be the target if 2160p is the target resolution.

Let's remember also that gaming is not always the exclusive activity to consider when selecting the CPU model to buy.
 
As an outsider looking in, I dislike the division it causes. A CPU specifically made just for games, but it can do other stuff ok. I would prefer a CPU that can do it all, just like what we had before :D

if you dislike the division it causes it gets worse... the X3D magic doesn't shine across all games but a select few (mostly CPU-bound titles), even does worse in some titles vs faster clocked non-X3D. Division widened :p

Either way non-X3D parts are also top-notch for gaming and for those who can benefit where X3D shines have the option to indulge - which can only be a positive thing. But i totally get it, non-X, X, X3D... only if things were more simpler with a single string of SKUs and core counts being the only divider.

Enthusiasts tend to grossly overestimate the CPU impact on games. I wager if you sit a gamer in front of a black box with a 4080, 98% wouldn't be able to tell the difference between anything 7600X/13600k and up.

Yep even some of that more affordable 12400/5600 juice with the FPS counter obscured would have us quenched too. Can't speak for 1080p anymore but at 1440p even with a 5800X3D i'm mostly GPU limited in the BIG graphics-galore screeners or 144hz-bottlenecked with tuned down settings/games which are by design visually tasteless. The more concerning issue being most people can only dream of landing a 4080 to keep up with today's already invigorated processors.
 
As an outsider looking in, I dislike the division it causes. A CPU specifically made just for games, but it can do other stuff ok. I would prefer a CPU that can do it all, just like what we had before :D

So you mean like the 7950X3D which provides top of the line single-thread, multi-thread, and gaming performance while consuming a fraction of the power compared to the competition. X3D enables the very all-arounder CPU you seem to be referring to.

X3D cache provides power savings across the board. The notion that it's only beneficial to games it false.

Don't really understand the "it can do other stuff ok" comment either. No, it can do everything exactly as well as the base chip except for efficiency and gaming which both see a huge boost. Unless you think the non-X3D chips are horrid, it's strictly better than "ok".
 
So you mean like the 7950X3D which provides top of the line single-thread, multi-thread, and gaming performance while consuming a fraction of the power compared to the competition. X3D enables the very all-arounder CPU you seem to be referring to.
I am on AM4, you can reel yourself back in now :)

The notion that it's only beneficial to games it false.
Really? What else is it good for?

Don't really understand the "it can do other stuff ok" comment either. No, it can do everything exactly as well as the base chip except for efficiency and gaming which both see a huge boost. Unless you think the non-X3D chips are horrid, it's strictly better than "ok".
Yawn.
 
My title was based on the link i posted or are you clueless and did not get that, maybe did not bother to actually look at the link. Maybe i could have been a bit more creative with the title. No need to be offensive though is there? enjoy being ignored.



people always treat them as if they are, seems to me most people do.

Who's being "offensive"? :wtf: And oh gawd, some random person on a tech forum is ignoring me. What ever shall I do? :cry::cry: /s

But let's get back to the topic. You said:

Seems zen 5 might be only 16 core. It's all you need for gaming though so I guess it's good enough.

*Only* 16 cores...you say this as if we should have server grade core counts on mainstream PCs. Back when Ryzen first arrived, if you needed (or wanted) anything higher than 8 cores, you went HEDT with Threadripper, which started at 8 cores. But the fact is, unless you have more money than sense and figure a 16 core cpu is better than something like a 7800X3D for straight gaming, ya really don't see any tangible benefit in getting the 16 core vs. an 8 core. All I'm saying is 16 cores might be "good enough" for tasks that demand them. Gaming, however, isn't really one of them.
 
So you mean like the 7950X3D which provides top of the line single-thread, multi-thread, and gaming performance while consuming a fraction of the power compared to the competition. X3D enables the very all-arounder CPU you seem to be referring to.

X3D cache provides power savings across the board. The notion that it's only beneficial to games it false.

Don't really understand the "it can do other stuff ok" comment either. No, it can do everything exactly as well as the base chip except for efficiency and gaming which both see a huge boost. Unless you think the non-X3D chips are horrid, it's strictly better than "ok".
It does better both for speed and efficiency, for workloads with a working set larger than the 30MB or so of L3 of non-X3D and competiting processors, but not much exceeding the 96MB offered. Game engines are apparently in that sweet spot more often than usual.

As for power savings, I wonder if they bin CCDs for undervolt stability. I knew mine is right at limit, being unstable with any negative all-core CO. It does still boost above rated frequency at 4.2GHz all-core even when limited to 65W and even on AVX512 workloads, so I'm not complaining.

I think synthetic benchmarks are not kind to it, since they are often optimized for either less, or much more. Same with many productivity applications.
Really? What else is it good for?
It's one of the best things to have happened so far for Prime95 number crunchers, outpacing by significant margin any other consumer processor currently in production, for instance. :p
 
*Only* 16 cores...you say this as if we should have server grade core counts on mainstream PCs. Back when Ryzen first arrived, if you needed (or wanted) anything higher than 8 cores, you went HEDT with Threadripper, which started at 8 cores. But the fact is, unless you have more money than sense and figure a 16 core cpu is better than something like a 7800X3D for straight gaming, ya really don't see any tangible benefit in getting the 16 core vs. an 8 core. All I'm saying is 16 cores might be "good enough" for tasks that demand them. Gaming, however, isn't really one of them.
Agree. 6-core CPUs have became the value point and a 8-core is practically everything a gamer needs. Maybe a streamer needs more cores, but every gamer doesn't stream. And even then, a 16-core fast CPU should be more than enough.
 
So you mean like the 7950X3D which provides top of the line single-thread, multi-thread, and gaming performance while consuming a fraction of the power compared to the competition. X3D enables the very all-arounder CPU you seem to be referring to.

X3D cache provides power savings across the board. The notion that it's only beneficial to games it false.

Don't really understand the "it can do other stuff ok" comment either. No, it can do everything exactly as well as the base chip except for efficiency and gaming which both see a huge boost. Unless you think the non-X3D chips are horrid, it's strictly better than "ok".
X3D CPUs aren't inherently more efficient, they just are forced to use more efficient power limits. A 7950X limited to the same power as the 7950X3D will perform identically or outperform it in any task that doesn't use the extra cache.

Also you're ignoring the economic factor. X3D CPUs are usually significantly more expensive, and only perform better in specific tasks, for people that don't care about those they are clearly worse than the standard models.
 
So you mean like the 7950X3D which provides top of the line single-thread, multi-thread, and gaming performance while consuming a fraction of the power compared to the competition. X3D enables the very all-arounder CPU you seem to be referring to.

X3D cache provides power savings across the board. The notion that it's only beneficial to games it false.

Don't really understand the "it can do other stuff ok" comment either. No, it can do everything exactly as well as the base chip except for efficiency and gaming which both see a huge boost. Unless you think the non-X3D chips are horrid, it's strictly better than "ok".
According to this very site's review, the 7950x 3d is closer to the i7 14700k rather than "the top of the line". It's middle of the pack in ST performance (losing to i5's. Not a bad CPU at all, but top of the line performance isn't it.
 
A CPU specifically made just for games, but it can do other stuff ok. I would prefer a CPU that can do it all, just like what we had before :D
It wasn't made for gaming, it was made for Azure(MS) just that gaming & some other applications could use make better use of it as well. Someone at AMD was lucky/smart enough to test it at the right time!

Really? What else is it good for?
 
Last edited:
X3D CPUs aren't inherently more efficient, they just are forced to use more efficient power limits. A 7950X limited to the same power as the 7950X3D will perform identically or outperform it in any task that doesn't use the extra cache.

We know that the X3D chips are in fact more efficient because when you set the scheduler to prefer cache their efficiency increases further:

1717562849818.png


Therefore the cache itself does improve efficiency and not just the frequency limits imposed on X3D processors.

Also you're ignoring the economic factor.

No, it was just never apart of the conversation. It's illogical to imply that everything not discussed is purposefully being ignored.

X3D CPUs are usually significantly more expensive, and only perform better in specific tasks

$100 more expensive, whether that's significant is subjective.

people that don't care about those they are clearly worse than the standard models.

Which is irrelevant because my argument was about the objective merits of the product. Subjective ever shifting metrics aren't really interesting, particularly on the internet where those metrics will change based on whatever argument the person wants to support.


Cool, we get it you don't care about efficiency. Very constructive.
 
Back
Top