Thursday, January 5th 2023

AMD Confirms Ryzen 9 7950X3D and 7900X3D Feature 3DV Cache on Only One of the Two Chiplets

AMD today announced its new Ryzen 7000X3D high-end desktop processors to much fanfare, with availability slated for February 2023, you can read all about them in our older article. In our coverage, we noticed something odd about the cache sizes of the 12-core 7900X3D and 16-core 7950X3D. Whereas the 8-core, single-CCD 7800X3D comes with 104 MB of total cache (L2+L3), which works out to 1 MB L2 cache per core and 96 MB of L3 cache (32 MB on-die + 64 MB stacked 3DV cache); the dual-CCD 7900X3D and 7950X3D was shown with total caches of 140 MB and 144 MB, while they should have been 204 MB or 208 MB, respectively.

In our older article, we explored two possibilities—one that the 3DV cache is available on both CCDs but halved in size for whatever reason; and the second more outlandish possibility that only one of the two CCDs has stacked 3DV cache, while the other is a normal planar CCD with just the on-die 32 MB L3 cache. As it turns out, the latter theory is right! AMD put out high-resolution renders of the dual-CCD 7000X3D processors, where only one of the two CCDs is shown having the L3D (L3 cache die) stacked on top. Even real-world pictures of the older "Zen 3" 3DV cache CCDs from the 5800X3D or EPYC "Milan-X" processors show CCDs with 3DV caches having a distinct appearance with dividing lines between the L3D and the structural substrates over the regions of the CCD that have the CPU cores. In these renders, we see these lines drawn on only one of the two CCDs.
It shouldn't be hard for such an asymmetric cache setup to work in the real world from a software perspective, given that we are now firmly in the era of hybrid-core processors thanks to Intel and Arm. Even way before "Alder Lake," when AMD started shipping dual-CCD client processors with the Ryzen 3000 "Matisse" based on "Zen 2," the company closely collaborated with Microsoft to optimize OS scheduling such that high-performance and less-parallelized workloads such as games, are localized to just one of the two CCDs, to minimize DDR4 memory roundtrips.

Even before "Matisse," AMD and Microsoft confronted multi-threaded workload optimization challenges with dual-CCX architectures such as "Zen" and "Zen 2," where the OS scheduler would ideally want to localize gaming workload to a single CCX before saturating both CCXs on a single CCD, and then onward to the next CCD. This is achieved using methods such as CPPC2 preferred-core flagging, and which is why AMD highly recommends you to use their "Ryzen Balanced" Windows power-plan included with their Chipset drivers.

We predict that something similar is happening with the 12-core and 16-core 7000X3D processors—where gaming workloads can benefit from being localized to the 3DV cache-enabled CCD, and any spillover workloads (such as audio stack, network stack, background services, etc) are handled by the second CCD. In non-gaming workloads that scale across all 16 cores, the processor works like any other multi-core chip, it's just that the cores in the 3DV-enabled CCD have better performance from the larger victim cache. There shouldn't be any runtime errors arising from ISA mismatch, as the CPU core types on both CCDs are the same "Zen 4."

AMD Ryzen 7000X3D processors go on sale in February 2023.
Add your own comment

164 Comments on AMD Confirms Ryzen 9 7950X3D and 7900X3D Feature 3DV Cache on Only One of the Two Chiplets

#151
HD64G
For new builds on AM5, gamers with expensive top-tier GPUs and monitors with high refresh rate and resolutions will get the X3Ds, budget gamers will get the non-X CPUs. The content creators will get 7900X or 7950X.
Posted on Reply
#152
Gica
I never denied that an X3D would not be suitable for an expensive gaming build with top-of-the-line video cards. I have big doubts that an X3D helps entry-middle video cards and is worth the price. I repeat: in my opinion, it is better to buy a cheaper processor and direct the money difference to a more expensive video card. You will definitely WIN!
Note: the exception is kapone32, whose 5800X3D turned his 6500XT into a 6950XTX.
Posted on Reply
#153
kapone32
GicaI never denied that an X3D would not be suitable for an expensive gaming build with top-of-the-line video cards. I have big doubts that an X3D helps entry-middle video cards and is worth the price. I repeat: in my opinion, it is better to buy a cheaper processor and direct the money difference to a more expensive video card. You will definitely WIN!
Note: the exception is kapone32, whose 5800X3D turned his 6500XT into a 6950XTX.
I never said that all I was establishing is the X3D is just as important at budget Gaming as High end Gaming. You seem to misunderstand how the CPU works. Better 1% lows will be good for any GPU period. As I said in my post to you using the 5600x vs the 5800X3D using a 6500XT gave me up to 30 more FPS in the Games I play. Maybe get yourself an X3D chip (you could use it with a A320 board) and get a 6500XT for yourself and understand what basically everyone who has a X3D chip says. It is great for Gaming period.
Posted on Reply
#154
medi01
Is it me, or most AMD news on this site highlight some deficiency, no matter how important or vague, in AMD's product?

I.e. "oh boy, notebook chips don't have PCIe5".
Posted on Reply
#155
umeng2002
Games will never-not be mainly GPU dependent, so don't go all out spending a $1000 on a CPU and motherboard for gaming and "just" spend a $1000 on a GPU... unless you're going for extremely high refresh rates on low settings.
Posted on Reply
#156
Gica
kapone32I never said that all I was establishing is the X3D is just as important at budget Gaming as High end Gaming. You seem to misunderstand how the CPU works. Better 1% lows will be good for any GPU period. As I said in my post to you using the 5600x vs the 5800X3D using a 6500XT gave me up to 30 more FPS in the Games I play. Maybe get yourself an X3D chip (you could use it with a A320 board) and get a 6500XT for yourself and understand what basically everyone who has a X3D chip says. It is great for Gaming period.
I'm going to buy one to turn my 3070Ti into the fastest video card on the planet: 4090Ti Super Titan. :peace: :peace::peace:
The problem is that the current 11600KF does not give me reasons. I don't see how an X3D would help this video card, maybe you can explain to me how the 6500XT miracle succeeded.
P.S. When you made the statements, you claimed that you compared the X3D with the 5900X, not the 5600X.
For me it's simple with games. I flow fluently, or not. If not, I act accordingly.

As an idea: 11600KF + 500GB 980 Pro PCIe 4.0 + Seasonic 650W gold (ordered together) cost less than the cost of only the 5800X3D processor last year. The processor is not a spectacular one, but it does its job well and it was definitely a more normal investment than combining an X3D with a 6500XT. If you say that the games are not worth the money, well, with that money you buy the 5900X.
Posted on Reply
#157
big_glasses
GicaThe TPU review is in total agreement with Guru's, Tom's and Anandtech's reviews and for me it is enough. Bonus: Puget.
By the way, on the next page (Inventor) the 5800X3D loses on all fronts to the 5800X. And their conclusion is: "Those who rather want to work should therefore better keep their hands off the Ryzen 7 5800X3D in most cases, because what is offered is simply too little in relation to the other products.".
PS: 5800X3D costs as much as 5900X or 13600K and most likely a 7800X3D will cost as much as a 7900X. I wish you success in gaming for a lot of money, gentlemen.
Did you or did you not say that the non-3D won in ALL workload? (spoiler: you did, let's remind you again)
Let's not forget that, except for gaming, a possible 7600X3D will perform below the 7600X in other applications.
So let's go through this (again): You are wrong. the 3D will NOT be outperformed in all workload/"other applications" by the non-3D part.
I have no clue why you keep trying to deny this. Is there or is there not (major) application where the 3DX beats the non-3D?
autoCAD is a major application used by many industries (inb4 use intel, not the discussion)


Puget is mostly content creation, not all workload. It's adobe (and similar), rendering and a wee bit of unreal. It is not representative of all workloads! No diss on them, but they mostly to photo/video editing related reviews.
Workloads include among others (and Im not mentioning all). Coding, CAD designing, simulation (like physics), AI/ML related, mixed server-stuff (encoding, decoding, zip, VM, etc.), degridding.
Again, you can just see all the workloads that is on Phoronix/Openbenchmark test suite to get even more


Puget, TPU, Guru Tom, Anandtech and Phoronix is in no way the end all to determine if a product fits a users usecase. They can give numbers and pointers, but the end-user (or IT department xd) needs to see their requirement vs cost.

example would be if you already have an AM4. You do mostly workloads and some gaming, what (AM4) CPU do you upgrade to between 5800X or 3DX?
if you're workload is rendering or image editing, then probably the 5800X, if you do AutoCAD (as seen from Igors review) you go by 3DX. especially if it's the 2D performance you need, where the 3D crushes (411 vs 313).
Same for (at least some) physics simulation (as seen from TPU's own review), or on the other hand if it's rendering then the 5800X (or even 59x0X) will obviously be better.

edit: just for reference, a mate I have had a autoCAD sim going for a couple of hours.... that kind of difference(from Igors review) is massive when taken over longer time
Posted on Reply
#158
Gica
I said that in inventory it loses. Overall the 5800X3D is not worth the money in applications, except for gaming, and this is igorlab's conclusion.
Only you dream of green horses on the walls. The conclusion has been drawn since the launch of the 5800X3D and will be repeated in February: if it helps you in gaming, it's worth it. For anything else, no! It's not worth it because they will be more expensive and in many applications suffer a loss of performance.
You have to be a true fanboy to say that it is worth it next to a video card that is not in the top, when, for the same money, you can buy a processor with 4 extra cores (5900X in the case of 5800X3D, a processor that destroys it in all applications , with the exception of gaming).
Let's not forget that the 5900X was AMD's flagship in gaming until the 5800X3D. It is not a weak processor in this segment either, on the contrary.
Now, the 7900X is not weak in gaming. On the contrary. What would I buy, for the same money, a 7800X3D, 4 cores/8t less????
Posted on Reply
#159
big_glasses
GicaI said that in inventory it loses. Overall the 5800X3D is not worth the money in applications, except for gaming, and this is igorlab's conclusion.
No you didn't, I've reposted what you said every time. You said it loses in all applications except for gaming, it does not. You can argue perf/cost if you want, or you can argue other processors are better and that can be correct. but 3D vs non-3D only. That is not true, it's a mixed bag of performance winner.
I will not go through conclusion, that is again, up to "your" specific usecase.
You are also wrong on Igorlab's conclusion, he says most cases (my highlight)
Those who rather want to work should therefore better keep their hands off the Ryzen 7 5800X3D in most case
This is your exact wordings
GicaLet's not forget that, except for gaming, a possible 7600X3D will perform below the 7600X in other applications.
This is not correct, we've gone through this multiple times now. That for certain applications the 3D is more performant than the non-3D.
GicaOnly you dream of green horses on the walls. The conclusion has been drawn since the launch of the 5800X3D and will be repeated in February: if it helps you in gaming, it's worth it. For anything else, no! It's not worth it because they will be more expensive and in many applications suffer a loss of performance.
My highlights:
GicaYou have to be a true fanboy to say that it is worth it next to a video card that is not in the top, when, for the same money, you can buy a processor with 4 extra cores (5900X in the case of 5800X3D, a processor that destroys it in all applications , with the exception of gaming).
false again. Better check data again? Who wins here:

inb4: This one doesn't count... or some of the other where the 5800X3D wins
or some of phoronix benchmark (from 79x0X review) www.phoronix.com/review/amd-ryzen-7900x-7950x-linux/16
GicaLet's not forget that the 5900X was AMD's flagship in gaming until the 5800X3D. It is not a weak processor in this segment either, on the contrary.
Now, the 7900X is not weak in gaming. On the contrary. What would I buy, for the same money, a 7800X3D, 4 cores/8t less????
I have no fucking clue why you pull in this. I've (tried) to only refute your one statement about 3D ALWAYS performs bellow the non-3D version in non-gaming applications. Nothing else, and this has now been shown to be a false statement
This specific statement:
except for gaming, a possible 7600X3D will perform below the 7600X in other applications.
Do you have reviews of the 7600X3D vs 7600X? No, then based on the 58003DX vs 5800X it is a false statement!
edit: removed a "then"
Posted on Reply
#160
efikkan
Warrior24_7Every time I read about these chips the performance over the 13900k is decreased! It went from 30% to 10%-15% faster performance. A joke.
Those figures are probably either fake or misrepresented. It is not likely to see an overall performance advantage over 13900K of 30%.

I believe some of the estimates from AMD were more along the lines of 7800X3D being up to**** 30% faster than 5800X3D.

Zen 4 isn't very different from Zen 3, so we should have an idea of how it will scale, but the percentages will differ of course.
Posted on Reply
#161
spnidel
damn, so much pathetic arguing for the sake of arguing in this thread
Posted on Reply
#162
Lovec1990
Question for you guys is 7900X3D a 6c/12t CCD+ 6c/12t X3D CCD or 8c/16t X3D CCD+ 4c/8t CCD ?
Posted on Reply
#163
OneMoar
There is Always Moar
this isn't a big deal
but microsoft needs to get on the ball with there core scheduler because there are still some instances where it falls flat on its face
Posted on Reply
#164
Aashishkebab
Space Lynxmy guess is the next x3d chips in a year or two will haver both ccd, its probably just a backup plan for if Intel comes out swinging again soon, they can just add another x3d cache to the other ccd and swing back to take the crown again.

lol. dumb, they should have just swung all the way and slam dunked.
The 3d cache cores are clocked much lower due to thermal and voltage limitations. Plus, having cache on both chiplets would negate any benefit unless they combined all 16 cores into one CCD.
Posted on Reply
Add your own comment
Oct 18th, 2024 06:23 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts