• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Confirms Ryzen 9 7950X3D and 7900X3D Feature 3DV Cache on Only One of the Two Chiplets

Joined
Jun 6, 2022
Messages
622 (0.72/day)
Buy your glasses. You can find it in every processor review :peace: .
Latest TPU review. Enjoy your reading!

However, it is impossible to find (a random example) how much X3D helps an RX 6600 or RTX3060 compared to the X counterpart. :confused:
 
Last edited:
Joined
Apr 30, 2011
Messages
2,695 (0.55/day)
Location
Greece
Processor AMD Ryzen 5 5600@80W
Motherboard MSI B550 Tomahawk
Cooling ZALMAN CNPS9X OPTIMA
Memory 2*8GB PATRIOT PVS416G400C9K@3733MT_C16
Video Card(s) Sapphire Radeon RX 6750 XT Pulse 12GB
Storage Sandisk SSD 128GB, Kingston A2000 NVMe 1TB, Samsung F1 1TB, WD Black 10TB
Display(s) AOC 27G2U/BK IPS 144Hz
Case SHARKOON M25-W 7.1 BLACK
Audio Device(s) Realtek 7.1 onboard
Power Supply Seasonic Core GC 500W
Mouse Sharkoon SHARK Force Black
Keyboard Trust GXT280
Software Win 7 Ultimate 64bit/Win 10 pro 64bit/Manjaro Linux
Can you fight though?
Which part is unclear to you? I can help you. Unlike you, who made a fix for me, I will also use arguments.
I am willing to discuss about the performance of the X3D models vs X only after you admit being mistaken on your power draw baseless theory.
 
Joined
Jun 6, 2022
Messages
622 (0.72/day)
The arithmetic is simple:
7000X
7000X + 3DCache = 7000X3D
The only way in which (example) 7950X3D can consume less or like 7950X is to reduce the frequency.
This reality is officially confirmed!
7950X: 4.5GHz base clock
7950X3D: 4.2GHz base clock (300MHz less)
7900X: 4.7GHz base clock
7900X3D: 4.4GHz base clock (300MHz less)
What we don't know, yet, is how much amd will push the limits of these processors. Only the reviews will show us the real consumption in applications, but it is certain that X3D will be weaker in all applications that do not react to the extra cache memory, such as rendering and encoding.

I don't know if it has been discussed until now, but it is a problem with the AM5 platform if you want to use all the memory slots.
 

Attachments

  • Clipboard01.jpg
    Clipboard01.jpg
    36 KB · Views: 44
Last edited:
Joined
Apr 30, 2011
Messages
2,695 (0.55/day)
Location
Greece
Processor AMD Ryzen 5 5600@80W
Motherboard MSI B550 Tomahawk
Cooling ZALMAN CNPS9X OPTIMA
Memory 2*8GB PATRIOT PVS416G400C9K@3733MT_C16
Video Card(s) Sapphire Radeon RX 6750 XT Pulse 12GB
Storage Sandisk SSD 128GB, Kingston A2000 NVMe 1TB, Samsung F1 1TB, WD Black 10TB
Display(s) AOC 27G2U/BK IPS 144Hz
Case SHARKOON M25-W 7.1 BLACK
Audio Device(s) Realtek 7.1 onboard
Power Supply Seasonic Core GC 500W
Mouse Sharkoon SHARK Force Black
Keyboard Trust GXT280
Software Win 7 Ultimate 64bit/Win 10 pro 64bit/Manjaro Linux
Some useful info that respond to many questions about the Zen4 X3D models.

 

Aquinus

Resident Wat-man
Joined
Jan 28, 2012
Messages
13,159 (2.83/day)
Location
Concord, NH, USA
System Name Apollo
Processor Intel Core i9 9880H
Motherboard Some proprietary Apple thing.
Memory 64GB DDR4-2667
Video Card(s) AMD Radeon Pro 5600M, 8GB HBM2
Storage 1TB Apple NVMe, 4TB External
Display(s) Laptop @ 3072x1920 + 2x LG 5k Ultrafine TB3 displays
Case MacBook Pro (16", 2019)
Audio Device(s) AirPods Pro, Sennheiser HD 380s w/ FIIO Alpen 2, or Logitech 2.1 Speakers
Power Supply 96w Power Adapter
Mouse Logitech MX Master 3
Keyboard Logitech G915, GL Clicky
Software MacOS 12.1
Something like: 4090 versus 4080 with:
intel pentium 4
I seriously hope that this was an attempt at sarcasm. I cannot fathom the usefulness of such a comparison. I can maybe imagine comparisons with CPUs as old as Sandy Bridge (given skt2011 had PCIe 3.0 support,) but Pentium 4? That's laughable to say the least. I'd even go so far to say the Core 2 series would also be laughable to compare.
 
Joined
Jun 6, 2022
Messages
622 (0.72/day)
It was sarcasm, he knows why. The idea of testing x3d with weaker (but up-to-date) video cards doesn't seem bad to me. I bet if instead of 5800X3D you choose 5800X and invest the price difference in a more powerful video card, you will win more. I'm referring to the entry-middle area of these video cards, where no one has tested how much X3D helps, because even the 5800X was among the most powerful processors in gaming, it's not a Pentium.
In the video: 3090Ti with i7 860 "13 year old". With a video card from 13 years ago + the most powerful processor in gaming in 2023, you can't play anything even in 1080p.
 
Joined
Jun 2, 2017
Messages
8,849 (3.28/day)
System Name Best AMD Computer
Processor AMD 7900X3D
Motherboard Asus X670E E Strix
Cooling In Win SR36
Memory GSKILL DDR5 32GB 5200 30
Video Card(s) Sapphire Pulse 7900XT (Watercooled)
Storage Corsair MP 700, Seagate 530 2Tb, Adata SX8200 2TBx2, Kingston 2 TBx2, Micron 8 TB, WD AN 1500
Display(s) GIGABYTE FV43U
Case Corsair 7000D Airflow
Audio Device(s) Corsair Void Pro, Logitch Z523 5.1
Power Supply Deepcool 1000M
Mouse Logitech g7 gaming mouse
Keyboard Logitech G510
Software Windows 11 Pro 64 Steam. GOG, Uplay, Origin
Benchmark Scores Firestrike: 46183 Time Spy: 25121
Are your glasses good? How do you think that 7700X3D (8 cores) and 7950X3D (16 cores) will have the same power consumption?
The entire 7000X3D range is marked with 120W Default TDP, but we will see the real consumption in the reviews.

As for 7000X versus 7000X3D, the consumption of that extracache is added to the total consumption.
So, the X3D variants:
1. They will consume more than the X variants.
2. They will consume the same or less, in which case the frequencies will be lower than in the X versions, with the performance penalty that the 5800X3D also suffered in front of the 5800X, with the exception of games.

It was sarcasm, he knows why. The idea of testing x3d with weaker (but up-to-date) video cards doesn't seem bad to me. I bet if instead of 5800X3D you choose 5800X and invest the price difference in a more powerful video card, you will win more. I'm referring to the entry-middle area of these video cards, where no one has tested how much X3D helps, because even the 5800X was among the most powerful processors in gaming, it's not a Pentium.
In the video: 3090Ti with i7 860 "13 year old". With a video card from 13 years ago + the most powerful processor in gaming in 2023, you can't play anything even in 1080p.
Total conjecture on your part. You have no idea what the advantages are for the X3D. It doesn't matter though as your posts show that you don't understand how the X3D is better in Gaming,
 
Joined
May 20, 2020
Messages
1,360 (0.84/day)
I just cannot see why could they (AMD Co.) not put the same amount (however much) of cache on each chiplet. Do we really need another scheduler nightmare (and customer dissatisfaction)?
 
Joined
Jun 10, 2014
Messages
2,973 (0.79/day)
Processor AMD Ryzen 9 5900X ||| Intel Core i7-3930K
Motherboard ASUS ProArt B550-CREATOR ||| Asus P9X79 WS
Cooling Noctua NH-U14S ||| Be Quiet Pure Rock
Memory Crucial 2 x 16 GB 3200 MHz ||| Corsair 8 x 8 GB 1333 MHz
Video Card(s) MSI GTX 1060 3GB ||| MSI GTX 680 4GB
Storage Samsung 970 PRO 512 GB + 1 TB ||| Intel 545s 512 GB + 256 GB
Display(s) Asus ROG Swift PG278QR 27" ||| Eizo EV2416W 24"
Case Fractal Design Define 7 XL x 2
Audio Device(s) Cambridge Audio DacMagic Plus
Power Supply Seasonic Focus PX-850 x 2
Mouse Razer Abyssus
Keyboard CM Storm QuickFire XT
Software Ubuntu
This outcry is probably completely out of proportions. We are talking about L3 which is a LLC, a spillover for L2. The likelihood of poor performance from an asymmetric L3 cache configuration is fairly low.
But I'm sure many of you will blame this when the X3D models fail to live up to the extreme hype, even though having extra L3 on both CCDs would result in the same problem with inconsistent latencies across L3. (when in reality many have shown unrealistic expectations)

The design makes sense/is logical - for gaming purposes.

Games mostly being single or lightly threaded, so you would want to keep all those threads on the one CCD for latency's sake. So it Also makes sense to only have the X3D cache on 1 CCD, the one with the fastest/best cores. I imagine having it on Both would create latencies that would hinder game performance, but in Epyc scenarios it's used totally differently.
I don't think there are many (or any) AAA PC-games for the past 10 years that have been single threaded, it's actually not possible to obtain rendering that's independent on IO and simulation without at least two threads, which is why this has been common since the early 2000s. But this doesn't mean all threads have significant load though, so this may be the source of your confusion?

Still, games do use relatively "few" threads if that's what you mean, but they still need top performance and low latency between certain threads. This is more important than whether the extra L3 is on the same die or not, but even core-to-core latency variances doesn't matter a whole lot.
 
Joined
Dec 12, 2012
Messages
763 (0.18/day)
Location
Poland
System Name THU
Processor Intel Core i5-13600KF
Motherboard ASUS PRIME Z790-P D4
Cooling SilentiumPC Fortis 3 v2 + Arctic Cooling MX-2
Memory Crucial Ballistix 2x16 GB DDR4-3600 CL16 (dual rank)
Video Card(s) MSI GeForce RTX 4070 Ventus 3X OC 12 GB GDDR6X (2610/21000 @ 0.91 V)
Storage Lexar NM790 2 TB + Corsair MP510 960 GB + PNY XLR8 CS3030 500 GB + Toshiba E300 3 TB
Display(s) LG OLED C8 55" + ASUS VP229Q
Case Fractal Design Define R6
Audio Device(s) Yamaha RX-V381 + Monitor Audio Bronze 6 + Bronze FX | FiiO E10K-TC + Sony MDR-7506
Power Supply Corsair RM650
Mouse Logitech M705 Marathon
Keyboard Corsair K55 RGB PRO
Software Windows 10 Home
Benchmark Scores Benchmarks in 2024?
I just cannot see why could they (AMD Co.) not put the same amount (however much) of cache on each chiplet. Do we really need another scheduler nightmare (and customer dissatisfaction)?

If both dies had the V-cache, productivity performance would be worse because of lower clocks. I doubt gaming performance would improve much from the second cache die.

The price would probably go up by another $100 or so, for no actual benefits.

Scheduling is a problem, but in a worst case scenario, you can always assign affinity manually.
 
Joined
May 20, 2020
Messages
1,360 (0.84/day)
If both dies had the V-cache, productivity performance would be worse because of lower clocks. I doubt gaming performance would improve much from the second cache die.
The price would probably go up by another $100 or so, for no actual benefits.
Scheduling is a problem, but in a worst case scenario, you can always assign affinity manually.
Well I guess it's all down to price, I'm curious to see how it pans out in the real world. Are users really that enthusiastic about micro-managing their CPU's...
 
Joined
Sep 4, 2022
Messages
280 (0.36/day)
Well I guess it's all down to price, I'm curious to see how it pans out in the real world. Are users really that enthusiastic about micro-managing their CPU's...
Well if the R9 Zen 4 3d chips performance is less than the 7950x ( currently selling for $569) and 7900x ( currently @ $440) in professional non gaming applications due to a significant amount of cores clocked lower from base and boost how much can they really charge? Also the gaming performance would have to be superior to to the i9 13900k selling currently at $599 in order for it to be more expensive and i7 13700k close to $400. Also if the gaming performance is superior Intel would have no choice but to lower its pricing making those eventually more attractive for z690 users. I believe AMD knows this hence the delay in pricing.
 
Joined
Jun 6, 2022
Messages
622 (0.72/day)
Total conjecture on your part. You have no idea what the advantages are for the X3D. It doesn't matter though as your posts show that you don't understand how the X3D is better in Gaming,
AMD has every interest in launching a range that will shine in gaming. Just business. If the reviewers use X3D in tests, not 13900K(S), hooray!
The problems for the user are a bit more delicate, when choosing an X3D instead of an X, the following must be taken into account:
1. Higher price compared to non-X3D brothers
2. Less aggressive frequency curve (altered performance in many applications that are not affected by extracache)
3. Only 89 degree Tmax.
4. Very limited overclocking (manual overclocking is impossible).
5. How much it helps me, practically speaking.
In short, a lot of compromise.
 
Joined
Oct 30, 2022
Messages
236 (0.33/day)
Location
Australia
System Name Blytzen
Processor Ryzen 7 7800X3D
Motherboard ASRock B650E Taichi Lite
Cooling Deepcool LS520 (240mm)
Memory G.Skill Trident Z5 Neo RGB 64 GB (2 x 32 GB) DDR5-6000 CL30
Video Card(s) Powercolor 6800XT Red Dragon (16 gig)
Storage 2TB Crucial P5 Plus SSD, 80TB spinning rust in a NAS
Display(s) MSI MPG321URX QD-OLED (32", 4k, 240hz), Samsung 32" 4k
Case Coolermaster HAF 500
Audio Device(s) Logitech G733 and a Z5500 running in a 2.1 config (I yeeted the mid and 2 satellites)
Power Supply Corsair HX850
Mouse Logitech G502X lightspeed
Keyboard Logitech G915 TKL tactile
Benchmark Scores Squats and calf raises
Ok, someone explain what the cache and core layout on the 7900X3D is.
CCD A = x cores, x cache
CCD B = y cores, y cache

Because I can't see a way it works simply (the CCD's are either both 6 core or 1x 8 and 1x 4)
 
Joined
Apr 30, 2011
Messages
2,695 (0.55/day)
Location
Greece
Processor AMD Ryzen 5 5600@80W
Motherboard MSI B550 Tomahawk
Cooling ZALMAN CNPS9X OPTIMA
Memory 2*8GB PATRIOT PVS416G400C9K@3733MT_C16
Video Card(s) Sapphire Radeon RX 6750 XT Pulse 12GB
Storage Sandisk SSD 128GB, Kingston A2000 NVMe 1TB, Samsung F1 1TB, WD Black 10TB
Display(s) AOC 27G2U/BK IPS 144Hz
Case SHARKOON M25-W 7.1 BLACK
Audio Device(s) Realtek 7.1 onboard
Power Supply Seasonic Core GC 500W
Mouse Sharkoon SHARK Force Black
Keyboard Trust GXT280
Software Win 7 Ultimate 64bit/Win 10 pro 64bit/Manjaro Linux
Ok, someone explain what the cache and core layout on the 7900X3D is.
CCD A = x cores, x cache
CCD B = y cores, y cache

Because I can't see a way it works simply (the CCD's are either both 6 core or 1x 8 and 1x 4)
6+6 cores with 3D cache on top of one of the CCDs.
 
Joined
Oct 30, 2022
Messages
236 (0.33/day)
Location
Australia
System Name Blytzen
Processor Ryzen 7 7800X3D
Motherboard ASRock B650E Taichi Lite
Cooling Deepcool LS520 (240mm)
Memory G.Skill Trident Z5 Neo RGB 64 GB (2 x 32 GB) DDR5-6000 CL30
Video Card(s) Powercolor 6800XT Red Dragon (16 gig)
Storage 2TB Crucial P5 Plus SSD, 80TB spinning rust in a NAS
Display(s) MSI MPG321URX QD-OLED (32", 4k, 240hz), Samsung 32" 4k
Case Coolermaster HAF 500
Audio Device(s) Logitech G733 and a Z5500 running in a 2.1 config (I yeeted the mid and 2 satellites)
Power Supply Corsair HX850
Mouse Logitech G502X lightspeed
Keyboard Logitech G915 TKL tactile
Benchmark Scores Squats and calf raises
6+6 cores with 3D cache on top of one of the CCDs.
Ok, so instead of 64 meg extra shared between 8 cores it's shared between 6...

Sounds like a 7600X3D is definitely an option now (or should be)

Wonder if the 25% ish more cache per core at the same boost clock will help
 
Joined
Jun 6, 2022
Messages
622 (0.72/day)
Up to a minimum of 7800X3D, 3D cache will be used.
A full CCX 8 cores with 3D cache + a CCX with 4 (7900X3D) and 8 cores (7950X).
For 7600X3D the price is not justified. More expensive than 13600K, with some results in gaming, but effectively destroyed in the other applications. Let's not forget that, except for gaming, a possible 7600X3D will perform below the 7600X in other applications.
 
Joined
Aug 10, 2021
Messages
166 (0.14/day)
System Name Main
Processor 5900X
Motherboard Asrock 570X Taichi
Memory 32GB
Video Card(s) 6800XT
Display(s) Odyssey C49G95T - 5120 x 1440
Up to a minimum of 7800X3D, 3D cache will be used.
A full CCX 8 cores with 3D cache + a CCX with 4 (7900X3D) and 8 cores (7950X).
For 7600X3D the price is not justified. More expensive than 13600K, with some results in gaming, but effectively destroyed in the other applications. Let's not forget that, except for gaming, a possible 7600X3D will perform below the 7600X in other applications.
wow, do you already have reviews? Or do you base that on how 5800X and 5800X3D performed....?
1673250076068.png

inb4: This doesn't count, since it doesn't fit your workload useage...
 
Joined
Sep 23, 2022
Messages
30 (0.04/day)
Every time I read about these chips the performance over the 13900k is decreased! It went from 30% to 10%-15% faster performance. A joke.
 
Joined
Jun 6, 2022
Messages
622 (0.72/day)
wow, do you already have reviews? Or do you base that on how 5800X and 5800X3D performed....?
View attachment 278262
inb4: This doesn't count, since it doesn't fit your workload useage...
TPU 5800X3D review. You start here and compare with 5800X. Take care that many Lower is Better, sir.

Puget System:
Overall, the Ryzen 5800X3D is a great proof of concept product for using 3D V-Cache on desktop CPUs, but we would recommend following AMD's advertising of the product, and using it primarily for gaming. For content creation, it can be on par with the standard 5800X in many cases, but in others, you could encounter a decent drop in performance depending on the application you are using
 

Attachments

  • puget.jpg
    puget.jpg
    122.2 KB · Views: 46
Joined
Oct 30, 2022
Messages
236 (0.33/day)
Location
Australia
System Name Blytzen
Processor Ryzen 7 7800X3D
Motherboard ASRock B650E Taichi Lite
Cooling Deepcool LS520 (240mm)
Memory G.Skill Trident Z5 Neo RGB 64 GB (2 x 32 GB) DDR5-6000 CL30
Video Card(s) Powercolor 6800XT Red Dragon (16 gig)
Storage 2TB Crucial P5 Plus SSD, 80TB spinning rust in a NAS
Display(s) MSI MPG321URX QD-OLED (32", 4k, 240hz), Samsung 32" 4k
Case Coolermaster HAF 500
Audio Device(s) Logitech G733 and a Z5500 running in a 2.1 config (I yeeted the mid and 2 satellites)
Power Supply Corsair HX850
Mouse Logitech G502X lightspeed
Keyboard Logitech G915 TKL tactile
Benchmark Scores Squats and calf raises
Up to a minimum of 7800X3D, 3D cache will be used.
A full CCX 8 cores with 3D cache + a CCX with 4 (7900X3D) and 8 cores (7950X).
For 7600X3D the price is not justified. More expensive than 13600K, with some results in gaming, but effectively destroyed in the other applications. Let's not forget that, except for gaming, a possible 7600X3D will perform below the 7600X in other applications.
Problem with that is they don't currently make 4 core CCX's why start now ?

Also, result for 5800X vs 3D same thing, it's another product in their sku though. Except for gaming the 5800X3D has limited use cases
 
Joined
Apr 1, 2017
Messages
420 (0.15/day)
System Name The Cum Blaster
Processor R9 5900x
Motherboard Gigabyte X470 Aorus Gaming 7 Wifi
Cooling Alphacool Eisbaer LT360
Memory 4x8GB Crucial Ballistix @ 3800C16
Video Card(s) 7900 XTX Nitro+
Storage Lots
Display(s) 4k60hz, 4k144hz
Case Obsidian 750D Airflow Edition
Power Supply EVGA SuperNOVA G3 750W
I seriously hope that this was an attempt at sarcasm. I cannot fathom the usefulness of such a comparison. I can maybe imagine comparisons with CPUs as old as Sandy Bridge (given skt2011 had PCIe 3.0 support,) but Pentium 4? That's laughable to say the least. I'd even go so far to say the Core 2 series would also be laughable to compare.
damn you guys take yourselves way too seriously
 
Joined
Jun 2, 2017
Messages
8,849 (3.28/day)
System Name Best AMD Computer
Processor AMD 7900X3D
Motherboard Asus X670E E Strix
Cooling In Win SR36
Memory GSKILL DDR5 32GB 5200 30
Video Card(s) Sapphire Pulse 7900XT (Watercooled)
Storage Corsair MP 700, Seagate 530 2Tb, Adata SX8200 2TBx2, Kingston 2 TBx2, Micron 8 TB, WD AN 1500
Display(s) GIGABYTE FV43U
Case Corsair 7000D Airflow
Audio Device(s) Corsair Void Pro, Logitch Z523 5.1
Power Supply Deepcool 1000M
Mouse Logitech g7 gaming mouse
Keyboard Logitech G510
Software Windows 11 Pro 64 Steam. GOG, Uplay, Origin
Benchmark Scores Firestrike: 46183 Time Spy: 25121
AMD has every interest in launching a range that will shine in gaming. Just business. If the reviewers use X3D in tests, not 13900K(S), hooray!
The problems for the user are a bit more delicate, when choosing an X3D instead of an X, the following must be taken into account:
1. Higher price compared to non-X3D brothers
2. Less aggressive frequency curve (altered performance in many applications that are not affected by extracache)
3. Only 89 degree Tmax.
4. Very limited overclocking (manual overclocking is impossible).
5. How much it helps me, practically speaking.
In short, a lot of compromise.
I guess you have one and can explain what no one ever had. You are free to your opinion but you should preface it with that instead of coming with this. The clock speed on the 7950X3D is 5.7 boost 5.0 normal. Now let's use objective reasoning. If a 5800X3D could clock to 5 GHZ then we could use it as a basis. Now if we reasonably think about the jump from 5000 to 7000 and realize that you get a huge bump in productivity tasks and clock speed that must also be considered. When you think that according to reviews the Heat limit is not a factor in performance it is mitigating. All of those together lead one to think that the X3D will be that good and is probably why AMD is bringing forward the launch of these chips by like 3 months. If you want a 13900KS be my guest but don't try to make people feel forlorn for thinking about a product that has no user information because you think that it will be the same as the current offerings.

By the way thank you for the Puget Systems assessment of what every owner of a X3D chip already knows. What they should have added is you will have so much fun Gaming and pushing your 1% lows that it will be a fading memory that the 5800X3D is not as fast in productivity vs the non cache brother or sister.

Have you seen AMD's new laptop lineup. 30 hour battery life is nothing to sneeze at for business purposes. Especially given the fact that a large portion of the Work force has had to buy new tech recently and the motto in business is as cheap as possible. That meant that when we transitioned to working from home everyone in my Company got laptops. Guess what kind of laptops they were? Yep that's right AMD, because they have better APUs period. With the efficiency that AMD has achieved it will bode them well for Corporate customers going forward.

Did you mention manual OC and AM5 together (for a regular user)? If you draw back, all AMD has done is increased their stack. If you don't want a X3D chip you can still enjoy AM5. If you don't want a X chip you can get a non X chip and be happy. There will probably be refreshes on the MBs for both launches too like B620 with non X chips and a revised B650 and X670 lineup but the X3D but the current lineup is also fine. More boards means more choice, what it also hopefully means is that boards will come back to reality in pricing .
 
Joined
Apr 30, 2011
Messages
2,695 (0.55/day)
Location
Greece
Processor AMD Ryzen 5 5600@80W
Motherboard MSI B550 Tomahawk
Cooling ZALMAN CNPS9X OPTIMA
Memory 2*8GB PATRIOT PVS416G400C9K@3733MT_C16
Video Card(s) Sapphire Radeon RX 6750 XT Pulse 12GB
Storage Sandisk SSD 128GB, Kingston A2000 NVMe 1TB, Samsung F1 1TB, WD Black 10TB
Display(s) AOC 27G2U/BK IPS 144Hz
Case SHARKOON M25-W 7.1 BLACK
Audio Device(s) Realtek 7.1 onboard
Power Supply Seasonic Core GC 500W
Mouse Sharkoon SHARK Force Black
Keyboard Trust GXT280
Software Win 7 Ultimate 64bit/Win 10 pro 64bit/Manjaro Linux
Ok, so instead of 64 meg extra shared between 8 cores it's shared between 6...

Sounds like a 7600X3D is definitely an option now (or should be)

Wonder if the 25% ish more cache per core at the same boost clock will help
I think 7600X3D won't be materialised as the price hike will be big in % for the entry level CPU of the Zen4 series. And AMD most possibly isn't willing to canibalise their other CPU sales. They could also make X3D iterations only for the 12 and 16-core CPUs if they thought just about higher margins and not market share.
 
Joined
Aug 10, 2021
Messages
166 (0.14/day)
System Name Main
Processor 5900X
Motherboard Asrock 570X Taichi
Memory 32GB
Video Card(s) 6800XT
Display(s) Odyssey C49G95T - 5120 x 1440
TPU 5800X3D review. You start here and compare with 5800X. Take care that many Lower is Better, sir.

Puget System:
Overall, the Ryzen 5800X3D is a great proof of concept product for using 3D V-Cache on desktop CPUs, but we would recommend following AMD's advertising of the product, and using it primarily for gaming. For content creation, it can be on par with the standard 5800X in many cases, but in others, you could encounter a decent drop in performance depending on the application you are using
Did you also read that it was a 5800X3D vs 5800X review/vs I linked?
lmao thank you for proving my point:
inb4: This doesn't count, since it doesn't fit your workload useage...
Those are not (potential, unlikely) 7600X3D reviews.... did you have those....?
And even in some of puget's application, the 3DX is performing higher.

This is what you said sir. My highlight
Let's not forget that, except for gaming, a possible 7600X3D will perform below the 7600X in other applications.
This is your words. You are the one that claims it will perform bellow. Not that it will be mixed or application dependent. but in non-gaming workload it will perform bellow.

W1zzard's review also have 3DX sometimes being better than non-3D in gaming workload.

and workload doesn't need to be rendering or photoshop
1673269413950.png

1673269488670.png

 
Top